Science.gov

Sample records for cladistic analysis based

  1. Cladistic analysis of Bantu languages: a new tree based on combined lexical and grammatical data

    NASA Astrophysics Data System (ADS)

    Rexová, Kateřina; Bastin, Yvonne; Frynta, Daniel

    2006-04-01

    The phylogeny of the Bantu languages is reconstructed by application of the cladistic methodology to the combined lexical and grammatical data (87 languages, 144 characters). A maximum parsimony tree and Bayesian analysis supported some previously recognized clades, e.g., that of eastern and southern Bantu languages. Moreover, the results revealed that Bantu languages south and east of the equatorial forest are probably monophyletic. It suggests an unorthodox scenario of Bantu expansion including (after initial radiation in their homelands and neighboring territories) just a single passage through rainforest areas followed by a subsequent divergence into major clades. The likely localization of this divergence is in the area west of the Great Lakes. It conforms to the view that demographic expansion and dispersal throughout the dry-forests and savanna regions of subequatorial Africa was associated with the acquisition of new technologies (iron metallurgy and grain cultivation).

  2. Cladistical Analysis of the Jovian and Saturnian Satellite Systems

    NASA Astrophysics Data System (ADS)

    Holt, Timothy. R.; Brown, Adrian. J.; Nesvorný, David; Horner, Jonathan; Carter, Brad

    2018-06-01

    Jupiter and Saturn each have complex systems of satellites and rings. These satellites can be classified into dynamical groups, implying similar formation scenarios. Recently, a larger number of additional irregular satellites have been discovered around both gas giants that have yet to be classified. The aim of this paper is to examine the relationships between the satellites and rings of the gas giants, using an analytical technique called cladistics. Cladistics is traditionally used to examine relationships between living organisms, the “tree of life.” In this work, we perform the first cladistical study of objects in a planetary science context. Our method uses the orbital, physical, and compositional characteristics of satellites to classify the objects in the Jovian and Saturnian systems. We find that the major relationships between the satellites in the two systems, such as families, as presented in previous studies, are broadly preserved. In addition, based on our analysis of the Jovian system, we identify a new retrograde irregular family, the Iocaste family, and suggest that the Phoebe family of the Saturnian system can be further divided into two subfamilies. We also propose that the Saturnian irregular families be renamed, to be consistent with the convention used in Jovian families. Using cladistics, we are also able to assign the new unclassified irregular satellites into families. Taken together, the results of this study demonstrate the potential use of the cladistical technique in the investigation of relationships between orbital bodies.

  3. Evolution of amino acid metabolism inferred through cladistic analysis.

    PubMed

    Cunchillos, Chomin; Lecointre, Guillaume

    2003-11-28

    Because free amino acids were most probably available in primitive abiotic environments, their metabolism is likely to have provided some of the very first metabolic pathways of life. What were the first enzymatic reactions to emerge? A cladistic analysis of metabolic pathways of the 16 aliphatic amino acids and 2 portions of the Krebs cycle was performed using four criteria of homology. The analysis is not based on sequence comparisons but, rather, on coding similarities in enzyme properties. The properties used are shared specific enzymatic activity, shared enzymatic function without substrate specificity, shared coenzymes, and shared functional family. The tree shows that the earliest pathways to emerge are not portions of the Krebs cycle but metabolisms of aspartate, asparagine, glutamate, and glutamine. The views of Horowitz (Horowitz, N. H. (1945) Proc. Natl. Acad. Sci. U. S. A. 31, 153-157) and Cordón (Cordón, F. (1990) Tratado Evolucionista de Biologia, Aguilar, Madrid, Spain), according to which the upstream reactions in the catabolic pathways and the downstream reactions in the anabolic pathways are the earliest in evolution, are globally corroborated; however, with some exceptions. These are due to later opportunistic connections of pathways (actually already suggested by these authors). Earliest enzymatic functions are mostly catabolic; they were deaminations, transaminations, and decarboxylations. From the consensus tree we extracted four time spans for amino acid metabolism development. For some amino acids catabolism and biosynthesis occurred at the same time (Asp, Glu, Lys, Leu, Ala, Val, Ile, Pro, Arg). For others ultimate reactions that use amino acids as a substrate or as a product are distinct in time, with catabolism preceding anabolism for Asn, Gln, and Cys and anabolism preceding catabolism for Ser, Met, and Thr. Cladistic analysis of the structure of biochemical pathways makes hypotheses in biochemical evolution explicit and parsimonious.

  4. Larvae of the genus Eleodes (Coleoptera, Tenebrionidae): matrix-based descriptions, cladistic analysis, and key to late instars

    PubMed Central

    Smith, Aaron D.; Dornburg, Rebecca; Wheeler, Quentin D.

    2014-01-01

    Abstract Darkling beetle larvae (Coleoptera, Tenebrionidae) are collectively referred to as false wireworms. Larvae from several species in the genus Eleodes are considered to be agricultural pests, though relatively little work has been done to associate larvae with adults of the same species and only a handful of species have been characterized in their larval state. Morphological characters from late instar larvae were examined and coded to produce a matrix in the server-based content management system mx. The resulting morphology matrix was used to produce larval species descriptions, reconstruct a phylogeny, and build a key to the species included in the matrix. Larvae are described for the first time for the following 12 species: Eleodes anthracinus Blaisdell, Eleodes carbonarius (Say), Eleodes caudiferus LeConte, Eleodes extricatus (Say), Eleodes goryi Solier, Eleodes hispilabris (Say), Eleodes nigropilosus LeConte, Eleodes pilosus Horn, Eleodes subnitens LeConte, Eleodes tenuipes Casey, Eleodes tribulus Thomas, and Eleodes wheeleri Aalbu, Smith & Triplehorn. The larval stage of Eleodes armatus LeConte is redescribed with additional characters to differentiate it from the newly described congeneric larvae. PMID:25009429

  5. A cladistic analysis of Aristotle's animal groups in the Historia animalium.

    PubMed

    von Lieven, Alexander Fürst; Humar, Marcel

    2008-01-01

    The Historia animalium (HA) of Aristotle contains an extraordinarily rich compilation of descriptions of animal anatomy, development, and behaviour. It is believed that Aristotle's aim in HA was to describe the correlations of characters rather than to classify or define animal groups. In order to assess if Aristotle, while organising his character correlations, referred to a pre-existing classification that underlies the descriptions in HA, we carried out a cladistic analysis according to the following procedure: by disentangeling 147 species and 40 higher taxa-designations from 157 predicates in the texts, we transcribed Aristotle's descriptions on anatomy and development of animals in books I-V of HA into a character matrix for a cladistic analysis. By analysing the distribution of characters as described in his books, we obtained a non-phylogenetic dendrogram displaying 58 monophyletic groups, 29 of which have equivalents among Aristotle's group designations. Eleven Aristotelian groupings turned out to be non-monophyletic, and six of them are inconsistent with the monophyletic groups. Twelve of 29 taxa without equivalents in Aristotle's works have equivalents in modern classifications. With this analysis we demonstate there exists a fairly consistent underlying classification in the zoological works of Aristotle. The peculiarities of Aristotle's character basis are discussed and the dendrogram is compared with a current phylogenetic tree.

  6. Taxonomic revision and cladistic analysis of the Neotropical genus Acrochaeta Wiedemann, 1830 (Diptera: Stratiomyidae: Sarginae).

    PubMed

    Fachin, Diego Aguilar; Amorim, Dalton De Souza

    2015-11-30

    The Neotropical genus Acrochaeta Wiedemann is revised and a cladistics analysis of the genus based on morphological characters is presented. This paper raises the total number of extant Acrochaeta species from 10 to 14 with the description of nine new species, the synonymy of one species, the transfer of five species to other genera and the transfer of one species of Merosargus to Acrochaeta. The new species described (of which eight are from Brazil and one from Bolivia and Peru) are Acrochaeta asapha nov. sp., A. balbii nov. sp., A. dichrostyla nov. sp., A. polychaeta nov. sp., A. pseudofasciata nov. sp., A. pseudopolychaeta nov. sp., A. rhombostyla nov. sp. A. ruschii nov. sp. and A. stigmata nov. sp. The primary types of all Acrochaeta species were studied at least from photos, when possible with the study of dissected male or female terminalia. A. mexicana Lindner is proposed as a junior synonym of A. flaveola Bigot. M. chalconota (Brauer) comb. nov., M. degenerata (Lindner) comb. nov., M. longiventris (Enderlein) comb. nov. and M. picta (Brauer) comb. nov. are herein transferred from Acrochaeta to Merosargus Loew, and Chrysochlorina elegans (Perty) comb. nov. is transferred from Acrochaeta to Chrysochlorina James. A. convexifrons (McFadden) comb. nov. is transferred from Merosargus to Acrochaeta. The limits of the genus and its insertion in the Sarginae are considered, and an updated generic diagnosis is provided. All species of the genus are redescribed and diagnosed, and illustrated with photos of the habitus, thorax, wing, and drawings of the antenna and male and female terminalia. Distribution maps are provided for the species, along with an identification key for adults of all species. Parsimony analyses were carried out under equal and implied weight. Our matrix includes 43 terminal taxa--of which 26 are outgroup species from four different sargine genera--and 59 adult morphological characters. The phylogenetic analysis supports the monophyly of

  7. The evolution of the centrifugal visual system of vertebrates. A cladistic analysis and new hypotheses.

    PubMed

    Repérant, J; Médina, M; Ward, R; Miceli, D; Kenigfest, N B; Rio, J P; Vesselkin, N P

    2007-01-01

    In a recent review of the available data concerning the centrifugal visual system (CVS) of vertebrates [Repérant, J., Ward, R., Miceli, D., Rio, J.P., Médina, M., Kenigfest, N.B., Vesselkin, N.P., 2006. The centrifugal visual system of vertebrates: a comparative analysis of its functional anatomical organization, Brain Res. Rev. 52, 1-57], we have shown that this feature of the visual system is not a particularity of birds, but is a permanent component of the vertebrate central nervous system which nevertheless shows considerable morphological and functional variation from one taxonomic group to another. Given these findings, the primary objective of the present article is an attempt to specify the evolutionary significance of this phylogenetic diversity. We begin by drawing up an inventory of this variation under several headings: the intracerebral location of the retinopetal neurons; the mode of intra-retinal arborizations of the centrifugal fibres and the nature of their targets; their neurochemical properties; and the afferent supplies of these neurons. We subsequently discuss these variations, particularly that of the intracerebral location of the retinopetal neurons during development and in adult forms, using the neuromeric terminology and in the framework of cladistic analysis, and seek to interpret them in a phylogenetic context. From this analysis, it becomes evident that the CVS is not a homogeneous entity formed by neurons with a common embryological origin, but rather a collection of at least eight distinct subsystems arising in very different regions of the neuraxis. These are the olfacto-retinal, dorsal thalamo-retinal, ventral thalamo-retinal, pretecto-retinal, tecto-retinal, tegmento-mesencephalo-retinal, dorsal isthmo-retinal and ventral isthmo-retinal systems. The olfacto-retinal system, which is probably absent in Agnatha, appears to be a pleisiomorphic characteristic of all Gnathostomata, while on the other hand the tegmento

  8. Cladistic analysis of extant and fossil African papionins using craniodental data.

    PubMed

    Gilbert, Christopher C

    2013-05-01

    This study examines African papionin phylogenetic history through a comprehensive cladistic analysis of extant and fossil craniodental morphology using both quantitative and qualitative characters. To account for the well-documented influence of allometry on the papionin skull, the general allometric coding method was applied to characters determined to be significantly affected by allometry. Results of the analyses suggest that Parapapio, Pliopapio, and Papio izodi are stem African papionin taxa. Crown Plio-Pleistocene African papionin taxa include Gorgopithecus, Lophocebus cf. albigena, Procercocebus, Soromandrillus (new genus defined herein) quadratirostris, and, most likely, Dinopithecus. Furthermore, S. quadratirostris is a member of a clade also containing Mandrillus, Cercocebus, and Procercocebus; ?Theropithecus baringensis is strongly supported as a primitive member of the genus Theropithecus; Gorgopithecus is closely related to Papio and Lophocebus; and Theropithecus is possibly the most primitive crown African papionin taxon. Finally, character transformation analyses identify a series of morphological transformations during the course of papionin evolution. The origin of crown African papionins is diagnosed, at least in part, by the appearance of definitive and well-developed male maxillary ridges and maxillary fossae. Among crown African papionins, Papio, Lophocebus, and Gorgopithecus are further united by the most extensive development of the maxillary fossae. The Soromandrillus/Mandrillus/Cercocebus/Procercocebus clade is diagnosed by upturned nuchal crests (especially in males), widely divergent temporal lines (especially in males), medially oriented maxillary ridges in males, medially oriented inferior petrous processes, and a tendency to enlarge the premolars as an adaptation for hard-object food processing. The adaptive origins of the genus Theropithecus appear associated with a diet requiring an increase in size of the temporalis, the optimal

  9. A Cladistic Analysis of Phenotypic Associations with Haplotypes Inferred from Restriction Endonuclease Mapping. IV. Nested Analyses with Cladogram Uncertainty and Recombination

    PubMed Central

    Templeton, A. R.; Sing, C. F.

    1993-01-01

    We previously developed an analytical strategy based on cladistic theory to identify subsets of haplotypes that are associated with significant phenotypic deviations. Our initial approach was limited to segments of DNA in which little recombination occurs. In such cases, a cladogram can be constructed from the restriction site data to estimate the evolutionary steps that interrelate the observed haplotypes to one another. The cladogram is then used to define a nested statistical design for identifying mutational steps associated with significant phenotypic deviations. The central assumption behind this strategy is that a mutation responsible for a particular phenotypic effect is embedded within the evolutionary history that is represented by the cladogram. The power of this approach depends on the accuracy of the cladogram in portraying the evolutionary history of the DNA region. This accuracy can be diminished both by recombination and by uncertainty in the estimated cladogram topology. In a previous paper, we presented an algorithm for estimating the set of likely cladograms and recombination events. In this paper we present an algorithm for defining a nested statistical design under cladogram uncertainty and recombination. Given the nested design, phenotypic associations can be examined using either a nested analysis of variance (for haploids or homozygous strains) or permutation testing (for outcrossed, diploid gene regions). In this paper we also extend this analytical strategy to include categorical phenotypes in addition to quantitative phenotypes. Some worked examples are presented using Drosophila data sets. These examples illustrate that having some recombination may actually enhance the biological inferences that may derived from a cladistic analysis. In particular, recombination can be used to assign a physical localization to a given subregion for mutations responsible for significant phenotypic effects. PMID:8100789

  10. The evolution of the dorsal thalamus of jawed vertebrates, including mammals: cladistic analysis and a new hypothesis.

    PubMed

    Butler, A B

    1994-01-01

    The evolution of the dorsal thalamus in various vertebrate lineages of jawed vertebrates has been an enigma, partly due to two prevalent misconceptions: the belief that the multitude of nuclei in the dorsal thalamus of mammals could be meaningfully compared neither with the relatively few nuclei in the dorsal thalamus of anamniotes nor with the intermediate number of dorsal thalamic nuclei of other amniotes and a definition of the dorsal thalamus that too narrowly focused on the features of the dorsal thalamus of mammals. The cladistic analysis carried out here allows us to recognize which features are plesiomorphic and which apomorphic for the dorsal thalamus of jawed vertebrates and to then reconstruct the major changes that have occurred in the dorsal thalamus over evolution. Embryological data examined in the context of Von Baerian theory (embryos of later-descendant species resemble the embryos of earlier-descendant species to the point of their divergence) supports a new 'Dual Elaboration Hypothesis' of dorsal thalamic evolution generated from this cladistic analysis. From the morphotype for an early stage in the embryological development of the dorsal thalamus of jawed vertebrates, the divergent, sequential stages of the development of the dorsal thalamus are derived for each major radiation and compared. The new hypothesis holds that the dorsal thalamus comprises two basic divisions--the collothalamus and the lemnothalamus--that receive their predominant input from the midbrain roof and (plesiomorphically) from lemniscal pathways, including the optic tract, respectively. Where present, the collothalamic, midbrain-sensory relay nuclei are homologous to each other in all vertebrate radiations as discrete nuclei. Within the lemnothalamus, the dorsal lateral geniculate nucleus of mammals and the dorsal lateral optic nucleus of non-synapsid amniotes (diapsid reptiles, birds and turtles) are homologous as discrete nuclei; most or all of the ventral nuclear group

  11. A cladistically based reinterpretation of the taxonomy of two Afrotropical tenebrionid genera Ectateus Koch, 1956 and Selinus Mulsant & Rey, 1853 (Coleoptera, Tenebrionidae, Platynotina).

    PubMed

    Kamiński, Marcin Jan

    2014-01-01

    On the basis of a newly performed cladistic analysis a new classification of the representatives of two Afrotropical tenebrionid genera, Ectateus Koch, 1956 and Selinus Mulsant & Rey, 1853 sensu Iwan 2002a, is provided. Eleoselinus is described as a new genus. The genus Monodius, previously synonymized with Selinus by Iwan (2002), is redescribed and considered as a separate genus. Following new combinations are proposed: Ectateus calcaripes (Gebien, 1904), Monodius laevistriatus (Fairmaire, 1897), Monodius lamottei (Gridelli, 1954), Monodius plicicollis (Fairmaire, 1897), Eleoselinus villiersi (Ardoin, 1965) and Eleoselinus ursynowiensis (Kamiński, 2011). Neotype for Ectateus calcaripes and lectotypes for E. crenatus (Fairmaire, 1897), E. ghesquierei Koch, 1956 and Monodius malaisei malaisei Koch, 1956 are designated to fix the taxonomic status of these taxa. The following synonymies are proposed: Selinus monardi Kaszab, 1951 and Ectateus latipennis Koch, 1956 with E. crenatus (Fairmaire, 1897). Identification keys are provided to all known species of Ectateus sensu novum, Eleoselinus, Monodius and Selinus sensu novum.

  12. Taxonomic revision and cladistic analysis of Avicularia Lamarck, 1818 (Araneae, Theraphosidae, Aviculariinae) with description of three new aviculariine genera

    PubMed Central

    Fukushima, Caroline Sayuri; Bertani, Rogério

    2017-01-01

    Abstract The genus Avicularia Lamarck, 1818 is revised and all species are rediagnosed. The type species, described as Aranea avicularia Linnaeus, 1758, is the oldest mygalomorph species described and its taxonomic history is extensive and confusing. Cladistic analyses using both equal and implied weights were carried out with a matrix of 46 taxa from seven theraphosid subfamilies, and 71 morphological and ecological characters. The optimal cladogram found with Piwe and concavity = 6 suggests Avicularia and Aviculariinae are monophyletic. Subfamily Aviculariinae includes Avicularia Lamarck, 1818, Typhochlaena C. L. Koch, 1850, Tapinauchenius Ausserer, 1871, Stromatopelma Karsch, 1881, Ephebopus Simon, 1892, Psalmopoeus Pocock, 1895, Heteroscodra Pocock, 1899, Iridopelma Pocock, 1901, Pachistopelma Pocock, 1901, Ybyrapora gen. n., Caribena gen. n., and Antillena gen. n. The clade is supported by well-developed scopulae on tarsi and metatarsi, greatly extended laterally. Avicularia synapomorphies are juveniles bearing black tarsi contrasting with other lighter articles; spermathecae with an accentuated outwards curvature medially, and male palpal bulb with embolus medial portion and tegulum’s margin form an acute angle in retrolateral view. Avicularia is composed of twelve species, including three new species: Avicularia avicularia (Linnaeus, 1818), Avicularia glauca Simon, 1891, Avicularia variegata (F. O. Pickard-Cambridge, 1896) stat. n., Avicularia minatrix Pocock, 1903, Avicularia taunayi (Mello-Leitão, 1920), Avicularia juruensis Mello-Leitão, 1923, Avicularia rufa Schiapelli & Gerschman, 1945, Avicularia purpurea Kirk, 1990, Avicularia hirschii Bullmer et al. 2006, Avicularia merianae sp. n., Avicularia lynnae sp. n., and Avicularia caei sp. n.. Avicularia species are distributed throughout Mexico, Costa Rica, Panama, Trinidad and Tobago, Venezuela, Guyana, Suriname, French Guiana, Colombia, Ecuador, Peru, Bolivia, and Brazil. Three new genera are erected

  13. Investigating the origins of the Irregular satellites using Cladistics

    NASA Astrophysics Data System (ADS)

    Holt, Timothy; Horner, Jonti; Tylor, Christopher; Nesvorny, David; Brown, Adrian; Carter, Brad

    2017-10-01

    The irregular satellites of Jupiter and Saturn are thought to be objects captured during a period of instability in the early solar system. However, the precise origins of these small bodies remain elusive. We use cladistics, a technique traditionally used by biologists, to help constrain the origins of these bodies. Our research contributes to a growing body of work that uses cladistics in astronomy, collectively called astrocladistics. We present one of the first instances of cladistics being used in a planetary science context. The analysis uses physical and compositional characteristics of three prograde Jovian irregular satellites (Themisto, Leda & Himalia), five retrograde Jovian irregular satellites (Ananke, Carme, Pasiphae, Sinope & Callirrhoe), along with Phoebe, a retrograde irregular satellite of Saturn, and several other regular Jovian and Saturnian satellites. Each of these members are representatives of their respective taxonomic groups. The irregular satellites are compared with other well-studied solar system bodies, including satellites, terrestrial planets, main belt asteroids, comets, and minor planets. We find that the Jovian irregular satellites cluster with asteroids and Ceres. The Saturnian satellites studied here are found to form an association with the comets, adding to the narrative of exchange between the outer solar system and Saturnian orbital space. Both of these results demonstrate the utility of cladistics as an analysis tool for the planetary sciences.

  14. A New Paleozoic Symmoriiformes (Chondrichthyes) from the Late Carboniferous of Kansas (USA) and Cladistic Analysis of Early Chondrichthyans

    PubMed Central

    Pradel, Alan; Tafforeau, Paul; Maisey, John G.; Janvier, Philippe

    2011-01-01

    Background The relationships of cartilaginous fishes are discussed in the light of well preserved three-dimensional Paleozoic specimens. There is no consensus to date on the interrelationship of Paleozoic chondrichthyans, although three main phylogenetic hypotheses exist in the current literature: 1. the Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are grouped along with the modern sharks (neoselachians) into a clade which is sister group of holocephalans; 2. the Symmoriiformes are related to holocephalans, whereas the other Paleozoic shark-like chondrichthyans are related to neoselachians; 3. many Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are stem chondrichthyans, whereas stem and crown holocephalans are sister group to the stem and crown neoselachians in a crown-chondrichthyan clade. This third hypothesis was proposed recently, based mainly on dental characters. Methodology/Principal Findings On the basis of two well preserved chondrichthyan neurocrania from the Late Carboniferous of Kansas, USA, we describe here a new species of Symmoriiformes, Kawichthys moodiei gen. et sp. nov., which was investigated by means of computerized X-ray synchrotron microtomography. We present a new phylogenetic analysis based on neurocranial characters, which supports the third hypothesis and corroborates the hypothesis that crown-group chondrichthyans (Holocephali+Neoselachii) form a tightly-knit group within the chondrichthyan total group, by providing additional, non dental characters. Conclusions/Significance Our results highlight the importance of new well preserved Paleozoic fossils and new techniques of observation, and suggest that a new look at the synapomorphies of the crown-group chondrichthyans would be worthwhile in terms of understanding the adaptive significance of phylogenetically important characters. PMID:21980367

  15. A new paleozoic Symmoriiformes (Chondrichthyes) from the late Carboniferous of Kansas (USA) and cladistic analysis of early chondrichthyans.

    PubMed

    Pradel, Alan; Tafforeau, Paul; Maisey, John G; Janvier, Philippe

    2011-01-01

    The relationships of cartilaginous fishes are discussed in the light of well preserved three-dimensional Paleozoic specimens. There is no consensus to date on the interrelationship of Paleozoic chondrichthyans, although three main phylogenetic hypotheses exist in the current literature: 1. the Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are grouped along with the modern sharks (neoselachians) into a clade which is sister group of holocephalans; 2. the Symmoriiformes are related to holocephalans, whereas the other Paleozoic shark-like chondrichthyans are related to neoselachians; 3. many Paleozoic shark-like chondrichthyans, such as the Symmoriiformes, are stem chondrichthyans, whereas stem and crown holocephalans are sister group to the stem and crown neoselachians in a crown-chondrichthyan clade. This third hypothesis was proposed recently, based mainly on dental characters. On the basis of two well preserved chondrichthyan neurocrania from the Late Carboniferous of Kansas, USA, we describe here a new species of Symmoriiformes, Kawichthys moodiei gen. et sp. nov., which was investigated by means of computerized X-ray synchrotron microtomography. We present a new phylogenetic analysis based on neurocranial characters, which supports the third hypothesis and corroborates the hypothesis that crown-group chondrichthyans (Holocephali+Neoselachii) form a tightly-knit group within the chondrichthyan total group, by providing additional, non dental characters. Our results highlight the importance of new well preserved Paleozoic fossils and new techniques of observation, and suggest that a new look at the synapomorphies of the crown-group chondrichthyans would be worthwhile in terms of understanding the adaptive significance of phylogenetically important characters.

  16. Revision of the African pollen beetle genera Tarchonanthogethes and Xenostrongylogethes, with insect-host plant relationships, identification key, and cladistic analysis of the Anthystrix genus-complex (Coleoptera: Nitidulidae: Meligethinae).

    PubMed

    Audisio, Paolo; Cline, Andrew R; Trizzino, Marco; Mancini, Emiliano; Antonini, Gloria; Sabatelli, Simone; Cerretti, Pierfilippo

    2015-02-19

    The Afrotropical endemic pollen beetle genera Tarchonanthogethes Audisio & Cline and Xenostrongylogethes Audisio & Cline, of the Anthystrix genus-complex, are revised. Eleven new species of Tarchonanthogethes (T. autumnalis, sp. nov., T. bisignatus, sp. nov., T. fasciatus, sp. nov., T. gratiellae, sp. nov., T. hermani, sp. nov., T. hystrix, sp. nov., T. lilliputianus, sp. nov., T. maasai, sp. nov., T. manconiae, sp. nov., T. pectinipes, sp. nov., T. thalycriformis, sp. nov.) and one new Xenostrongylogethes (X. cychramoides, sp. nov.) are described, illustrated and compared with related taxa. Tarchonanthogethes hirtus Kirejtshuk & Easton, 1988 is synonymized with T. martini (syn. nov.). Meligethes assutus Easton, 1960 from Kenya is transferred from Afrogethes Audisio & Cline to Tarchonanthogethes (comb. nov.). Meligethes singularis Grouvelle, 1919 from southern Africa is transferred from Tarchonanthogethes to Meligethinus Grouvelle, 1906 (comb. nov.). Larval host-plants for Tarchonanthogethes and Xenostrongylogethes include dioecious bushes and trees of Tarchonantheae Asteraceae (genera Brachylaena R.Br. and Tarchonanthus L.). All species currently attributed to the genera Anthystrix Kirejtshuk, Sebastiangethes Audisio, Kirk-Spriggs & Cline, Tarchonanthogethes and Xenostrongylogethes (Anthystrix genus-complex) are included in a morphology-based cladistic analysis to provide a rigorous hypothesis of phylogenetic relationships. An identification key to all 25 known species in the Anthystrix genus-complex, including all available data on insect host plant relationships, is presented.

  17. Morphological cladistic study of coregonine fishes

    USGS Publications Warehouse

    Smith, G.R.; Todd, T.N.

    1992-01-01

    A cladistic analysis of 50 characters from 26 taxa of coregonine fishes and two outgroup taxa yields a phylogenetic tree with two major branches, best summarized as two genera - Prosopium and Coregonus. Presence of teeth on the palatine, long maxillae, and long supra-maxillae are primitive, whereas loss of teeth, short or notched maxillae, and short supermaxillae are derived traits. P. coulteri and C. huntsmani are morphologically and phylogenetically primitive members of their groups. The widespread species, P. cylindraceum and P. williamsoni are morphologically advanced in parallel with the subgenus Coregonus (whitefishes): they share subterminal mouths, short jaws, and reduced teeth. Prosopium gemmifer parallels the ciscoes, subgenus Leucichthys. The whitefishes, C. ussuriensis, C. lavaretus, C. clupeaformis, and C. nasus are a monophyletic group, the subgenus Coregonus. The subgenus Leucichthys is a diverse, relatively plesiomorphic assemblage, widespread in the Holarctic region. This assemblage includes the inconnu, Stenodus.

  18. Integrating restriction site-associated DNA sequencing (RAD-seq) with morphological cladistic analysis clarifies evolutionary relationships among major species groups of bee orchids

    PubMed Central

    Sramkó, Gábor; Paun, Ovidiu

    2018-01-01

    Abstract Background and Aims Bee orchids (Ophrys) have become the most popular model system for studying reproduction via insect-mediated pseudo-copulation and for exploring the consequent, putatively adaptive, evolutionary radiations. However, despite intensive past research, both the phylogenetic structure and species diversity within the genus remain highly contentious. Here, we integrate next-generation sequencing and morphological cladistic techniques to clarify the phylogeny of the genus. Methods At least two accessions of each of the ten species groups previously circumscribed from large-scale cloned nuclear ribosomal internal transcibed spacer (nrITS) sequencing were subjected to restriction site-associated sequencing (RAD-seq). The resulting matrix of 4159 single nucleotide polymorphisms (SNPs) for 34 accessions was used to construct an unrooted network and a rooted maximum likelihood phylogeny. A parallel morphological cladistic matrix of 43 characters generated both polymorphic and non-polymorphic sets of parsimony trees before being mapped across the RAD-seq topology. Key Results RAD-seq data strongly support the monophyly of nine out of ten groups previously circumscribed using nrITS and resolve three major clades; in contrast, supposed microspecies are barely distinguishable. Strong incongruence separated the RAD-seq trees from both the morphological trees and traditional classifications; mapping of the morphological characters across the RAD-seq topology rendered them far more homoplastic. Conclusions The comparatively high level of morphological homoplasy reflects extensive convergence, whereas the derived placement of the fusca group is attributed to paedomorphic simplification. The phenotype of the most recent common ancestor of the extant lineages is inferred, but it post-dates the majority of the character-state changes that typify the genus. RAD-seq may represent the high-water mark of the contribution of molecular phylogenetics to

  19. Integrating restriction site-associated DNA sequencing (RAD-seq) with morphological cladistic analysis clarifies evolutionary relationships among major species groups of bee orchids.

    PubMed

    Bateman, Richard M; Sramkó, Gábor; Paun, Ovidiu

    2018-01-25

    Bee orchids (Ophrys) have become the most popular model system for studying reproduction via insect-mediated pseudo-copulation and for exploring the consequent, putatively adaptive, evolutionary radiations. However, despite intensive past research, both the phylogenetic structure and species diversity within the genus remain highly contentious. Here, we integrate next-generation sequencing and morphological cladistic techniques to clarify the phylogeny of the genus. At least two accessions of each of the ten species groups previously circumscribed from large-scale cloned nuclear ribosomal internal transcibed spacer (nrITS) sequencing were subjected to restriction site-associated sequencing (RAD-seq). The resulting matrix of 4159 single nucleotide polymorphisms (SNPs) for 34 accessions was used to construct an unrooted network and a rooted maximum likelihood phylogeny. A parallel morphological cladistic matrix of 43 characters generated both polymorphic and non-polymorphic sets of parsimony trees before being mapped across the RAD-seq topology. RAD-seq data strongly support the monophyly of nine out of ten groups previously circumscribed using nrITS and resolve three major clades; in contrast, supposed microspecies are barely distinguishable. Strong incongruence separated the RAD-seq trees from both the morphological trees and traditional classifications; mapping of the morphological characters across the RAD-seq topology rendered them far more homoplastic. The comparatively high level of morphological homoplasy reflects extensive convergence, whereas the derived placement of the fusca group is attributed to paedomorphic simplification. The phenotype of the most recent common ancestor of the extant lineages is inferred, but it post-dates the majority of the character-state changes that typify the genus. RAD-seq may represent the high-water mark of the contribution of molecular phylogenetics to understanding evolution within Ophrys; further progress will require

  20. Revision, cladistic analysis and biogeography of Typhochlaena C. L. Koch, 1850, Pachistopelma Pocock, 1901 and Iridopelma Pocock, 1901 (Araneae, Theraphosidae, Aviculariinae).

    PubMed

    Bertani, Rogério

    2012-01-01

    Three aviculariine genera endemic to Brazil are revised. Typhochlaena C. L. Koch, 1850 is resurrected, including five species; Pachistopelma Pocock, 1901 includes two species; and Iridopelma Pocock, 1901, six species. Nine species are newly described: Typhochlaena ammasp. n., Typhochlaena costaesp. n., Typhochlaena curumimsp. n., Typhochlaena paschoalisp. n., Pachistopelma bromelicolasp. n., Iridopelma katiaesp. n., Iridopelma marcoisp. n., Iridopelma oliveiraisp. n. and Iridopelma vaninisp. n. Three new synonymies are established: Avicularia pulchra Mello-Leitão, 1933 and Avicularia recifiensis Struchen & Brändle, 1996 are junior synonyms of Pachistopelma rufonigrum Pocock, 1901 syn. n., and Avicularia palmicola Mello-Leitão, 1945 is a junior synonym of Iridopelma hirsutum Pocock, 1901 syn. n.Pachistopelma concolor Caporiacco, 1947 is transferred to Tapinauchenius Ausserer, 1871, making the new combination Tapinauchenius concolor (Caporiacco, 1947)comb. n. Lectotypes are newly designed for Pachistopelma rufonigrum Pocock, 1901 , Iridopelma hirsutum Pocock, 1901 and Pachistopelma concolor Caporiacco, 1947. Cladistic analyses using both equal and implied weights were carried out with a matrix comprising 62 characters and 38 terminal taxa. The chosen cladogram found with X-Pee-Wee and concavity 6 suggests they are monophyletic. All species are keyed and mapped and information on species habitat and area cladograms are presented. Discussion on biogeography and conservation is provided.

  1. Revision, cladistic analysis and biogeography of Typhochlaena C. L. Koch, 1850, Pachistopelma Pocock, 1901 and Iridopelma Pocock, 1901 (Araneae, Theraphosidae, Aviculariinae)

    PubMed Central

    Bertani, Rogério

    2012-01-01

    Abstract Three aviculariine genera endemic to Brazil are revised. Typhochlaena C. L. Koch, 1850 is resurrected, including five species; Pachistopelma Pocock, 1901 includes two species; and Iridopelma Pocock, 1901, six species. Nine species are newly described: Typhochlaena amma sp. n., Typhochlaena costae sp. n., Typhochlaena curumim sp. n., Typhochlaena paschoali sp. n., Pachistopelma bromelicola sp. n., Iridopelma katiae sp. n., Iridopelma marcoi sp. n., Iridopelma oliveirai sp. n. and Iridopelma vanini sp. n. Three new synonymies are established: Avicularia pulchra Mello-Leitão, 1933 and Avicularia recifiensis Struchen & Brändle, 1996 are junior synonyms of Pachistopelma rufonigrum Pocock, 1901 syn. n., and Avicularia palmicola Mello-Leitão, 1945 is a junior synonym of Iridopelma hirsutum Pocock, 1901 syn. n. Pachistopelma concolor Caporiacco, 1947 is transferred to Tapinauchenius Ausserer, 1871, making the new combination Tapinauchenius concolor (Caporiacco, 1947) comb. n. Lectotypes are newly designed for Pachistopelma rufonigrum Pocock, 1901 , Iridopelma hirsutum Pocock, 1901 and Pachistopelma concolor Caporiacco, 1947. Cladistic analyses using both equal and implied weights were carried out with a matrix comprising 62 characters and 38 terminal taxa. The chosen cladogram found with X-Pee-Wee and concavity 6 suggests they are monophyletic. All species are keyed and mapped and information on species habitat and area cladograms are presented. Discussion on biogeography and conservation is provided. PMID:23166476

  2. Brain, calvarium, cladistics: A new approach to an old question, who are modern humans and Neandertals?

    PubMed

    Mounier, Aurélien; Balzeau, Antoine; Caparros, Miguel; Grimaud-Hervé, Dominique

    2016-03-01

    The evolutionary history of the genus Homo is the focus of major research efforts in palaeoanthropology. However, the use of palaeoneurology to infer phylogenies of our genus is rare. Here we use cladistics to test the importance of the brain in differentiating and defining Neandertals and modern humans. The analysis is based on morphological data from the calvarium and endocast of Pleistocene fossils and results in a single most parsimonious cladogram. We demonstrate that the joint use of endocranial and calvarial features with cladistics provides a unique means to understand the evolution of the genus Homo. The main results of this study indicate that: (i) the endocranial features are more phylogenetically informative than the characters from the calvarium; (ii) the specific differentiation of Neandertals and modern humans is mostly supported by well-known calvarial autapomorphies; (iii) the endocranial anatomy of modern humans and Neandertals show strong similarities, which appeared in the fossil record with the last common ancestor of both species; and (iv) apart from encephalisation, human endocranial anatomy changed tremendously during the end of the Middle Pleistocene. This may be linked to major cultural and technological novelties that had happened by the end of the Middle Pleistocene (e.g., expansion of the Middle Stone Age (MSA) in Africa and Mousterian in Europe). The combined study of endocranial and exocranial anatomy offers opportunities to further understand human evolution and the implication for the phylogeny of our genus. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Pattern cladistics and the 'realism-antirealism debate' in the philosophy of biology.

    PubMed

    Vergara-Silva, Francisco

    2009-06-01

    Despite the amount of work that has been produced on the subject over the years, the 'transformation of cladistics' is still a misunderstood episode in the history of comparative biology. Here, I analyze two outstanding, highly contrasting historiographic accounts on the matter, under the perspective of an influential dichotomy in the philosophy of science: the opposition between Scientific Realism and Empiricism. Placing special emphasis on the notion of 'causal grounding' of morphological characters (sensu Olivier Rieppel) in modern developmental biology's (mechanistic) theories, I arrive at the conclusion that a 'new transformation of cladistics' is philosophically plausible. This 'reformed' understanding of 'pattern cladistics' entails retaining the interpretation of cladograms as 'schemes of synapomorphies', but in association to construing cladogram nodes as 'developmental-genetic taxic homologies', instead of 'standard Darwinian ancestors'. The reinterpretation of pattern cladistics presented here additionally proposes to take Bas Van Fraassen's 'constructive empiricism' as a philosophical stance that could properly support such analysis of developmental-genetic data for systematic purposes. The latter suggestion is justified through a reappraisal of previous ideas developed by prominent pattern cladists (mainly, Colin Patterson), which concerned a scientifically efficient 'observable/non-observable distinction' linked to the conceptual pair 'ontogeny and phylogeny'. Finally, I argue that a robust articulation of Antirealist alternatives in systematics may provide a rational basis for its disciplinary separation from evolutionary biology, as well as for a critical reconsideration of the proper role of certain Scientific Realist positions, currently popular in comparative biology.

  4. dCITE: Measuring Necessary Cladistic Information Can Help You Reduce Polytomy Artefacts in Trees.

    PubMed

    Wise, Michael J

    2016-01-01

    Biologists regularly create phylogenetic trees to better understand the evolutionary origins of their species of interest, and often use genomes as their data source. However, as more and more incomplete genomes are published, in many cases it may not be possible to compute genome-based phylogenetic trees due to large gaps in the assembled sequences. In addition, comparison of complete genomes may not even be desirable due to the presence of horizontally acquired and homologous genes. A decision must therefore be made about which gene, or gene combinations, should be used to compute a tree. Deflated Cladistic Information based on Total Entropy (dCITE) is proposed as an easily computed metric for measuring the cladistic information in multiple sequence alignments representing a range of taxa, without the need to first compute the corresponding trees. dCITE scores can be used to rank candidate genes or decide whether input sequences provide insufficient cladistic information, making artefactual polytomies more likely. The dCITE method can be applied to protein, nucleotide or encoded phenotypic data, so can be used to select which data-type is most appropriate, given the choice. In a series of experiments the dCITE method was compared with related measures. Then, as a practical demonstration, the ideas developed in the paper were applied to a dataset representing species from the order Campylobacterales; trees based on sequence combinations, selected on the basis of their dCITE scores, were compared with a tree constructed to mimic Multi-Locus Sequence Typing (MLST) combinations of fragments. We see that the greater the dCITE score the more likely it is that the computed phylogenetic tree will be free of artefactual polytomies. Secondly, cladistic information saturates, beyond which little additional cladistic information can be obtained by adding additional sequences. Finally, sequences with high cladistic information produce more consistent trees for the same taxa.

  5. dCITE: Measuring Necessary Cladistic Information Can Help You Reduce Polytomy Artefacts in Trees

    PubMed Central

    2016-01-01

    Biologists regularly create phylogenetic trees to better understand the evolutionary origins of their species of interest, and often use genomes as their data source. However, as more and more incomplete genomes are published, in many cases it may not be possible to compute genome-based phylogenetic trees due to large gaps in the assembled sequences. In addition, comparison of complete genomes may not even be desirable due to the presence of horizontally acquired and homologous genes. A decision must therefore be made about which gene, or gene combinations, should be used to compute a tree. Deflated Cladistic Information based on Total Entropy (dCITE) is proposed as an easily computed metric for measuring the cladistic information in multiple sequence alignments representing a range of taxa, without the need to first compute the corresponding trees. dCITE scores can be used to rank candidate genes or decide whether input sequences provide insufficient cladistic information, making artefactual polytomies more likely. The dCITE method can be applied to protein, nucleotide or encoded phenotypic data, so can be used to select which data-type is most appropriate, given the choice. In a series of experiments the dCITE method was compared with related measures. Then, as a practical demonstration, the ideas developed in the paper were applied to a dataset representing species from the order Campylobacterales; trees based on sequence combinations, selected on the basis of their dCITE scores, were compared with a tree constructed to mimic Multi-Locus Sequence Typing (MLST) combinations of fragments. We see that the greater the dCITE score the more likely it is that the computed phylogenetic tree will be free of artefactual polytomies. Secondly, cladistic information saturates, beyond which little additional cladistic information can be obtained by adding additional sequences. Finally, sequences with high cladistic information produce more consistent trees for the same taxa

  6. Multivariate and Cladistic Analyses of Isolated Teeth Reveal Sympatry of Theropod Dinosaurs in the Late Jurassic of Northern Germany.

    PubMed

    Gerke, Oliver; Wings, Oliver

    2016-01-01

    Remains of theropod dinosaurs are very rare in Northern Germany because the area was repeatedly submerged by a shallow epicontinental sea during the Mesozoic. Here, 80 Late Jurassic theropod teeth are described of which the majority were collected over decades from marine carbonates in nowadays abandoned and backfilled quarries of the 19th century. Eighteen different morphotypes (A-R) could be distinguished and 3D models based on micro-CT scans of the best examples of all morphotypes are included as supplements. The teeth were identified with the assistance of discriminant function analysis and cladistic analysis based on updated datamatrices. The results show that a large variety of theropod groups were present in the Late Jurassic of northern Germany. Identified specimens comprise basal Tyrannosauroidea, as well as Allosauroidea, Megalosauroidea cf. Marshosaurus, Megalosauridae cf. Torvosaurus and probably Ceratosauria. The formerly reported presence of Dromaeosauridae in the Late Jurassic of northern Germany could not be confirmed. Some teeth of this study resemble specimens described as pertaining to Carcharodontosauria (morphotype A) and Abelisauridae (morphotype K). This interpretation is however, not supported by discriminant function analysis and cladistic analysis. Two smaller morphotypes (N and Q) differ only in some probably size-related characteristics from larger morphotypes (B and C) and could well represent juveniles of adult specimens. The similarity of the northern German theropods with groups from contemporaneous localities suggests faunal exchange via land-connections in the Late Jurassic between Germany, Portugal and North America.

  7. Cladistic Analysis of Olfactory and Vomeronasal Systems

    PubMed Central

    Ubeda-Bañon, Isabel; Pro-Sistiaga, Palma; Mohedano-Moriano, Alicia; Saiz-Sanchez, Daniel; de la Rosa-Prieto, Carlos; Gutierrez-Castellanos, Nicolás; Lanuza, Enrique; Martinez-Garcia, Fernando; Martinez-Marcos, Alino

    2010-01-01

    Most tetrapods possess two nasal organs for detecting chemicals in their environment, which are the sensory detectors of the olfactory and vomeronasal systems. The seventies’ view that the olfactory system was only devoted to sense volatiles, whereas the vomeronasal system was exclusively specialized for pheromone detection was challenged by accumulating data showing deep anatomical and functional interrelationships between both systems. In addition, the assumption that the vomeronasal system appeared as an adaptation to terrestrial life is being questioned as well. The aim of the present work is to use a comparative strategy to gain insight in our understanding of the evolution of chemical “cortex.” We have analyzed the organization of the olfactory and vomeronasal cortices of reptiles, marsupials, and placental mammals and we have compared our findings with data from other taxa in order to better understand the evolutionary history of the nasal sensory systems in vertebrates. The olfactory and vomeronsasal cortices have been re-investigated in garter snakes (Thamnophis sirtalis), short-tailed opossums (Monodelphis domestica), and rats (Rattus norvegicus) by tracing the efferents of the main and accessory olfactory bulbs using injections of neuroanatomical anterograde tracers (dextran-amines). In snakes, the medial olfactory tract is quite evident, whereas the main vomeronasal-recipient structure, the nucleus sphaericus is a folded cortical-like structure, located at the caudal edge of the amygdala. In marsupials, which are acallosal mammals, the rhinal fissure is relatively dorsal and the olfactory and vomeronasal cortices relatively expanded. Placental mammals, like marsupials, show partially overlapping olfactory and vomeronasal projections in the rostral basal telencephalon. These data raise the interesting question of how the telencephalon has been re-organized in different groups according to the biological relevance of chemical senses. PMID:21290004

  8. Cladistic analysis of olfactory and vomeronasal systems.

    PubMed

    Ubeda-Bañon, Isabel; Pro-Sistiaga, Palma; Mohedano-Moriano, Alicia; Saiz-Sanchez, Daniel; de la Rosa-Prieto, Carlos; Gutierrez-Castellanos, Nicolás; Lanuza, Enrique; Martinez-Garcia, Fernando; Martinez-Marcos, Alino

    2011-01-01

    Most tetrapods possess two nasal organs for detecting chemicals in their environment, which are the sensory detectors of the olfactory and vomeronasal systems. The seventies' view that the olfactory system was only devoted to sense volatiles, whereas the vomeronasal system was exclusively specialized for pheromone detection was challenged by accumulating data showing deep anatomical and functional interrelationships between both systems. In addition, the assumption that the vomeronasal system appeared as an adaptation to terrestrial life is being questioned as well. The aim of the present work is to use a comparative strategy to gain insight in our understanding of the evolution of chemical "cortex." We have analyzed the organization of the olfactory and vomeronasal cortices of reptiles, marsupials, and placental mammals and we have compared our findings with data from other taxa in order to better understand the evolutionary history of the nasal sensory systems in vertebrates. The olfactory and vomeronsasal cortices have been re-investigated in garter snakes (Thamnophis sirtalis), short-tailed opossums (Monodelphis domestica), and rats (Rattus norvegicus) by tracing the efferents of the main and accessory olfactory bulbs using injections of neuroanatomical anterograde tracers (dextran-amines). In snakes, the medial olfactory tract is quite evident, whereas the main vomeronasal-recipient structure, the nucleus sphaericus is a folded cortical-like structure, located at the caudal edge of the amygdala. In marsupials, which are acallosal mammals, the rhinal fissure is relatively dorsal and the olfactory and vomeronasal cortices relatively expanded. Placental mammals, like marsupials, show partially overlapping olfactory and vomeronasal projections in the rostral basal telencephalon. These data raise the interesting question of how the telencephalon has been re-organized in different groups according to the biological relevance of chemical senses.

  9. Mitochondrial DNA haplogroup phylogeny of the dog: Proposal for a cladistic nomenclature.

    PubMed

    Fregel, Rosa; Suárez, Nicolás M; Betancor, Eva; González, Ana M; Cabrera, Vicente M; Pestano, José

    2015-05-01

    Canis lupus familiaris mitochondrial DNA analysis has increased in recent years, not only for the purpose of deciphering dog domestication but also for forensic genetic studies or breed characterization. The resultant accumulation of data has increased the need for a normalized and phylogenetic-based nomenclature like those provided for human maternal lineages. Although a standardized classification has been proposed, haplotype names within clades have been assigned gradually without considering the evolutionary history of dog mtDNA. Moreover, this classification is based only on the D-loop region, proven to be insufficient for phylogenetic purposes due to its high number of recurrent mutations and the lack of relevant information present in the coding region. In this study, we design 1) a refined mtDNA cladistic nomenclature from a phylogenetic tree based on complete sequences, classifying dog maternal lineages into haplogroups defined by specific diagnostic mutations, and 2) a coding region SNP analysis that allows a more accurate classification into haplogroups when combined with D-loop sequencing, thus improving the phylogenetic information obtained in dog mitochondrial DNA studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Cognitive cladistics and cultural override in Hominid spatial cognition

    PubMed Central

    Haun, Daniel B. M.; Rapold, Christian J.; Call, Josep; Janzen, Gabriele; Levinson, Stephen C.

    2006-01-01

    Current approaches to human cognition often take a strong nativist stance based on Western adult performance, backed up where possible by neonate and infant research and almost never by comparative research across the Hominidae. Recent research suggests considerable cross-cultural differences in cognitive strategies, including relational thinking, a domain where infant research is impossible because of lack of cognitive maturation. Here, we apply the same paradigm across children and adults of different cultures and across all nonhuman great ape genera. We find that both child and adult spatial cognition systematically varies with language and culture but that, nevertheless, there is a clear inherited bias for one spatial strategy in the great apes. It is reasonable to conclude, we argue, that language and culture mask the native tendencies in our species. This cladistic approach suggests that the correct perspective on human cognition is neither nativist uniformitarian nor “blank slate” but recognizes the powerful impact that language and culture can have on our shared primate cognitive biases. PMID:17079489

  11. Evo-SETI: A Mathematical Tool for Cladistics, Evolution, and SETI

    PubMed Central

    Maccone, Claudio

    2017-01-01

    The discovery of new exoplanets makes us wonder where each new exoplanet stands along its way to develop life as we know it on Earth. Our Evo-SETI Theory is a mathematical way to face this problem. We describe cladistics and evolution by virtue of a few statistical equations based on lognormal probability density functions (pdf) in the time. We call b-lognormal a lognormal pdf starting at instant b (birth). Then, the lifetime of any living being becomes a suitable b-lognormal in the time. Next, our “Peak-Locus Theorem” translates cladistics: each species created by evolution is a b-lognormal whose peak lies on the exponentially growing number of living species. This exponential is the mean value of a stochastic process called “Geometric Brownian Motion” (GBM). Past mass extinctions were all-lows of this GBM. In addition, the Shannon Entropy (with a reversed sign) of each b-lognormal is the measure of how evolved that species is, and we call it EvoEntropy. The “molecular clock” is re-interpreted as the EvoEntropy straight line in the time whenever the mean value is exactly the GBM exponential. We were also able to extend the Peak-Locus Theorem to any mean value other than the exponential. For example, we derive in this paper for the first time the EvoEntropy corresponding to the Markov-Korotayev (2007) “cubic” evolution: a curve of logarithmic increase. PMID:28383497

  12. Evo-SETI: A Mathematical Tool for Cladistics, Evolution, and SETI.

    PubMed

    Maccone, Claudio

    2017-04-06

    The discovery of new exoplanets makes us wonder where each new exoplanet stands along its way to develop life as we know it on Earth. Our Evo-SETI Theory is a mathematical way to face this problem. We describe cladistics and evolution by virtue of a few statistical equations based on lognormal probability density functions (pdf) in the time . We call b -lognormal a lognormal pdf starting at instant b (birth). Then, the lifetime of any living being becomes a suitable b -lognormal in the time . Next, our "Peak-Locus Theorem" translates cladistics : each species created by evolution is a b -lognormal whose peak lies on the exponentially growing number of living species. This exponential is the mean value of a stochastic process called "Geometric Brownian Motion" (GBM). Past mass extinctions were all-lows of this GBM. In addition, the Shannon Entropy (with a reversed sign) of each b -lognormal is the measure of how evolved that species is, and we call it EvoEntropy. The "molecular clock" is re-interpreted as the EvoEntropy straight line in the time whenever the mean value is exactly the GBM exponential. We were also able to extend the Peak-Locus Theorem to any mean value other than the exponential. For example, we derive in this paper for the first time the EvoEntropy corresponding to the Markov-Korotayev (2007) "cubic" evolution: a curve of logarithmic increase.

  13. A Cladist is a systematist who seeks a natural classification: some comments on Quinn (2017).

    PubMed

    Williams, David M; Ebach, Malte C

    2018-01-01

    In response to Quinn (Biol Philos, 2017. 10.1007/s10539-017-9577-z) we identify cladistics to be about natural classifications and their discovery and thereby propose to add an eighth cladistic definition to Quinn's list, namely the systematist who seeks to discover natural classifications, regardless of their affiliation, theoretical or methodological justifications.

  14. Morphological cladistic analysis of eight popular Olive (Olea europaea L.) cultivars grown in Saudi Arabia using Numerical Taxonomic System for personal computer to detect phyletic relationship and their proximate fruit composition

    PubMed Central

    Al-Ruqaie, I.; Al-Khalifah, N.S.; Shanavaskhan, A.E.

    2015-01-01

    Varietal identification of olives is an intrinsic and empirical exercise owing to the large number of synonyms and homonyms, intensive exchange of genotypes, presence of varietal clones and lack of proper certification in nurseries. A comparative study of morphological characters of eight olive cultivars grown in Saudi Arabia was carried out and analyzed using NTSYSpc (Numerical Taxonomy System for personal computer) system segregated smaller fruits in one clade and the rest in two clades. Koroneiki, a Greek cultivar with a small sized fruit shared arm with Spanish variety Arbosana. Morphologic analysis using NTSYSpc revealed that biometrics of leaves, fruits and seeds are reliable morphologic characters to distinguish between varieties, except for a few morphologically very similar olive cultivars. The proximate analysis showed significant variations in the protein, fiber, crude fat, ash and moisture content of different cultivars. The study also showed that neither the size of fruit nor the fruit pulp thickness is a limiting factor determining crude fat content of olives. PMID:26858547

  15. A phylogenetic analysis of the megadiverse Chalcidoidea (Hymenoptera)

    USDA-ARS?s Scientific Manuscript database

    Chalcidoidea (Hymenoptera) are extremely diverse with an estimated 500,000 species. We present the first phylogenetic analysis of the superfamily based on a cladistic analysis of both morphological and molecular data. A total of 233 morphological characters were scored for 300 taxa and 265 genera, a...

  16. Cladistic analyses of behavioural variation in wild Pan troglodytes: exploring the chimpanzee culture hypothesis.

    PubMed

    Lycett, Stephen J; Collard, Mark; McGrew, William C

    2009-10-01

    Long-term field studies have revealed considerable behavioural differences among groups of wild Pan troglodytes. Here, we report three sets of cladistic analyses that were designed to shed light on issues relating to this interpopulation variation that are of particular relevance to palaeoanthropology. In the first set of analyses, we focused on the proximate cause of the variation. Some researchers have argued that it is cultural, while others have suggested that it is the result of genetic differences. Because the eastern and western subspecies of P. troglodytes are well differentiated genetically while groups within the subspecies are not, we reasoned that if the genetic hypothesis is correct, the phylogenetic signal should be stronger when data from the eastern and western subspecies are analysed together compared to when data from only the eastern subspecies are analysed. Using randomisation procedures, we found that the phylogenetic signal was substantially stronger with in a single subspecies rather than with two. The results of the first sets of analyses, therefore, were inconsistent with the predictions of the genetic hypothesis. The other two sets of analyses built on the results of the first and assumed that the intergroup behavioural variation is cultural in nature. Recent work has shown that, contrary to what anthropologists and archaeologists have long believed, vertical intergroup transmission is often more important than horizontal intergroup transmission in human cultural evolution. In the second set of analyses, we sought to determine how important vertical transmission has been in the evolution of chimpanzee cultural diversity. The first analysis we carried out indicated that the intergroup similarities and differences in behaviour are consistent with the divergence of the western and eastern subspecies, which is what would be expected if vertical intergroup transmission has been the dominant process. In the second analysis, we found that the

  17. Cladistic biogeography of Juglans (Juglandaceae) based on chloroplast DNA intergenic spacer sequences

    USDA-ARS?s Scientific Manuscript database

    The phylogenetic utility of sequence variation from five chloroplast DNA intergenic spacer (IGS) regions: trnT-trnF, psbA-trnH, atpB-rbcL, trnV-16S rRNA, and trnS-trnfM was examined in the genus Juglans. A total of seventeen taxa representing the four sections within Juglans and an outgroup taxon, ...

  18. A phylogenetic analysis of the Gruiformes (Aves) based on morphological characters, with an emphasis on the rails (Rallidae)

    PubMed Central

    C.Livezey, B.

    1998-01-01

    The order Gruiformes, for which even familial composition remains controversial, is perhaps the least well understood avian order from a phylogenetic perspective. The history of the systematics of the order is presented, and the ecological and biogeographic characteristics of its members are summarized. Using cladistic techniques, phylogenetic relationships among fossil and modern genera of the Gruiformes were estimated based on 381 primarily osteological characters; relationships among modern species of Grues (Psophiidae, Aramidae, Gruidae, Heliornithidae and Rallidae) were assessed based on these characters augmented by 189 characters of the definitive integument. A strict consensus tree for 20,000 shortest trees compiled for the matrix of gruiform genera (length = 967, CI = 0.517) revealed a number of nodes common to the solution set, many of which were robust to bootstrapping and had substantial support (Bremer) indices. Robust nodes included those supporting: a sister relationship between the Pedionomidae and Turnicidae; monophyly of the Gruiformes exclusive of the Pedionomidae and Turnicidae; a sister relationship between the Cariamidae and Phorusrhacoidea; a sister relationship between a clade comprising Eurypyga and Messelornis and one comprising Rhynochetos and Aptornis; monophyly of the Grues (Psophiidae, Aramidae, Gruidae, Heliornithidae and Rallidae); monophyly of a clade (Gruoidea) comprising (in order of increasingly close relationship) Psophia, Aramus, Balearica and other Gruidae, with monophyly of each member in this series confirmed; a sister relationship between the Heliornithidae and Rallidae; and monophyly of the Rallidae exclusive of Himantornis. Autapomorphic divergence was comparatively high for Pedionomus, Eurypyga, Psophia, Himantornis and Fulica; extreme autapomorphy, much of which is unique for the order, characterized the extinct, flightless Aptornis. In the species-level analysis of modern Grues, special efforts were made to limit the

  19. A New Morphological Phylogeny of the Ophiuroidea (Echinodermata) Accords with Molecular Evidence and Renders Microfossils Accessible for Cladistics

    PubMed Central

    Thuy, Ben; Stöhr, Sabine

    2016-01-01

    Ophiuroid systematics is currently in a state of upheaval, with recent molecular estimates fundamentally clashing with traditional, morphology-based classifications. Here, we attempt a long overdue recast of a morphological phylogeny estimate of the Ophiuroidea taking into account latest insights on microstructural features of the arm skeleton. Our final estimate is based on a total of 45 ingroup taxa, including 41 recent species covering the full range of extant ophiuroid higher taxon diversity and 4 fossil species known from exceptionally preserved material, and the Lower Carboniferous Aganaster gregarius as the outgroup. A total of 130 characters were scored directly on specimens. The tree resulting from the Bayesian inference analysis of the full data matrix is reasonably well resolved and well supported, and refutes all previous classifications, with most traditional families discredited as poly- or paraphyletic. In contrast, our tree agrees remarkably well with the latest molecular estimate, thus paving the way towards an integrated new classification of the Ophiuroidea. Among the characters which were qualitatively found to accord best with our tree topology, we selected a list of potential synapomorphies for future formal clade definitions. Furthermore, an analysis with 13 of the ingroup taxa reduced to the lateral arm plate characters produced a tree which was essentially similar to the full dataset tree. This suggests that dissociated lateral arm plates can be analysed in combination with fully known taxa and thus effectively unlocks the extensive record of fossil lateral arm plates for phylogenetic estimates. Finally, the age and position within our tree implies that the ophiuroid crown-group had started to diversify by the Early Triassic. PMID:27227685

  20. Higher-order phylogeny of modern birds (Theropoda, Aves: Neornithes) based on comparative anatomy. II. Analysis and discussion

    PubMed Central

    LIVEZEY, BRADLEY C; ZUSI, RICHARD L

    2007-01-01

    In recent years, avian systematics has been characterized by a diminished reliance on morphological cladistics of modern taxa, intensive palaeornithogical research stimulated by new discoveries and an inundation by analyses based on DNA sequences. Unfortunately, in contrast to significant insights into basal origins, the broad picture of neornithine phylogeny remains largely unresolved. Morphological studies have emphasized characters of use in palaeontological contexts. Molecular studies, following disillusionment with the pioneering, but non-cladistic, work of Sibley and Ahlquist, have differed markedly from each other and from morphological works in both methods and findings. Consequently, at the turn of the millennium, points of robust agreement among schools concerning higher-order neornithine phylogeny have been limited to the two basalmost and several mid-level, primary groups. This paper describes a phylogenetic (cladistic) analysis of 150 taxa of Neornithes, including exemplars from all non-passeriform families, and subordinal representatives of Passeriformes. Thirty-five outgroup taxa encompassing Crocodylia, predominately theropod Dinosauria, and selected Mesozoic birds were used to root the trees. Based on study of specimens and the literature, 2954 morphological characters were defined; these characters have been described in a companion work, approximately one-third of which were multistate (i.e. comprised at least three states), and states within more than one-half of these multistate characters were ordered for analysis. Complete heuristic searches using 10 000 random-addition replicates recovered a total solution set of 97 well-resolved, most-parsimonious trees (MPTs). The set of MPTs was confirmed by an expanded heuristic search based on 10 000 random-addition replicates and a full ratchet-augmented exploration to ascertain global optima. A strict consensus tree of MPTs included only six trichotomies, i.e. nodes differing topologically among MPTs

  1. Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1981-04-01

    UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface

  2. Homeopathy and systematics: a systematic analysis of the therapeutic effects of the plant species used in homeopathy.

    PubMed

    Bharatan, V

    2008-07-01

    The therapeutic effects of the plant species used in homeopathy have never been subjected to systematic analysis. A survey of the various Materiae Medicae shows that over 800 plant species are the source of medicines in homeopathy. As these medicines are considered related to one another with respect to their therapeutic effects for treating similar symptoms, the aim is to classify and map them using the concept of homology. This involves placing the discipline of homeopathy into a comparative framework using these plant medicines as taxa, therapeutic effects as characters, and contemporary cladistic techniques to analyse these relationships. The results are compared using cladograms based on different data sets used in biology (e.g. morphological characters and DNA sequences) to test whether similar cladistic patterns exist among these medicines. By classifying the therapeutic actions, genuine homologies can be distinguished from homoplasies. As this is a comparative study it has been necessary first to update the existing nomenclature of the plant species in the homeopathic literature in line with the current International Code of Botanical Nomenclature.

  3. Base compaction specification feasibility analysis.

    DOT National Transportation Integrated Search

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  4. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  5. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George

    2013-01-01

    Hand-based biometric analysis systems and techniques provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an input image. Additionally, the analysis uses re-use of commonly seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  6. Hand-Based Biometric Analysis

    NASA Technical Reports Server (NTRS)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  7. Wavelet-based polarimetry analysis

    NASA Astrophysics Data System (ADS)

    Ezekiel, Soundararajan; Harrity, Kyle; Farag, Waleed; Alford, Mark; Ferris, David; Blasch, Erik

    2014-06-01

    Wavelet transformation has become a cutting edge and promising approach in the field of image and signal processing. A wavelet is a waveform of effectively limited duration that has an average value of zero. Wavelet analysis is done by breaking up the signal into shifted and scaled versions of the original signal. The key advantage of a wavelet is that it is capable of revealing smaller changes, trends, and breakdown points that are not revealed by other techniques such as Fourier analysis. The phenomenon of polarization has been studied for quite some time and is a very useful tool for target detection and tracking. Long Wave Infrared (LWIR) polarization is beneficial for detecting camouflaged objects and is a useful approach when identifying and distinguishing manmade objects from natural clutter. In addition, the Stokes Polarization Parameters, which are calculated from 0°, 45°, 90°, 135° right circular, and left circular intensity measurements, provide spatial orientations of target features and suppress natural features. In this paper, we propose a wavelet-based polarimetry analysis (WPA) method to analyze Long Wave Infrared Polarimetry Imagery to discriminate targets such as dismounts and vehicles from background clutter. These parameters can be used for image thresholding and segmentation. Experimental results show the wavelet-based polarimetry analysis is efficient and can be used in a wide range of applications such as change detection, shape extraction, target recognition, and feature-aided tracking.

  8. Use of Parsimony Analysis to Identify Areas of Endemism of Chinese Birds: Implications for Conservation and Biogeography

    PubMed Central

    Huang, Xiao-Lei; Qiao, Ge-Xia; Lei, Fu-Min

    2010-01-01

    Parsimony analysis of endemicity (PAE) was used to identify areas of endemism (AOEs) for Chinese birds at the subregional level. Four AOEs were identified based on a distribution database of 105 endemic species and using 18 avifaunal subregions as the operating geographical units (OGUs). The four AOEs are the Qinghai-Zangnan Subregion, the Southwest Mountainous Subregion, the Hainan Subregion and the Taiwan Subregion. Cladistic analysis of subregions generally supports the division of China’s avifauna into Palaearctic and Oriental realms. Two PAE area trees were produced from two different distribution datasets (year 1976 and 2007). The 1976 topology has four distinct subregional branches; however, the 2007 topology has three distinct branches. Moreover, three Palaearctic subregions in the 1976 tree clustered together with the Oriental subregions in the 2007 tree. Such topological differences may reflect changes in the distribution of bird species through circa three decades. PMID:20559504

  9. Data depth based clustering analysis

    DOE PAGES

    Jeong, Myeong -Hun; Cai, Yaping; Sullivan, Clair J.; ...

    2016-01-01

    Here, this paper proposes a new algorithm for identifying patterns within data, based on data depth. Such a clustering analysis has an enormous potential to discover previously unknown insights from existing data sets. Many clustering algorithms already exist for this purpose. However, most algorithms are not affine invariant. Therefore, they must operate with different parameters after the data sets are rotated, scaled, or translated. Further, most clustering algorithms, based on Euclidean distance, can be sensitive to noises because they have no global perspective. Parameter selection also significantly affects the clustering results of each algorithm. Unlike many existing clustering algorithms, themore » proposed algorithm, called data depth based clustering analysis (DBCA), is able to detect coherent clusters after the data sets are affine transformed without changing a parameter. It is also robust to noises because using data depth can measure centrality and outlyingness of the underlying data. Further, it can generate relatively stable clusters by varying the parameter. The experimental comparison with the leading state-of-the-art alternatives demonstrates that the proposed algorithm outperforms DBSCAN and HDBSCAN in terms of affine invariance, and exceeds or matches the ro-bustness to noises of DBSCAN or HDBSCAN. The robust-ness to parameter selection is also demonstrated through the case study of clustering twitter data.« less

  10. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  11. The Cladistic Basis for the Phylogenetic Diversity (PD) Measure Links Evolutionary Features to Environmental Gradients and Supports Broad Applications of Microbial Ecology’s “Phylogenetic Beta Diversity” Framework

    PubMed Central

    Faith, Daniel P.; Lozupone, Catherine A.; Nipperess, David; Knight, Rob

    2009-01-01

    The PD measure of phylogenetic diversity interprets branch lengths cladistically to make inferences about feature diversity. PD calculations extend conventional species-level ecological indices to the features level. The “phylogenetic beta diversity” framework developed by microbial ecologists calculates PD-dissimilarities between community localities. Interpretation of these PD-dissimilarities at the feature level explains the framework’s success in producing ordinations revealing environmental gradients. An example gradients space using PD-dissimilarities illustrates how evolutionary features form unimodal response patterns to gradients. This features model supports new application of existing species-level methods that are robust to unimodal responses, plus novel applications relating to climate change, commercial products discovery, and community assembly. PMID:20087461

  12. Target Tracking Based Scene Analysis

    DTIC Science & Technology

    1984-08-01

    1082 , pp 377-391. [21 S.T. Barnard and M.A. Fisch~ler, "Computational Stereo", Computing Surveys 14, 1082 , pp 553-572. 131 K.H. Bers, M. Bohner, and P...Braunlage/Harz. FRG, June 21 - July 2, 1082 Springer, Berlin, 1083. pp 10.1-124. [81 R.B. Cate, T.*1B. Dennis, J.T. Mallin, K.S. Nedelman, NEIL Trenchard, and...Institute on Pictorial Data Analysis, Bonas, France, August 1-12, 1082 ), Springer, Berlin, 1983. [181 G.R. Legters Jr. and T.Y. Young, "A Mathematical

  13. Evidence based practice readiness: A concept analysis.

    PubMed

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  14. Content-based analysis of news video

    NASA Astrophysics Data System (ADS)

    Yu, Junqing; Zhou, Dongru; Liu, Huayong; Cai, Bo

    2001-09-01

    In this paper, we present a schema for content-based analysis of broadcast news video. First, we separate commercials from news using audiovisual features. Then, we automatically organize news programs into a content hierarchy at various levels of abstraction via effective integration of video, audio, and text data available from the news programs. Based on these news video structure and content analysis technologies, a TV news video Library is generated, from which users can retrieve definite news story according to their demands.

  15. Analysis of Toxic and Non-Toxic Alexandrium (Dinophyceae) Species Using Ribosomal RNA Gene Sequences

    DTIC Science & Technology

    1993-02-01

    Therriault, J.-C. (1988). Cladistic analysis of electrophoretic variants within the toxic dinoflagellate genus Protogonyaulax. Botanica Marina 31: 39- 51. 8... Botanica Marina 34: 575-587. Halegraeff, G. M., and Bolch, C.J. (1992). Transport of toxic dinoflagellate cysts via ship’s ballast water: implications...analysis of electrophoretic variants within the toxic dinoflagellate genus Protogonv-u.!a,. Botanica Marina 31: 39-51. Curran, J., Baillie, D.L

  16. Team-Based Care: A Concept Analysis.

    PubMed

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  17. Soft-tissue anatomy of the primates: phylogenetic analyses based on the muscles of the head, neck, pectoral region and upper limb, with notes on the evolution of these muscles

    PubMed Central

    Diogo, R; Wood, B

    2011-01-01

    Apart from molecular data, nearly all the evidence used to study primate relationships comes from hard tissues. Here, we provide details of the first parsimony and Bayesian cladistic analyses of the order Primates based exclusively on muscle data. The most parsimonious tree obtained from the cladistic analysis of 166 characters taken from the head, neck, pectoral and upper limb musculature is fully congruent with the most recent evolutionary molecular tree of Primates. That is, this tree recovers not only the relationships among the major groups of primates, i.e. Strepsirrhini {Tarsiiformes [Platyrrhini (Cercopithecidae, Hominoidea)]}, but it also recovers the relationships within each of these inclusive groups. Of the 301 character state changes occurring in this tree, ca. 30% are non-homoplasic evolutionary transitions; within the 220 changes that are unambiguously optimized in the tree, ca. 15% are reversions. The trees obtained by using characters derived from the muscles of the head and neck are more similar to the most recent evolutionary molecular tree than are the trees obtained by using characters derived from the pectoral and upper limb muscles. It was recently argued that since the Pan/Homo split, chimpanzees accumulated more phenotypic adaptations than humans, but our results indicate that modern humans accumulated more muscle character state changes than chimpanzees, and that both these taxa accumulated more changes than gorillas. This overview of the evolution of the primate head, neck, pectoral and upper limb musculature suggests that the only muscle groups for which modern humans have more muscles than most other extant primates are the muscles of the face, larynx and forearm. PMID:21689100

  18. Soft-tissue anatomy of the primates: phylogenetic analyses based on the muscles of the head, neck, pectoral region and upper limb, with notes on the evolution of these muscles.

    PubMed

    Diogo, R; Wood, B

    2011-09-01

    Apart from molecular data, nearly all the evidence used to study primate relationships comes from hard tissues. Here, we provide details of the first parsimony and Bayesian cladistic analyses of the order Primates based exclusively on muscle data. The most parsimonious tree obtained from the cladistic analysis of 166 characters taken from the head, neck, pectoral and upper limb musculature is fully congruent with the most recent evolutionary molecular tree of Primates. That is, this tree recovers not only the relationships among the major groups of primates, i.e. Strepsirrhini {Tarsiiformes [Platyrrhini (Cercopithecidae, Hominoidea)]}, but it also recovers the relationships within each of these inclusive groups. Of the 301 character state changes occurring in this tree, ca. 30% are non-homoplasic evolutionary transitions; within the 220 changes that are unambiguously optimized in the tree, ca. 15% are reversions. The trees obtained by using characters derived from the muscles of the head and neck are more similar to the most recent evolutionary molecular tree than are the trees obtained by using characters derived from the pectoral and upper limb muscles. It was recently argued that since the Pan/Homo split, chimpanzees accumulated more phenotypic adaptations than humans, but our results indicate that modern humans accumulated more muscle character state changes than chimpanzees, and that both these taxa accumulated more changes than gorillas. This overview of the evolution of the primate head, neck, pectoral and upper limb musculature suggests that the only muscle groups for which modern humans have more muscles than most other extant primates are the muscles of the face, larynx and forearm. © 2011 The Authors. Journal of Anatomy © 2011 Anatomical Society of Great Britain and Ireland.

  19. Robust Mediation Analysis Based on Median Regression

    PubMed Central

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  20. Wavelet-based image analysis system for soil texture analysis

    NASA Astrophysics Data System (ADS)

    Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John

    2003-05-01

    Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.

  1. Temperature Based Stress Analysis of Notched Members

    DTIC Science & Technology

    1979-03-01

    Strain Behavior 98 of Mild Steel 17 Percent Restoration vs. Residual Stress 99 18 Examples of a Good Weld and Three 100 Defective Welds vi LIST OF TABLES...measuring temperatures in deforming metals based on the use 27 of thermistor flakes. The system was used to show that more heating occurs near stress...thermocouples were welded to the specimen surface. This particular attachment method is quite suitable for stress analysis for the following reasons

  2. Watershed-based Morphometric Analysis: A Review

    NASA Astrophysics Data System (ADS)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  3. A dictionary based informational genome analysis

    PubMed Central

    2012-01-01

    Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068

  4. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. Inmore » increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.« less

  5. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  6. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  7. NASA Lunar Base Wireless System Propagation Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  8. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  9. Particle Pollution Estimation Based on Image Analysis.

    PubMed

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction.

  10. Particle Pollution Estimation Based on Image Analysis

    PubMed Central

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic information and weather conditions, to predict PM2.5 index. The results demonstrate that the image analysis method provides good prediction of PM2.5 indexes, and different features have different significance levels in the prediction. PMID:26828757

  11. Constructing storyboards based on hierarchical clustering analysis

    NASA Astrophysics Data System (ADS)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  12. Power quality analysis based on spatial correlation

    NASA Astrophysics Data System (ADS)

    Li, Jiangtao; Zhao, Gang; Liu, Haibo; Li, Fenghou; Liu, Xiaoli

    2018-03-01

    With the industrialization and urbanization, the status of electricity in the production and life is getting higher and higher. So the prediction of power quality is the more potential significance. Traditional power quality analysis methods include: power quality data compression, disturbance event pattern classification, disturbance parameter calculation. Under certain conditions, these methods can predict power quality. This paper analyses the temporal variation of power quality of one provincial power grid in China from time angle. The distribution of power quality was analyzed based on spatial autocorrelation. This paper tries to prove that the research idea of geography is effective for mining the potential information of power quality.

  13. AR(p) -based detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  14. Sequence information gain based motif analysis.

    PubMed

    Maynou, Joan; Pairó, Erola; Marco, Santiago; Perera, Alexandre

    2015-11-09

    The detection of regulatory regions in candidate sequences is essential for the understanding of the regulation of a particular gene and the mechanisms involved. This paper proposes a novel methodology based on information theoretic metrics for finding regulatory sequences in promoter regions. This methodology (SIGMA) has been tested on genomic sequence data for Homo sapiens and Mus musculus. SIGMA has been compared with different publicly available alternatives for motif detection, such as MEME/MAST, Biostrings (Bioconductor package), MotifRegressor, and previous work such Qresiduals projections or information theoretic based detectors. Comparative results, in the form of Receiver Operating Characteristic curves, show how, in 70% of the studied Transcription Factor Binding Sites, the SIGMA detector has a better performance and behaves more robustly than the methods compared, while having a similar computational time. The performance of SIGMA can be explained by its parametric simplicity in the modelling of the non-linear co-variability in the binding motif positions. Sequence Information Gain based Motif Analysis is a generalisation of a non-linear model of the cis-regulatory sequences detection based on Information Theory. This generalisation allows us to detect transcription factor binding sites with maximum performance disregarding the covariability observed in the positions of the training set of sequences. SIGMA is freely available to the public at http://b2slab.upc.edu.

  15. Specialized Community-Based Care: An Evidence-Based Analysis

    PubMed Central

    2012-01-01

    Background Specialized community-based care (SCBC) refers to services that manage chronic illness through formalized links between primary and specialized care. Objectives The objectives of this evidence-based analysis (EBA) were as follows: to summarize the literature on SCBC, also known as intermediate care to synthesize the evidence from previous Medical Advisory Secretariat (now Health Quality Ontario) EBAs on SCBC for heart failure, diabetes, chronic obstructive pulmonary disease (COPD), and chronic wounds to examine the role of SCBC in family practice Results Part 1: Systematic Review of Intermediate Care Seven systematic reviews on intermediate care since 2008 were identified. The literature base is complex and difficult to define. There is evidence to suggest that intermediate care is effective in improving outcomes; however, the effective interventions are still uncertain. Part 2: Synthesis of Evidence in Intermediate Care Mortality • Heart failure Significant reduction in patients receiving SCBC • COPD Nonsignificant reduction in patients receiving SCBC Hospitalization • Heart failure Nonsignificant reduction in patients receiving SCBC • COPD Significant reduction in patients receiving SCBC Emergency Department Visits • Heart failure Nonsignificant reduction in patients receiving SCBC • COPD Significant reduction in patients receiving SCBC Disease-Specific Patient Outcomes • COPD Nonsignificant improvement in lung function in patients receiving SCBC • Diabetes Significant reduction in hemoglobin A1c (HbA1c) and systolic blood pressure in patients receiving SCBC • Chronic wounds Significant increase in the proportion of healed wounds in patients receiving SCBC Quality of Life • Heart failure Trend toward improvement in patients receiving SCBC • COPD Significant improvement in patients receiving SCBC Part 3: Intermediate Care in Family Practice—Evidence-Based Analysis Five randomized controlled trials were identified comparing SCBC

  16. Recurrence quantity analysis based on matrix eigenvalues

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian

    2018-06-01

    Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.

  17. Visibility Graph Based Time Series Analysis

    PubMed Central

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  18. Visibility Graph Based Time Series Analysis.

    PubMed

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  19. Data-Flow Based Model Analysis

    NASA Technical Reports Server (NTRS)

    Saad, Christian; Bauer, Bernhard

    2010-01-01

    The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.

  20. BASE Flexible Array Preliminary Lithospheric Structure Analysis

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Sheehan, A. F.; Anderson, M. L.; Siddoway, C. S.; Erslev, E.; Harder, S. H.; Miller, K. C.

    2009-12-01

    The Bighorns Arch Seismic Experiment (BASE) is a Flexible Array experiment integrated with EarthScope. The goal of BASE is to develop a better understanding of how basement-involved foreland arches form and what their link is to plate tectonic processes. To achieve this goal, the crustal structure under the Bighorn Mountain range, Bighorn Basin, and Powder River Basin of northern Wyoming and southern Montana are investigated through the deployment of 35 broadband seismometers, 200 short period seismometers, 1600 “Texan” instruments using active sources and 800 “Texan” instruments monitoring passive sources, together with field structural analysis of brittle structures. The novel combination of these approaches and anticipated simultaneous data inversion will give a detailed structural crustal image of the Bighorn region at all levels of the crust. Four models have been proposed for the formation of the Bighorn foreland arch: subhorizontal detachment within the crust, lithospheric buckling, pure shear lithospheric thickening, and fault blocks defined by lithosphere-penetrating thrust faults. During the summer of 2009, we deployed 35 broadband instruments, which have already recorded several magnitude 7+ teleseismic events. Through P wave receiver function analysis of these 35 stations folded in with many EarthScope Transportable Array stations in the region, we present a preliminary map of the Mohorovicic discontinuity. This crustal map is our first test of how the unique Moho geometries predicted by the four hypothesized models of basement involved arches fit seismic observations for the Bighorn Mountains. In addition, shear-wave splitting analysis for our first few recorded teleseisms helps us determine if strong lithospheric deformation is preserved under the range. These analyses help lead us to our final goal, a complete 4D (3D spatial plus temporal) lithospheric-scale model of arch formation which will advance our understanding of the mechanisms

  1. A revision and phylogenetic analysis of the spider genus Oxysoma Nicolet (Araneae: Anyphaenidae, Amaurobioidinae).

    PubMed

    Aisen, Santiago; Ramírez, Martín J

    2015-08-06

    We review the spider genus Oxysoma Nicolet, with most of its species endemic from the southern temperate forests in Chile and Argentina, and present a phylogenetic analysis including seven species, of which three are newly described in this study (O. macrocuspis new species, O. kuni new species, and O. losruiles new species, all from Chile), together with other 107 representatives of Anyphaenidae. New geographical records and distribution maps are provided for all species, with illustrations and reviewed diagnoses for the genus and the four previously known species (O. punctatum Nicolet, O. saccatum (Tullgren), O. longiventre (Nicolet) and O. itambezinho Ramírez). The phylogenetic analysis using cladistic methods is based on 264 previously defined characters plus one character that arises from this study. The three new species are closely related with Oxysoma longiventre, and this four species compose what we define as the Oxysoma longiventre species group. The phylogenetic analysis did not retrieve the monophyly of Oxysoma, which should be reevaluated in the future, together with the genus Tasata.

  2. Cluster-based exposure variation analysis

    PubMed Central

    2013-01-01

    Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate

  3. Joint multifractal analysis based on wavelet leaders

    NASA Astrophysics Data System (ADS)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  4. Unobtrusive Biometric System Based on Electroencephalogram Analysis

    NASA Astrophysics Data System (ADS)

    Riera, A.; Soria-Frisch, A.; Caparrini, M.; Grau, C.; Ruffini, G.

    2007-12-01

    Features extracted from electroencephalogram (EEG) recordings have proved to be unique enough between subjects for biometric applications. We show here that biometry based on these recordings offers a novel way to robustly authenticate or identify subjects. In this paper, we present a rapid and unobtrusive authentication method that only uses 2 frontal electrodes referenced to another one placed at the ear lobe. Moreover, the system makes use of a multistage fusion architecture, which demonstrates to improve the system performance. The performance analysis of the system presented in this paper stems from an experiment with 51 subjects and 36 intruders, where an equal error rate (EER) of 3.4% is obtained, that is, true acceptance rate (TAR) of 96.6% and a false acceptance rate (FAR) of 3.4%. The obtained performance measures improve the results of similar systems presented in earlier work.

  5. Voxel-Based LIDAR Analysis and Applications

    NASA Astrophysics Data System (ADS)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  6. [Concept analysis "Competency-based education"].

    PubMed

    Loosli, Clarence

    2016-03-01

    Competency-based education (CBE) stands out at global level as the best educational practice. Indeed, CBE is supposed to improve the quality of care provided by newly graduated nurses. Yet, there is a dearth of knowledge in nursing literature regarding CBE concept's definition. CBE is implemented differently in each entity even inside the same discipline in a single country. What accounts for CBE in nursing education ? to clarify CBE concept meaning according to literature review in order to propose a definition. Wilson concept analysis method framed our literature review through two databases: CINHAL and ERIC. following the 11 Wilson techniques analysis, we identified CBE concept as a multidimensional concept clustering three dimensions : learning, teaching and assessment. nurses educators are accountable for providing performants newly graduated professional to the society. Schools should struggle for the visibility and the transparency of means they are using to accomplish their educational activities. This first attempt to understand CBE concept opens a matter of debate concerning further development and clarification of the concept. This first description of CBE concept is a step toward its identification and assessment.

  7. Interactive analysis of geodata based intelligence

    NASA Astrophysics Data System (ADS)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  8. Constraints based analysis of extended cybernetic models.

    PubMed

    Mandli, Aravinda R; Venkatesh, Kareenhalli V; Modak, Jayant M

    2015-11-01

    The cybernetic modeling framework provides an interesting approach to model the regulatory phenomena occurring in microorganisms. In the present work, we adopt a constraints based approach to analyze the nonlinear behavior of the extended equations of the cybernetic model. We first show that the cybernetic model exhibits linear growth behavior under the constraint of no resource allocation for the induction of the key enzyme. We then quantify the maximum achievable specific growth rate of microorganisms on mixtures of substitutable substrates under various kinds of regulation and show its use in gaining an understanding of the regulatory strategies of microorganisms. Finally, we show that Saccharomyces cerevisiae exhibits suboptimal dynamic growth with a long diauxic lag phase when growing on a mixture of glucose and galactose and discuss on its potential to achieve optimal growth with a significantly reduced diauxic lag period. The analysis carried out in the present study illustrates the utility of adopting a constraints based approach to understand the dynamic growth strategies of microorganisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Bismuth-based electrochemical stripping analysis

    DOEpatents

    Wang, Joseph

    2004-01-27

    Method and apparatus for trace metal detection and analysis using bismuth-coated electrodes and electrochemical stripping analysis. Both anodic stripping voltammetry and adsorptive stripping analysis may be employed.

  10. Micromechanics Based Failure Analysis of Heterogeneous Materials

    NASA Astrophysics Data System (ADS)

    Sertse, Hamsasew M.

    are performed for both brittle failure/high cycle fatigue (HCF) for negligible plastic strain and ductile failure/low cycle fatigue (LCF) for large plastic strain. The proposed approach is incorporated in SwiftComp and used to predict the initial failure envelope, stress-strain curve for various loading conditions, and fatigue life of heterogeneous materials. The combined effects of strain hardening and progressive fatigue damage on the effective properties of heterogeneous materials are also studied. The capability of the current approach is validated using several representative examples of heterogeneous materials including binary composites, continuous fiber-reinforced composites, particle-reinforced composites, discontinuous fiber-reinforced composites, and woven composites. The predictions of MSG are also compared with the predictions obtained using various micromechanics approaches such as Generalized Methods of Cells (GMC), Mori-Tanaka (MT), and Double Inclusions (DI) and Representative Volume Element (RVE) Analysis (called as 3-dimensional finite element analysis (3D FEA) in this document). This study demonstrates that a micromechanics based failure analysis has a great potential to rigorously and more accurately analyze initiation and progression of damage in heterogeneous materials. However, this approach requires material properties specific to damage analysis, which are needed to be independently calibrated for each constituent.

  11. Temporal Expression-based Analysis of Metabolism

    PubMed Central

    Segrè, Daniel

    2012-01-01

    Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM). We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such “history-dependent” sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques. PMID:23209390

  12. Derivative based sensitivity analysis of gamma index

    PubMed Central

    Sarkar, Biplab; Pradhan, Anirudh; Ganesh, T.

    2015-01-01

    Originally developed as a tool for patient-specific quality assurance in advanced treatment delivery methods to compare between measured and calculated dose distributions, the gamma index (γ) concept was later extended to compare between any two dose distributions. It takes into effect both the dose difference (DD) and distance-to-agreement (DTA) measurements in the comparison. Its strength lies in its capability to give a quantitative value for the analysis, unlike other methods. For every point on the reference curve, if there is at least one point in the evaluated curve that satisfies the pass criteria (e.g., δDD = 1%, δDTA = 1 mm), the point is included in the quantitative score as “pass.” Gamma analysis does not account for the gradient of the evaluated curve - it looks at only the minimum gamma value, and if it is <1, then the point passes, no matter what the gradient of evaluated curve is. In this work, an attempt has been made to present a derivative-based method for the identification of dose gradient. A mathematically derived reference profile (RP) representing the penumbral region of 6 MV 10 cm × 10 cm field was generated from an error function. A general test profile (GTP) was created from this RP by introducing 1 mm distance error and 1% dose error at each point. This was considered as the first of the two evaluated curves. By its nature, this curve is a smooth curve and would satisfy the pass criteria for all points in it. The second evaluated profile was generated as a sawtooth test profile (STTP) which again would satisfy the pass criteria for every point on the RP. However, being a sawtooth curve, it is not a smooth one and would be obviously poor when compared with the smooth profile. Considering the smooth GTP as an acceptable profile when it passed the gamma pass criteria (1% DD and 1 mm DTA) against the RP, the first and second order derivatives of the DDs (δD’, δD”) between these two curves were derived and used as the boundary

  13. EWET: Data collection and interface for the genetic analysis of Echinococcus multilocularis based on EmsB microsatellite.

    PubMed

    Knapp, Jenny; Damy, Sylvie; Brillaud, Jonathan; Tissot, Jean-Daniel; Navion, Jérémy; Mélior, Raphael; Afonso, Eve; Hormaz, Vanessa; Gottstein, Bruno; Umhang, Gérald; Casulli, Adriano; Dadeau, Frédéric; Millon, Laurence; Raoul, Francis

    2017-01-01

    Evolution and dispersion history on Earth of organisms can best be studied through biological markers in molecular epidemiological studies. The biological diversity of the cestode Echinococcus multilocularis was investigated in different cladistic approaches. First the morphological aspects were explored in connection with its ecology. More recently, molecular aspects were investigated to better understand the nature of the variations observed among isolates. The study of the tandemly repeated multilocus microsatellite EmsB allowed us to attain a high genetic diversity level where other classic markers have failed. Since 2006, EmsB data have been collected on specimens from various endemic foci of the parasite in Europe (in historic and newly endemic areas), Asia (China, Japan and Kyrgyzstan), and North America (Canada and Alaska). Biological data on the isolates and metadata were also recorded (e.g. host, geographical location, EmsB analysis, citation in the literature). In order to make available the data set of 1,166 isolates from classic and aberrant domestic and wild animal hosts (larval lesions and adult worms) and from human origin, an open web access interface, developed in PHP, and connected to a PostgreSQL database, was developed in the EmsB Website for the Echinococcus Typing (EWET) project. It allows researchers to access data collection, perform genetic analyses online (e.g. defining the genetic distance between their own samples and the samples in the database), consult distribution maps of EmsB profiles, and record and share their new EmsB genotyping data. In order to standardize the EmsB analyses performed in the different laboratories throughout the world, a calibrator was developed. The final aim of this project was to gather and arrange available data to permit to better understand the dispersion and transmission patterns of the parasite among definitive and intermediate hosts, in order to organize control strategies on the ground.

  14. EWET: Data collection and interface for the genetic analysis of Echinococcus multilocularis based on EmsB microsatellite

    PubMed Central

    Damy, Sylvie; Brillaud, Jonathan; Tissot, Jean-Daniel; Navion, Jérémy; Mélior, Raphael; Afonso, Eve; Hormaz, Vanessa; Gottstein, Bruno; Umhang, Gérald; Casulli, Adriano; Dadeau, Frédéric; Millon, Laurence; Raoul, Francis

    2017-01-01

    Evolution and dispersion history on Earth of organisms can best be studied through biological markers in molecular epidemiological studies. The biological diversity of the cestode Echinococcus multilocularis was investigated in different cladistic approaches. First the morphological aspects were explored in connection with its ecology. More recently, molecular aspects were investigated to better understand the nature of the variations observed among isolates. The study of the tandemly repeated multilocus microsatellite EmsB allowed us to attain a high genetic diversity level where other classic markers have failed. Since 2006, EmsB data have been collected on specimens from various endemic foci of the parasite in Europe (in historic and newly endemic areas), Asia (China, Japan and Kyrgyzstan), and North America (Canada and Alaska). Biological data on the isolates and metadata were also recorded (e.g. host, geographical location, EmsB analysis, citation in the literature). In order to make available the data set of 1,166 isolates from classic and aberrant domestic and wild animal hosts (larval lesions and adult worms) and from human origin, an open web access interface, developed in PHP, and connected to a PostgreSQL database, was developed in the EmsB Website for the Echinococcus Typing (EWET) project. It allows researchers to access data collection, perform genetic analyses online (e.g. defining the genetic distance between their own samples and the samples in the database), consult distribution maps of EmsB profiles, and record and share their new EmsB genotyping data. In order to standardize the EmsB analyses performed in the different laboratories throughout the world, a calibrator was developed. The final aim of this project was to gather and arrange available data to permit to better understand the dispersion and transmission patterns of the parasite among definitive and intermediate hosts, in order to organize control strategies on the ground. PMID:28972978

  15. Thermodynamics-Based Metabolic Flux Analysis

    PubMed Central

    Henry, Christopher S.; Broadbelt, Linda J.; Hatzimanikatis, Vassily

    2007-01-01

    A new form of metabolic flux analysis (MFA) called thermodynamics-based metabolic flux analysis (TMFA) is introduced with the capability of generating thermodynamically feasible flux and metabolite activity profiles on a genome scale. TMFA involves the use of a set of linear thermodynamic constraints in addition to the mass balance constraints typically used in MFA. TMFA produces flux distributions that do not contain any thermodynamically infeasible reactions or pathways, and it provides information about the free energy change of reactions and the range of metabolite activities in addition to reaction fluxes. TMFA is applied to study the thermodynamically feasible ranges for the fluxes and the Gibbs free energy change, ΔrG′, of the reactions and the activities of the metabolites in the genome-scale metabolic model of Escherichia coli developed by Palsson and co-workers. In the TMFA of the genome scale model, the metabolite activities and reaction ΔrG′ are able to achieve a wide range of values at optimal growth. The reaction dihydroorotase is identified as a possible thermodynamic bottleneck in E. coli metabolism with a ΔrG′ constrained close to zero while numerous reactions are identified throughout metabolism for which ΔrG′ is always highly negative regardless of metabolite concentrations. As it has been proposed previously, these reactions with exclusively negative ΔrG′ might be candidates for cell regulation, and we find that a significant number of these reactions appear to be the first steps in the linear portion of numerous biosynthesis pathways. The thermodynamically feasible ranges for the concentration ratios ATP/ADP, NAD(P)/NAD(P)H, and \\documentclass[10pt]{article} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{pmc} \\pagestyle{empty} \\oddsidemargin -1.0in \\begin{document} \\begin{equation*}{\\mathrm{H}}_{{\\mathrm

  16. Image based performance analysis of thermal imagers

    NASA Astrophysics Data System (ADS)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  17. Bariatric surgery: an evidence-based analysis.

    PubMed

    2005-01-01

    To conduct an evidence-based analysis of the effectiveness and cost-effectiveness of bariatric surgery. Obesity is defined as a body mass index (BMI) of at last 30 kg/m(2).() Morbid obesity is defined as a BMI of at least 40 kg/m(2) or at least 35 kg/m(2) with comorbid conditions. Comorbid conditions associated with obesity include diabetes, hypertension, dyslipidemias, obstructive sleep apnea, weight-related arthropathies, and stress urinary incontinence. It is also associated with depression, and cancers of the breast, uterus, prostate, and colon, and is an independent risk factor for cardiovascular disease. Obesity is also associated with higher all-cause mortality at any age, even after adjusting for potential confounding factors like smoking. A person with a BMI of 30 kg/m(2) has about a 50% higher risk of dying than does someone with a healthy BMI. The risk more than doubles at a BMI of 35 kg/m(2). An expert estimated that about 160,000 people are morbidly obese in Ontario. In the United States, the prevalence of morbid obesity is 4.7% (1999-2000). In Ontario, the 2004 Chief Medical Officer of Health Report said that in 2003, almost one-half of Ontario adults were overweight (BMI 25-29.9 kg/m(2)) or obese (BMI ≥ 30 kg/m(2)). About 57% of Ontario men and 42% of Ontario women were overweight or obese. The proportion of the population that was overweight or obese increased gradually from 44% in 1990 to 49% in 2000, and it appears to have stabilized at 49% in 2003. The report also noted that the tendency to be overweight and obese increases with age up to 64 years. BMI should be used cautiously for people aged 65 years and older, because the "normal" range may begin at slightly above 18.5 kg/m(2) and extend into the "overweight" range. The Chief Medical Officer of Health cautioned that these data may underestimate the true extent of the problem, because they were based on self reports, and people tend to over-report their height and under-report their weight

  18. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  19. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  20. Using Willie's Acid-Base Box for Blood Gas Analysis

    ERIC Educational Resources Information Center

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  1. Recent Electrochemical and Optical Sensors in Flow-Based Analysis

    PubMed Central

    Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn

    2006-01-01

    Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.

  2. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...liquids or propellants . 15. SUBJECT TERMS N/A 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the

  3. Retinal imaging analysis based on vessel detection.

    PubMed

    Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila

    2017-07-01

    With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.

  4. EEG Based Analysis of Cognitive Load Enhance Instructional Analysis

    ERIC Educational Resources Information Center

    Dan, Alex; Reiner, Miriam

    2017-01-01

    One of the recommended approaches in instructional design methods is to optimize the value of working memory capacity and avoid cognitive overload. Educational neuroscience offers novel processes and methodologies to analyze cognitive load based on physiological measures. Observing psychophysiological changes when they occur in response to the…

  5. Security analysis of quadratic phase based cryptography

    NASA Astrophysics Data System (ADS)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Healy, John J.; Sheridan, John T.

    2016-09-01

    The linear canonical transform (LCT) is essential in modeling a coherent light field propagation through first-order optical systems. Recently, a generic optical system, known as a Quadratic Phase Encoding System (QPES), for encrypting a two-dimensional (2D) image has been reported. It has been reported together with two phase keys the individual LCT parameters serve as keys of the cryptosystem. However, it is important that such the encryption systems also satisfies some dynamic security properties. Therefore, in this work, we examine some cryptographic evaluation methods, such as Avalanche Criterion and Bit Independence, which indicates the degree of security of the cryptographic algorithms on QPES. We compare our simulation results with the conventional Fourier and the Fresnel transform based DRPE systems. The results show that the LCT based DRPE has an excellent avalanche and bit independence characteristics than that of using the conventional Fourier and Fresnel based encryption systems.

  6. Texton-based analysis of paintings

    NASA Astrophysics Data System (ADS)

    van der Maaten, Laurens J. P.; Postma, Eric O.

    2010-08-01

    The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of

  7. Gender-Based Analysis On-Line Dialogue. Final Report.

    ERIC Educational Resources Information Center

    2001

    An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with…

  8. Automated Video-Based Traffic Count Analysis.

    DOT National Transportation Integrated Search

    2016-01-01

    The goal of this effort has been to develop techniques that could be applied to the : detection and tracking of vehicles in overhead footage of intersections. To that end we : have developed and published techniques for vehicle tracking based on dete...

  9. Web-Based Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  10. Remote sensing based on hyperspectral data analysis

    NASA Astrophysics Data System (ADS)

    Sharifahmadian, Ershad

    In remote sensing, accurate identification of far objects, especially concealed objects is difficult. In this study, to improve object detection from a distance, the hyperspecral imaging and wideband technology are employed with the emphasis on wideband radar. As the wideband data includes a broad range of frequencies, it can reveal information about both the surface of the object and its content. Two main contributions are made in this study: 1) Developing concept of return loss for target detection: Unlike typical radar detection methods which uses radar cross section to detect an object, it is possible to enhance the process of detection and identification of concealed targets using the wideband radar based on the electromagnetic characteristics --conductivity, permeability, permittivity, and return loss-- of materials. During the identification process, collected wideband data is evaluated with information from wideband signature library which has already been built. In fact, several classes (e.g. metal, wood, etc.) and subclasses (ex. metals with high conductivity) have been defined based on their electromagnetic characteristics. Materials in a scene are then classified based on these classes. As an example, materials with high electrical conductivity can be conveniently detected. In fact, increasing relative conductivity leads to a reduction in the return loss. Therefore, metals with high conductivity (ex. copper) shows stronger radar reflections compared with metals with low conductivity (ex. stainless steel). Thus, it is possible to appropriately discriminate copper from stainless steel. 2) Target recognition techniques: To detect and identify targets, several techniques have been proposed, in particular the Multi-Spectral Wideband Radar Image (MSWRI) which is able to localize and identify concealed targets. The MSWRI is based on the theory of robust capon beamformer. During identification process, information from wideband signature library is utilized

  11. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  12. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Reinhart, Richard; Liebetreu, John; Kacpura, Tom J.

    2005-01-01

    This paper presents the tool chain, methodology, and results of an on-going study being performed jointly by Space Communication Experts at NASA Glenn Research Center (GRC), General Dynamics C4 Systems (GD), and Southwest Research Institute (SwRI). The team is evaluating the applicability and tradeoffs concerning the use of Software Defined Radio (SDR) technologies for Space missions. The Space Telecommunications Radio Systems (STRS) project is developing an approach toward building SDR-based transceivers for space communications applications based on an accompanying software architecture that can be used to implement transceivers for NASA space missions. The study is assessing the overall cost and benefit of employing SDR technologies in general, and of developing a software architecture standard for its space SDR transceivers. The study is considering the cost and benefit of existing architectures, such as the Joint Tactical Radio Systems (JTRS) Software Communications Architecture (SCA), as well as potential new space-specific architectures.

  13. Fault Analysis-based Logic Encryption (Preprint)

    DTIC Science & Technology

    2013-11-01

    publication of this paper. This material is based on work fund- ed by AFRL under contract No. FA8750-11-2-0274. Received and cleared for public release by...AFRL on November 19, 2012, case number 88ABW-2012-6072. Any opinions, findings and conclusions or recommendations expressed in this material are...those of the authors and do not necessarily reflect the views of AFRL or its contractors. 10 REFERENCES [1] KPMG . (2006) Managing the risks of

  14. What Is Evidence-Based Behavior Analysis?

    PubMed Central

    Smith, Tristram

    2013-01-01

    Although applied behavior analysts often say they engage in evidence-based practice, they express differing views on what constitutes “evidence” and “practice.” This article describes a practice as a service offered by a provider to help solve a problem presented by a consumer. Solving most problems (e.g., increasing or decreasing a behavior and maintaining this change) requires multiple intervention procedures (i.e., a package). Single-subject studies are invaluable in investigating individual procedures, but researchers still need to integrate the procedures into a package. The package must be standardized enough for independent providers to replicate yet flexible enough to allow individualization; intervention manuals are the primary technology for achieving this balance. To test whether the package is effective in solving consumers' problems, researchers must evaluate outcomes of the package as a whole, usually in group studies such as randomized controlled trials. From this perspective, establishing an evidence-based practice involves more than analyzing the effects of discrete intervention procedures on behavior; it requires synthesizing information so as to offer thorough solutions to problems. Recognizing the need for synthesis offers behavior analysts many promising opportunities to build on their existing research to increase the quality and quantity of evidence-based practices. PMID:25729130

  15. Discretization analysis of bifurcation based nonlinear amplifiers

    NASA Astrophysics Data System (ADS)

    Feldkord, Sven; Reit, Marco; Mathis, Wolfgang

    2017-09-01

    Recently, for modeling biological amplification processes, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have been widely analyzed analytically. For technical realizations, digital systems have become the most relevant systems in signal processing applications. The underlying continuous-time systems are transferred to the discrete-time domain using numerical integration methods. Within this contribution, effects on the qualitative behavior of the Andronov-Hopf bifurcation based systems concerning numerical integration methods are analyzed. It is shown exemplarily that explicit Runge-Kutta methods transform the truncated normalform equation of the Andronov-Hopf bifurcation into the normalform equation of the Neimark-Sacker bifurcation. Dependent on the order of the integration method, higher order terms are added during this transformation.A rescaled normalform equation of the Neimark-Sacker bifurcation is introduced that allows a parametric design of a discrete-time system which corresponds to the rescaled Andronov-Hopf system. This system approximates the characteristics of the rescaled Hopf-type amplifier for a large range of parameters. The natural frequency and the peak amplitude are preserved for every set of parameters. The Neimark-Sacker bifurcation based systems avoid large computational effort that would be caused by applying higher order integration methods to the continuous-time normalform equations.

  16. Determinants of birthweight: gender based analysis.

    PubMed

    Halileh, Samia; Abu-Rmeileh, Niveen; Watt, Graham; Spencer, Nick; Gordon, Nahida

    2008-09-01

    The objective of this cross sectional study is to look at determinants of birth weight and their association with the gender of the baby in 2,795 full term children living in the occupied Palestinian territory, derived from a stratified random sample of 2,994 households in the West Bank and 2,234 households in the Gaza Strip. The response rate was 85%. Multivariable analysis using analysis of variance for mixed models showed that sex and birth order, maternal age and education and to a lesser extent region were determinants of birth weight for all children. The effect of maternal education on birth weight differed for female and male infants, tending to be relatively unchanged for male infants and with mean birth weights increasing with maternal education in female infants. The effect of birth order differed by maternal age, with mean birth weight increasing with maternal age for first and second births; but being unaffected by maternal age for infants of birth order greater than two. We conclude that birth weight is influenced by common biological determinants across cultures, but is also influenced by social, ethnic, and environmental factors that are culture specific, of which some might be gender related.

  17. Meeting Report: Tissue-based Image Analysis.

    PubMed

    Saravanan, Chandra; Schumacher, Vanessa; Brown, Danielle; Dunstan, Robert; Galarneau, Jean-Rene; Odin, Marielle; Mishra, Sasmita

    2017-10-01

    Quantitative image analysis (IA) is a rapidly evolving area of digital pathology. Although not a new concept, the quantification of histological features on photomicrographs used to be cumbersome, resource-intensive, and limited to specialists and specialized laboratories. Recent technological advances like highly efficient automated whole slide digitizer (scanner) systems, innovative IA platforms, and the emergence of pathologist-friendly image annotation and analysis systems mean that quantification of features on histological digital images will become increasingly prominent in pathologists' daily professional lives. The added value of quantitative IA in pathology includes confirmation of equivocal findings noted by a pathologist, increasing the sensitivity of feature detection, quantification of signal intensity, and improving efficiency. There is no denying that quantitative IA is part of the future of pathology; however, there are also several potential pitfalls when trying to estimate volumetric features from limited 2-dimensional sections. This continuing education session on quantitative IA offered a broad overview of the field; a hands-on toxicologic pathologist experience with IA principles, tools, and workflows; a discussion on how to apply basic stereology principles in order to minimize bias in IA; and finally, a reflection on the future of IA in the toxicologic pathology field.

  18. Confidence-Based Learning in Investment Analysis

    NASA Astrophysics Data System (ADS)

    Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés

    The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.

  19. Market-Based Multirobot Coordination: A Survey and Analysis

    DTIC Science & Technology

    2005-04-01

    observe new information about their surroundings. Market -based approaches can often seamlessly incorporate online tasks by auctioning new tasks as they... Market -Based Multirobot Coordination: A Survey and Analysis M. Bernardine Dias, Robert Zlot, Nidhi Kalra, and Anthony Stentz CMU-RI-TR-05-13 April...00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Market -Based Multirobot Coordination: A Survey and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER

  20. Network Anomaly Detection Based on Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  1. Sandia National Laboratories analysis code data base

    NASA Astrophysics Data System (ADS)

    Peterson, C. W.

    1994-11-01

    Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.

  2. Carbon Nanotube Based Devices for Intracellular Analysis

    NASA Astrophysics Data System (ADS)

    Singhal, Riju Mohan

    would potentially lead to the highly sought after "selective component extraction" and analysis from a single cell. These multi-functional devices therefore provide a picture of the physiological state of a living cell and function as endoscopes for single cell analysis.

  3. The Route Analysis Based On Flight Plan

    NASA Astrophysics Data System (ADS)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  4. Trajectory Based Behavior Analysis for User Verification

    NASA Astrophysics Data System (ADS)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  5. Advanced overlay analysis through design based metrology

    NASA Astrophysics Data System (ADS)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  6. Image based SAR product simulation for analysis

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  7. Pathway-based personalized analysis of cancer

    PubMed Central

    Drier, Yotam; Sheffer, Michal; Domany, Eytan

    2013-01-01

    We introduce Pathifier, an algorithm that infers pathway deregulation scores for each tumor sample on the basis of expression data. This score is determined, in a context-specific manner, for every particular dataset and type of cancer that is being investigated. The algorithm transforms gene-level information into pathway-level information, generating a compact and biologically relevant representation of each sample. We demonstrate the algorithm’s performance on three colorectal cancer datasets and two glioblastoma multiforme datasets and show that our multipathway-based representation is reproducible, preserves much of the original information, and allows inference of complex biologically significant information. We discovered several pathways that were significantly associated with survival of glioblastoma patients and two whose scores are predictive of survival in colorectal cancer: CXCR3-mediated signaling and oxidative phosphorylation. We also identified a subclass of proneural and neural glioblastoma with significantly better survival, and an EGF receptor-deregulated subclass of colon cancers. PMID:23547110

  8. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  9. Simulation based analysis of laser beam brazing

    NASA Astrophysics Data System (ADS)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  10. Analysis of Vehicle-Based Security Operations

    SciTech Connect

    Carter, Jason M; Paul, Nate R

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications mustmore » be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that

  11. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  12. Comparative analysis of data base management systems

    NASA Technical Reports Server (NTRS)

    Smith, R.

    1983-01-01

    A study to determine if the Remote File Inquiry (RFI) system would handle the future requirements of the user community is discussed. RFI is a locally written and locally maintained on-line query/update package. The current and future on-line requirements of the user community were studied. Additional consideration was given to the types of data structuring the users required. The survey indicated the features of greatest benefit were: sort, subtotals, totals, record selection, storage of queries, global updating and the ability to page break. The major deficiencies were: one level of hierarchy, excessive response time, software unreliability, difficult to add, delete and modify records, complicated error messages and the lack of ability to perform interfield comparisons. Missing features users required were: formatted screens, interfield comparions, interfield arithmetic, multiple file access, security and data integrity. The survey team recommended Kennedy Space Center move forward to state-of-the-art software, a Data Base Management System which is thoroughly tested and easy to implement and use.

  13. Indoor air quality analysis based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui

    2014-03-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.

  14. Engineering Analysis Using a Web-based Protocol

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.; Claus, Russell W.

    2002-01-01

    This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.

  15. Analysis of composition-based metagenomic classification.

    PubMed

    Higashi, Susan; Barreto, André da Motta Salles; Cantão, Maurício Egidio; de Vasconcelos, Ana Tereza Ribeiro

    2012-01-01

    An essential step of a metagenomic study is the taxonomic classification, that is, the identification of the taxonomic lineage of the organisms in a given sample. The taxonomic classification process involves a series of decisions. Currently, in the context of metagenomics, such decisions are usually based on empirical studies that consider one specific type of classifier. In this study we propose a general framework for analyzing the impact that several decisions can have on the classification problem. Instead of focusing on any specific classifier, we define a generic score function that provides a measure of the difficulty of the classification task. Using this framework, we analyze the impact of the following parameters on the taxonomic classification problem: (i) the length of n-mers used to encode the metagenomic sequences, (ii) the similarity measure used to compare sequences, and (iii) the type of taxonomic classification, which can be conventional or hierarchical, depending on whether the classification process occurs in a single shot or in several steps according to the taxonomic tree. We defined a score function that measures the degree of separability of the taxonomic classes under a given configuration induced by the parameters above. We conducted an extensive computational experiment and found out that reasonable values for the parameters of interest could be (i) intermediate values of n, the length of the n-mers; (ii) any similarity measure, because all of them resulted in similar scores; and (iii) the hierarchical strategy, which performed better in all of the cases. As expected, short n-mers generate lower configuration scores because they give rise to frequency vectors that represent distinct sequences in a similar way. On the other hand, large values for n result in sparse frequency vectors that represent differently metagenomic fragments that are in fact similar, also leading to low configuration scores. Regarding the similarity measure, in

  16. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    DTIC Science & Technology

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...8217\\ \\ \\ \\ \\\\ \\ ~ >(- \\ , ~ AOC01 \\ PS018 / WP002 \\ DP008 // WP006 \\ ~ ,/ ’----- -----·-------------~--/·/ LAUGHLIN AIR FORCE BASE ENVIROMENTAL RESTORATION

  17. Peptidomic analysis of the extensive array of host-defense peptides in skin secretions of the dodecaploid frog Xenopus ruwenzoriensis (Pipidae).

    PubMed

    Coquet, Laurent; Kolodziejek, Jolanta; Jouenne, Thierry; Nowotny, Norbert; King, Jay D; Conlon, J Michael

    2016-09-01

    The Uganda clawed frog Xenopus ruwenzoriensis with a karyotype of 2n=108 is one of the very few vertebrates with dodecaploid status. Peptidomic analysis of norepinephrine-stimulated skin secretions from this species led to the isolation and structural characterization of 23 host-defense peptides belonging to the following families: magainin (3 peptides), peptide glycine-leucine-amide (PGLa; 6 peptides), xenopsin precursor fragment (XPF; 3 peptides), caerulein precursor fragment (CPF; 8 peptides), and caerulein precursor fragment-related peptide (CPF-RP; 3 peptides). In addition, the secretions contained caerulein, identical to the peptide from Xenopus laevis, and two peptides that were identified as members of the trefoil factor family (TFF). The data indicate that silencing of the host-defense peptide genes following polyploidization has been appreciable and non-uniform. Consistent with data derived from comparison of nucleotide sequences of mitochrondrial and nuclear genes, cladistic analyses based upon the primary structures of the host-defense peptides provide support for an evolutionary scenario in which X. ruwenzoriensis arose from an allopolyploidization event involving an octoploid ancestor of the present-day frogs belonging to the Xenopus amieti species group and a tetraploid ancestor of Xenopus pygmaeus. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  19. Data Base Reexamination as Part of IDS Secondary Analysis.

    ERIC Educational Resources Information Center

    Curry, Blair H.; And Others

    Data reexamination is a critical component for any study. The complexity of the study, the time available for data base development and analysis, and the relationship of the study to educational policy-making can all increase the criticality of such reexamination. Analysis of the error levels in the National Institute of Education's Instructional…

  20. Web-Based Trainer for Electrical Circuit Analysis

    ERIC Educational Resources Information Center

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  1. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    SciTech Connect

    Noonan, Nicholas James

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  2. Content Analysis of a Computer-Based Faculty Activity Repository

    ERIC Educational Resources Information Center

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  3. Integrated Formulation of Beacon-Based Exception Analysis for Multimissions

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail

    2003-01-01

    Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,

  4. Comprehension-Based versus Production-Based Grammar Instruction: A Meta-Analysis of Comparative Studies

    ERIC Educational Resources Information Center

    Shintani, Natsuko; Li, Shaofeng; Ellis, Rod

    2013-01-01

    This article reports a meta-analysis of studies that investigated the relative effectiveness of comprehension-based instruction (CBI) and production-based instruction (PBI). The meta-analysis only included studies that featured a direct comparison of CBI and PBI in order to ensure methodological and statistical robustness. A total of 35 research…

  5. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    DTIC Science & Technology

    2001-10-25

    Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for

  6. Space shuttle booster multi-engine base flow analysis

    NASA Technical Reports Server (NTRS)

    Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.

    1972-01-01

    A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.

  7. An Evidence-Based Videotaped Running Biomechanics Analysis.

    PubMed

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Preprocessing and Analysis of LC-MS-Based Proteomic Data

    PubMed Central

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W.

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed. PMID:26519169

  9. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  10. CognitionMaster: an object-based image analysis framework

    PubMed Central

    2013-01-01

    Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542

  11. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    PubMed

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  12. Success Probability Analysis for Shuttle Based Microgravity Experiments

    NASA Technical Reports Server (NTRS)

    Liou, Ying-Hsin Andrew

    1996-01-01

    Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.

  13. Basic gait analysis based on continuous wave radar.

    PubMed

    Zhang, Jun

    2012-09-01

    A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. CASAS: Cancer Survival Analysis Suite, a web based application.

    PubMed

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.

  15. CASAS: Cancer Survival Analysis Suite, a web based application

    PubMed Central

    Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne

    2017-01-01

    We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis:  quantile, landmark and competing risks, in addition to standard survival analysis.  The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots.  Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946

  16. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  17. Event-based analysis of free-living behaviour.

    PubMed

    Granat, Malcolm H

    2012-11-01

    The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.

  18. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  19. Exploratory Analysis of Supply Chains in the Defense Industrial Base

    DTIC Science & Technology

    2012-04-01

    Instruments Industry Group 382: Laboratory Apparatus and Analytical, Optical, Measuring, and Controlling Instruments 3821 Laboratory Apparatus and Furniture ...I N S T I T U T E F O R D E F E N S E A N A LY S E S Exploratory Analysis of Supply Chains in the Defense Industrial Base James R. Dominy...contract DASW01-04-C-0003, AH-7-3315, “Exploratory Analysis of Supply Chains in the Defense Industrial Base,” for the Director, Industrial Policy. The

  20. Graph-based urban scene analysis using symbolic data

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-07-01

    A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.

  1. Web-Based Analysis and Publication of Flow Cytometry Experiments

    PubMed Central

    Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.

    2014-01-01

    Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106

  2. Research of second harmonic generation images based on texture analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  3. Breath analysis based on micropreconcentrator for early cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Seok

    2018-02-01

    We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.

  4. CASKS (Computer Analysis of Storage Casks): A microcomputer based analysis system for storage cask review

    SciTech Connect

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1996-12-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules--the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impactmore » analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage asks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less

  5. Casks (computer analysis of storage casks): A microcomputer based analysis system for storage cask review

    SciTech Connect

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1995-08-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on themore » impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.« less

  6. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Web-based visual analysis for high-throughput genomics

    PubMed Central

    2013-01-01

    Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the

  8. ANALYSIS/PLOT: a graphics package for use with the SORT/ANALYSIS data bases

    SciTech Connect

    Sady, C.A.

    1983-08-01

    This report describes a graphics package that is used with the SORT/ANALYSIS data bases. The data listed by the SORT/ANALYSIS program can be presented in pie, bar, line, or Gantt chart form. Instructions for the use of the plotting program and descriptions of the subroutines are given in the report.

  9. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  10. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  11. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  12. Advancing School-Based Interventions through Economic Analysis

    ERIC Educational Resources Information Center

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  13. Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.

    ERIC Educational Resources Information Center

    De Grave, W. S.; And Others

    1996-01-01

    To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…

  14. Spreadsheet-based engine data analysis tool - user's guide.

    DOT National Transportation Integrated Search

    2016-07-01

    This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...

  15. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    ERIC Educational Resources Information Center

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  16. Utilizing Problem-Based Learning in Qualitative Analysis Lab Experiments

    ERIC Educational Resources Information Center

    Hicks, Randall W.; Bevsek, Holly M.

    2012-01-01

    A series of qualitative analysis (QA) laboratory experiments utilizing a problem-based learning (PBL) module has been designed and implemented. The module guided students through the experiments under the guise of cleaning up a potentially contaminated water site as employees of an environmental chemistry laboratory. The main goal was the…

  17. An Analysis of Losses to the Southern Commercial Timberland Base

    Treesearch

    Ian A. Munn; David Cleaves

    1998-01-01

    Demographic and physical factors influencing the conversion of commercial timberland iu the south to non-forestry uses between the last two Forest Inventory Analysis (FIA) surveys were investigated. GIS techniques linked Census data and FIA plot level data. Multinomial logit regression identified factors associated with losses to the timberland base. Conversion to...

  18. Content-Based Analysis of Bumper Stickers in Jordan

    ERIC Educational Resources Information Center

    Jaradat, Abdullah A.

    2016-01-01

    This study has set out to investigate bumper stickers in Jordan focusing mainly on the themes of the stickers. The study hypothesized that bumper stickers in Jordan reflect a wide range of topics including social, economic, and political. Due to being the first study of this phenomenon, the study has adopted content-based analysis to determine the…

  19. Geopolitical E-Analysis Based on E-Learning Content

    ERIC Educational Resources Information Center

    Dinicu, Anca; Oancea, Romana

    2017-01-01

    In a world of great complexity, understanding the manner states act and react becomes more and more an intriguing quest due to the multiple relations of dependence and interdependence that characterize "the global puzzle". Within this context, an analysis based on a geopolitical approach becomes a very useful means used to determine not…

  20. Analysis of Computer Network Information Based on "Big Data"

    NASA Astrophysics Data System (ADS)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  1. A Rational Analysis of Rule-Based Concept Learning

    ERIC Educational Resources Information Center

    Goodman, Noah D.; Tenenbaum, Joshua B.; Feldman, Jacob; Griffiths, Thomas L.

    2008-01-01

    This article proposes a new model of human concept learning that provides a rational analysis of learning feature-based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space--a concept language of logical rules. This article compares the model predictions to human generalization judgments in several…

  2. a Buffer Analysis Based on Co-Location Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, G.; Huang, S.; Wang, H.; Zhang, R.; Wang, Q.; Sha, H.; Liu, X.; Pan, Q.

    2018-05-01

    Buffer analysis is a common tool of spatial analysis, which deals with the problem of proximity in GIS. Buffer analysis researches the relationship between the center object and other objects around a certain distance. Buffer analysis can make the complicated problem be more scientifically and visually, and provide valuable information for users. Over the past decades, people have done a lot of researches on buffer analysis. Along with the constantly improvement of spatial analysis accuracy needed by people, people hope that the results of spatial analysis can be more exactly express the actual situation. Due to the influence of some certain factors, the impact scope and contact range of a geographic elements on the surrounding objects are uncertain. As all we know, each object has its own characteristics and changing rules in the nature. They are both independent and relative to each other. However, almost all the generational algorithms of existing buffer analysis are based on fixed buffer distance, which do not consider the co-location relationship among instances. Consequently, it is a waste of resource to retrieve the useless information, and useful information is ignored.

  3. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    PubMed

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  4. Control volume based hydrocephalus research; analysis of human data

    NASA Astrophysics Data System (ADS)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  5. Laser-Based Lighting: Experimental Analysis and Perspectives

    PubMed Central

    Yushchenko, Maksym; Buffolo, Matteo; Meneghini, Matteo; Zanoni, Enrico

    2017-01-01

    This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time) we present a detailed comparison between three different solutions for laser lighting, based on (i) transmissive phosphor layers; (ii) a reflective/angled phosphor layer; and (iii) a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes. PMID:29019958

  6. Analysis of Aerospike Plume Induced Base-Heating Environment

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See

    1998-01-01

    Computational analysis is conducted to study the effect of an aerospike engine plume on X-33 base-heating environment during ascent flight. To properly account for the effect of forebody and aftbody flowfield such as shocks and to allow for potential plume-induced flow-separation, thermo-flowfield of trajectory points is computed. The computational methodology is based on a three-dimensional finite-difference, viscous flow, chemically reacting, pressure-base computational fluid dynamics formulation, and a three-dimensional, finite-volume, spectral-line based weighted-sum-of-gray-gases radiation absorption model computational heat transfer formulation. The predicted convective and radiative base-heat fluxes are presented.

  7. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  8. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    PubMed

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    PubMed

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier

  10. Economic analysis of transmission line engineering based on industrial engineering

    NASA Astrophysics Data System (ADS)

    Li, Yixuan

    2017-05-01

    The modern industrial engineering is applied to the technical analysis and cost analysis of power transmission and transformation engineering. It can effectively reduce the cost of investment. First, the power transmission project is economically analyzed. Based on the feasibility study of power transmission and transformation project investment, the proposal on the company system cost management is put forward through the economic analysis of the effect of the system. The cost management system is optimized. Then, through the cost analysis of power transmission and transformation project, the new situation caused by the cost of construction is found. It is of guiding significance to further improve the cost management of power transmission and transformation project. Finally, according to the present situation of current power transmission project cost management, concrete measures to reduce the cost of power transmission project are given from the two aspects of system optimization and technology optimization.

  11. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  12. Model-based gene set analysis for Bioconductor.

    PubMed

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  13. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  14. Graph-based layout analysis for PDF documents

    NASA Astrophysics Data System (ADS)

    Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao

    2013-03-01

    To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.

  15. Timescale analysis of rule-based biochemical reaction networks

    PubMed Central

    Klinke, David J.; Finley, Stacey D.

    2012-01-01

    The flow of information within a cell is governed by a series of protein-protein interactions that can be described as a reaction network. Mathematical models of biochemical reaction networks can be constructed by repetitively applying specific rules that define how reactants interact and what new species are formed upon reaction. To aid in understanding the underlying biochemistry, timescale analysis is one method developed to prune the size of the reaction network. In this work, we extend the methods associated with timescale analysis to reaction rules instead of the species contained within the network. To illustrate this approach, we applied timescale analysis to a simple receptor-ligand binding model and a rule-based model of Interleukin-12 (IL-12) signaling in näive CD4+ T cells. The IL-12 signaling pathway includes multiple protein-protein interactions that collectively transmit information; however, the level of mechanistic detail sufficient to capture the observed dynamics has not been justified based upon the available data. The analysis correctly predicted that reactions associated with JAK2 and TYK2 binding to their corresponding receptor exist at a pseudo-equilibrium. In contrast, reactions associated with ligand binding and receptor turnover regulate cellular response to IL-12. An empirical Bayesian approach was used to estimate the uncertainty in the timescales. This approach complements existing rank- and flux-based methods that can be used to interrogate complex reaction networks. Ultimately, timescale analysis of rule-based models is a computational tool that can be used to reveal the biochemical steps that regulate signaling dynamics. PMID:21954150

  16. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  17. Sources and Nature of Cost Analysis Data Base Reference Manual.

    DTIC Science & Technology

    1983-07-01

    COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON

  18. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  19. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  20. Setting Standards for Medically-Based Running Analysis

    PubMed Central

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  1. A phylogenetic and biogeographic analysis of the genera of Spirorchinae (Digenea: Spirorchidae) parasitic in freshwater turtles.

    PubMed

    Platt, T R

    1992-08-01

    Cladistic analysis of the freshwater genera of Spirorchinae (Schistosomatoidea: Spirorchidae sensu Yamaguti, 1971) plus Haematotrema Stunkard, 1923, and Aphanospirorchis Platt, 1990, was completed. The Spirorchinae were considered monophyletic based on synapomorphies of the esophagus. Three lineages, Spirhapalum (Europe/Asia), Plasmiorchis+Hemiorchis (India), and Spirorchis + Henotosoma + Haematotrema + Aphanospirorchis (North America), were identified. Nelsen consensus analysis was used as the basis for recognizing 3 valid monophyletic genera: Spirhapalum, Plasmiorchis, and Spirorchis. Hapalotrematinae sensu Smith, 1972 (e.g., Hapalorhynchus/Coeuritrema), is considered the most plesiomorphic group of spirorchids. Freshwater representatives of the hapalotrematines have been reported from 7 of 12 extant turtle families, including the relatively primitive Pelomedusidae (Pleurodira) and exhibit a worldwide distribution. It is hypothesized that this group arose in the early Triassic period, prior to the breakup of Pangea. Thus, it represents a primitive lineage that was present during the diversification of turtle lineages in the mid-Mesozoic era. Spirorchinae arose later (late Cretaceous period) as a Laurasian component parasitic in the more recent pond turtles (Emydidae + Bataguridae). Species of Spirhapalum retained a relatively plesiomorphic distribution, and they are found in emydids (Europe) and batagurids (Asia). Species of Spirorchis arose and diversified with North America emydids following the separation of North America and Europe in the late Cretaceous or early Tertiary periods. Species of Plasmiorchis are hypothesized to be derived from Asian ancestors that accompanied the colonization of India by Asian batagurids during the early Tertiary period. The presence of Spirorchis species in snapping turtles (Chelydridae/North America) and of Plasmiorchis species in Indian soft-shelled turtle (Trionychidae) are considered independent colonization events.

  2. Linear discriminant analysis based on L1-norm maximization.

    PubMed

    Zhong, Fujin; Zhang, Jiashu

    2013-08-01

    Linear discriminant analysis (LDA) is a well-known dimensionality reduction technique, which is widely used for many purposes. However, conventional LDA is sensitive to outliers because its objective function is based on the distance criterion using L2-norm. This paper proposes a simple but effective robust LDA version based on L1-norm maximization, which learns a set of local optimal projection vectors by maximizing the ratio of the L1-norm-based between-class dispersion and the L1-norm-based within-class dispersion. The proposed method is theoretically proved to be feasible and robust to outliers while overcoming the singular problem of the within-class scatter matrix for conventional LDA. Experiments on artificial datasets, standard classification datasets and three popular image databases demonstrate the efficacy of the proposed method.

  3. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    PubMed

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the

  4. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  5. Performance and analysis of MAC protocols based on application

    NASA Astrophysics Data System (ADS)

    Yadav, Ravi; Daniel, A. K.

    2018-04-01

    Wireless Sensor Network is one of the rapid emerging technology in recent decades. It covers large application area as civilian and military. Wireless Sensor Network primary consists of sensor nodes having low-power, low cost and multifunctional activities to collaborates and communicates via wireless medium. The deployment of sensor nodes are adhoc in nature, so sensor nodes are auto organize themselves in such a way to communicate with each other. The characteristics make more challenging areas on WSNs. This paper gives overview about characteristics of WSNs, Architecture and Contention Based MAC protocol. The paper present analysis of various protocol based on performance.

  6. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    PubMed Central

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  7. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  8. Job optimization in ATLAS TAG-based distributed analysis

    NASA Astrophysics Data System (ADS)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  9. Arterial stiffness estimation based photoplethysmographic pulse wave analysis

    NASA Astrophysics Data System (ADS)

    Huotari, Matti; Maatta, Kari; Kostamovaara, Juha

    2010-11-01

    Arterial stiffness is one of the indices of vascular healthiness. It is based on pulse wave analysis. In the case we decompose the pulse waveform for the estimation and determination of arterial elasticity. Firstly, optically measured with photoplethysmograph and then investigating means by four lognormal pulse waveforms for which we can find very good fit between the original and summed decomposed pulse wave. Several studies have demonstrated that these kinds of measures predict cardiovascular events. While dynamic factors, e.g., arterial stiffness, depend on fixed structural features of the vascular wall. Arterial stiffness is estimated based on pulse wave decomposition analysis in the radial and tibial arteries. Elucidation of the precise relationship between endothelial function and vascular stiffness awaits still further study.

  10. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  11. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  12. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    PubMed

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters.

  13. Moon-Based INSAR Geolocation and Baseline Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  14. Using the DOE Knowledge Base for Special Event Analysis

    SciTech Connect

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled

  15. Research on Air Quality Evaluation based on Principal Component Analysis

    NASA Astrophysics Data System (ADS)

    Wang, Xing; Wang, Zilin; Guo, Min; Chen, Wei; Zhang, Huan

    2018-01-01

    Economic growth has led to environmental capacity decline and the deterioration of air quality. Air quality evaluation as a fundamental of environmental monitoring and air pollution control has become increasingly important. Based on the principal component analysis (PCA), this paper evaluates the air quality of a large city in Beijing-Tianjin-Hebei Area in recent 10 years and identifies influencing factors, in order to provide reference to air quality management and air pollution control.

  16. GLC analysis of base composition of RNA and DNA hydrolysates

    NASA Technical Reports Server (NTRS)

    Lakings, D. B.; Gehreke, C. W.

    1971-01-01

    Various methods used for the analysis of the base composition of RNA and DNA hydrolysates are presented. The methods discussed are: (1) ion-exchange chromatography, (2) paper chromatography, (3) paper electrophoresis, (4) thin layer chromatography, (5) paper chromatography and time of flight mass spectrometry, and (6) gas-liquid chromatography. The equipment required and the conditions for obtaining the best results with each method are described.

  17. Content-based TV sports video retrieval using multimodal analysis

    NASA Astrophysics Data System (ADS)

    Yu, Yiqing; Liu, Huayong; Wang, Hongbin; Zhou, Dongru

    2003-09-01

    In this paper, we propose content-based video retrieval, which is a kind of retrieval by its semantical contents. Because video data is composed of multimodal information streams such as video, auditory and textual streams, we describe a strategy of using multimodal analysis for automatic parsing sports video. The paper first defines the basic structure of sports video database system, and then introduces a new approach that integrates visual stream analysis, speech recognition, speech signal processing and text extraction to realize video retrieval. The experimental results for TV sports video of football games indicate that the multimodal analysis is effective for video retrieval by quickly browsing tree-like video clips or inputting keywords within predefined domain.

  18. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  19. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    NASA Astrophysics Data System (ADS)

    Lee, Chien-Cheng; Huang, Shin-Sheng; Shih, Cheng-Yuan

    2010-12-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO) algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  20. Student Engagement: A Principle-Based Concept Analysis.

    PubMed

    Bernard, Jean S

    2015-08-04

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.

  1. GOMA: functional enrichment analysis tool based on GO modules

    PubMed Central

    Huang, Qiang; Wu, Ling-Yun; Wang, Yong; Zhang, Xiang-Sun

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology. A variety of enrichment analysis tools have been developed in recent years, but most output a long list of significantly enriched terms that are often redundant, making it difficult to extract the most meaningful functions. In this paper, we present GOMA, a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules. With this method, we systematically revealed functional GO modules, i.e., groups of functionally similar GO terms, via an optimization model and then ranked them by enrichment scores. Our new method simplifies enrichment analysis results by reducing redundancy, thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results. PMID:23237213

  2. Dynamic Network-Based Epistasis Analysis: Boolean Examples

    PubMed Central

    Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.

    2011-01-01

    In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and

  3. Geographic Object-Based Image Analysis - Towards a new paradigm.

    PubMed

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  4. Principle-based analysis of the concept of telecare.

    PubMed

    Solli, Hilde; Bjørk, Ida Torunn; Hvalvik, Sigrun; Hellesø, Ragnhild

    2012-12-01

    To report a concept analysis of telecare. Lately telecare has become a worldwide, modern way of giving care over distance by means of technology. Other concepts, like telemedicine, e-health, and telehealth, focus on the same topic though the boundaries between them seem to be blurred. Sources comprise 44 English language research articles retrieved from the database of Medline and Cinahl (1995-October 2011). Literature Review. A principle-based analysis was undertaken through content analysis of the definitions, attributes, preconditions, and outcomes of the concept. The attributes are well described according to the use of technology, caring activity, persons involved, and accessibility. Preconditions and outcomes are well described concerning individual and health political needs and benefits. The concept did not hold its boundaries through theoretical integration with the concept of telemedicine and telehealth. The definition of telecare competes with concepts like home-based e-health, telehomecare, telephonecare, telephone-based psychosocial services, telehealth, and telemedicine. Assessment of the definitions resulted in a suggestion of a new definition: Telecare is the use of information, communication, and monitoring technologies which allow healthcare providers to remotely evaluate health status, give educational intervention, or deliver health and social care to patients in their homes. The logical principle was assessed to be partly immature, whereas the pragmatical and linguistical principles were found to be mature. A new definition is suggested and this has moved the epistemological principle forward to maturity. © 2012 Blackwell Publishing Ltd.

  5. Multi-membership gene regulation in pathway based microarray analysis

    PubMed Central

    2011-01-01

    Background Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. Results We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. Conclusions We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes. PMID:21939531

  6. Multi-membership gene regulation in pathway based microarray analysis.

    PubMed

    Pavlidis, Stelios P; Payne, Annette M; Swift, Stephen M

    2011-09-22

    Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes.

  7. Seahawk: moving beyond HTML in Web-based bioinformatics analysis.

    PubMed

    Gordon, Paul M K; Sensen, Christoph W

    2007-06-18

    Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therefore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer.

  8. Seahawk: moving beyond HTML in Web-based bioinformatics analysis

    PubMed Central

    Gordon, Paul MK; Sensen, Christoph W

    2007-01-01

    Background Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. Results We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. Conclusion As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer. PMID:17577405

  9. A knowledge base for Vitis vinifera functional analysis.

    PubMed

    Pulvirenti, Alfredo; Giugno, Rosalba; Distefano, Rosario; Pigola, Giuseppe; Mongiovi, Misael; Giudice, Girolamo; Vendramin, Vera; Lombardo, Alessandro; Cattonaro, Federica; Ferro, Alfredo

    2015-01-01

    Vitis vinifera (Grapevine) is the most important fruit species in the modern world. Wine and table grapes sales contribute significantly to the economy of major wine producing countries. The most relevant goals in wine production concern quality and safety. In order to significantly improve the achievement of these objectives and to gain biological knowledge about cultivars, a genomic approach is the most reliable strategy. The recent grapevine genome sequencing offers the opportunity to study the potential roles of genes and microRNAs in fruit maturation and other physiological and pathological processes. Although several systems allowing the analysis of plant genomes have been reported, none of them has been designed specifically for the functional analysis of grapevine genomes of cultivars under environmental stress in connection with microRNA data. Here we introduce a novel knowledge base, called BIOWINE, designed for the functional analysis of Vitis vinifera genomes of cultivars present in Sicily. The system allows the analysis of RNA-seq experiments of two different cultivars, namely Nero d'Avola and Nerello Mascalese. Samples were taken under different climatic conditions of phenological phases, diseases, and geographic locations. The BIOWINE web interface is equipped with data analysis modules for grapevine genomes. In particular users may analyze the current genome assembly together with the RNA-seq data through a customized version of GBrowse. The web interface allows users to perform gene set enrichment by exploiting third-party databases. BIOWINE is a knowledge base implementing a set of bioinformatics tools for the analysis of grapevine genomes. The system aims to increase our understanding of the grapevine varieties and species of Sicilian products focusing on adaptability to different climatic conditions, phenological phases, diseases, and geographic locations.

  10. Ontology-based specification, identification and analysis of perioperative risks.

    PubMed

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  11. Performance of Koyna dam based on static and dynamic analysis

    NASA Astrophysics Data System (ADS)

    Azizan, Nik Zainab Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar

    2017-10-01

    This paper discusses the performance of Koyna dam based on static pushover analysis (SPO) and incremental dynamic analysis (IDA). The SPO in this study considered two type of lateral load which is inertial load and hydrodynamic load. The structure was analyse until the damage appears on the structure body. The IDA curves were develop based on 7 ground motion, where the characteristic of the ground motions: i) the distance from the epicenter is less than 15km, (ii) the magnitude is equal to or greater than 5.5 and (iii) the PGA is equal to or greater than 0.15g. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. Elastic respond spectrum developed based on soil type B by using Eurocode 8. By using SPO and IDA method are able to determine the limit states of the dam. The limit state proposed in this study are yielding and ultimate state which is identified base on crack pattern perform on the structure model. The comparison of maximum crest displacement for both methods is analysed to define the limit state of the dam. The displacement of yielding state for Koyna dam is 23.84mm and 44.91mm for the ultimate state. The results are able to be used as a guideline to monitor Koyna dam under seismic loadings which are considering static and dynamic.

  12. Web-Based Virtual Laboratory for Food Analysis Course

    NASA Astrophysics Data System (ADS)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  13. Matching biomedical ontologies based on formal concept analysis.

    PubMed

    Zhao, Mengyi; Zhang, Songmao; Li, Weizhuo; Chen, Guowei

    2018-03-19

    The goal of ontology matching is to identify correspondences between entities from different yet overlapping ontologies so as to facilitate semantic integration, reuse and interoperability. As a well developed mathematical model for analyzing individuals and structuring concepts, Formal Concept Analysis (FCA) has been applied to ontology matching (OM) tasks since the beginning of OM research, whereas ontological knowledge exploited in FCA-based methods is limited. This motivates the study in this paper, i.e., to empower FCA with as much as ontological knowledge as possible for identifying mappings across ontologies. We propose a method based on Formal Concept Analysis to identify and validate mappings across ontologies, including one-to-one mappings, complex mappings and correspondences between object properties. Our method, called FCA-Map, incrementally generates a total of five types of formal contexts and extracts mappings from the lattices derived. First, the token-based formal context describes how class names, labels and synonyms share lexical tokens, leading to lexical mappings (anchors) across ontologies. Second, the relation-based formal context describes how classes are in taxonomic, partonomic and disjoint relationships with the anchors, leading to positive and negative structural evidence for validating the lexical matching. Third, the positive relation-based context can be used to discover structural mappings. Afterwards, the property-based formal context describes how object properties are used in axioms to connect anchor classes across ontologies, leading to property mappings. Last, the restriction-based formal context describes co-occurrence of classes across ontologies in anonymous ancestors of anchors, from which extended structural mappings and complex mappings can be identified. Evaluation on the Anatomy, the Large Biomedical Ontologies, and the Disease and Phenotype track of the 2016 Ontology Alignment Evaluation Initiative campaign

  14. Physics-based deformable organisms for medical image analysis

    NASA Astrophysics Data System (ADS)

    Hamarneh, Ghassan; McIntosh, Chris

    2005-04-01

    Previously, "Deformable organisms" were introduced as a novel paradigm for medical image analysis that uses artificial life modelling concepts. Deformable organisms were designed to complement the classical bottom-up deformable models methodologies (geometrical and physical layers), with top-down intelligent deformation control mechanisms (behavioral and cognitive layers). However, a true physical layer was absent and in order to complete medical image segmentation tasks, deformable organisms relied on pure geometry-based shape deformations guided by sensory data, prior structural knowledge, and expert-generated schedules of behaviors. In this paper we introduce the use of physics-based shape deformations within the deformable organisms framework yielding additional robustness by allowing intuitive real-time user guidance and interaction when necessary. We present the results of applying our physics-based deformable organisms, with an underlying dynamic spring-mass mesh model, to segmenting and labelling the corpus callosum in 2D midsagittal magnetic resonance images.

  15. An activity-based methodology for operations cost analysis

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David; Bilby, Curt; Frizzell, R. A.

    1991-01-01

    This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.

  16. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  17. Coformer screening using thermal analysis based on binary phase diagrams.

    PubMed

    Yamashita, Hiroyuki; Hirakura, Yutaka; Yuda, Masamichi; Terada, Katsuhide

    2014-08-01

    The advent of cocrystals has demonstrated a growing need for efficient and comprehensive coformer screening in search of better development forms, including salt forms. Here, we investigated a coformer screening system for salts and cocrystals based on binary phase diagrams using thermal analysis and examined the effectiveness of the method. Indomethacin and tenoxicam were used as models of active pharmaceutical ingredients (APIs). Physical mixtures of an API and 42 kinds of coformers were analyzed using Differential Scanning Calorimetry (DSC) and X-ray DSC. We also conducted coformer screening using a conventional slurry method and compared these results with those from the thermal analysis method and previous studies. Compared with the slurry method, the thermal analysis method was a high-performance screening system, particularly for APIs with low solubility and/or propensity to form solvates. However, this method faced hurdles for screening coformers combined with an API in the presence of kinetic hindrance for salt or cocrystal formation during heating or if there is degradation near the metastable eutectic temperature. The thermal analysis and slurry methods are considered complementary to each other for coformer screening. Feasibility of the thermal analysis method in drug discovery practice is ensured given its small scale and high throughput.

  18. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  19. Web-based analysis and publication of flow cytometry experiments.

    PubMed

    Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M

    2010-07-01

    Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.

  20. Complexity analysis of human physiological signals based on case studies

    NASA Astrophysics Data System (ADS)

    Angelova, Maia; Holloway, Philip; Ellis, Jason

    2015-04-01

    This work focuses on methods for investigation of physiological time series based on complexity analysis. It is a part of a wider programme to determine non-invasive markers for healthy ageing. We consider two case studies investigated with actigraphy: (a) sleep and alternations with insomnia, and (b) ageing effects on mobility patterns. We illustrate, using these case studies, the application of fractal analysis to the investigation of regulation patterns and control, and change of physiological function. In the first case study, fractal analysis techniques were implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia in comparison with healthy controls. The aim was to investigate if complexity analysis can detect the onset of adverse health-related events. The subjects with acute insomnia displayed significantly higher levels of complexity, possibly a result of too much activity in the underlying regulatory systems. The second case study considered mobility patterns during night time and their variations with age. It showed that complexity metrics can identify change in physiological function with ageing. Both studies demonstrated that complexity analysis can be used to investigate markers of health, disease and healthy ageing.

  1. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    PubMed Central

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified. PMID:23766723

  2. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    SciTech Connect

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  3. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  4. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    PubMed

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  5. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis.

    PubMed

    Garling, Eric H; Kaptein, Bart L; Geleijns, Koos; Nelissen, Rob G H H; Valstar, Edward R

    2005-04-01

    It remains unknown if and how the polyethylene bearing in mobile bearing knees moves during dynamic activities with respect to the tibial base plate. Marker Configuration Model-Based Roentgen Fluoroscopic Analysis (MCM-based RFA) uses a marker configuration model of inserted tantalum markers in order to accurately estimate the pose of an implant or bone using single plane Roentgen images or fluoroscopic images. The goal of this study is to assess the accuracy of (MCM-Based RFA) in a standard fluoroscopic set-up using phantom experiments and to determine the error propagation with computer simulations. The experimental set-up of the phantom study was calibrated using a calibration box equipped with 600 tantalum markers, which corrected for image distortion and determined the focus position. In the computer simulation study the influence of image distortion, MC-model accuracy, focus position, the relative distance between MC-models and MC-model configuration on the accuracy of MCM-Based RFA were assessed. The phantom study established that the in-plane accuracy of MCM-Based RFA is 0.1 mm and the out-of-plane accuracy is 0.9 mm. The rotational accuracy is 0.1 degrees. A ninth-order polynomial model was used to correct for image distortion. Marker-Based RFA was estimated to have, in a worst case scenario, an in vivo translational accuracy of 0.14 mm (x-axis), 0.17 mm (y-axis), 1.9 mm (z-axis), respectively, and a rotational accuracy of 0.3 degrees. When using fluoroscopy to study kinematics, image distortion and the accuracy of models are important factors, which influence the accuracy of the measurements. MCM-Based RFA has the potential to be an accurate, clinically useful tool for studying kinematics after total joint replacement using standard equipment.

  6. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  7. Resilience Analysis of Countries under Disasters Based on Multisource Data.

    PubMed

    Zhang, Nan; Huang, Hong

    2018-01-01

    Disasters occur almost daily in the world. Because emergencies frequently have no precedent, are highly uncertain, and can be very destructive, improving a country's resilience is an efficient way to reduce risk. In this article, we collected more than 20,000 historical data points from disasters from 207 countries to enable us to calculate the severity of disasters and the danger they pose to countries. In addition, 6 primary indices (disaster, personal attribute, infrastructure, economics, education, and occupation) including 38 secondary influencing factors are considered in analyzing the resilience of countries. Using these data, we obtained the danger, expected number of deaths, and resilience of all 207 countries. We found that a country covering a large area is more likely to have a low resilience score. Through sensitivity analysis of all secondary indices, we found that population density, frequency of disasters, and GDP are the three most critical factors affecting resilience. Based on broad-spectrum resilience analysis of the different continents, Oceania and South America have the highest resilience, while Asia has the lowest. Over the past 50 years, the resilience of many countries has been improved sharply, especially in developing countries. Based on our results, we analyze the comprehensive resilience and provide some optimal suggestions to efficiently improve resilience. © 2017 Society for Risk Analysis.

  8. A graph-based system for network-vulnerability analysis

    SciTech Connect

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks,more » broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.« less

  9. Complexity analysis based on generalized deviation for financial markets

    NASA Astrophysics Data System (ADS)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  10. Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation

    ERIC Educational Resources Information Center

    Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom

    2014-01-01

    Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…

  11. Principle-based concept analysis: Caring in nursing education

    PubMed Central

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development

  12. Principle-based concept analysis: Caring in nursing education.

    PubMed

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Karimi Moonaghi, Hossein; Mazloom, Seyed Reza

    2016-03-01

    The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as "caring pedagogy," "value-based education," and "teaching excellence," caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development.

  13. Structure-Based Phylogenetic Analysis of the Lipocalin Superfamily.

    PubMed

    Lakshmi, Balasubramanian; Mishra, Madhulika; Srinivasan, Narayanaswamy; Archunan, Govindaraju

    2015-01-01

    Lipocalins constitute a superfamily of extracellular proteins that are found in all three kingdoms of life. Although very divergent in their sequences and functions, they show remarkable similarity in 3-D structures. Lipocalins bind and transport small hydrophobic molecules. Earlier sequence-based phylogenetic studies of lipocalins highlighted that they have a long evolutionary history. However the molecular and structural basis of their functional diversity is not completely understood. The main objective of the present study is to understand functional diversity of the lipocalins using a structure-based phylogenetic approach. The present study with 39 protein domains from the lipocalin superfamily suggests that the clusters of lipocalins obtained by structure-based phylogeny correspond well with the functional diversity. The detailed analysis on each of the clusters and sub-clusters reveals that the 39 lipocalin domains cluster based on their mode of ligand binding though the clustering was performed on the basis of gross domain structure. The outliers in the phylogenetic tree are often from single member families. Also structure-based phylogenetic approach has provided pointers to assign putative function for the domains of unknown function in lipocalin family. The approach employed in the present study can be used in the future for the functional identification of new lipocalin proteins and may be extended to other protein families where members show poor sequence similarity but high structural similarity.

  14. Finite element analysis of osteoporosis models based on synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  15. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  16. Cost Analysis of an Office-based Surgical Suite

    PubMed Central

    LaBove, Gabrielle

    2016-01-01

    Introduction: Operating costs are a significant part of delivering surgical care. Having a system to analyze these costs is imperative for decision making and efficiency. We present an analysis of surgical supply, labor and administrative costs, and remuneration of procedures as a means for a practice to analyze their cost effectiveness; this affects the quality of care based on the ability to provide services. The costs of surgical care cannot be estimated blindly as reconstructive and cosmetic procedures have different percentages of overhead. Methods: A detailed financial analysis of office-based surgical suite costs for surgical procedures was determined based on company contract prices and average use of supplies. The average time spent on scheduling, prepping, and doing the surgery was factored using employee rates. Results: The most expensive, minor procedure supplies are suture needles. The 4 most common procedures from the most expensive to the least are abdominoplasty, breast augmentation, facelift, and lipectomy. Conclusions: Reconstructive procedures require a greater portion of collection to cover costs. Without the adjustment of both patient and insurance remuneration in the practice, the ability to provide quality care will be increasingly difficult. PMID:27536482

  17. Automated image-based phenotypic analysis in zebrafish embryos

    PubMed Central

    Vogt, Andreas; Cholewinski, Andrzej; Shen, Xiaoqiang; Nelson, Scott; Lazo, John S.; Tsang, Michael; Hukriede, Neil A.

    2009-01-01

    Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to utilizing the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. PMID:19235725

  18. Least-dependent-component analysis based on mutual information

    NASA Astrophysics Data System (ADS)

    Stögbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-12-01

    We propose to use precise estimators of mutual information (MI) to find the least dependent components in a linearly mixed signal. On the one hand, this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand, it has the advantage, compared to other implementations of “independent” component analysis (ICA), some of which are based on crude approximations for MI, that the numerical values of the MI can be used for (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output by comparing the pairwise MIs with those of remixed components; and (iii) clustering the output according to the residual interdependencies. For the MI estimator, we use a recently proposed k -nearest-neighbor-based algorithm. For time sequences, we combine this with delay embedding, in order to take into account nontrivial time correlations. After several tests with artificial data, we apply the resulting MILCA (mutual-information-based least dependent component analysis) algorithm to a real-world dataset, the ECG of a pregnant woman.

  19. Visualization-based analysis of multiple response survey data

    NASA Astrophysics Data System (ADS)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  20. Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders

    PubMed Central

    Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini

    2008-01-01

    Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693

  1. Laparoscopic surgery skills evaluation: analysis based on accelerometers.

    PubMed

    Sánchez, Alexis; Rodríguez, Omaira; Sánchez, Renata; Benítez, Gustavo; Pena, Romina; Salamo, Oriana; Baez, Valentina

    2014-01-01

    Technical skills assessment is considered an important part of surgical training. Subjective assessment is not appropriate for training feedback, and there is now increased demand for objective assessment of surgical performance. Economy of movement has been proposed as an excellent alternative for this purpose. The investigators describe a readily available method to evaluate surgical skills through motion analysis using accelerometers in Apple's iPod Touch device. Two groups of individuals with different minimally invasive surgery skill levels (experts and novices) were evaluated. Each group was asked to perform a given task with an iPod Touch placed on the dominant-hand wrist. The Accelerometer Data Pro application makes it possible to obtain movement-related data detected by the accelerometers. Average acceleration and maximum acceleration for each axis (x, y, and z) were determined and compared. The analysis of average acceleration and maximum acceleration showed statistically significant differences between groups on both the y (P = .04, P = .03) and z (P = .04, P = .04) axes. This demonstrates the ability to distinguish between experts and novices. The analysis of the x axis showed no significant differences between groups, which could be explained by the fact that the task involves few movements on this axis. Accelerometer-based motion analysis is a useful tool to evaluate laparoscopic skill development of surgeons and should be used in training programs. Validation of this device in an in vivo setting is a research goal of the investigators' team.

  2. GIS-based poverty and population distribution analysis in China

    NASA Astrophysics Data System (ADS)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  3. Histogram analysis for smartphone-based rapid hematocrit determination

    PubMed Central

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  4. GPS baseline configuration design based on robustness analysis

    NASA Astrophysics Data System (ADS)

    Yetkin, M.; Berber, M.

    2012-11-01

    The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configurationof the observed baselines. The selection of optimal GPS baselines may allow for a cost effective survey campaign and a sufficiently robustnetwork. Furthermore, using the approach described in this paper, the required number of sessions, the baselines to be observed, and thesignificance levels for statistical testing and robustness analysis can be determined even before the GPS campaign starts. In this study, wepropose a robustness criterion for the optimal design of geodetic networks, and present a very simple and efficient algorithm based on thiscriterion for the selection of optimal GPS baselines. We also show the relationship between the number of sessions and the non-centralityparameter. Finally, a numerical example is given to verify the efficacy of the proposed approach.

  5. Robust LOD scores for variance component-based linkage analysis.

    PubMed

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  6. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  7. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  8. Demonstrating Change with Astronaut Photography Using Object Based Image Analysis

    NASA Technical Reports Server (NTRS)

    Hollier, Andi; Jagge, Amy

    2017-01-01

    Every day, hundreds of images of Earth flood the Crew Earth Observations database as astronauts use hand held digital cameras to capture spectacular frames from the International Space Station. The variety of resolutions and perspectives provide a template for assessing land cover change over decades. We will focus on urban growth in the second fastest growing city in the nation, Houston, TX, using Object-Based Image Analysis. This research will contribute to the land change science community, integrated resource planning, and monitoring of the rapid rate of urban sprawl.

  9. Selecting supplier combination based on fuzzy multicriteria analysis

    NASA Astrophysics Data System (ADS)

    Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.

    2015-07-01

    Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.

  10. Amatchmethod Based on Latent Semantic Analysis for Earthquakehazard Emergency Plan

    NASA Astrophysics Data System (ADS)

    Sun, D.; Zhao, S.; Zhang, Z.; Shi, X.

    2017-09-01

    The structure of the emergency plan on earthquake is complex, and it's difficult for decision maker to make a decision in a short time. To solve the problem, this paper presents a match method based on Latent Semantic Analysis (LSA). After the word segmentation preprocessing of emergency plan, we carry out keywords extraction according to the part-of-speech and the frequency of words. Then through LSA, we map the documents and query information to the semantic space, and calculate the correlation of documents and queries by the relation between vectors. The experiments results indicate that the LSA can improve the accuracy of emergency plan retrieval efficiently.

  11. Electromagnetic fields from mobile phone base station - variability analysis.

    PubMed

    Bienkowski, Pawel; Zubrzak, Bartlomiej

    2015-09-01

    The article describes the character of electromagnetic field (EMF) in mobile phone base station (BS) surroundings and its variability in time with an emphasis on the measurement difficulties related to its pulse and multi-frequency nature. Work also presents long-term monitoring measurements performed recently in different locations in Poland - small city with dispersed building development and in major polish city - dense urban area. Authors tried to determine the trends in changing of EMF spectrum analyzing daily changes of measured EMF levels in those locations. Research was performed using selective electromagnetic meters and also EMF meter with spectrum analysis.

  12. Lossless droplet transfer of droplet-based microfluidic analysis

    DOEpatents

    Kelly, Ryan T [West Richland, WA; Tang, Keqi [Richland, WA; Page, Jason S [Kennewick, WA; Smith, Richard D [Richland, WA

    2011-11-22

    A transfer structure for droplet-based microfluidic analysis is characterized by a first conduit containing a first stream having at least one immiscible droplet of aqueous material and a second conduit containing a second stream comprising an aqueous fluid. The interface between the first conduit and the second conduit can define a plurality of apertures, wherein the apertures are sized to prevent exchange of the first and second streams between conduits while allowing lossless transfer of droplets from the first conduit to the second conduit through contact between the first and second streams.

  13. Web-Based Instruction and Learning: Analysis and Needs Assessment

    NASA Technical Reports Server (NTRS)

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany

    1998-01-01

    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  14. An adhered-particle analysis system based on concave points

    NASA Astrophysics Data System (ADS)

    Wang, Wencheng; Guan, Fengnian; Feng, Lin

    2018-04-01

    Particles adhered together will influence the image analysis in computer vision system. In this paper, a method based on concave point is designed. First, corner detection algorithm is adopted to obtain a rough estimation of potential concave points after image segmentation. Then, it computes the area ratio of the candidates to accurately localize the final separation points. Finally, it uses the separation points of each particle and the neighboring pixels to estimate the original particles before adhesion and provides estimated profile images. The experimental results have shown that this approach can provide good results that match the human visual cognitive mechanism.

  15. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    PubMed Central

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  16. Principal Component Analysis Based Measure of Structural Holes

    NASA Astrophysics Data System (ADS)

    Deng, Shiguo; Zhang, Wenqing; Yang, Huijie

    2013-02-01

    Based upon principal component analysis, a new measure called compressibility coefficient is proposed to evaluate structural holes in networks. This measure incorporates a new effect from identical patterns in networks. It is found that compressibility coefficient for Watts-Strogatz small-world networks increases monotonically with the rewiring probability and saturates to that for the corresponding shuffled networks. While compressibility coefficient for extended Barabasi-Albert scale-free networks decreases monotonically with the preferential effect and is significantly large compared with that for corresponding shuffled networks. This measure is helpful in diverse research fields to evaluate global efficiency of networks.

  17. EEG analysis using wavelet-based information tools.

    PubMed

    Rosso, O A; Martin, M T; Figliola, A; Keller, K; Plastino, A

    2006-06-15

    Wavelet-based informational tools for quantitative electroencephalogram (EEG) record analysis are reviewed. Relative wavelet energies, wavelet entropies and wavelet statistical complexities are used in the characterization of scalp EEG records corresponding to secondary generalized tonic-clonic epileptic seizures. In particular, we show that the epileptic recruitment rhythm observed during seizure development is well described in terms of the relative wavelet energies. In addition, during the concomitant time-period the entropy diminishes while complexity grows. This is construed as evidence supporting the conjecture that an epileptic focus, for this kind of seizures, triggers a self-organized brain state characterized by both order and maximal complexity.

  18. Content and user-based music visual analysis

    NASA Astrophysics Data System (ADS)

    Guo, Xiaochun; Tang, Lei

    2015-12-01

    In recent years, people's ability to collect music got enhanced greatly. Many people who prefer listening music offline even stored thousands of music on their local storage or portable device. However, their ability to deal with music information has not been improved accordingly, which results in two problems. One is how to find out the favourite songs from large music dataset and satisfy different individuals. The other one is how to compose a play list quickly. To solve these problems, the authors proposed a content and user-based music visual analysis approach. We first developed a new recommendation algorithm based on the content of music and user's behaviour, which satisfy individual's preference. Then, we make use of visualization and interaction tools to illustrate the relationship between songs and help people compose a suitable play list. At the end of this paper, a survey is mentioned to show that our system is available and effective.

  19. Structural analysis of nanocomposites based on HDPE/EPDM blends.

    PubMed

    Zitzumbo, Roberto; Alonso, Sergio; Avalos, Felipe; Ortiz, José C; López-Manchado, Miguel A; Arroyo, Miguel

    2006-02-01

    Intercalated and exfoliated nanocomposites based on HDPE and EPDM blends with an organoclay have been obtained through the addition of EPDM-g-MA as a compatibilizer. The combined effect of clay and EPDM-g-MA on the rheological behaviour is very noticeable with a sensible increase in viscosity which suggests the formation of a structural net of percolation induced by the presence of intercalated and exfoliated silicate layer. As deduced from rheological studies, a morphology based on nanostructured micro-domains dispersed in HDPE continuous phase is proposed for EPDM/HDPE blend nanocomposites. XRD and SEM analysis suggest that two different transport phenomena take simultaneously place during the intercalation process in the melt. One due to diffusion of HDPE chains into the tactoid and the other to diffusion of EPDM-g-MA into the silicate galleries.

  20. Harmonic analysis of electrified railway based on improved HHT

    NASA Astrophysics Data System (ADS)

    Wang, Feng

    2018-04-01

    In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.

  1. Recurrence quantity analysis based on singular value decomposition

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2017-05-01

    Recurrence plot (RP) has turned into a powerful tool in many different sciences in the last three decades. To quantify the complexity and structure of RP, recurrence quantification analysis (RQA) has been developed based on the measures of recurrence density, diagonal lines, vertical lines and horizontal lines. This paper will study the RP based on singular value decomposition which is a new perspective of RP study. Principal singular value proportion (PSVP) will be proposed as one new RQA measure and bigger PSVP means higher complexity for one system. In contrast, smaller PSVP reflects a regular and stable system. Considering the advantage of this method in detecting the complexity and periodicity of systems, several simulation and real data experiments are chosen to examine the performance of this new RQA.

  2. Live face detection based on the analysis of Fourier spectra

    NASA Astrophysics Data System (ADS)

    Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.

    2004-08-01

    Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.

  3. CO Component Estimation Based on the Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki; Takeuchi, Tsutomu T.; Fukui, Yasuo

    2014-01-01

    Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independent component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.

  4. CO component estimation based on the independent component analysis

    SciTech Connect

    Ichiki, Kiyotomo; Kaji, Ryohei; Yamamoto, Hiroaki

    2014-01-01

    Fast Independent Component Analysis (FastICA) is a component separation algorithm based on the levels of non-Gaussianity. Here we apply FastICA to the component separation problem of the microwave background, including carbon monoxide (CO) line emissions that are found to contaminate the PLANCK High Frequency Instrument (HFI) data. Specifically, we prepare 100 GHz, 143 GHz, and 217 GHz mock microwave sky maps, which include galactic thermal dust, NANTEN CO line, and the cosmic microwave background (CMB) emissions, and then estimate the independent components based on the kurtosis. We find that FastICA can successfully estimate the CO component as the first independentmore » component in our deflection algorithm because its distribution has the largest degree of non-Gaussianity among the components. Thus, FastICA can be a promising technique to extract CO-like components without prior assumptions about their distributions and frequency dependences.« less

  5. Position Accuracy Analysis of a Robust Vision-Based Navigation

    NASA Astrophysics Data System (ADS)

    Gaglione, S.; Del Pizzo, S.; Troisi, S.; Angrisano, A.

    2018-05-01

    Using images to determine camera position and attitude is a consolidated method, very widespread for application like UAV navigation. In harsh environment, where GNSS could be degraded or denied, image-based positioning could represent a possible candidate for an integrated or alternative system. In this paper, such method is investigated using a system based on single camera and 3D maps. A robust estimation method is proposed in order to limit the effect of blunders or noisy measurements on position solution. The proposed approach is tested using images collected in an urban canyon, where GNSS positioning is very unaccurate. A previous photogrammetry survey has been performed to build the 3D model of tested area. The position accuracy analysis is performed and the effect of the robust method proposed is validated.

  6. Geospatial analysis based on GIS integrated with LADAR.

    PubMed

    Fetterman, Matt R; Freking, Robert; Fernandez-Cull, Christy; Hinkle, Christopher W; Myne, Anu; Relyea, Steven; Winslow, Jim

    2013-10-07

    In this work, we describe multi-layered analyses of a high-resolution broad-area LADAR data set in support of expeditionary activities. High-level features are extracted from the LADAR data, such as the presence and location of buildings and cars, and then these features are used to populate a GIS (geographic information system) tool. We also apply line-of-sight (LOS) analysis to develop a path-planning module. Finally, visualization is addressed and enhanced with a gesture-based control system that allows the user to navigate through the enhanced data set in a virtual immersive experience. This work has operational applications including military, security, disaster relief, and task-based robotic path planning.

  7. Independent Component Analysis applied to Ground-based observations

    NASA Astrophysics Data System (ADS)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  8. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  9. Mobile applications for weight management: theory-based content analysis.

    PubMed

    Azar, Kristen M J; Lesser, Lenard I; Laing, Brian Y; Stephens, Janna; Aurora, Magi S; Burke, Lora E; Palaniappan, Latha P

    2013-11-01

    The use of smartphone applications (apps) to assist with weight management is increasingly prevalent, but the quality of these apps is not well characterized. The goal of the study was to evaluate diet/nutrition and anthropometric tracking apps based on incorporation of features consistent with theories of behavior change. A comparative, descriptive assessment was conducted of the top-rated free apps in the Health and Fitness category available in the iTunes App Store. Health and Fitness apps (N=200) were evaluated using predetermined inclusion/exclusion criteria and categorized based on commonality in functionality, features, and developer description. Four researchers then evaluated the two most popular apps in each category using two instruments: one based on traditional behavioral theory (score range: 0-100) and the other on the Fogg Behavioral Model (score range: 0-6). Data collection and analysis occurred in November 2012. Eligible apps (n=23) were divided into five categories: (1) diet tracking; (2) healthy cooking; (3) weight/anthropometric tracking; (4) grocery decision making; and (5) restaurant decision making. The mean behavioral theory score was 8.1 (SD=4.2); the mean persuasive technology score was 1.9 (SD=1.7). The top-rated app on both scales was Lose It! by Fitnow Inc. All apps received low overall scores for inclusion of behavioral theory-based strategies. © 2013 American Journal of Preventive Medicine.

  10. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    PubMed

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. © The Author(s) 2016.

  11. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    PubMed

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  12. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    PubMed Central

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  13. Research on cloud-based remote measurement and analysis system

    NASA Astrophysics Data System (ADS)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  14. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  15. Streaming support for data intensive cloud-based sequence analysis.

    PubMed

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  16. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  17. Principle-based concept analysis: intentionality in holistic nursing theories.

    PubMed

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  18. Analysis of Fat Intake Based on the US Department of ...

    EPA Pesticide Factsheets

    EPA released the final report, Analysis of Fat Intake Based on USDA’s 1994-1996, 1998 Continuing Survey of Food Intakes by Individuals (CSFII, Final Report). For this report, the EPA conducted an analysis of fat consumption across the U.S. population based on data derived from the U.S. Department of Agriculture's 1994-96, 1998 Continuing Survey of Food Intakes by Individuals (CSFII) and EPA's Food Commodity Intake Database (FCID). Percentiles of fat consumption were calculated on the basis of total mass and on a per-unit body-weight basis for 12 food categories and 98 demographic cohorts. In addition, dietary breakdown and fat intake percentiles were calculated for a subset of the sample population whose consumption of animal fats exceeded the 90th percentile within its age group. Many chemicals found in the environment tend to accumulate in fatty tissue. Assessing risks from these chemicals requires knowledge of dietary habits and the amount of fat present in various types of foods.

  19. Postpartum sexual health: a principle-based concept analysis.

    PubMed

    O'Malley, Deirdre; Higgins, Agnes; Smith, Valerie

    2015-10-01

    The aim of this study is to report an analysis of the concept of postpartum sexual health. Postpartum sexual health is a minimally understood concept, most often framed within physical/biological dimensions or as a 'checklist' task in postpartum information provision. This has the potential to leave women unprepared to manage transient or normative sexual health changes after childbirth. For meaningful discussions, clarity and understanding of postpartum sexual health is required. A principle-based method of concept analysis. The databases of PubMed, CINAHL, Maternity and Infant Care, PsychInfo, Web of Science, EMBASE, SCOPUS and Social Science Index were systematically searched, from their earliest dates, using a combination of key terms, including; 'sexual health', 'sexual function', 'dyspareunia', 'sexuality', 'sexual desire', 'sexual dysfunction', 'postnatal' and 'postpartum', resulting in a final included dataset of 91 studies. Using the principle-based approach, postpartum sexual health was analysed under the four philosophical principles of epistemological, pragmatic, linguistic and logical. Philosophically, postpartum sexual health is underdeveloped as a concept. A precise theoretical definition remains elusive and, presently, postpartum sexual health cannot be separated theoretically from sexuality and sexual function. Identified antecedents include an instrument free birth, an intact perineum and avoidance of episiotomy. Attributes include sexual arousal, desire, orgasm, sexual satisfaction and resumption of sexual intercourse. Outcomes are sexual satisfaction and a satisfying intimate relationship with one's partner. Postpartum sexual health is conceptually immature with limited applicability in current midwifery practice. © 2015 John Wiley & Sons Ltd.

  20. Inquiry-Based Approach to a Carbohydrate Analysis Experiment

    NASA Astrophysics Data System (ADS)

    Senkbeil, Edward G.

    1999-01-01

    The analysis of an unknown carbohydrate in an inquiry-based learning format has proven to be a valuable and interesting undergraduate biochemistry laboratory experiment. Students are given a list of carbohydrates and a list of references for carbohydrate analysis. The references contain a variety of well-characterized wet chemistry and instrumental techniques for carbohydrate identification, but the students must develop an appropriate sequential protocol for unknown identification. The students are required to provide a list of chemicals and procedures and a flow chart for identification before the lab. During the 3-hour laboratory period, they utilize their accumulated information and knowledge to classify and identify their unknown. Advantages of the inquiry-based format are (i) students must be well prepared in advance to be successful in the laboratory, (ii) students feel a sense of accomplishment in both designing and carrying out a successful experiment, and (iii) the carbohydrate background information digested by the students significantly decreases the amount of lecture time required for this topic.

  1. Analysis of Android Device-Based Solutions for Fall Detection

    PubMed Central

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  2. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  3. Analysis of Android Device-Based Solutions for Fall Detection.

    PubMed

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  4. Nondeducibility-Based Analysis of Cyber-Physical Systems

    NASA Astrophysics Data System (ADS)

    Gamage, Thoshitha; McMillin, Bruce

    Controlling information flow in a cyber-physical system (CPS) is challenging because cyber domain decisions and actions manifest themselves as visible changes in the physical domain. This paper presents a nondeducibility-based observability analysis for CPSs. In many CPSs, the capacity of a low-level (LL) observer to deduce high-level (HL) actions ranges from limited to none. However, a collaborative set of observers strategically located in a network may be able to deduce all the HL actions. This paper models a distributed power electronics control device network using a simple DC circuit in order to understand the effect of multiple observers in a CPS. The analysis reveals that the number of observers required to deduce all the HL actions in a system increases linearly with the number of configurable units. A simple definition of nondeducibility based on the uniqueness of low-level projections is also presented. This definition is used to show that a system with two security domain levels could be considered “nondeducibility secure” if no unique LL projections exist.

  5. Heating Analysis in Constant-pressure Hydraulic System based on Energy Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Chao; Xu, Cong; Mao, Xuyao; Li, Bin; Hu, Junhua; Liu, Yiou

    2017-12-01

    Hydraulic systems are widely used in industrial applications, but the problem of heating has become an important reason to restrict the promotion of hydraulic technology. The high temperature, will seriously affect the operation of the hydraulic system, even cause stuck and other serious failure. Based on the analysis of the heat damage of the hydraulic system, this paper gives the reasons for this problem, and it is showed by the application that the energy analysis can accurately locate the main reasons for the heating of the hydraulic system, which can give strong practical guidance.

  6. Identification and human condition analysis based on the human voice analysis

    NASA Astrophysics Data System (ADS)

    Mieshkov, Oleksandr Yu.; Novikov, Oleksandr O.; Novikov, Vsevolod O.; Fainzilberg, Leonid S.; Kotyra, Andrzej; Smailova, Saule; Kozbekova, Ainur; Imanbek, Baglan

    2017-08-01

    The paper presents a two-stage biotechnical system for human condition analysis that is based on analysis of human voice signal. At the initial stage, the voice signal is pre-processed and its characteristics in time domain are determined. At the first stage, the developed system is capable of identifying the person in the database on the basis of the extracted characteristics. At the second stage, the model of a human voice is built on the basis of the real voice signals after clustering the whole database.

  7. Point-based and model-based geolocation analysis of airborne laser scanning data

    NASA Astrophysics Data System (ADS)

    Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet

    2017-01-01

    Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.

  8. A Web-Based Development Environment for Collaborative Data Analysis

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  9. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example themore » class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.« less

  10. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the classmore » of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.« less

  11. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  12. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  13. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  14. Perinatal Bereavement: A Principle-based Concept Analysis

    PubMed Central

    FENSTERMACHER, Kimberly; HUPCEY, Judith E.

    2013-01-01

    Aim This paper is a report of an analysis of the concept of perinatal bereavement. Background The concept of perinatal bereavement emerged in the scientific literature during the 1970s. Perinatal bereavement is a practice based concept, although it is not well defined in the scientific literature and is often intermingled with the concepts of mourning and grief. Design Concept Analysis. Data Sources Using the term ‘perinatal bereavement’ and limits of only English and human, Pub Med and CINAHL were searched to yield 278 available references dating from 1974 – 2011. Articles specific to the experience of perinatal bereavement were reviewed. The final data set was 143 articles. Review Methods The methods of principle-based concept analysis were used. Results reveal conceptual components (antecedents, attributes and outcomes) which are delineated to create a theoretical definition of perinatal bereavement. Results The concept is epistemologically immature, with few explicit definitions to describe the phenomenon. Inconsistency in conceptual meaning threatens the construct validity of measurement tools for perinatal bereavement and contributes to incongruent theoretical definitions. This has implications for both nursing science (how the concept is studied and theoretically integrated) and clinical practice (timing and delivery of support interventions). Conclusions Perinatal bereavement is a multifaceted global phenomenon that follows perinatal loss. Lack of conceptual clarity and lack of a clearly articulated conceptual definition impede the synthesis and translation of research findings into practice. A theoretical definition of perinatal bereavement is offered as a platform for researchers to advance the concept through research and theory development. PMID:23458030

  15. [Evidence based medicine and cost-effectiveness analysis in ophthalmology].

    PubMed

    Nováková, D; Rozsíval, P

    2004-09-01

    To make the reader familiar with the term evidence based medicine (EBM), to explain the principle of cost-effectiveness analysis (price-profit), and to show its usefulness to compare the effectiveness of different medical procedures. Based on few examples, in this article the relevance and calculation of important parameters of cost-effectiveness analysis (CE), as utility value (UV), quality adjusted life years (QALY) is explained. In addition, calculation of UV and QALY for the cataract surgery, including its complications, is provided. According to this method, laser photocoagulation and cryocoagulation of the early stages of retinopathy of prematurity, treatment of amblyopia, cataract surgery of one or both eyes, from the vitreoretinal procedures the early vitrectomy in cases of hemophtalmus in proliferative diabetic retinopathy or grid laser photocoagulation in diabetic macular edema or worsening of the visual acuity due to the branch retinal vein occlusion belong to highly effective procedures. On the other hand, to the procedures with low cost effectiveness belongs the treating of the central retinal artery occlusion with anterior chamber paracentesis, as well as with CO2 inhalation, or photodynamic therapy in choroidal neovascularization in age-related macular degeneration with visual acuity of the better eye 20/200. Cost-effectiveness analysis is a new perspective method evaluating successfulness of medical procedure comparing the final effect with the financial costs. In evaluation of effectiveness of individual procedures, three main aspects are considered: subjective feeling of influence of the disease on the patient's life, objective results of clinical examination and financial costs of the procedure. According to this method, the cataract surgery, as well as procedures in the pediatric ophthalmology belong to the most effective surgical methods.

  16. Sensitive albuminuria analysis using dye-binding based test strips.

    PubMed

    Delanghe, Joris R; Himpe, Jonas; De Cock, Naomi; Delanghe, Sigurd; De Herde, Kevin; Stove, Veronique; Speeckaert, Marijn M

    2017-08-01

    Populations at increased risk for chronic kidney disease should be screened for albuminuria. Possibilities of advanced urine strip readers based on complementary metal oxide semiconductor (CMOS) sensor technology were investigated for obtaining quantitative albuminuria results. Reflectance data of test strips (Sysmex UFC 3500 reader+CMOS) were compared with albuminuria (BNII) and with proteinuria (Cobas 8000). Urinary creatinine was assayed using a Jaffe-based creatinine assay (Cobas 8000). Calibration curve was made between 11.5 and 121.5mg/L with detection limit of 5.5mg/L. Within-run CV values of reflectance data were 0.21% (UC-Control L; 10mg/L) and 0.37% (UC-Control H; >150mg/L) for albumin, and 0.71%/3.97% for creatinine. Between-run CV values were 0.24%/0.42% for albumin and 0.93%/5.13% for creatinine. A strong correlation (r=0.92) was obtained between albuminuria (BNII) and protein strip reflectance data. Creatinine reflectance data correlated well with Jaffe-based urinary creatinine data (r=0.90). Albumin:creatinine ratio obtained by test strip and by wet chemistry showed a good correlation (r=0.59). Carbamylated, glycated and partially hydrolyzed isoforms of albumin could be detected by test strip. Dye-binding based albumin test strip assay in combination with a CMOS based reader would potentially allow quantitative analysis of albuminuria and determination of albumin:creatinine ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A molecular phylogeny of anseriformes based on mitochondrial DNA analysis.

    PubMed

    Donne-Goussé, Carole; Laudet, Vincent; Hänni, Catherine

    2002-06-01

    To study the phylogenetic relationships among Anseriformes, sequences for the complete mitochondrial control region (CR) were determined from 45 waterfowl representing 24 genera, i.e., half of the existing genera. To confirm the results based on CR analysis we also analyzed representative species based on two mitochondrial protein-coding genes, cytochrome b (cytb) and NADH dehydrogenase subunit 2 (ND2). These data allowed us to construct a robust phylogeny of the Anseriformes and to compare it with existing phylogenies based on morphological or molecular data. Chauna and Dendrocygna were identified as early offshoots of the Anseriformes. All the remaining taxa fell into two clades that correspond to the two subfamilies Anatinae and Anserinae. Within Anserinae Branta and Anser cluster together, whereas Coscoroba, Cygnus, and Cereopsis form a relatively weak clade with Cygnus diverging first. Five clades are clearly recognizable among Anatinae: (i) the Anatini with Anas and Lophonetta; (ii) the Aythyini with Aythya and Netta; (iii) the Cairinini with Cairina and Aix; (iv) the Mergini with Mergus, Bucephala, Melanitta, Callonetta, Somateria, and Clangula, and (v) the Tadornini with Tadorna, Chloephaga, and Alopochen. The Tadornini diverged early on from the Anatinae; then the Mergini and a large group that comprises the Anatini, Aythyini, Cairinini, and two isolated genera, Chenonetta and Marmaronetta, diverged. The phylogeny obtained with the control region appears more robust than the one obtained with mitochondrial protein-coding genes such as ND2 and cytb. This suggests that the CR is a powerful tool for bird phylogeny, not only at a small scale (i.e., relationships between species) but also at the family level. Whereas morphological analysis effectively resolved the split between Anatinae and Anserinae and the existence of some of the clades, the precise composition of the clades are different when morphological and molecular data are compared. (c) 2002 Elsevier

  18. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  19. Feature-Based Statistical Analysis of Combustion Simulation Data

    SciTech Connect

    Bennett, J; Krishnamoorthy, V; Liu, S

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing andmore » reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for

  20. Cloud-based data-proximate visualization and analysis

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  1. Lunar base thermal management/power system analysis and design

    NASA Technical Reports Server (NTRS)

    Mcghee, Jerry R.

    1992-01-01

    A compilation of several lunar surface thermal management and power system studies completed under contract and IR&D is presented. The work includes analysis and preliminary design of all major components of an integrated thermal management system, including loads determination, active internal acquisition and transport equipment, external transport systems (active and passive), passive insulation, solar shielding, and a range of lunar surface radiator concepts. Several computer codes were utilized in support of this study, including RADSIM to calculate radiation exchange factors and view factors, RADIATOR (developed in-house) for heat rejection system sizing and performance analysis over a lunar day, SURPWER for power system sizing, and CRYSTORE for cryogenic system performance predictions. Although much of the work was performed in support of lunar rover studies, any or all of the results can be applied to a range of surface applications. Output data include thermal loads summaries, subsystem performance data, mass, and volume estimates (where applicable), integrated and worst-case lunar day radiator size/mass and effective sink temperatures for several concepts (shielded and unshielded), and external transport system performance estimates for both single and two-phase (heat pumped) transport loops. Several advanced radiator concepts are presented, along with brief assessments of possible system benefits and potential drawbacks. System point designs are presented for several cases, executed in support of the contract and IR&D studies, although the parametric nature of the analysis is stressed to illustrate applicability of the analysis procedure to a wide variety of lunar surface systems. The reference configuration(s) derived from the various studies will be presented along with supporting criteria. A preliminary design will also be presented for the reference basing scenario, including qualitative data regarding TPS concerns and issues.

  2. Maximum flow-based resilience analysis: From component to system

    PubMed Central

    Jin, Chong; Li, Ruiying; Kang, Rui

    2017-01-01

    Resilience, the ability to withstand disruptions and recover quickly, must be considered during system design because any disruption of the system may cause considerable loss, including economic and societal. This work develops analytic maximum flow-based resilience models for series and parallel systems using Zobel’s resilience measure. The two analytic models can be used to evaluate quantitatively and compare the resilience of the systems with the corresponding performance structures. For systems with identical components, the resilience of the parallel system increases with increasing number of components, while the resilience remains constant in the series system. A Monte Carlo-based simulation method is also provided to verify the correctness of our analytic resilience models and to analyze the resilience of networked systems based on that of components. A road network example is used to illustrate the analysis process, and the resilience comparison among networks with different topologies but the same components indicates that a system with redundant performance is usually more resilient than one without redundant performance. However, not all redundant capacities of components can improve the system resilience, the effectiveness of the capacity redundancy depends on where the redundant capacity is located. PMID:28545135

  3. Aerodynamic flight evaluation analysis and data base update

    NASA Technical Reports Server (NTRS)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  4. NURBS-Based Geometry for Integrated Structural Analysis

    NASA Technical Reports Server (NTRS)

    Oliver, James H.

    1997-01-01

    This grant was initiated in April 1993 and completed in September 1996. The primary goal of the project was to exploit the emerging defacto CAD standard of Non- Uniform Rational B-spline (NURBS) based curve and surface geometry to integrate and streamline the process of turbomachinery structural analysis. We focused our efforts on critical geometric modeling challenges typically posed by the requirements of structural analysts. We developed a suite of software tools that facilitate pre- and post-processing of NURBS-based turbomachinery blade models for finite element structural analyses. We also developed tools to facilitate the modeling of blades in their manufactured (or cold) state based on nominal operating shape and conditions. All of the software developed in the course of this research is written in the C++ language using the Iris Inventor 3D graphical interface tool-kit from Silicon Graphics. In addition to enhanced modularity, improved maintainability, and efficient prototype development, this design facilitates the re-use of code developed for other NASA projects and provides a uniform and professional 'look and feel' for all applications developed by the Iowa State Team.

  5. Analysis of base and codon usage by rubella virus.

    PubMed

    Zhou, Yumei; Chen, Xianfeng; Ushijima, Hiroshi; Frey, Teryl K

    2012-05-01

    Rubella virus (RUBV), a small, plus-strand RNA virus that is an important human pathogen, has the unique feature that the GC content of its genome (70%) is the highest (by 20%) among RNA viruses. To determine the effect of this GC content on genomic evolution, base and codon usage were analyzed across viruses from eight diverse genotypes of RUBV. Despite differences in frequency of codon use, the favored codons in the RUBV genome matched those in the human genome for 18 of the 20 amino acids, indicating adaptation to the host. Although usage patterns were conserved in corresponding genes in the diverse genotypes, within-genome comparison revealed that both base and codon usages varied regionally, particularly in the hypervariable region (HVR) of the P150 replicase gene. While directional mutation pressure was predominant in determining base and codon usage within most of the genome (with the strongest tendency being towards C's at third codon positions), natural selection was predominant in the HVR region. The GC content of this region was the highest in the genome (>80%), and it was not clear if selection at the nucleotide level accompanied selection at the amino acid level. Dinucleotide frequency analysis of the RUBV genome revealed that TpA usage was lower than expected, similar to mammalian genes; however, CpG usage was not suppressed, and TpG usage was not enhanced, as is the case in mammalian genes.

  6. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  7. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    PubMed Central

    Song, Zhiming; Wang, Maocai; Dai, Guangming; Vasile, Massimiliano

    2015-01-01

    As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m − 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m − 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper. PMID:25874246

  8. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  9. Spectral Entropy Based Neuronal Network Synchronization Analysis Based on Microelectrode Array Measurements

    PubMed Central

    Kapucu, Fikret E.; Välkki, Inkeri; Mikkonen, Jarno E.; Leone, Chiara; Lenk, Kerstin; Tanskanen, Jarno M. A.; Hyttinen, Jari A. K.

    2016-01-01

    Synchrony and asynchrony are essential aspects of the functioning of interconnected neuronal cells and networks. New information on neuronal synchronization can be expected to aid in understanding these systems. Synchronization provides insight in the functional connectivity and the spatial distribution of the information processing in the networks. Synchronization is generally studied with time domain analysis of neuronal events, or using direct frequency spectrum analysis, e.g., in specific frequency bands. However, these methods have their pitfalls. Thus, we have previously proposed a method to analyze temporal changes in the complexity of the frequency of signals originating from different network regions. The method is based on the correlation of time varying spectral entropies (SEs). SE assesses the regularity, or complexity, of a time series by quantifying the uniformity of the frequency spectrum distribution. It has been previously employed, e.g., in electroencephalogram analysis. Here, we revisit our correlated spectral entropy method (CorSE), providing evidence of its justification, usability, and benefits. Here, CorSE is assessed with simulations and in vitro microelectrode array (MEA) data. CorSE is first demonstrated with a specifically tailored toy simulation to illustrate how it can identify synchronized populations. To provide a form of validation, the method was tested with simulated data from integrate-and-fire model based computational neuronal networks. To demonstrate the analysis of real data, CorSE was applied on in vitro MEA data measured from rat cortical cell cultures, and the results were compared with three known event based synchronization measures. Finally, we show the usability by tracking the development of networks in dissociated mouse cortical cell cultures. The results show that temporal correlations in frequency spectrum distributions reflect the network relations of neuronal populations. In the simulated data, CorSE unraveled the

  10. Independent Component Analysis applied to Ground-based observations

    NASA Astrophysics Data System (ADS)

    Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert

    2017-10-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulations of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitudes smaller.The effects of the terrestrial atmosphere and some of the time dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analyses (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition this technique has the advantage of requiring no reference star.Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements.

  11. EM calibration based on Post OPC layout analysis

    NASA Astrophysics Data System (ADS)

    Sreedhar, Aswin; Kundu, Sandip

    2010-03-01

    Design for Manufacturability (DFM) involves changes to the design and CAD tools to help increase pattern printability and improve process control. Design for Reliability (DFR) performs the same to improve reliability of devices from failures such as Electromigration (EM), gate-oxide break down, hot carrier injection (HCI), Negative Bias Temperature Insatiability (NBTI) and mechanical stress effects. Electromigration (EM) occurs due to migration or displacement of atoms as a result of the movement of electrons through a conducting medium. The rate of migration determines the Mean Time to Failure (MTTF) which is modeled as a function of temperature and current density. The model itself is calibrated through failure analysis (FA) of parts that are deemed to have failed due to EM against design parameters such as linewidth. Reliability Verification (RV) of a design involves verifying that every conducting line in a design meets certain MTTF threshold. In order to perform RV, current density for each wire must be computed. Current itself is a function of the parasitics that are determined through RC extraction. The standard practice is to perform the RC extraction and current density calculation on drawn, pre-OPC layouts. If a wire fails to meet threshold for MTTF, it may be resized. Subsequently, mask preparation steps such as OPC and PSM introduce extra features such as SRAFs, jogs,hammerheads and serifs that change their resistance, capacitance and current density values. Hence, calibrating EM model based on pre-OPC layouts will lead to different results compared to post-OPC layouts. In this work, we compare EM model calibration and reliability check based on drawn layout versus predicted layout, where the drawn layout is pre-OPC layout and predicted layout is based on litho simulation of post-OPC layout. Results show significant divergence between these two approaches, making a case for methodology based on predicted layout.

  12. Hydrological landscape analysis based on digital elevation data

    NASA Astrophysics Data System (ADS)

    Seibert, J.; McGlynn, B.; Grabs, T.; Jensco, K.

    2008-12-01

    Topography is a major factor controlling both hydrological and soil processes at the landscape scale. While this is well-accepted qualitatively, quantifying relationships between topography and spatial variations of hydrologically relevant variables at the landscape scale still remains a challenging research topic. In this presentation, we describe hydrological landscape analysis HLA) as a way to derive relevant topographic indicies to describe the spatial variations of hydrological variables at the landscape scale. We demonstrate our HLA approach with four high-resolution digital elevation models (DEMs) from Sweden, Switzerland and Montana (USA). To investigate scale effects HLA metrics, we compared DEMs of different resolutions. These LiDAR-derived DEMs of 3m, 10m, and 30m, resolution represent catchments of ~ 5 km2 ranging from low to high relief. A central feature of HLA is the flowpath-based analysis of topography and the separation of hillslopes, riparian areas, and the stream network. We included the following metrics: riparian area delineation, riparian buffer potential, separation of stream inflows into right and left bank components, travel time proxies based on flowpath distances and gradients to the channel, and as a hydrologic similarity to the hypsometric curve we suggest the distribution of elevations above the stream network (computed based on the location where a certain flow pathway enters the stream). Several of these indices depended clearly on DEM resolution, whereas this effect was minor for others. While the hypsometric curves all were S-shaped the 'hillslope-hypsometric curves' had the shape of a power function with exponents less than 1. In a similar way we separated flow pathway lengths and gradients between hillslopes and streams and compared a topographic travel time proxy, which was based on the integration of gradients along the flow pathways. Besides the comparison of HLA-metrics for different catchments and DEM resolutions we present

  13. Phylogeny of the Acanthocephala based on morphological characters.

    PubMed

    Monks, S

    2001-02-01

    Only four previous studies of relationships among acanthocephalans have included cladistic analyses, and knowledge of the phylogeny of the group has not kept pace with that of other taxa. The purpose of this study is to provide a more comprehensive analysis of the phylogenetic relationships among members of the phylum Acanthocephala using morphological characters. The most appropriate outgroups are those that share a common early cell-cleavage pattern (polar placement of centrioles), such as the Rotifera, rather than the Priapulida (meridional placement of centrioles) to provide character polarity based on common ancestry rather than a general similarity likely due to convergence of body shapes. The phylogeny of 22 species of the Acanthocephala was evaluated based on 138 binary and multistate characters derived from comparative morphological and ontogenetic studies. Three assumptions of cement gland structure were tested: (i) the plesiomorphic type of cement glands in the Rotifera, as the sister group, is undetermined; (ii) non-syncytial cement glands are plesiomorphic; and (iii) syncytial cement glands are plesiomorphic. The results were used to test an early move of Tegorhynchus pectinarius to Koronacantha and to evaluate the relationship between Tegorhynchus and Illiosentis. Analysis of the data-set for each of these assumptions of cement gland structure produced the same single most parsimonious tree topology. Using Assumptions i and ii for the cement glands, the trees were the same length (length = 404 steps, CI = 0.545, CIX = 0.517, HI = 0.455, HIX = 0.483, RI = 0.670, RC = 0.365). Using Assumption iii, the tree was three steps longer (length = 408 steps, CI = 0.539, CIX = 0.512, HI = 0.461, HIX = 0.488, RI = 0.665, RC = 0.359). The tree indicates that the Palaeacanthocephala and Eoacanthocephala both are monophyletic and are sister taxa. The members of the Archiacanthocephala are basal to the other two clades, but do not themselves form a clade. The results

  14. Rasch model based analysis of the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-06-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct). The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17-18 years). The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4)% , indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian), an additional predominantly Newtonian sample ( N=141 , average FCI score of 64.5%) of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further examination

  15. Functional analysis-based interventions for challenging behaviour in dementia.

    PubMed

    Moniz Cook, Esme D; Swift, Katie; James, Ian; Malouf, Reem; De Vugt, Marjolein; Verhey, Frans

    2012-02-15

    Functional analysis (FA) for the management of challenging behaviour is a promising behavioural intervention that involves exploring the meaning or purpose of an individual's behaviour. It extends the 'ABC' approach of behavioural analysis, to overcome the restriction of having to derive a single explanatory hypothesis for the person's behaviour. It is seen as a first line alternative to traditional pharmacological management for agitation and aggression. FA typically requires the therapist to develop and evaluate hypotheses-driven strategies that aid family and staff caregivers to reduce or resolve a person's distress and its associated behavioural manifestations. To assess the effects of functional analysis-based interventions for people with dementia (and their caregivers) living in their own home or in other settings. We searched ALOIS: the Cochrane Dementia and Cognitive Improvement Group's Specialized Register on 3 March 2011 using the terms: FA, behaviour (intervention, management, modification), BPSD, psychosocial and Dementia. Randomised controlled trials (RCTs) with reported behavioural outcomes that could be associated with functional analysis for the management of challenging behaviour in dementia. Four reviewers selected trials for inclusion. Two reviewers worked independently to extract data and assess trial quality, including bias. Meta-analyses for reported incidence, frequency, severity of care recipient challenging behaviour and mood (primary outcomes) and caregiver reaction, burden and mood were performed. Details of adverse effects were noted. Eighteen trials are included in the review. The majority were in family care settings. For fourteen studies, FA was just one aspect of a broad multi-component programme of care. Assessing the effect of FA was compromised by ill-defined protocols for the duration of component parts of these programmes (i.e. frequency of the intervention or actual time spent). Therefore, establishing the real effect of the

  16. Performance Analysis of an Actor-Based Distributed Simulation

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    Object-oriented design of simulation programs appears to be very attractive because of the natural association of components in the simulated system with objects. There is great potential in distributing the simulation across several computers for the purpose of parallel computation and its consequent handling of larger problems in less elapsed time. One approach to such a design is to use "actors", that is, active objects with their own thread of control. Because these objects execute concurrently, communication is via messages. This is in contrast to an object-oriented design using passive objects where communication between objects is via method calls (direct calls when they are in the same address space and remote procedure calls when they are in different address spaces or different machines). This paper describes a performance analysis program for the evaluation of a design for distributed simulations based upon actors.

  17. Visual traffic jam analysis based on trajectory data.

    PubMed

    Wang, Zuchao; Lu, Min; Yuan, Xiaoru; Zhang, Junping; van de Wetering, Huub

    2013-12-01

    In this work, we present an interactive system for visual analysis of urban traffic congestion based on GPS trajectories. For these trajectories we develop strategies to extract and derive traffic jam information. After cleaning the trajectories, they are matched to a road network. Subsequently, traffic speed on each road segment is computed and traffic jam events are automatically detected. Spatially and temporally related events are concatenated in, so-called, traffic jam propagation graphs. These graphs form a high-level description of a traffic jam and its propagation in time and space. Our system provides multiple views for visually exploring and analyzing the traffic condition of a large city as a whole, on the level of propagation graphs, and on road segment level. Case studies with 24 days of taxi GPS trajectories collected in Beijing demonstrate the effectiveness of our system.

  18. Microscopic image analysis for reticulocyte based on watershed algorithm

    NASA Astrophysics Data System (ADS)

    Wang, J. Q.; Liu, G. F.; Liu, J. G.; Wang, G.

    2007-12-01

    We present a watershed-based algorithm in the analysis of light microscopic image for reticulocyte (RET), which will be used in an automated recognition system for RET in peripheral blood. The original images, obtained by micrography, are segmented by modified watershed algorithm and are recognized in term of gray entropy and area of connective area. In the process of watershed algorithm, judgment conditions are controlled according to character of the image, besides, the segmentation is performed by morphological subtraction. The algorithm was simulated with MATLAB software. It is similar for automated and manual scoring and there is good correlation(r=0.956) between the methods, which is resulted from 50 pieces of RET images. The result indicates that the algorithm for peripheral blood RETs is comparable to conventional manual scoring, and it is superior in objectivity. This algorithm avoids time-consuming calculation such as ultra-erosion and region-growth, which will speed up the computation consequentially.

  19. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  20. Agent-based model to rural urban migration analysis

    NASA Astrophysics Data System (ADS)

    Silveira, Jaylson J.; Espíndola, Aquino L.; Penna, T. J. P.

    2006-05-01

    In this paper, we analyze the rural-urban migration phenomenon as it is usually observed in economies which are in the early stages of industrialization. The analysis is conducted by means of a statistical mechanics approach which builds a computational agent-based model. Agents are placed on a lattice and the connections among them are described via an Ising-like model. Simulations on this computational model show some emergent properties that are common in developing economies, such as a transitional dynamics characterized by continuous growth of urban population, followed by the equalization of expected wages between rural and urban sectors (Harris-Todaro equilibrium condition), urban concentration and increasing of per capita income.

  1. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  2. An adjoint-based sensitivity analysis of thermoacoustic network models

    NASA Astrophysics Data System (ADS)

    Sogaro, Francesca; Morgans, Aimee; Schmid, Peter

    2017-11-01

    Thermoacoustic instability is a phenomenon that occurs in numerous combustion systems, from rockets to land-based gas turbines. The acoustic oscillations of these systems are of significant importance as they can result in severe vibrations, thrust oscillations, thermal stresses and mechanical loads that lead to fatigue or even failure. In this work we use a low-order network model representation of a combustor system where linear acoustics are solved together with the appropriate boundary conditions, area change jump conditions, acoustic dampers and an appropriate flame transfer function. Special emphasis is directed towards the interaction between acoustically driven instabilities and flame-intrinsic modes. Adjoint methods are used to perform a sensitivity analysis of the spectral properties of the system to changes in the parameters involved. An exchange of modal identity between acoustic and intrinsic modes will be demonstrated and analyzed. The results provide insight into the interplay between various mode types and build a quantitative foundation for the design of combustors.

  3. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  4. Identification and annotation of erotic film based on content analysis

    NASA Astrophysics Data System (ADS)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  5. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  6. Analysis of Environmental Law Enforcement Mechanism Based on Economic Principle

    NASA Astrophysics Data System (ADS)

    Cao, Hongjun; Shao, Haohao; Cai, Xuesen

    2017-11-01

    Strengthening and improving the environmental law enforcement mechanism is an important way to protect the ecological environment. This paper is based on economical principles, we did analysis of the marginal management costs by using Pigou means and the marginal transaction costs by using Coase means vary with the quantity growth of pollutant discharge Enterprises. We analyzed all this information, then we got the conclusion as follows. In the process of strengthening the environmental law enforcement mechanism, firstly, we should fully mobilize all aspects of environmental law enforcement, such as legislative bodies and law enforcement agencies, public welfare organizations, television, newspapers, enterprises, people and so on, they need to form a reasonable and organic structure system; then we should use various management means, such as government regulation, legal sanctions, fines, persuasion and denounce, they also need to form an organic structural system.

  7. Iris recognition based on robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong

    2014-11-01

    Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.

  8. Analysis of rocket flight stability based on optical image measurement

    NASA Astrophysics Data System (ADS)

    Cui, Shuhua; Liu, Junhu; Shen, Si; Wang, Min; Liu, Jun

    2018-02-01

    Based on the abundant optical image measurement data from the optical measurement information, this paper puts forward the method of evaluating the rocket flight stability performance by using the measurement data of the characteristics of the carrier rocket in imaging. On the basis of the method of measuring the characteristics of the carrier rocket, the attitude parameters of the rocket body in the coordinate system are calculated by using the measurements data of multiple high-speed television sets, and then the parameters are transferred to the rocket body attack angle and it is assessed whether the rocket has a good flight stability flying with a small attack angle. The measurement method and the mathematical algorithm steps through the data processing test, where you can intuitively observe the rocket flight stability state, and also can visually identify the guidance system or failure analysis.

  9. Encoder fault analysis system based on Moire fringe error signal

    NASA Astrophysics Data System (ADS)

    Gao, Xu; Chen, Wei; Wan, Qiu-hua; Lu, Xin-ran; Xie, Chun-yu

    2018-02-01

    Aiming at the problem of any fault and wrong code in the practical application of photoelectric shaft encoder, a fast and accurate encoder fault analysis system is researched from the aspect of Moire fringe photoelectric signal processing. DSP28335 is selected as the core processor and high speed serial A/D converter acquisition card is used. And temperature measuring circuit using AD7420 is designed. Discrete data of Moire fringe error signal is collected at different temperatures and it is sent to the host computer through wireless transmission. The error signal quality index and fault type is displayed on the host computer based on the error signal identification method. The error signal quality can be used to diagnosis the state of error code through the human-machine interface.

  10. Waveguide-based electro-absorption modulator performance: comparative analysis

    NASA Astrophysics Data System (ADS)

    Amin, Rubab; Khurgin, Jacob B.; Sorger, Volker J.

    2018-06-01

    Electro-optic modulation is a key function for data communication. Given the vast amount of data handled, understanding the intricate physics and trade-offs of modulators on-chip allows revealing performance regimes not explored yet. Here we show a holistic performance analysis for waveguide-based electro-absorption modulators. Our approach centers around material properties revealing obtainable optical absorption leading to effective modal cross-section, and material broadening effects. Taken together both describe the modulator physical behavior entirely. We consider a plurality of material modulation classes to include two-level absorbers such as quantum dots, free carrier accumulation or depletion such as ITO or Silicon, two-dimensional electron gas in semiconductors such as quantum wells, Pauli blocking in Graphene, and excitons in two-dimensional atomic layered materials such as found in transition metal dichalcogendies. Our results show that reducing the modal area generally improves modulator performance defined by the amount of induced electrical charge, and hence the energy-per-bit function, required switching the signal. We find that broadening increases the amount of switching charge needed. While some material classes allow for reduced broadening such as quantum dots and 2-dimensional materials due to their reduced Coulomb screening leading to increased oscillator strengths, the sharpness of broadening is overshadowed by thermal effects independent of the material class. Further we find that plasmonics allows the switching charge and energy-per-bit function to be reduced by about one order of magnitude compared to bulk photonics. This analysis is aimed as a guide for the community to predict anticipated modulator performance based on both existing and emerging materials.

  11. Adansonian Analysis and Deoxyribonucleic Acid Base Composition of Serratia marcescens

    PubMed Central

    Colwell, R. R.; Mandel, M.

    1965-01-01

    Colwell, R. R. (Georgetown University, Washington, D.C.), and M. Mandel. Adansonian analysis and deoxyribonucleic acid base composition of Serratia marcescens. J. Bacteriol. 89:454–461. 1965.—A total of 33 strains of Serratia marcescens were subjected to Adansonian analysis for which more than 200 coded features for each of the organisms were included. In addition, the base composition [expressed as moles per cent guanine + cytosine (G + C)] of the deoxyribonucleic acid (DNA) prepared from each of the strains was determined. Except for four strains which were intermediate between Serratia and the Hafnia and Aerobacter group C of Edwards and Ewing, the S. marcescens species group proved to be extremely homogeneous, and the different strains showed high affinities for each other (mean similarity, ¯S = 77%). The G + C ratio of the DNA from the Serratia strains ranged from 56.2 to 58.4% G + C. Many species names have been listed for the genus, but only a single clustering of the strains was obtained at the species level, for which the species name S. marcescens was retained. S. kiliensis, S. indica, S. plymuthica, and S. marinorubra could not be distinguished from S. marcescens; it was concluded, therefore, that there is only a single species in the genus. The variety designation kiliensis does not appear to be valid, since no subspecies clustering of strains with negative Voges-Proskauer reactions could be detected. The characteristics of the species are listed, and a description of S. marcescens is presented. PMID:14255714

  12. Computer-based analysis of holography using ray tracing.

    PubMed

    Latta, J N

    1971-12-01

    The application of a ray-tracing methodology to holography is presented. Emphasis is placed on establishing a very general foundation from which to build a general computer-based implementation. As few restrictions as possible are placed on the recording and reconstruction geometry. The necessary equations are established from the construction and reconstruction parameters of the hologram. The aberrations are defined following H. H. Hopkins, and these aberration specification techniques are compared with those used previously to analyze holography. Representative of the flexibility of the ray-tracing approach, two examples are considered. The first compares the answers between a wavefront matching and the ray-tracing analysis in the case of aberration balancing to compensate for chromatic aberrations. The results are very close and establish the basic utility of aberration balancing. Further indicative of the power of a ray tracing, a thick media analysis is included in the computer programs. This section is then used to perform a study of the effects of hologram emulsion shrinkage and methods for compensation. The results of compensating such holograms are to introduce aberrations, and these are considered in both reflection and transmission holograms.

  13. Quantitative Analysis of Intracellular Motility Based on Optical Flow Model

    PubMed Central

    Li, Heng

    2017-01-01

    Analysis of cell mobility is a key issue for abnormality identification and classification in cell biology research. However, since cell deformation induced by various biological processes is random and cell protrusion is irregular, it is difficult to measure cell morphology and motility in microscopic images. To address this dilemma, we propose an improved variation optical flow model for quantitative analysis of intracellular motility, which not only extracts intracellular motion fields effectively but also deals with optical flow computation problem at the border by taking advantages of the formulation based on L1 and L2 norm, respectively. In the energy functional of our proposed optical flow model, the data term is in the form of L2 norm; the smoothness of the data changes with regional features through an adaptive parameter, using L1 norm near the edge of the cell and L2 norm away from the edge. We further extract histograms of oriented optical flow (HOOF) after optical flow field of intracellular motion is computed. Then distances of different HOOFs are calculated as the intracellular motion features to grade the intracellular motion. Experimental results show that the features extracted from HOOFs provide new insights into the relationship between the cell motility and the special pathological conditions. PMID:29065574

  14. Multi-spectrometer calibration transfer based on independent component analysis.

    PubMed

    Liu, Yan; Xu, Hao; Xia, Zhenzhen; Gong, Zhiyong

    2018-02-26

    Calibration transfer is indispensable for practical applications of near infrared (NIR) spectroscopy due to the need for precise and consistent measurements across different spectrometers. In this work, a method for multi-spectrometer calibration transfer is described based on independent component analysis (ICA). A spectral matrix is first obtained by aligning the spectra measured on different spectrometers. Then, by using independent component analysis, the aligned spectral matrix is decomposed into the mixing matrix and the independent components of different spectrometers. These differing measurements between spectrometers can then be standardized by correcting the coefficients within the independent components. Two NIR datasets of corn and edible oil samples measured with three and four spectrometers, respectively, were used to test the reliability of this method. The results of both datasets reveal that spectra measurements across different spectrometers can be transferred simultaneously and that the partial least squares (PLS) models built with the measurements on one spectrometer can predict that the spectra can be transferred correctly on another.

  15. Phylogenetic relationships of Malassezia species based on multilocus sequence analysis.

    PubMed

    Castellá, Gemma; Coutinho, Selene Dall' Acqua; Cabañes, F Javier

    2014-01-01

    Members of the genus Malassezia are lipophilic basidiomycetous yeasts, which are part of the normal cutaneous microbiota of humans and other warm-blooded animals. Currently, this genus consists of 14 species that have been characterized by phenetic and molecular methods. Although several molecular methods have been used to identify and/or differentiate Malassezia species, the sequencing of the rRNA genes and the chitin synthase-2 gene (CHS2) are the most widely employed. There is little information about the β-tubulin gene in the genus Malassezia, a gene has been used for the analysis of complex species groups. The aim of the present study was to sequence a fragment of the β-tubulin gene of Malassezia species and analyze their phylogenetic relationship using a multilocus sequence approach based on two rRNA genes (ITS including 5.8S rRNA and D1/D2 region of 26S rRNA) together with two protein encoding genes (CHS2 and β-tubulin). The phylogenetic study of the partial β-tubulin gene sequences indicated that this molecular marker can be used to assess diversity and identify new species. The multilocus sequence analysis of the four loci provides robust support to delineate species at the terminal nodes and could help to estimate divergence times for the origin and diversification of Malassezia species.

  16. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  17. Artistic image analysis using graph-based learning approaches.

    PubMed

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  18. Hydrochemical analysis of groundwater using a tree-based model

    NASA Astrophysics Data System (ADS)

    Litaor, M. Iggy; Brielmann, H.; Reichmann, O.; Shenker, M.

    2010-06-01

    SummaryHydrochemical indices are commonly used to ascertain aquifer characteristics, salinity problems, anthropogenic inputs and resource management, among others. This study was conducted to test the applicability of a binary decision tree model to aquifer evaluation using hydrochemical indices as input. The main advantage of the tree-based model compared to other commonly used statistical procedures such as cluster and factor analyses is the ability to classify groundwater samples with assigned probability and the reduction of a large data set into a few significant variables without creating new factors. We tested the model using data sets collected from headwater springs of the Jordan River, Israel. The model evaluation consisted of several levels of complexity, from simple separation between the calcium-magnesium-bicarbonate water type of karstic aquifers to the more challenging separation of calcium-sodium-bicarbonate water type flowing through perched and regional basaltic aquifers. In all cases, the model assigned measures for goodness of fit in the form of misclassification errors and singled out the most significant variable in the analysis. The model proceeded through a sequence of partitions providing insight into different possible pathways and changing lithology. The model results were extremely useful in constraining the interpretation of geological heterogeneity and constructing a conceptual flow model for a given aquifer. The tree model clearly identified the hydrochemical indices that were excluded from the analysis, thus providing information that can lead to a decrease in the number of routinely analyzed variables and a significant reduction in laboratory cost.

  19. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    NASA Astrophysics Data System (ADS)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  20. Aroma characterization based on aromatic series analysis in table grapes

    PubMed Central

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  1. Discrete Mathematical Approaches to Graph-Based Traffic Analysis

    SciTech Connect

    Joslyn, Cliff A.; Cowley, Wendy E.; Hogan, Emilie A.

    2014-04-01

    Modern cyber defense and anlaytics requires general, formal models of cyber systems. Multi-scale network models are prime candidates for such formalisms, using discrete mathematical methods based in hierarchically-structured directed multigraphs which also include rich sets of labels. An exemplar of an application of such an approach is traffic analysis, that is, observing and analyzing connections between clients, servers, hosts, and actors within IP networks, over time, to identify characteristic or suspicious patterns. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. In thismore » paper, we consider traffic analysis of Netflow using both basic graph statistics and two new mathematical measures involving labeled degree distributions and time interval overlap measures. We do all of this over the VAST test data set of 96M synthetic Netflow graph edges, against which we can identify characteristic patterns of simulated ground-truth network attacks.« less

  2. Codebook-based electrooculography data analysis towards cognitive activity recognition.

    PubMed

    Lagodzinski, P; Shirahama, K; Grzegorzek, M

    2018-04-01

    With the advancement in mobile/wearable technology, people started to use a variety of sensing devices to track their daily activities as well as health and fitness conditions in order to improve the quality of life. This work addresses an idea of eye movement analysis, which due to the strong correlation with cognitive tasks can be successfully utilized in activity recognition. Eye movements are recorded using an electrooculographic (EOG) system built into the frames of glasses, which can be worn more unobtrusively and comfortably than other devices. Since the obtained information is low-level sensor data expressed as a sequence representing values in constant intervals (100 Hz), the cognitive activity recognition problem is formulated as sequence classification. However, it is unclear what kind of features are useful for accurate cognitive activity recognition. Thus, a machine learning algorithm like a codebook approach is applied, which instead of focusing on feature engineering is using a distribution of characteristic subsequences (codewords) to describe sequences of recorded EOG data, where the codewords are obtained by clustering a large number of subsequences. Further, statistical analysis of the codeword distribution results in discovering features which are characteristic to a certain activity class. Experimental results demonstrate good accuracy of the codebook-based cognitive activity recognition reflecting the effective usage of the codewords. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Consumer’s market analysis of products based on cassava

    NASA Astrophysics Data System (ADS)

    Unteawati, Bina; Fitriani; Fatih, Cholid

    2018-03-01

    Cassava product has the important role for enhancing household's income in rural. Cassava as raw material food is plentiful as local food in Lampung. Cassava product is one of strategic value addition activities. Value additional activities are a key to create income source enrichment in rural. The household was product cassava as a snack or additional food. Their product cassava was operated in small-scale, traditional, and discontinuous production. They have been lacked in technology, capital, and market access. Measurement the sustainability of their business is important. The market has driven the business globally. This research aims to (1) describe the cassava demand to locally product cassava in rural and (2) analysis the consumer's perception of cassava product. Research take placed in Lampung Province, involved Bandar Lampung and Metro City, Pringsewu, Pesawaran, Central Lampung, and East Lampung district. It is held in February until April 2017. Data were analyzed by descriptive statistic and multidimensional scaling. Based on the analysis conclude that (1) the demand of product cassava from rural was massive in volume and regularity with the enormous transaction. This fact is very important to role business cycles. Consumers demand continuously will lead the production of cassava product sustain. Producers of product cassava will consume fresh cassava for the farmer. Consumption of fresh cassava for home industry regularly in rural will develop balancing in fresh cassava price in the farming gate (2) The consumer's perception on cassava product in the different market showed that they prefer much to consume cassava chips as cassava product products than other. Next are crackers, opak, and tiwul rice. Urban consumers prefer product products as snacks (chips, crumbs, and opak), with consumption frequency of 2-5 times per week and volume of 1-3 kg purchases. Consumers in rural were more frequent with daily consumption frequency. Multidimensional scaling

  4. A network-base analysis of CMIP5 "historical" experiments

    NASA Astrophysics Data System (ADS)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  5. Computer based imaging and analysis of root gravitropism

    NASA Technical Reports Server (NTRS)

    Evans, M. L.; Ishikawa, H.

    1997-01-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.

  6. Diffraction-analysis-based characterization of very fine gratings

    NASA Astrophysics Data System (ADS)

    Bischoff, Joerg; Truckenbrodt, Horst; Bauer, Joachim J.

    1997-09-01

    Fine gratings with spatial periods below one micron, either ruled mechanically or patterned holographically, play a key role as encoders in high precision translational or rotational coordinate or measuring machines. Besides, the fast in-line characterization of submicron patterns is a stringent demand in recent microelectronic technology. Thus, a rapid, destruction free and highly accurate measuring technique is required to ensure the quality during manufacturing and for final testing. We propose an optical method which was already successfully introduced in semiconductor industry. Here, the inverse scatter problem inherent in this diffraction based approach is overcome by sophisticated data analysis such as multivariate regression or neural networks. Shortly sketched, the procedure is as follows: certain diffraction efficiencies are measured with an optical angle resolved scatterometer and assigned to a number of profile parameters via data analysis (prediction). Before, the specific measuring model has to be calibrated. If the wavelength-to-period rate is well below unity, it is quite easy to gather enough diffraction orders. However, for gratings with spatial periods being smaller than the probing wavelength, merely the specular reflex will propagate for perpendicular incidence (zero order grating). Consequently, it is virtually impossible to perform a regression analysis. A proper mean to tackle this bottleneck is to record the zero-order reflex as a function of the incident angle. In this paper, the measurement of submicron gratings is discussed with the examples of 0.8, 1.0 and 1.4 micron period resist gratings on silicon, etched silicon oxide on silicon (same periods) and a 512 nm pitch chromium grating on quartz. Using a He-Ne laser with 633 nm wavelength and measuring the direct reflex in both linear polarizations, it is shown that even submicron patterning processes can be monitored and the resulting profiles with linewidths below a half micron can be

  7. Fine tuning breath-hold-based cerebrovascular reactivity analysis models.

    PubMed

    van Niftrik, Christiaan Hendrik Bas; Piccirelli, Marco; Bozinov, Oliver; Pangalu, Athina; Valavanis, Antonios; Regli, Luca; Fierstra, Jorn

    2016-02-01

    We elaborate on existing analysis methods for breath-hold (BH)-derived cerebrovascular reactivity (CVR) measurements and describe novel insights and models toward more exact CVR interpretation. Five blood-oxygen-level-dependent (BOLD) fMRI datasets of neurovascular patients with unilateral hemispheric hemodynamic impairment were used to test various BH CVR analysis methods. Temporal lag (phase), percent BOLD signal change (CVR), and explained variance (coherence) maps were calculated using three different sine models and two novel "Optimal Signal" model-free methods based on the unaffected hemisphere and the sagittal sinus fMRI signal time series, respectively. All models showed significant differences in CVR and coherence between the affected-hemodynamic impaired-and unaffected hemisphere. Voxel-wise phase determination significantly increases CVR (0.60 ± 0.18 vs. 0.82 ± 0.27; P < 0.05). Incorporating different durations of breath hold and resting period in one sine model (two-task) did increase coherence in the unaffected hemisphere, as well as eliminating negative phase commonly obtained by one-task frequency models. The novel model-free "optimal signal" methods both explained the BOLD MR data similar to the two task sine model. Our CVR analysis demonstrates an improved CVR and coherence after implementation of voxel-wise phase and frequency adjustment. The novel "optimal signal" methods provide a robust and feasible alternative to the sine models, as both are model-free and independent of compliance. Here, the sagittal sinus model may be advantageous, as it is independent of hemispheric CVR impairment.

  8. Worry about breast cancer recurrence: a population-based analysis.

    PubMed

    Tewari, Apoorva; Chagpar, Anees B

    2014-07-01

    As more patients with breast cancer survive treatment, the importance of their long-term quality of life is increasing. One important concern for many survivors is fear of recurrence. To better understand worry about recurrence, we conducted a population-based statistical analysis. The National Health Interview Survey (NHIS) is the largest annual source of health information for the U.S. population. We obtained data from the 2010 survey, which asked breast cancer survivors about their fear of recurrence and quality of life. Data were analyzed using SUDAAN software. The 2010 NHIS sample represented 2,668,697 breast cancer survivors. On univariate analysis, worry about recurrence was correlated with current age (P = 0.03) and radiation therapy (P = 0.04). Worry was strongly associated with perceived risk of recurrence (P < 0.01) and decreased overall quality of life (P < 0.01) as well as lower self-reported physical (P < 0.01) and mental (P < 0.01) health and poor satisfaction with social activities and relationships (P < 0.01). On multivariate analysis, worry was not independently associated with decreased quality of life (P = 0.09). However, those who "always worried" about recurrence had a lower quality of life (odds ratio, 0.06; 95% confidence interval, 0.01 to 0.45). Worry about recurrence among breast cancer survivors is associated with age and radiation therapy and is correlated with self-reported physical health, mental health, social relationships, and overall quality of life. It is a significant predictor of decreased quality of life in those who worry the most. Screening for worry about recurrence is an important measure for the improvement of quality of life among breast cancer survivors.

  9. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    PubMed

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. Optical coherence tomography imaging based on non-harmonic analysis

    NASA Astrophysics Data System (ADS)

    Cao, Xu; Hirobayashi, Shigeki; Chong, Changho; Morosawa, Atsushi; Totsuka, Koki; Suzuki, Takuya

    2009-11-01

    A new processing technique called Non-Harmonic Analysis (NHA) is proposed for OCT imaging. Conventional Fourier-Domain OCT relies on the FFT calculation which depends on the window function and length. Axial resolution is counter proportional to the frame length of FFT that is limited by the swept range of the swept source in SS-OCT, or the pixel counts of CCD in SD-OCT degraded in FD-OCT. However, NHA process is intrinsically free from this trade-offs; NHA can resolve high frequency without being influenced by window function or frame length of sampled data. In this study, NHA process is explained and applied to OCT imaging and compared with OCT images based on FFT. In order to validate the benefit of NHA in OCT, we carried out OCT imaging based on NHA with the three different sample of onion-skin,human-skin and pig-eye. The results show that NHA process can realize practical image resolution that is equivalent to 100nm swept range only with less than half-reduced wavelength range.

  11. Factor analysis and predictive validity of microcomputer-based tests

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Baltzley, D. R.; Turnage, J. J.; Jones, M. B.

    1989-01-01

    11 tests were selected from two microcomputer-based performance test batteries because previously these tests exhibited rapid stability (less than 10 min, of practice) and high retest reliability efficiencies (r greater than 0.707 for each 3 min. of testing). The battery was administered three times to each of 108 college students (48 men and 60 women) and a factor analysis was performed. Two of the three identified factors appear to be related to information processing ("encoding" and "throughput/decoding"), and the third named an "output/speed" factor. The spatial, memory, and verbal tests loaded on the "encoding" factor and included Grammatical Reasoning, Pattern Comparison, Continuous Recall, and Matrix Rotation. The "throughput/decoding" tests included perceptual/numerical tests like Math Processing, Code Substitution, and Pattern Comparison. The output speed factor was identified by Tapping and Reaction Time tests. The Wonderlic Personnel Test was group administered before the first and after the last administration of the performance tests. The multiple Rs in the total sample between combined Wonderlic as a criterion and less than 5 min. of microcomputer testing on Grammatical Reasoning and Math Processing as predictors ranged between 0.41 and 0.52 on the three test administrations. Based on these results, the authors recommend a core battery which, if time permits, would consist of two tests from each factor. Such a battery is now known to permit stable, reliable, and efficient assessment.

  12. Advanced microgrid design and analysis for forward operating bases

    NASA Astrophysics Data System (ADS)

    Reasoner, Jonathan

    This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.

  13. A probabilistic model of emphysema based on granulometry analysis

    NASA Astrophysics Data System (ADS)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  14. IMU-Based Joint Angle Measurement for Gait Analysis

    PubMed Central

    Seel, Thomas; Raisch, Jorg; Schauer, Thomas

    2014-01-01

    This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°. PMID:24743160

  15. Agent-based simulation for human-induced hazard analysis.

    PubMed

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  16. Perceptual security of encrypted images based on wavelet scaling analysis

    NASA Astrophysics Data System (ADS)

    Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.

    2016-08-01

    The scaling behavior of the pixel fluctuations of encrypted images is evaluated by using the detrended fluctuation analysis based on wavelets, a modern technique that has been successfully used recently for a wide range of natural phenomena and technological processes. As encryption algorithms, we use the Advanced Encryption System (AES) in RBT mode and two versions of a cryptosystem based on cellular automata, with the encryption process applied both fully and partially by selecting different bitplanes. In all cases, the results show that the encrypted images in which no understandable information can be visually appreciated and whose pixels look totally random present a persistent scaling behavior with the scaling exponent α close to 0.5, implying no correlation between pixels when the DFA with wavelets is applied. This suggests that the scaling exponents of the encrypted images can be used as a perceptual security criterion in the sense that when their values are close to 0.5 (the white noise value) the encrypted images are more secure also from the perceptual point of view.

  17. Mouse-based genetic modeling and analysis of Down syndrome

    PubMed Central

    Xing, Zhuo; Li, Yichen; Pao, Annie; Bennett, Abigail S.; Tycko, Benjamin; Mobley, William C.; Yu, Y. Eugene

    2016-01-01

    Introduction Down syndrome (DS), caused by human trisomy 21 (Ts21), can be considered as a prototypical model for understanding the effects of chromosomal aneuploidies in other diseases. Human chromosome 21 (Hsa21) is syntenically conserved with three regions in the mouse genome. Sources of data A review of recent advances in genetic modeling and analysis of DS. Using Cre/loxP-mediated chromosome engineering, a substantial number of new mouse models of DS have recently been generated, which facilitates better understanding of disease mechanisms in DS. Areas of agreement Based on evolutionary conservation, Ts21 can be modeled by engineered triplication of Hsa21 syntenic regions in mice. The validity of the models is supported by the exhibition of DS-related phenotypes. Areas of controversy Although substantial progress has been made, it remains a challenge to unravel the relative importance of specific candidate genes and molecular mechanisms underlying the various clinical phenotypes. Growing points Further understanding of mechanisms based on data from mouse models, in parallel with human studies, may lead to novel therapies for clinical manifestations of Ts21 and insights to the roles of aneuploidies in other developmental disorders and cancers. PMID:27789459

  18. Poka Yoke system based on image analysis and object recognition

    NASA Astrophysics Data System (ADS)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  19. Neural Representation. A Survey-Based Analysis of the Notion

    PubMed Central

    Vilarroya, Oscar

    2017-01-01

    The word representation (as in “neural representation”), and many of its related terms, such as to represent, representational and the like, play a central explanatory role in neuroscience literature. For instance, in “place cell” literature, place cells are extensively associated with their role in “the representation of space.” In spite of its extended use, we still lack a clear, universal and widely accepted view on what it means for a nervous system to represent something, on what makes a neural activity a representation, and on what is re-presented. The lack of a theoretical foundation and definition of the notion has not hindered actual research. My aim here is to identify how active scientists use the notion of neural representation, and eventually to list a set of criteria, based on actual use, that can help in distinguishing between genuine or non-genuine neural-representation candidates. In order to attain this objective, I present first the results of a survey of authors within two domains, place-cell and multivariate pattern analysis (MVPA) research. Based on the authors’ replies, and on a review of neuroscientific research, I outline a set of common properties that an account of neural representation seems to require. I then apply these properties to assess the use of the notion in two domains of the survey, place-cell and MVPA studies. I conclude by exploring a shift in the notion of representation suggested by recent literature. PMID:28900406

  20. Skin injury model classification based on shape vector analysis

    PubMed Central

    2012-01-01

    Background: Skin injuries can be crucial in judicial decision making. Forensic experts base their classification on subjective opinions. This study investigates whether known classes of simulated skin injuries are correctly classified statistically based on 3D surface models and derived numerical shape descriptors. Methods: Skin injury surface characteristics are simulated with plasticine. Six injury classes – abrasions, incised wounds, gunshot entry wounds, smooth and textured strangulation marks as well as patterned injuries - with 18 instances each are used for a k-fold cross validation with six partitions. Deformed plasticine models are captured with a 3D surface scanner. Mean curvature is estimated for each polygon surface vertex. Subsequently, distance distributions and derived aspect ratios, convex hulls, concentric spheres, hyperbolic points and Fourier transforms are used to generate 1284-dimensional shape vectors. Subsequent descriptor reduction maximizing SNR (signal-to-noise ratio) result in an average of 41 descriptors (varying across k-folds). With non-normal multivariate distribution of heteroskedastic data, requirements for LDA (linear discriminant analysis) are not met. Thus, shrinkage parameters of RDA (regularized discriminant analysis) are optimized yielding a best performance with λ = 0.99 and γ = 0.001. Results: Receiver Operating Characteristic of a descriptive RDA yields an ideal Area Under the Curve of 1.0for all six categories. Predictive RDA results in an average CRR (correct recognition rate) of 97,22% under a 6 partition k-fold. Adding uniform noise within the range of one standard deviation degrades the average CRR to 71,3%. Conclusions: Digitized 3D surface shape data can be used to automatically classify idealized shape models of simulated skin injuries. Deriving some well established descriptors such as histograms, saddle shape of hyperbolic points or convex hulls with subsequent reduction of dimensionality while maximizing SNR

  1. Accelerator-based chemical and elemental analysis of atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Mentes, Besim

    obtained as volatile and non-volatile fractions, analysis of acidic aerosols is possible, aerosols can be size-fractionated using a cascade impactor as collection device, total analysis time for a sample is around 45 min, the sample mass load is from around 1 to 30 μg/cm2. An intercomparison of IBT and ion chromatography (IC) when a DMPS system was used as a reference instrument has been performed (Paper IV). Ions of K, Na, SO4, NO3 and NH4 were determined and quantified by both IBT and IC. The intercomparison showed that the procedure used in IBT does not suffer from any selective losses, especially not from the NO3 and NH4 compounds, which exhibit an appreciable interaction with the gas phase as NH3 and HNO3. An impactor-based aerosol sampler for upper tropospheric conditions has been developed (Paper V). Despite the low aerosol concentration at that altitude the sulphur concentration can be measured, with a detection limit of 1 ng/m 3 for one hour sampling by optimising parameters in the use of PIXE analysis.

  2. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  3. Community-based care for the specialized management of heart failure: an evidence-based analysis.

    PubMed

    2009-01-01

    In August 2008, the Medical Advisory Secretariat (MAS) presented a vignette to the Ontario Health Technology Advisory Committee (OHTAC) on a proposed targeted health care delivery model for chronic care. The proposed model was defined as multidisciplinary, ambulatory, community-based care that bridged the gap between primary and tertiary care, and was intended for individuals with a chronic disease who were at risk of a hospital admission or emergency department visit. The goals of this care model were thought to include: the prevention of emergency department visits, a reduction in hospital admissions and re-admissions, facilitation of earlier hospital discharge, a reduction or delay in long-term care admissions, and an improvement in mortality and other disease-specific patient outcomes.OHTAC approved the development of an evidence-based assessment to determine the effectiveness of specialized community based care for the management of heart failure, Type 2 diabetes and chronic wounds.PLEASE VISIT THE MEDICAL ADVISORY SECRETARIAT WEB SITE AT: www.health.gov.on.ca/ohtas to review the following reports associated with the Specialized Multidisciplinary Community-Based care series.Specialized multidisciplinary community-based care series: a summary of evidence-based analysesCommunity-based care for the specialized management of heart failure: an evidence-based analysisCommunity-based care for chronic wound management: an evidence-based analysisPlease note that the evidence-based analysis of specialized community-based care for the management of diabetes titled: "Community-based care for the management of type 2 diabetes: an evidence-based analysis" has been published as part of the Diabetes Strategy Evidence Platform at this URL: http://www.health.gov.on.ca/english/providers/program/mas/tech/ohtas/tech_diabetes_20091020.htmlPLEASE VISIT THE TORONTO HEALTH ECONOMICS AND TECHNOLOGY ASSESSMENT COLLABORATIVE WEB SITE AT: http

  4. Evidence-based Neuro Linguistic Psychotherapy: a meta-analysis.

    PubMed

    Zaharia, Cătălin; Reiner, Melita; Schütz, Peter

    2015-12-01

    Neuro Linguistic Programming (NLP) Framework has enjoyed enormous popularity in the field of applied psychology. NLP has been used in business, education, law, medicine and psychotherapy to identify people's patterns and alter their responses to stimuli, so they are better able to regulate their environment and themselves. NLP looks at achieving goals, creating stable relationships, eliminating barriers such as fears and phobias, building self-confidence, and self-esteem, and achieving peak performance. Neuro Linguistic Psychotherapy (NLPt) encompasses NLP as framework and set of interventions in the treatment of individuals with different psychological and/or social problems. We aimed systematically to analyse the available data regarding the effectiveness of Neuro Linguistic Psychotherapy (NLPt). The present work is a meta-analysis of studies, observational or randomized controlled trials, for evaluating the efficacy of Neuro Linguistic Programming in individuals with different psychological and/or social problems. The databases searched to identify studies in English and German language: CENTRAL in the Cochrane Library; PubMed; ISI Web of Knowledge (include results also from Medline and the Web of Science); PsycINFO (including PsycARTICLES); Psyndex; Deutschsprachige Diplomarbeiten der Psychologie (database of theses in Psychology in German language), Social SciSearch; National library of health and two NLP-specific research databases: one from the NLP Community (http://www.nlp.de/cgi-bin/research/nlprdb.cgi?action=res_entries) and one from the NLP Group (http://www.nlpgrup.com/bilimselarastirmalar/bilimsel-arastirmalar-4.html#Zweig154). From a total number of 425 studies, 350 were removed and considered not relevant based on the title and abstract. Included, in the final analysis, are 12 studies with numbers of participants ranging between 12 and 115 subjects. The vast majority of studies were prospective observational. The actual paper represents the first

  5. GaitaBase: Web-based repository system for gait analysis.

    PubMed

    Tirosh, Oren; Baker, Richard; McGinley, Jenny

    2010-02-01

    The need to share gait analysis data to improve clinical decision support has been recognised since the early 1990s. GaitaBase has been established to provide a web-accessible repository system of gait analysis data to improve the sharing of data across local and international clinical and research community. It is used by several clinical and research groups across the world providing cross-group access permissions to retrieve and analyse the data. The system is useful for bench-marking and quality assurance, clinical consultation, and collaborative research. It has the capacity to increase the population sample size and improve the quality of 'normative' gait data. In addition the accumulated stored data may facilitate clinicians in comparing their own gait data with others, and give a valuable insight into how effective specific interventions have been for others. 2009 Elsevier Ltd. All rights reserved.

  6. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  7. Cluster-based analysis of multi-model climate ensembles

    NASA Astrophysics Data System (ADS)

    Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.

    2018-06-01

    Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and

  8. The State of Phylogenetic Analysis: Narrow Visions and Simple Answers-Examples from the Diptera (flies).

    PubMed

    Borkent, Art

    2018-01-17

    The order Diptera is remarkably diverse, not only in species but in morphological variation in every life stage, making them excellent candidates for phylogenetic analysis. Such analysis has been hampered by methods that have severely restricted character state interpretation. Morphological-based phylogenies should be based on a deep understanding of the morphology, development and function of character states, and have extensive outgroup comparisons made to determine their polarity. Character states clearly vary in their value for determining phylogenetic relationships and this needs to be studied and utilized. Characters themselves need more explicit discussion, including how some may be developmentally or functionally related to other characters (and potentially not independent indicators of genealogical relationship). The current practice by many, of filling a matrix with poorly understood character states and highly limited outgroup comparisons, is unacceptable if the results are to be a valid reflection of the actual history of the group.Parsimony analysis is not an objective interpretation of phylogenetic relationships when all characters are treated as equal in value. Exact mathematical values applied to characters are entirely arbitrary and are generally used to produce a phylogeny that the author considers as reasonable. Mathematical appraisal of a given node is similarly inconsequential because characters do not have an intrinsic mathematical value. Bremer support, for example, provides values that have no biological reality but provide the pretence of objectivity. Cladists need to focus their attention on testing the validity of each synapomorphy proposed, as the basis for all further phylogenetic interpretation, rather than the testing of differing phylogenies through various comparative programs.Current phylogenetic analyses have come to increasingly depend on DNA sequence-based characters, in spite of their tumultuous history of inconsistent results

  9. Conceptual bases of Christian, faith-based substance abuse rehabilitation programs: qualitative analysis of staff interviews.

    PubMed

    McCoy, Lisa K; Hermos, John A; Bokhour, Barbara G; Frayne, Susan M

    2004-09-01

    Faith-based substance abuse rehabilitation programs provide residential treatment for many substance abusers. To determine key governing concepts of such programs, we conducted semi-structured interviews with sample of eleven clinical and administrative staff referred to us by program directors at six, Evangelical Christian, faith-based, residential rehabilitation programs representing two large, nationwide networks. Qualitative analysis using grounded theory methods examined how spirituality is incorporated into treatment and elicited key theories of addiction and recovery. Although containing comprehensive secular components, the core activities are strongly rooted in a Christian belief system that informs their understanding of addiction and recovery and drives the treatment format. These governing conceptions, that addiction stems from attempts to fill a spiritual void through substance use and recovery through salvation and a long-term relationship with God, provide an explicit, theory-driven model upon which they base their core treatment activities. Knowledge of these core concepts and practices should be helpful to clinicians in considering referrals to faith-based recovery programs.

  10. [Bibliometric analysis of current glaucoma research based on Pubmed database].

    PubMed

    Huang, Wen-bin; Wang, Wei; Zhou, Min-wen; Chen, Shi-da; Zhang, Xiu-lan

    2013-11-01

    To survey the distribution pattern and subject domain knowledge of worldwide glaucoma research based on literatures in Pubmed database. Literatures on glaucoma published in 2007 to 2011 were identified in Pubmed database. The analytic items of an article include published year, country, language author, and journal. After core mesh terms had been characterized by BICOMS, the co-occurrence matrix was built. Cluster analysis was finished by SPSS 20.0. Then visualized network was drawn using ucinet 6.0. Totally 6427 literatures were included, the number of annual articles changed slightly between 2007 and 2011. The United States, England, Germany, Australia, and France together accounted for 77.63% of articles. There were 52 high-frequency subjects and hot topics were clustered into the following 10 categories: (1) Pathology of optic disc and nerve fibers and OCT application, (2) METHODS: of visual field (VF) and visual function examination, (3) Glaucoma drug medications, (4) Pathology and physiology of primary open angle glaucoma (POAG) including VF and intraocular pressure (IOP), (5) Glaucoma surgery, (6) Gene research related to POAG, (7) Glaucoma disease pathology and animal models, (8) Ocular hypertension (OHT) induced complications and corneal changes, (9) Etiology of congenital glaucoma and complications, (10) Etiology and epidemiology of glaucoma. The visualized domain knowledge mapping was successfully built. The pathology of optic disc and nerve fibers, medications, and surgery were well developed. Study on IOP and visual field was in the core domain, which have an important link to etiology, diagnosis, and therapy. The researches on glaucomatous gene, disease pathology model, congenital glaucoma, etiology and epidemiology were not developed well, which are of great promotion space. The distribution pattern and subject domain knowledge of worldwide glaucoma research in the recent five years were shown by using bibliometric analysis.Western developed

  11. Spectral decomposition of asteroid Itokawa based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Koga, Sumire C.; Sugita, Seiji; Kamata, Shunichi; Ishiguro, Masateru; Hiroi, Takahiro; Tatsumi, Eri; Sasaki, Sho

    2018-01-01

    The heliocentric stratification of asteroid spectral types may hold important information on the early evolution of the Solar System. Asteroid spectral taxonomy is based largely on principal component analysis. However, how the surface properties of asteroids, such as the composition and age, are projected in the principal-component (PC) space is not understood well. We decompose multi-band disk-resolved visible spectra of the Itokawa surface with principal component analysis (PCA) in comparison with main-belt asteroids. The obtained distribution of Itokawa spectra projected in the PC space of main-belt asteroids follows a linear trend linking the Q-type and S-type regions and is consistent with the results of space-weathering experiments on ordinary chondrites and olivine, suggesting that this trend may be a space-weathering-induced spectral evolution track for S-type asteroids. Comparison with space-weathering experiments also yield a short average surface age (< a few million years) for Itokawa, consistent with the cosmic-ray-exposure time of returned samples from Itokawa. The Itokawa PC score distribution exhibits asymmetry along the evolution track, strongly suggesting that space weathering has begun saturated on this young asteroid. The freshest spectrum found on Itokawa exhibits a clear sign for space weathering, indicating again that space weathering occurs very rapidly on this body. We also conducted PCA on Itokawa spectra alone and compared the results with space-weathering experiments. The obtained results indicate that the first principal component of Itokawa surface spectra is consistent with spectral change due to space weathering and that the spatial variation in the degree of space weathering is very large (a factor of three in surface age), which would strongly suggest the presence of strong regional/local resurfacing process(es) on this small asteroid.

  12. Geography-based structural analysis of the Internet

    SciTech Connect

    Kasiviswanathan, Shiva; Eidenbenz, Stephan; Yan, Guanhua

    2010-01-01

    In this paper, we study some geographic aspects of the Internet. We base our analysis on a large set of geolocated IP hop-level session data (including about 300,000 backbone routers, 150 million end hosts, and 1 billion sessions) that we synthesized from a variety of different input sources such as US census data, computer usage statistics, Internet market share data, IP geolocation data sets, CAJDA's Skitter data set for backbone connectivity, and BGP routing tables. We use this model to perform a nationwide and statewide geographic analysis of the Internet. Our main observations are: (1) There is a dominant coast-to-coastmore » pattern in the US Internet traffic. In fact, in many instances even if the end-devices are not near either coast, still the traffic between them takes a long detour through the coasts. (2) More than half of the Internet paths are inflated by 100% or more compared to their corresponding geometric straight-line distance. This circuitousness makes the average ratio between the routing distance and geometric distance big (around 10). (3) The weighted mean hop count is around 5, but the hop counts are very loosely correlated with the distances. The weighted mean AS count (number of ASes traversed) is around 3. (4) The AS size and the AS location number distributions are heavy-tailed and strongly correlated. Most of the ASes are medium sized and there is a wide variability in the geographic dispersion size (measured in terms of the convex hull area) of these ASes.« less

  13. Structural Decomposition Analysis of China’s Industrial Energy Consumption Based on Input-Output Analysis

    NASA Astrophysics Data System (ADS)

    Huang, X. Y.; Zhou, J. Q.; Wang, Z.; Deng, L. C.; Hong, S.

    2017-05-01

    China is now at a stage of accelerated industrialization and urbanization, with energy-intensive industries contributing a large proportion of economic growth. In this study, we examined industrial energy consumption by decomposition analysis to describe the driving factors of energy consumption in China. Based on input-output (I-O) tables from the World Input-Output Database (WIOD) website and China’s energy use data from 1995 to 2011, we studied the sectorial changes of energy efficiency during the examined period. The results showed that all industries increased their energy efficiency. Energy consumption was decomposed into three factors by the logarithmic mean Divisia index (LMDI) method. The increase in production output was the leading factor that drives up China’s energy consumption. World Trade Organization accession and financial crises had great impact on the energy consumption. Based on these results, a series of energy policy suggestions for decision-makers has been proposed.

  14. BMP analysis system for watershed-based stormwater management.

    PubMed

    Zhen, Jenny; Shoemaker, Leslie; Riverson, John; Alvi, Khalid; Cheng, Mow-Soung

    2006-01-01

    Best Management Practices (BMPs) are measures for mitigating nonpoint source (NPS) pollution caused mainly by stormwater runoff. Established urban and newly developing areas must develop cost effective means for restoring or minimizing impacts, and planning future growth. Prince George's County in Maryland, USA, a fast-growing region in the Washington, DC metropolitan area, has developed a number of tools to support analysis and decision making for stormwater management planning and design at the watershed level. These tools support watershed analysis, innovative BMPs, and optimization. Application of these tools can help achieve environmental goals and lead to significant cost savings. This project includes software development that utilizes GIS information and technology, integrates BMP processes simulation models, and applies system optimization techniques for BMP planning and selection. The system employs the ESRI ArcGIS as the platform, and provides GIS-based visualization and support for developing networks including sequences of land uses, BMPs, and stream reaches. The system also provides interfaces for BMP placement, BMP attribute data input, and decision optimization management. The system includes a stand-alone BMP simulation and evaluation module, which complements both research and regulatory nonpoint source control assessment efforts, and allows flexibility in the examining various BMP design alternatives. Process based simulation of BMPs provides a technique that is sensitive to local climate and rainfall patterns. The system incorporates a meta-heuristic optimization technique to find the most cost-effective BMP placement and implementation plan given a control target, or a fixed cost. A case study is presented to demonstrate the application of the Prince George's County system. The case study involves a highly urbanized area in the Anacostia River (a tributary to Potomac River) watershed southeast of Washington, DC. An innovative system of

  15. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  16. CAD-Based Shielding Analysis for ITER Port Diagnostics

    NASA Astrophysics Data System (ADS)

    Serikov, Arkady; Fischer, Ulrich; Anthoine, David; Bertalot, Luciano; De Bock, Maartin; O'Connor, Richard; Juarez, Rafael; Krasilnikov, Vitaly

    2017-09-01

    Radiation shielding analysis conducted in support of design development of the contemporary diagnostic systems integrated inside the ITER ports is relied on the use of CAD models. This paper presents the CAD-based MCNP Monte Carlo radiation transport and activation analyses for the Diagnostic Upper and Equatorial Port Plugs (UPP #3 and EPP #8, #17). The creation process of the complicated 3D MCNP models of the diagnostics systems was substantially accelerated by application of the CAD-to-MCNP converter programs MCAM and McCad. High performance computing resources of the Helios supercomputer allowed to speed-up the MCNP parallel transport calculations with the MPI/OpenMP interface. The found shielding solutions could be universal, reducing ports R&D costs. The shield block behind the Tritium and Deposit Monitor (TDM) optical box was added to study its influence on Shut-Down Dose Rate (SDDR) in Port Interspace (PI) of EPP#17. Influence of neutron streaming along the Lost Alpha Monitor (LAM) on the neutron energy spectra calculated in the Tangential Neutron Spectrometer (TNS) of EPP#8. For the UPP#3 with Charge eXchange Recombination Spectroscopy (CXRS-core), an excessive neutron streaming along the CXRS shutter, which should be prevented in further design iteration.

  17. MESA: Message-Based System Analysis Using Runtime Verification

    NASA Technical Reports Server (NTRS)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  18. LSD-based analysis of high-resolution stellar spectra

    NASA Astrophysics Data System (ADS)

    Tsymbal, V.; Tkachenko, A.; Van, Reeth T.

    2014-11-01

    We present a generalization of the method of least-squares deconvolution (LSD), a powerful tool for extracting high S/N average line profiles from stellar spectra. The generalization of the method is effected by extending it towards the multiprofile LSD and by introducing the possibility to correct the line strengths from the initial mask. We illustrate the new approach by two examples: (a) the detection of astroseismic signatures from low S/N spectra of single stars, and (b) disentangling spectra of multiple stellar objects. The analysis is applied to spectra obtained with 2-m class telescopes in the course of spectroscopic ground-based support for space missions such as CoRoT and Kepler. Usually, rather high S/N is required, so smaller telescopes can only compete successfully with more advanced ones when one can apply a technique that enables a remarkable increase in the S/N of the spectra which they observe. Since the LSD profiles have a potential for reconstruction what is common in all the spectral profiles, it should have a particular practical application to faint stars observed with 2-m class telescopes and whose spectra show remarkable LPVs.

  19. Behavior Analysis Based on Coordinates of Body Tags

    NASA Astrophysics Data System (ADS)

    Luštrek, Mitja; Kaluža, Boštjan; Dovgan, Erik; Pogorelc, Bogdan; Gams, Matjaž

    This paper describes fall detection, activity recognition and the detection of anomalous gait in the Confidence project. The project aims to prolong the independence of the elderly by detecting falls and other types of behavior indicating a health problem. The behavior will be analyzed based on the coordinates of tags worn on the body. The coordinates will be detected with radio sensors. We describe two Confidence modules. The first one classifies the user's activity into one of six classes, including falling. The second one detects walking anomalies, such as limping, dizziness and hemiplegia. The walking analysis can automatically adapt to each person by using only the examples of normal walking of that person. Both modules employ machine learning: the paper focuses on the features they use and the effect of tag placement and sensor noise on the classification accuracy. Four tags were enough for activity recognition accuracy of over 93% at moderate sensor noise, while six were needed to detect walking anomalies with the accuracy of over 90%.

  20. Resource Management Scheme Based on Ubiquitous Data Analysis

    PubMed Central

    Lee, Heung Ki; Jung, Jaehee

    2014-01-01

    Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692

  1. Pyrosequencing Based Microbial Community Analysis of Stabilized Mine Soils

    NASA Astrophysics Data System (ADS)

    Park, J. E.; Lee, B. T.; Son, A.

    2015-12-01

    Heavy metals leached from exhausted mines have been causing severe environmental problems in nearby soils and groundwater. Environmental mitigation was performed based on the heavy metal stabilization using Calcite and steel slag in Korea. Since the soil stabilization only temporarily immobilizes the contaminants to soil matrix, the potential risk of re-leaching heavy metal still exists. Therefore the follow-up management of stabilized soils and the corresponding evaluation methods are required to avoid the consequent contamination from the stabilized soils. In this study, microbial community analysis using pyrosequencing was performed for assessing the potential leaching of the stabilized soils. As a result of rarefaction curve and Chao1 and Shannon indices, the stabilized soil has shown lower richness and diversity as compared to non-contaminated negative control. At the phyla level, as the degree of contamination increases, most of phyla decreased with only exception of increased proteobacteria. Among proteobacteria, gamma-proteobacteria increased against the heavy metal contamination. At the species level, Methylobacter tundripaludum of gamma-proteobacteria showed the highest relative portion of microbial community, indicating that methanotrophs may play an important role in either solubilization or immobilization of heavy metals in stabilized soils.

  2. Life-cycle analysis of bio-based aviation fuels.

    PubMed

    Han, Jeongwoo; Elgowainy, Amgad; Cai, Hao; Wang, Michael Q

    2013-12-01

    Well-to-wake (WTWa) analysis of bio-based aviation fuels, including hydroprocessed renewable jet (HRJ) from various oil seeds, Fischer-Tropsch jet (FTJ) from corn-stover and co-feeding of coal and corn-stover, and pyrolysis jet from corn stover, is conducted and compared with petroleum jet. WTWa GHG emission reductions relative to petroleum jet can be 41-63% for HRJ, 68-76% for pyrolysis jet and 89% for FTJ from corn stover. The HRJ production stage dominates WTWa GHG emissions from HRJ pathways. The differences in GHG emissions from HRJ production stage among considered feedstocks are much smaller than those from fertilizer use and N2O emissions related to feedstock collection stage. Sensitivity analyses on FTJ production from coal and corn-stover are also conducted, showing the importance of biomass share in the feedstock, carbon capture and sequestration options, and overall efficiency. For both HRJ and FTJ, co-product handling methods have significant impacts on WTWa results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Modeling Chinese ionospheric layer parameters based on EOF analysis

    NASA Astrophysics Data System (ADS)

    Yu, You; Wan, Weixing; Xiong, Bo; Ren, Zhipeng; Zhao, Biqiang; Zhang, Yun; Ning, Baiqi; Liu, Libo

    2015-05-01

    Using 24-ionosonde observations in and around China during the 20th solar cycle, an assimilative model is constructed to map the ionospheric layer parameters (foF2, hmF2, M(3000)F2, and foE) over China based on empirical orthogonal function (EOF) analysis. First, we decompose the background maps from the International Reference Ionosphere model 2007 (IRI-07) into different EOF modes. The obtained EOF modes consist of two factors: the EOF patterns and the corresponding EOF amplitudes. These two factors individually reflect the spatial distributions (e.g., the latitudinal dependence such as the equatorial ionization anomaly structure and the longitude structure with east-west difference) and temporal variations on different time scales (e.g., solar cycle, annual, semiannual, and diurnal variations) of the layer parameters. Then, the EOF patterns and long-term observations of ionosondes are assimilated to get the observed EOF amplitudes, which are further used to construct the Chinese Ionospheric Maps (CIMs) of the layer parameters. In contrast with the IRI-07 model, the mapped CIMs successfully capture the inherent temporal and spatial variations of the ionospheric layer parameters. Finally, comparison of the modeled (EOF and IRI-07 model) and observed values reveals that the EOF model reproduces the observation with smaller root-mean-square errors and higher linear correlation coefficients. In addition, IRI discrepancy at the low latitude especially for foF2 is effectively removed by EOF model.

  4. Modeling Chinese ionospheric layer parameters based on EOF analysis

    NASA Astrophysics Data System (ADS)

    Yu, You; Wan, Weixing

    2016-04-01

    Using 24-ionosonde observations in and around China during the 20th solar cycle, an assimilative model is constructed to map the ionospheric layer parameters (foF2, hmF2, M(3000)F2, and foE) over China based on empirical orthogonal function (EOF) analysis. First, we decompose the background maps from the International Reference Ionosphere model 2007 (IRI-07) into different EOF modes. The obtained EOF modes consist of two factors: the EOF patterns and the corresponding EOF amplitudes. These two factors individually reflect the spatial distributions (e.g., the latitudinal dependence such as the equatorial ionization anomaly structure and the longitude structure with east-west difference) and temporal variations on different time scales (e.g., solar cycle, annual, semiannual, and diurnal variations) of the layer parameters. Then, the EOF patterns and long-term observations of ionosondes are assimilated to get the observed EOF amplitudes, which are further used to construct the Chinese Ionospheric Maps (CIMs) of the layer parameters. In contrast with the IRI-07 model, the mapped CIMs successfully capture the inherent temporal and spatial variations of the ionospheric layer parameters. Finally, comparison of the modeled (EOF and IRI-07 model) and observed values reveals that the EOF model reproduces the observation with smaller root-mean-square errors and higher linear correlation co- efficients. In addition, IRI discrepancy at the low latitude especially for foF2 is effectively removed by EOF model.

  5. Analysis of Human Mobility Based on Cellular Data

    NASA Astrophysics Data System (ADS)

    Arifiansyah, F.; Saptawati, G. A. P.

    2017-01-01

    Nowadays not only adult but even teenager and children have then own mobile phones. This phenomena indicates that the mobile phone becomes an important part of everyday’s life. Based on these indication, the amount of cellular data also increased rapidly. Cellular data defined as the data that records communication among mobile phone users. Cellular data is easy to obtain because the telecommunications company had made a record of the data for the billing system of the company. Billing data keeps a log of the users cellular data usage each time. We can obtained information from the data about communication between users. Through data visualization process, an interesting pattern can be seen in the raw cellular data, so that users can obtain prior knowledge to perform data analysis. Cellular data processing can be done using data mining to find out human mobility patterns and on the existing data. In this paper, we use frequent pattern mining and finding association rules to observe the relation between attributes in cellular data and then visualize them. We used weka tools for finding the rules in stage of data mining. Generally, the utilization of cellular data can provide supporting information for the decision making process and become a data support to provide solutions and information needed by the decision makers.

  6. Automatic comic page image understanding based on edge segment analysis

    NASA Astrophysics Data System (ADS)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  7. Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks

    NASA Astrophysics Data System (ADS)

    Johnson, Joseph

    2016-03-01

    We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.

  8. Wavelet packet-based insufficiency murmurs analysis method

    NASA Astrophysics Data System (ADS)

    Choi, Samjin; Jiang, Zhongwei

    2007-12-01

    In this paper, the aortic and mitral insufficiency murmurs analysis method using the wavelet packet technique is proposed for classifying the valvular heart defects. Considering the different frequency distributions between the normal sound and insufficiency murmurs in frequency domain, we used two properties such as the relative wavelet energy and the Shannon wavelet entropy which described the energy information and the entropy information at the selected frequency band, respectively. Then, the signal to murmur ratio (SMR) measures which could mean the ratio between the frequency bands for normal heart sounds and for aortic and mitral insufficiency murmurs allocated to 15.62-187.50 Hz and 187.50-703.12 Hz respectively, were employed as a classification manner to identify insufficiency murmurs. The proposed measures were validated by some case studies. The 194 heart sound signals with 48 normal and 146 abnormal sound cases acquired from 6 healthy volunteers and 30 patients were tested. The normal sound signals recorded by applying a self-produced wireless electric stethoscope system to subjects with no history of other heart complications were used. Insufficiency murmurs were grouped into two valvular heart defects such as aortic insufficiency and mitral insufficiency. These murmur subjects included no other coexistent valvular defects. As a result, the proposed insufficiency murmurs detection method showed relatively very high classification efficiency. Therefore, the proposed heart sound classification method based on the wavelet packet was validated for the classification of valvular heart defects, especially insufficiency murmurs.

  9. Personalized glucose-insulin model based on signal analysis.

    PubMed

    Goede, Simon L; de Galan, Bastiaan E; Leow, Melvin Khee Shing

    2017-04-21

    Glucose plasma measurements for diabetes patients are generally presented as a glucose concentration-time profile with 15-60min time scale intervals. This limited resolution obscures detailed dynamic events of glucose appearance and metabolism. Measurement intervals of 15min or more could contribute to imperfections in present diabetes treatment. High resolution data from mixed meal tolerance tests (MMTT) for 24 type 1 and type 2 diabetes patients were used in our present modeling. We introduce a model based on the physiological properties of transport, storage and utilization. This logistic approach follows the principles of electrical network analysis and signal processing theory. The method mimics the physiological equivalent of the glucose homeostasis comprising the meal ingestion, absorption via the gastrointestinal tract (GIT) to the endocrine nexus between the liver, pancreatic alpha and beta cells. This model demystifies the metabolic 'black box' by enabling in silico simulations and fitting of individual responses to clinical data. Five-minute intervals MMTT data measured from diabetic subjects result in two independent model parameters that characterize the complete glucose system response at a personalized level. From the individual data measurements, we obtain a model which can be analyzed with a standard electrical network simulator for diagnostics and treatment optimization. The insulin dosing time scale can be accurately adjusted to match the individual requirements of characterized diabetic patients without the physical burden of treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Robust linear discriminant analysis with distance based estimators

    NASA Astrophysics Data System (ADS)

    Lim, Yai-Fung; Yahaya, Sharipah Soaad Syed; Ali, Hazlina

    2017-11-01

    Linear discriminant analysis (LDA) is one of the supervised classification techniques concerning relationship between a categorical variable and a set of continuous variables. The main objective of LDA is to create a function to distinguish between populations and allocating future observations to previously defined populations. Under the assumptions of normality and homoscedasticity, the LDA yields optimal linear discriminant rule (LDR) between two or more groups. However, the optimality of LDA highly relies on the sample mean and pooled sample covariance matrix which are known to be sensitive to outliers. To alleviate these conflicts, a new robust LDA using distance based estimators known as minimum variance vector (MVV) has been proposed in this study. The MVV estimators were used to substitute the classical sample mean and classical sample covariance to form a robust linear discriminant rule (RLDR). Simulation and real data study were conducted to examine on the performance of the proposed RLDR measured in terms of misclassification error rates. The computational result showed that the proposed RLDR is better than the classical LDR and was comparable with the existing robust LDR.

  11. Iris-based medical analysis by geometric deformation features.

    PubMed

    Ma, Lin; Zhang, D; Li, Naimin; Cai, Yan; Zuo, Wangmeng; Wang, Kuanguan

    2013-01-01

    Iris analysis studies the relationship between human health and changes in the anatomy of the iris. Apart from the fact that iris recognition focuses on modeling the overall structure of the iris, iris diagnosis emphasizes the detecting and analyzing of local variations in the characteristics of irises. This paper focuses on studying the geometrical structure changes in irises that are caused by gastrointestinal diseases, and on measuring the observable deformations in the geometrical structures of irises that are related to roundness, diameter and other geometric forms of the pupil and the collarette. Pupil and collarette based features are defined and extracted. A series of experiments are implemented on our experimental pathological iris database, including manual clustering of both normal and pathological iris images, manual classification by non-specialists, manual classification by individuals with a medical background, classification ability verification for the proposed features, and disease recognition by applying the proposed features. The results prove the effectiveness and clinical diagnostic significance of the proposed features and a reliable recognition performance for automatic disease diagnosis. Our research results offer a novel systematic perspective for iridology studies and promote the progress of both theoretical and practical work in iris diagnosis.

  12. Residual Stress Analysis Based on Acoustic and Optical Methods.

    PubMed

    Yoshida, Sanichiro; Sasaki, Tomohiro; Usui, Masaru; Sakamoto, Shuichi; Gurney, David; Park, Ik-Keun

    2016-02-16

    Co-application of acoustoelasticity and optical interferometry to residual stress analysis is discussed. The underlying idea is to combine the advantages of both methods. Acoustoelasticity is capable of evaluating a residual stress absolutely but it is a single point measurement. Optical interferometry is able to measure deformation yielding two-dimensional, full-field data, but it is not suitable for absolute evaluation of residual stresses. By theoretically relating the deformation data to residual stresses, and calibrating it with absolute residual stress evaluated at a reference point, it is possible to measure residual stresses quantitatively, nondestructively and two-dimensionally. The feasibility of the idea has been tested with a butt-jointed dissimilar plate specimen. A steel plate 18.5 mm wide, 50 mm long and 3.37 mm thick is braze-jointed to a cemented carbide plate of the same dimension along the 18.5 mm-side. Acoustoelasticity evaluates the elastic modulus at reference points via acoustic velocity measurement. A tensile load is applied to the specimen at a constant pulling rate in a stress range substantially lower than the yield stress. Optical interferometry measures the resulting acceleration field. Based on the theory of harmonic oscillation, the acceleration field is correlated to compressive and tensile residual stresses qualitatively. The acoustic and optical results show reasonable agreement in the compressive and tensile residual stresses, indicating the feasibility of the idea.

  13. a Region-Based Multi-Scale Approach for Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz Erdemir, M.; Tonbul, H.

    2016-06-01

    Within the last two decades, object-based image analysis (OBIA) considering objects (i.e. groups of pixels) instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights) to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC) graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse) determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient). Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  14. Economic analysis of an internet-based depression prevention intervention.

    PubMed

    Ruby, Alexander; Marko-Holguin, Monika; Fogel, Joshua; Van Voorhees, Benjamin W

    2013-09-01

    interventions like CATCH-IT appears economically viable in the context of an Accountable Care Organization. Furthermore, while the cost of implementing an effective safety protocol is proportionally high for this intervention, CATCH-IT is still significantly cheaper to implement than current treatment options. Limitations of this research included diminished participation in follow-up surveys assessing willingness-to-pay. IMPLICATIONS FOR HEALTH CARE PROVISION AND USE AND HEALTH POLICIES: This research emphasizes that preventive interventions have the potential to be cheaper to implement than treatment protocols, even before taking into account lost productivity due to illness. Research such as this business application analysis of the CATCH-IT program highlights the importance of supporting preventive medical interventions as the healthcare system already does for treatment interventions. This research is the first to analyze the economic costs of an Internet-based intervention. Further research into the costs and outcomes of such interventions is certainly warranted before they are widely adopted. Furthermore, more research regarding the safety of Internet-based programs will likely need to be conducted before they are broadly accepted.

  15. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    EPA Science Inventory

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...

  16. Reliability analysis based on the losses from failures.

    PubMed

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  17. Hybrid diagnostic system: beacon-based exception analysis for multimissions - Livingstone integration

    NASA Technical Reports Server (NTRS)

    Park, Han G.; Cannon, Howard; Bajwa, Anupa; Mackey, Ryan; James, Mark; Maul, William

    2004-01-01

    This paper describes the initial integration of a hybrid reasoning system utilizing a continuous domain feature-based detector, Beacon-based Exceptions Analysis for Multimissions (BEAM), and a discrete domain model-based reasoner, Livingstone.

  18. [Quality evaluation of Artemisiae Argyi Folium based on fingerprint analysis and quantitative analysis of multicomponents].

    PubMed

    Guo, Long; Jiao, Qian; Zhang, Dan; Liu, Ai-Peng; Wang, Qian; Zheng, Yu-Guang

    2018-03-01

    Artemisiae Argyi Folium, the dried leaves of Artemisia argyi, has been widely used in traditional Chinese and folk medicines for treatment of hemorrhage, pain, and skin itch. Phytochemical studies indicated that volatile oil, organic acid and flavonoids were the main bioactive components in Artemisiae Argyi Folium. Compared to the volatile compounds, the research of nonvolatile compounds in Artemisiae Argyi Folium are limited. In the present study, an accurate and reliable fingerprint approach was developed using HPLC for quality control of Artemisiae Argyi Folium. A total of 10 common peaks were marked,and the similarity of all the Artemisiae Argyi Folium samples was above 0.940. The established fingerprint method could be used for quality control of Artemisiae Argyi Folium. Furthermore, an HPLC method was applied for simultaneous determination of seven bioactive compounds including five organic acids and two flavonoids in Artemisiae Argyi Folium and Artemisiae Lavandulaefoliae Folium samples. Moreover, chemometrics methods such as hierarchical clustering analysis and principal component analysis were performed to compare and discriminate the Artemisiae Argyi Folium and Artemisiae Lavandulaefoliae Folium based on the quantitative data of analytes. The results indicated that simultaneous quantification of multicomponents coupled with chemometrics analysis could be a well-acceptable strategy to identify and evaluate the quality of Artemisiae Argyi Folium. Copyright© by the Chinese Pharmaceutical Association.

  19. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  20. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    NASA Astrophysics Data System (ADS)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  1. Analysis of space shuttle main engine data using Beacon-based exception analysis for multi-missions

    NASA Technical Reports Server (NTRS)

    Park, H.; Mackey, R.; James, M.; Zak, M.; Kynard, M.; Sebghati, J.; Greene, W.

    2002-01-01

    This paper describes analysis of the Space Shuttle Main Engine (SSME) sensor data using Beacon-based exception analysis for multimissions (BEAM), a new technology developed for sensor analysis and diagnostics in autonomous space systems by the Jet Propulsion Laboratory (JPL).

  2. OMERACT-based fibromyalgia symptom subgroups: an exploratory cluster analysis.

    PubMed

    Vincent, Ann; Hoskin, Tanya L; Whipple, Mary O; Clauw, Daniel J; Barton, Debra L; Benzo, Roberto P; Williams, David A

    2014-10-16

    The aim of this study was to identify subsets of patients with fibromyalgia with similar symptom profiles using the Outcome Measures in Rheumatology (OMERACT) core symptom domains. Female patients with a diagnosis of fibromyalgia and currently meeting fibromyalgia research survey criteria completed the Brief Pain Inventory, the 30-item Profile of Mood States, the Medical Outcomes Sleep Scale, the Multidimensional Fatigue Inventory, the Multiple Ability Self-Report Questionnaire, the Fibromyalgia Impact Questionnaire-Revised (FIQ-R) and the Short Form-36 between 1 June 2011 and 31 October 2011. Hierarchical agglomerative clustering was used to identify subgroups of patients with similar symptom profiles. To validate the results from this sample, hierarchical agglomerative clustering was repeated in an external sample of female patients with fibromyalgia with similar inclusion criteria. A total of 581 females with a mean age of 55.1 (range, 20.1 to 90.2) years were included. A four-cluster solution best fit the data, and each clustering variable differed significantly (P <0.0001) among the four clusters. The four clusters divided the sample into severity levels: Cluster 1 reflects the lowest average levels across all symptoms, and cluster 4 reflects the highest average levels. Clusters 2 and 3 capture moderate symptoms levels. Clusters 2 and 3 differed mainly in profiles of anxiety and depression, with Cluster 2 having lower levels of depression and anxiety than Cluster 3, despite higher levels of pain. The results of the cluster analysis of the external sample (n = 478) looked very similar to those found in the original cluster analysis, except for a slight difference in sleep problems. This was despite having patients in the validation sample who were significantly younger (P <0.0001) and had more severe symptoms (higher FIQ-R total scores (P = 0.0004)). In our study, we incorporated core OMERACT symptom domains, which allowed for clustering based on a

  3. A review of vision-based motion analysis in sport.

    PubMed

    Barris, Sian; Button, Chris

    2008-01-01

    Efforts at player motion tracking have traditionally involved a range of data collection techniques from live observation to post-event video analysis where player movement patterns are manually recorded and categorized to determine performance effectiveness. Due to the considerable time required to manually collect and analyse such data, research has tended to focus only on small numbers of players within predefined playing areas. Whilst notational analysis is a convenient, practical and typically inexpensive technique, the validity and reliability of the process can vary depending on a number of factors, including how many observers are used, their experience, and the quality of their viewing perspective. Undoubtedly the application of automated tracking technology to team sports has been hampered because of inadequate video and computational facilities available at sports venues. However, the complex nature of movement inherent to many physical activities also represents a significant hurdle to overcome. Athletes tend to exhibit quick and agile movements, with many unpredictable changes in direction and also frequent collisions with other players. Each of these characteristics of player behaviour violate the assumptions of smooth movement on which computer tracking algorithms are typically based. Systems such as TRAKUS, SoccerMan, TRAKPERFORMANCE, Pfinder and Prozone all provide extrinsic feedback information to coaches and athletes. However, commercial tracking systems still require a fair amount of operator intervention to process the data after capture and are often limited by the restricted capture environments that can be used and the necessity for individuals to wear tracking devices. Whilst some online tracking systems alleviate the requirements of manual tracking, to our knowledge a completely automated system suitable for sports performance is not yet commercially available. Automatic motion tracking has been used successfully in other domains outside

  4. A Cost Analysis of School-Based Lifestyle Interventions.

    PubMed

    Oosterhoff, Marije; Bosma, Hans; van Schayck, Onno C P; Joore, Manuela A

    2018-05-31

    A uniform approach for costing school-based lifestyle interventions is currently lacking. The objective of this study was to develop a template for costing primary school-based lifestyle interventions and apply this to the costing of the "Healthy Primary School of the Future" (HPSF) and the "Physical Activity School" (PAS), which aim to improve physical activity and dietary behaviors. Cost-effectiveness studies were reviewed to identify the cost items. Societal costs were reflected by summing up the education, household and leisure, labor and social security, and health perspectives. Cost inputs for HPSF and PAS were obtained for the first year after implementation. In a scenario analysis, the costs were explored for a hypothetical steady state. From a societal perspective, the per child costs were €2.7/$3.3 (HPSF) and €- 0.3/$- 0.4 (PAS) per day during the first year after implementation, and €1.0/$1.2 and €- 1.3/$- 1.6 in a steady state, respectively (2016 prices). The highest costs were incurred by the education perspective (first year: €8.7/$10.6 (HPSF) and €4.0/$4.9 (PAS); steady state: €6.1/$7.4 (HPSF) and €2.1/$2.6 (PAS)), whereas most of the cost offsets were received by the household and leisure perspective (first year: €- 6.0/$- 7.3 (HPSF) and €- 4.4/$- 5.4 (PAS); steady state: €- 5.0/$- 6.1 (HPSF) and €- 3.4/$- 4.1 (PAS)). The template proved helpful for costing HPSF and PAS from various stakeholder perspectives. The costs for the education sector were fully (PAS) and almost fully (HPSF) compensated by the savings within the household sector. Whether the additional costs of HPSF over PAS represent value for money will depend on their relative effectiveness.

  5. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  6. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  7. Management of chronic pressure ulcers: an evidence-based analysis.

    PubMed

    2009-01-01

    In April 2008, the Medical Advisory Secretariat began an evidence-based review of the literature concerning pressure ulcers.Please visit the Medical Advisory Secretariat Web site, http://www.health.gov.on.ca/english/providers/program/mas/tech/tech_mn.html to review these titles that are currently available within the Pressure Ulcers series.PRESSURE ULCER PREVENTION: an evidence based analysisThe cost-effectiveness of prevention strategies for pressure ulcers in long-term care homes in Ontario: projections of the Ontario Pressure Ulcer Model (field evaluation)MANAGEMENT OF CHRONIC PRESSURE ULCERS: an evidence-based analysis The Medical Advisory Secretariat (MAS) conducted a systematic review on interventions used to treat pressure ulcers in order to answer the following questions: Do currently available interventions for the treatment of pressure ulcers increase the healing rate of pressure ulcers compared with standard care, a placebo, or other similar interventions?Within each category of intervention, which one is most effective in promoting the healing of existing pressure ulcers? A pressure ulcer is a localized injury to the skin and/or underlying tissue usually over a bony prominence, as a result of pressure, or pressure in conjunction with shear and/or friction. Many areas of the body, especially the sacrum and the heel, are prone to the development of pressure ulcers. People with impaired mobility (e.g., stroke or spinal cord injury patients) are most vulnerable to pressure ulcers. Other factors that predispose people to pressure ulcer formation are poor nutrition, poor sensation, urinary and fecal incontinence, and poor overall physical and mental health. The prevalence of pressure ulcers in Ontario has been estimated to range from a median of 22.1% in community settings to a median of 29.9% in nonacute care facilities. Pressure ulcers have been shown to increase the risk of mortality among geriatric patients by as much as 400%, to increase the frequency

  8. Analysis of mixed model in gear transmission based on ADAMS

    NASA Astrophysics Data System (ADS)

    Li, Xiufeng; Wang, Yabin

    2012-09-01

    The traditional method of mechanical gear driving simulation includes gear pair method and solid to solid contact method. The former has higher solving efficiency but lower results accuracy; the latter usually obtains higher precision of results while the calculation process is complex, also it is not easy to converge. Currently, most of the researches are focused on the description of geometric models and the definition of boundary conditions. However, none of them can solve the problems fundamentally. To improve the simulation efficiency while ensure the results with high accuracy, a mixed model method which uses gear tooth profiles to take the place of the solid gear to simulate gear movement is presented under these circumstances. In the process of modeling, build the solid models of the mechanism in the SolidWorks firstly; Then collect the point coordinates of outline curves of the gear using SolidWorks API and create fit curves in Adams based on the point coordinates; Next, adjust the position of those fitting curves according to the position of the contact area; Finally, define the loading conditions, boundary conditions and simulation parameters. The method provides gear shape information by tooth profile curves; simulates the mesh process through tooth profile curve to curve contact and offer mass as well as inertia data via solid gear models. This simulation process combines the two models to complete the gear driving analysis. In order to verify the validity of the method presented, both theoretical derivation and numerical simulation on a runaway escapement are conducted. The results show that the computational efficiency of the mixed model method is 1.4 times over the traditional method which contains solid to solid contact. Meanwhile, the simulation results are more closely to theoretical calculations. Consequently, mixed model method has a high application value regarding to the study of the dynamics of gear mechanism.

  9. Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis

    PubMed Central

    Ré, Miguel A.; Azad, Rajeev K.

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338

  10. Economic and policy analysis of university-based drug "detailing".

    PubMed

    Soumerai, S B; Avorn, J

    1986-04-01

    The cost-effectiveness of quality assurance programs is often poorly documented, especially for innovative approaches. The authors analyzed the economic effects of an experimental educational outreach program designed to reduce inappropriate drug prescribing, based on a four-state randomized controlled trial (N = 435 physicians). Primary care physicians randomized into the face-to-face group were offered two individualized educational sessions with clinical pharmacists, lasting an average of 18 minutes each, concerning optimal use of three drug groups that are often used inappropriately. After the program, expenditures for target drugs prescribed by these physicians to Medicaid patients decreased by 13%, compared with controls (P = 0.002); this effect was stable over three quarters. Implementation of this program for 10,000 physicians would lead to projected drug savings (to Medicaid only) of $2,050,000, compared with resource costs of $940,000. Net savings remain high, even after adjustment for use of substitution medications. Although there was a ninefold difference in average preintervention prescribing levels between the highest and lowest thirds of the sample, all groups reduced target drug expenditures at the same rate. Targeting of higher-volume prescribers would thus further raise the observed benefit-to-cost ratio from approximately 1.8 to at least 3.0. Net benefits would also increase further if non-Medicaid savings were added, or if the analysis included quality-of-care considerations. Although print materials alone may be marginally cost-effective, print plus face-to-face approaches offer greater net benefits. The authors conclude that a program of brief, face-to-face "detailing" visits conducted by academic rather than commercial sources can be a highly cost-effective method for improving drug therapy decisions. Such an approach makes possible the enhancement of physicians' clinical expertise without relying on restriction of drug choices.

  11. Generalization of entropy based divergence measures for symbolic sequence analysis.

    PubMed

    Ré, Miguel A; Azad, Rajeev K

    2014-01-01

    Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.

  12. Pattern-based IP block detection, verification, and variability analysis

    NASA Astrophysics Data System (ADS)

    Ahmad Ibrahim, Muhamad Asraf Bin; Muhsain, Mohamad Fahmi Bin; Kamal Baharin, Ezni Aznida Binti; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe

    2018-03-01

    The goal of a foundry partner is to deliver high quality silicon product to its customers on time. There is an assumed trust that the silicon will yield, function and perform as expected when the design fits all the sign-off criteria. The use of Intellectual Property (IP) blocks is very common today and provides the customer with pre-qualified and optimized functions for their design thus shortening the design cycle. There are many methods by which an IP Block can be generated and placed within layout. Even with the most careful methods and following of guidelines comes the responsibility of sign-off checking. A foundry needs to detect where these IP Blocks have been placed and look for any violations. This includes DRC clean modifications to the IP Block which may or may not be intentional. Using a pattern-based approach to detect all IP Blocks used provides the foundry advanced capabilities to analyze them further for any kind of changes which could void the OPC and process window optimizations. Having any changes in an IP Block could cause functionality changes or even failures. This also opens the foundry to legal and cost issues while at the same time forcing re-spins of the design. In this publication, we discuss the methodology we have employed to avoid process issues and tape-out errors while at the same time reduce our manual work and improve the turnaround time. We are also able to use our pattern analysis to improve our OPC optimizations when modifications are encountered which have not been seen before.

  13. Computerized summary scoring: crowdsourcing-based latent semantic analysis.

    PubMed

    Li, Haiying; Cai, Zhiqiang; Graesser, Arthur C

    2017-11-03

    In this study we developed and evaluated a crowdsourcing-based latent semantic analysis (LSA) approach to computerized summary scoring (CSS). LSA is a frequently used mathematical component in CSS, where LSA similarity represents the extent to which the to-be-graded target summary is similar to a model summary or a set of exemplar summaries. Researchers have proposed different formulations of the model summary in previous studies, such as pregraded summaries, expert-generated summaries, or source texts. The former two methods, however, require substantial human time, effort, and costs in order to either grade or generate summaries. Using source texts does not require human effort, but it also does not predict human summary scores well. With human summary scores as the gold standard, in this study we evaluated the crowdsourcing LSA method by comparing it with seven other LSA methods that used sets of summaries from different sources (either experts or crowdsourced) of differing quality, along with source texts. Results showed that crowdsourcing LSA predicted human summary scores as well as expert-good and crowdsourcing-good summaries, and better than the other methods. A series of analyses with different numbers of crowdsourcing summaries demonstrated that the number (from 10 to 100) did not significantly affect performance. These findings imply that crowdsourcing LSA is a promising approach to CSS, because it saves human effort in generating the model summary while still yielding comparable performance. This approach to small-scale CSS provides a practical solution for instructors in courses, and also advances research on automated assessments in which student responses are expected to semantically converge on subject matter content.

  14. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  15. A Stirling engine analysis method based upon moving gas nodes

    NASA Technical Reports Server (NTRS)

    Martini, W. R.

    1986-01-01

    A Lagrangian nodal analysis method for Stirling engines (SEs) is described, validated, and applied to a conventional SE and an isothermalized SE (with fins in the hot and cold spaces). The analysis employs a constant-mass gas node (which moves with respect to the solid nodes during each time step) instead of the fixed gas nodes of Eulerian analysis. The isothermalized SE is found to have efficiency only slightly greater than that of a conventional SE.

  16. CMS Configuration Editor: GUI based application for user analysis job

    NASA Astrophysics Data System (ADS)

    de Cosa, A.

    2011-12-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  17. Reachability Analysis for Base Placement in Mobile Manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1994-01-01

    This paper addresses the problem of base placement for mobile robots, and proposes a simple off-line solution to determine the appropriate base locations from which the robot can reach a target point.

  18. Feasibility analysis of base compaction specification : [project brief].

    DOT National Transportation Integrated Search

    2013-02-01

    Appropriate design and construction of the aggregate base layer has significant influence on : structural stability and performance of pavements. Controlling the construction quality of : the granular base layer is important to achieve long-lasting p...

  19. Testing Evolutionary Hypotheses in the Classroom with MacClade Software.

    ERIC Educational Resources Information Center

    Codella, Sylvio G.

    2002-01-01

    Introduces MacClade which is a Macintosh-based software package that uses the techniques of cladistic analysis to explore evolutionary patterns. Describes a novel and effective exercise that allows undergraduate biology majors to test a hypothesis about behavioral evolution in insects. (Contains 13 references.) (Author/YDS)

  20. GIS-Based crash referencing and analysis system

    DOT National Transportation Integrated Search

    1999-02-01

    One area where a Geographic Information System (GIS) has yet to be extensively applied is in the analysis of crash data. Computerized crash analysis systems in which crash data, roadway inventory data, and traffic operations data can be merged are us...

  1. Rasch Based Analysis of Oral Proficiency Test Data.

    ERIC Educational Resources Information Center

    Nakamura, Yuji

    2001-01-01

    This paper examines the rating scale data of oral proficiency tests analyzed by a Rasch Analysis focusing on an item map and factor analysis. In discussing the item map, the difficulty order of six items and students' answering patterns are analyzed using descriptive statistics and measures of central tendency of test scores. The data ranks the…

  2. Performance analysis of a potassium-base AMTEC cell

    SciTech Connect

    Huang, C.; Hendricks, T.J.; Hunt, T.K.

    1998-07-01

    Sodium-BASE Alkali-Metal-Thermal-to-Electric-Conversion (AMTEC) cells have been receiving increased attention and funding from the Department of Energy, NASA and the United States Air Force. Recently, sodium-BASE (Na-BASE) AMTEC cells were selected for the Advanced Radioisotope Power System (ARPS) program for the next generation of deep-space missions and spacecraft. Potassium-BASE (K-BASE) AMTEC cells have not received as much attention to date, even though the vapor pressure of potassium is higher than that of sodium at the same temperature. So that, K-BASE AMTEC cells with potentially higher open circuit voltage and higher power output than Na-BASE AMTEC cells are possible. Because the surfacemore » tension of potassium is about half of the surface tension of sodium at the same temperature, the artery and evaporator design in a potassium AMTEC cell has much more challenging pore size requirements than designs using sodium. This paper uses a flexible thermal/fluid/electrical model to predict the performance of a K-BASE AMTEC cell. Pore sizes in the artery of K-BASE AMTEC cells must be smaller by an order of magnitude than in Na-BASE AMTEC cells. The performance of a K-BASE AMTEC cell was higher than a Na-BASE AMTEC cell at low voltages/high currents. K-BASE AMTEC cells also have the potential of much better electrode performance, thereby creating another avenue for potentially better performance in K-BASE AMTEC cells.« less

  3. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    PubMed

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  4. Violence in public transportation: an approach based on spatial analysis.

    PubMed

    Sousa, Daiane Castro Bittencourt de; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-12-11

    To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis.

  5. Violence in public transportation: an approach based on spatial analysis

    PubMed Central

    de Sousa, Daiane Castro Bittencourt; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-01-01

    ABSTRACT OBJECTIVE To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. METHODS The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. RESULTS The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. CONCLUSIONS Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis. PMID:29236883

  6. Geomechanics-Based Stochastic Analysis of Injection- Induced Seismicity

    SciTech Connect

    Ghassemi, Ahmad

    The production of geothermal energy from dry and low permeability reservoirs is achieved by water circulation in natural and/or man-made fractures, and is referred to as enhanced or engineered geothermal systems (EGS). Often, the permeable zones have to be created by stimulation, a process which involves fracture initiation and/or activation of discontinuities such as faults and joints due to pore pressure and the in-situ stress perturbations. The stimulation of a rock mass is often accompanied by multiple microseismic events. Micro-seismic events associated with rock failure in shear, and shear slip on new or pre-existing fracture planes and possibly their propagations.more » The microseismic signals contain information about the sources of energy that can be used for understanding the hydraulic fracturing process and the created reservoir properties. Detection and interpretation of microseismic events is useful for estimating the stimulated zone, created reservoir permeability and fracture growth, and geometry of the geological structures and the in-situ stress state. The process commonly is referred to as seismicity-based reservoir characterization (SBRC). Although, progress has been made by scientific & geothermal communities for quantitative and qualitative analysis of reservoir stimulation using SBRC several key questions remain unresolved in the analysis of micro-seismicity namely, variation of seismic activity with injection rate, delayed micro-seismicity, and the relation of stimulated zone to the injected volume and its rate, and the resulting reservoir permeability. In addition, the current approach to SBRC does not consider the full range of relevant poroelastic and thermoelastic phenomena and neglects the uncertainty in rock properties and in-situ stress in the data inversion process. The objective of this research and technology developments was to develop a 3D SBRC model that addresses these shortcomings by taking into account hydro

  7. Functional brain imaging: an evidence-based analysis.

    PubMed

    2006-01-01

    The objective of this analysis is to review a spectrum of functional brain imaging technologies to identify whether there are any imaging modalities that are more effective than others for various brain pathology conditions. This evidence-based analysis reviews magnetoencephalography (MEG), magnetic resonance spectroscopy (MRS), positron emission tomography (PET), and functional magnetic resonance imaging (fMRI) for the diagnosis or surgical management of the following conditions: Alzheimer's disease (AD), brain tumours, epilepsy, multiple sclerosis (MS), and Parkinson's disease (PD). TARGET POPULATION AND CONDITION Alzheimer's disease is a progressive, degenerative, neurologic condition characterized by cognitive impairment and memory loss. The Canadian Study on Health and Aging estimated that there will be 97,000 incident cases (about 60,000 women) of dementia (including AD) in Canada in 2006. In Ontario, there will be an estimated 950 new cases and 580 deaths due to brain cancer in 2006. Treatments for brain tumours include surgery and radiation therapy. However, one of the limitations of radiation therapy is that it damages tissue though necrosis and scarring. Computed tomography (CT) and magnetic resonance imaging (MRI) may not distinguish between radiation effects and resistant tissue, creating a potential role for functional brain imaging. Epilepsy is a chronic disorder that provokes repetitive seizures. In Ontario, the rate of epilepsy is estimated to be 5 cases per 1,000 people. Most people with epilepsy are effectively managed with drug therapy; but about 50% do not respond to drug therapy. Surgical resection of the seizure foci may be considered in these patients, and functional brain imaging may play a role in localizing the seizure foci. Multiple sclerosis is a progressive, inflammatory, demyelinating disease of the central nervous system (CNS). The cause of MS is unknown; however, it is thought to be due to a combination of etiologies, including

  8. Quality control analysis : part II : soil and aggregate base course.

    DOT National Transportation Integrated Search

    1966-07-01

    This is the second of the three reports on the quality control analysis of highway construction materials. : It deals with the statistical evaluation of results from several construction projects to determine the basic pattern of variability with res...

  9. The coordinate-based meta-analysis of neuroimaging data

    PubMed Central

    Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E.; Johnson, Timothy D.

    2017-01-01

    Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research. PMID:29545671

  10. The coordinate-based meta-analysis of neuroimaging data.

    PubMed

    Samartsidis, Pantelis; Montagna, Silvia; Nichols, Thomas E; Johnson, Timothy D

    2017-01-01

    Neuroimaging meta-analysis is an area of growing interest in statistics. The special characteristics of neuroimaging data render classical meta-analysis methods inapplicable and therefore new methods have been developed. We review existing methodologies, explaining the benefits and drawbacks of each. A demonstration on a real dataset of emotion studies is included. We discuss some still-open problems in the field to highlight the need for future research.

  11. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  12. Soil Studies: Applying Acid-Base Chemistry to Environmental Analysis.

    ERIC Educational Resources Information Center

    West, Donna M.; Sterling, Donna R.

    2001-01-01

    Laboratory activities for chemistry students focus attention on the use of acid-base chemistry to examine environmental conditions. After using standard laboratory procedures to analyze soil and rainwater samples, students use web-based resources to interpret their findings. Uses CBL probes and graphing calculators to gather and analyze data and…

  13. Classroom Application of a Trial-Based Functional Analysis

    ERIC Educational Resources Information Center

    Bloom, Sarah E.; Iwata, Brian A.; Fritz, Jennifer N.; Roscoe, Eileen M.; Carreau, Abbey B.

    2011-01-01

    We evaluated a trial-based approach to conducting functional analyses in classroom settings. Ten students referred for problem behavior were exposed to a series of assessment trials, which were interspersed among classroom activities throughout the day. Results of these trial-based functional analyses were compared to those of more traditional…

  14. Developing a Problem-Based Course Based on Needs Analysis to Enhance English Reading Ability of Thai Undergraduate Students

    ERIC Educational Resources Information Center

    Bosuwon, Takwa; Woodrow, Lindy

    2009-01-01

    This paper reports on a needs analysis underlying a proposed business English reading course using a problem-based learning approach designed to enhance English reading abilities of Thai undergraduate students. As part of a work in progress, the needs analysis survey was done prior to the course design with the major stakeholders in business and…

  15. Triangulation Based 3D Laser Imaging for Fracture Orientation Analysis

    NASA Astrophysics Data System (ADS)

    Mah, J.; Claire, S.; Steve, M.

    2009-05-01

    Laser imaging has recently been identified as a potential tool for rock mass characterization. This contribution focuses on the application of triangulation based, short-range laser imaging to determine fracture orientation and surface texture. This technology measures the distance to the target by triangulating the projected and reflected laser beams, and also records the reflection intensity. In this study, we acquired 3D laser images of rock faces using the Laser Camera System (LCS), a portable instrument developed by Neptec Design Group (Ottawa, Canada). The LCS uses an infrared laser beam and is immune to the lighting conditions. The maximum image resolution is 1024 x 1024 volumetric image elements. Depth resolution is 0.5 mm at 5 m. An above ground field trial was conducted at a blocky road cut with well defined joint sets (Kingston, Ontario). An underground field trial was conducted at the Inco 175 Ore body (Sudbury, Ontario) where images were acquired in the dark and the joint set features were more subtle. At each site, from a distance of 3 m away from the rock face, a grid of six images (approximately 1.6 m by 1.6 m) was acquired at maximum resolution with 20% overlap between adjacent images. This corresponds to a density of 40 image elements per square centimeter. Polyworks, a high density 3D visualization software tool, was used to align and merge the images into a single digital triangular mesh. The conventional method of determining fracture orientations is by manual measurement using a compass. In order to be accepted as a substitute for this method, the LCS should be capable of performing at least to the capabilities of manual measurements. To compare fracture orientation estimates derived from the 3D laser images to manual measurements, 160 inclinometer readings were taken at the above ground site. Three prominent joint sets (strike/dip: 236/09, 321/89, 325/01) were identified by plotting the joint poles on a stereonet. Underground, two main joint

  16. Image-based RSA: Roentgen stereophotogrammetric analysis based on 2D-3D image registration.

    PubMed

    de Bruin, P W; Kaptein, B L; Stoel, B C; Reiber, J H C; Rozing, P M; Valstar, E R

    2008-01-01

    Image-based Roentgen stereophotogrammetric analysis (IBRSA) integrates 2D-3D image registration and conventional RSA. Instead of radiopaque RSA bone markers, IBRSA uses 3D CT data, from which digitally reconstructed radiographs (DRRs) are generated. Using 2D-3D image registration, the 3D pose of the CT is iteratively adjusted such that the generated DRRs resemble the 2D RSA images as closely as possible, according to an image matching metric. Effectively, by registering all 2D follow-up moments to the same 3D CT, the CT volume functions as common ground. In two experiments, using RSA and using a micromanipulator as gold standard, IBRSA has been validated on cadaveric and sawbone scapula radiographs, and good matching results have been achieved. The accuracy was: |mu |< 0.083 mm for translations and |mu| < 0.023 degrees for rotations. The precision sigma in x-, y-, and z-direction was 0.090, 0.077, and 0.220 mm for translations and 0.155 degrees , 0.243 degrees , and 0.074 degrees for rotations. Our results show that the accuracy and precision of in vitro IBRSA, performed under ideal laboratory conditions, are lower than in vitro standard RSA but higher than in vivo standard RSA. Because IBRSA does not require radiopaque markers, it adds functionality to the RSA method by opening new directions and possibilities for research, such as dynamic analyses using fluoroscopy on subjects without markers and computer navigation applications.

  17. Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.

    2016-01-01

    The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.

  18. Advanced electrophysiologic mapping systems: an evidence-based analysis.

    PubMed

    2006-01-01

    any of the advanced systems to fluoroscopy-guided ablation of tachycardia. English-language studies with sample sizes greater than or equal to 20 that were published between 2000 and 2005 were included. Observational studies on safety of advanced mapping systems and fluoroscopy were also included. Outcomes of interest were acute success, defined as termination of arrhythmia immediately following ablation; long-term success, defined as being arrhythmia free at follow-up; total procedure time; fluoroscopy time; radiation dose; number of radiofrequency pulses; complications; cost; and the cost-effectiveness ratio. Quality of the individual studies was assessed using established criteria. Quality of the overall evidence was determined by applying the GRADE evaluation system. (3) Qualitative synthesis of the data was performed. Quantitative analysis using Revman 4.2 was performed when appropriate. Quality of the Studies Thirty-four studies met the inclusion criteria. These comprised 18 studies on CARTO (4 randomized controlled trials [RCTs] and 14 non-RCTs), 3 RCTs on EnSite NavX, 4 studies on LocaLisa Navigational System (1 RCT and 3 non-RCTs), 2 studies on EnSite and CARTO, 1 on Polar Constellation basket catheter, and 7 studies on radiation safety. The quality of the studies ranged from moderate to low. Most of the studies had small sample sizes with selection bias, and there was no blinding of patients or care providers in any of the studies. Duration of follow-up ranged from 6 weeks to 29 months, with most having at least 6 months of follow-up. There was heterogeneity with respect to the approach to ablation, definition of success, and drug management before and after the ablation procedure. Evidence is based on a small number of small RCTS and non-RCTS with methodological flaws.Advanced nonfluoroscopy mapping/navigation systems provided real time 3-dimensional images with integration of anatomic and electrical potential information that enable better visualization of

  19. Crude oil price analysis and forecasting based on variational mode decomposition and independent component analysis

    NASA Astrophysics Data System (ADS)

    E, Jianwei; Bao, Yanling; Ye, Jimin

    2017-10-01

    As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.

  20. Reliability analysis of the solar array based on Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Jianing, Wu; Shaoze, Yan

    2011-07-01

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  1. dada - a web-based 2D detector analysis tool

    NASA Astrophysics Data System (ADS)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  2. A Study of Crisis Management Based on Stakeholders Analysis Model

    NASA Astrophysics Data System (ADS)

    Qingchun, Yue

    2017-11-01

    From the view of stakeholder theory, not only the enterprises should provide services to shareholders, but also take care of the demands of stakeholders. Stakeholders for the enterprise crisis are the organizations and individuals, which cause crisis, respond to the crisis and affected by the enterprise crisis. In this paper, first of all, to comb the development of stakeholder theory systematically; secondly, with the help of the enterprise crisis stakeholder analysis model, analyze the concept of stakeholders for the enterprise crisis and membership, and with the example of Shuanghui Group for further analysis; finally, we put forward relevant proposals for the enterprise crisis from the view of stakeholders.

  3. Analysis of MLS Based Surveillance System (MLSS) Concepts

    DOT National Transportation Integrated Search

    1989-04-01

    This report examines a number of surveillance system concepts to support safe independent runway approaches and converging runways under weather conditons. All surveillance conepts are based on the use of MLS signals. The resultin surveillance is ava...

  4. Image watermarking capacity analysis based on Hopfield neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Zhang, Hongbin

    2004-11-01

    In watermarking schemes, watermarking can be viewed as a form of communication problems. Almost all of previous works on image watermarking capacity are based on information theory, using Shannon formula to calculate the capacity of watermarking. In this paper, we present a blind watermarking algorithm using Hopfield neural network, and analyze watermarking capacity based on neural network. In our watermarking algorithm, watermarking capacity is decided by attraction basin of associative memory.

  5. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  6. USARCENT AOR Contingency Base Waste Stream Analysis: An Analysis of Solid Waste Streams at Five Bases in the U. S. Army Central (USARCENT) Area of Responsibility

    DTIC Science & Technology

    2013-03-31

    certainly remain comingled with other solid waste. For example, some bases provided containers for segregation of recyclables including plastic and...prevalent types of solid waste are food (19.1% by average sample weight), wood (18.9%), and plastics (16.0%) based on analysis of bases in...within the interval shown. Food and wood wastes are the largest components of the average waste stream (both at ~19% by weight), followed by plastic

  7. Prediction of Recidivism in Juvenile Offenders Based on Discriminant Analysis.

    ERIC Educational Resources Information Center

    Proefrock, David W.

    The recent development of strong statistical techniques has made accurate predictions of recidivism possible. To investigate the utility of discriminant analysis methodology in making predictions of recidivism in juvenile offenders, the court records of 271 male and female juvenile offenders, aged 12-16, were reviewed. A cross validation group…

  8. A Structural and Content-Based Analysis for Web Filtering.

    ERIC Educational Resources Information Center

    Lee, P. Y.; Hui, S. C.; Fong, A. C. M.

    2003-01-01

    Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)

  9. The Bases of Teacher Expectancies: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Dusek, Jerome B.; Joseph, Gail

    1983-01-01

    A meta-analysis of 77 studies on teacher expectancies led to the following conclusions: student attractiveness, conduct, cumulative folder information, race, and social class were related to teacher expectancies. Student gender and the number of parents at home were not related to teacher expectancies. (Author/LC)

  10. Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection

    NASA Astrophysics Data System (ADS)

    Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki

    Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.

  11. Rasch Model Based Analysis of the Force Concept Inventory

    ERIC Educational Resources Information Center

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-01-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…

  12. Network-Based Visual Analysis of Tabular Data

    ERIC Educational Resources Information Center

    Liu, Zhicheng

    2012-01-01

    Tabular data is pervasive in the form of spreadsheets and relational databases. Although tables often describe multivariate data without explicit network semantics, it may be advantageous to explore the data modeled as a graph or network for analysis. Even when a given table design conveys some static network semantics, analysts may want to look…

  13. The Mathematical Analysis of Style: A Correlation-Based Approach.

    ERIC Educational Resources Information Center

    Oppenheim, Rosa

    1988-01-01

    Examines mathematical models of style analysis, focusing on the pattern in which literary characteristics occur. Describes an autoregressive integrated moving average model (ARIMA) for predicting sentence length in different works by the same author and comparable works by different authors. This technique is valuable in characterizing stylistic…

  14. Automatic Text Analysis Based on Transition Phenomena of Word Occurrences

    ERIC Educational Resources Information Center

    Pao, Miranda Lee

    1978-01-01

    Describes a method of selecting index terms directly from a word frequency list, an idea originally suggested by Goffman. Results of the analysis of word frequencies of two articles seem to indicate that the automated selection of index terms from a frequency list holds some promise for automatic indexing. (Author/MBR)

  15. Improving Cluster Analysis with Automatic Variable Selection Based on Trees

    DTIC Science & Technology

    2014-12-01

    regression trees Daisy DISsimilAritY PAM partitioning around medoids PMA penalized multivariate analysis SPC sparse principal components UPGMA unweighted...unweighted pair-group average method ( UPGMA ). This method measures dissimilarities between all objects in two clusters and takes the average value

  16. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    PubMed

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  17. Weighting-Based Sensitivity Analysis in Causal Mediation Studies

    ERIC Educational Resources Information Center

    Hong, Guanglei; Qin, Xu; Yang, Fan

    2018-01-01

    Through a sensitivity analysis, the analyst attempts to determine whether a conclusion of causal inference could be easily reversed by a plausible violation of an identification assumption. Analytic conclusions that are harder to alter by such a violation are expected to add a higher value to scientific knowledge about causality. This article…

  18. Spectrum-Based and Collaborative Network Topology Analysis and Visualization

    ERIC Educational Resources Information Center

    Hu, Xianlin

    2013-01-01

    Networks are of significant importance in many application domains, such as World Wide Web and social networks, which often embed rich topological information. Since network topology captures the organization of network nodes and links, studying network topology is very important to network analysis. In this dissertation, we study networks by…

  19. The Application of Structured Job Analysis Information Based on the Position Analysis Questionnaire (PAQ).

    DTIC Science & Technology

    Position Analysis Questionnaire ( PAQ ). This job analysis instrument consists of 187 job elements organized into six divisions. In the analysis of a job...with the PAQ the relevance of the individual elements to the job are rated using any of several rating scales such as importance, or time.

  20. Team-Based Models for End-of-Life Care: An Evidence-Based Analysis

    PubMed Central

    2014-01-01

    Background End of life refers to the period when people are living with advanced illness that will not stabilize and from which they will not recover and will eventually die. It is not limited to the period immediately before death. Multiple services are required to support people and their families during this time period. The model of care used to deliver these services can affect the quality of the care they receive. Objectives Our objective was to determine whether an optimal team-based model of care exists for service delivery at end of life. In systematically reviewing such models, we considered their core components: team membership, services offered, modes of patient contact, and setting. Data Sources A literature search was performed on October 14, 2013, using Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid Embase, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), and EBM Reviews, for studies published from January 1, 2000, to October 14, 2013. Review Methods Abstracts were reviewed by a single reviewer and full-text articles were obtained that met the inclusion criteria. Studies were included if they evaluated a team model of care compared with usual care in an end-of-life adult population. A team was defined as having at least 2 health care disciplines represented. Studies were limited to English publications. A meta-analysis was completed to obtain pooled effect estimates where data permitted. The GRADE quality of the evidence was evaluated. Results Our literature search located 10 randomized controlled trials which, among them, evaluated the following 6 team-based models of care: hospital, direct contact home, direct contact home, indirect contact comprehensive, indirect contact comprehensive, direct contact comprehensive, direct, and early contact Direct contact is when team members see the patient; indirect contact is when they advise another health care practitioner (e.g., a family doctor) who sees

  1. Analysis of selected data from the triservice missile data base

    NASA Technical Reports Server (NTRS)

    Allen, Jerry M.; Shaw, David S.; Sawyer, Wallace C.

    1989-01-01

    An extremely large, systematic, axisymmetric-body/tail-fin data base has been gathered through tests of an innovative missile model design which is described herein. These data were originally obtained for incorporation into a missile aerodynamics code based on engineering methods (Program MISSILE3), but these data are also valuable as diagnostic test cases for developing computational methods because of the individual-fin data included in the data base. Detailed analyses of four sample cases from these data are presented to illustrate interesting individual-fin force and moment trends. These samples quantitatively show how bow shock, fin orientation, fin deflection, and body vortices can produce strong, unusual, and computationally challenging effects on individual fin loads. Flow-visualization photographs are examined to provide physical insight into the cause of these effects.

  2. Systems Analysis Of Advanced Coal-Based Power Plants

    NASA Technical Reports Server (NTRS)

    Ferrall, Joseph F.; Jennings, Charles N.; Pappano, Alfred W.

    1988-01-01

    Report presents appraisal of integrated coal-gasification/fuel-cell power plants. Based on study comparing fuel-cell technologies with each other and with coal-based alternatives and recommends most promising ones for research and development. Evaluates capital cost, cost of electricity, fuel consumption, and conformance with environmental standards. Analyzes sensitivity of cost of electricity to changes in fuel cost, to economic assumptions, and to level of technology. Recommends further evaluation of integrated coal-gasification/fuel-cell integrated coal-gasification/combined-cycle, and pulverized-coal-fired plants. Concludes with appendixes detailing plant-performance models, subsystem-performance parameters, performance goals, cost bases, plant-cost data sheets, and plant sensitivity to fuel-cell performance.

  3. Analysis of Tyman green detection system based on polarization interference

    NASA Astrophysics Data System (ADS)

    Huang, Yaolin; Wang, Min; Shao, Xiaoping; Kou, Yuanfeng

    2018-02-01

    The optical surface deviation of the lens can directly affect the quality of the optical system.In order to effectively and accurately detect the surface shape, an optical surface on-line detection system based on polarization interference technology is designed and developed. The system is based on Tyman-Green interference optical path, join the polarization interference measuring technology. Based on the theoretical derivation of the optical path and the ZEMAX software simulation, the experimental optical path is constructed. The parallel light is used to detect the concave lens. The parallel light is used as the light source, the size of the polarization splitting prism, detection radius of curvature, the relations between and among the size of the lens aperture, a detection range is given.

  4. Conceptual analysis of a lunar base transportation system

    NASA Technical Reports Server (NTRS)

    Hoy, Trevor D.; Johnson, Lloyd B., III; Persons, Mark B.; Wright, Robert L.

    1992-01-01

    Important to the planning for a lunar base is the development of transportation requirements for the establishment and maintenance of that base. This was accomplished as part of a lunar base systems assessment study conducted by the NASA Langley Research Center in conjunction with the NASA Johnson Space Center. Lunar base parameters are presented using a baseline lunar facility concept and timeline of developmental phases. Masses for habitation and scientific modules, power systems, life support systems, and thermal control systems were generated, assuming space station technology as a starting point. The masses were manifested by grouping various systems into cargo missions and interspersing manned flights consistent with construction and base maintenance timelines. A computer program that sizes the orbital transfer vehicles (OTV's), lunar landers, lunar ascenders, and the manned capsules was developed. This program consists of an interative technique to solve the rocket equation successively for each velocity correction (delta V) in a mission. The delta V values reflect integrated trajectory values and include gravity losses. As the program computed fuel masses, it matched structural masses from General Dynamics' modular space-based OTV design. Variables in the study included the operation mode (i.e., expendable vs. reusable and single-stage vs. two-stage OTV's), cryogenic specific impulse, reflecting different levels of engine technology, and aerobraking vs. all-propulsive return to Earth orbit. The use of lunar-derived oxygen was also examined for its general impact. For each combination of factors, the low-Earth orbit (LEO) stack masses and Earth-to-orbit (ETO) lift requirements are summarized by individual mission and totaled for the developmental phase. In addition to these discrete data, trends in the variation of study parameters are presented.

  5. Dynamic soft tissue deformation estimation based on energy analysis

    NASA Astrophysics Data System (ADS)

    Gao, Dedong; Lei, Yong; Yao, Bin

    2016-10-01

    The needle placement accuracy of millimeters is required in many needle-based surgeries. The tissue deformation, especially that occurring on the surface of organ tissue, affects the needle-targeting accuracy of both manual and robotic needle insertions. It is necessary to understand the mechanism of tissue deformation during needle insertion into soft tissue. In this paper, soft tissue surface deformation is investigated on the basis of continuum mechanics, where a geometry model is presented to quantitatively approximate the volume of tissue deformation. The energy-based method is presented to the dynamic process of needle insertion into soft tissue based on continuum mechanics, and the volume of the cone is exploited to quantitatively approximate the deformation on the surface of soft tissue. The external work is converted into potential, kinetic, dissipated, and strain energies during the dynamic rigid needle-tissue interactive process. The needle insertion experimental setup, consisting of a linear actuator, force sensor, needle, tissue container, and a light, is constructed while an image-based method for measuring the depth and radius of the soft tissue surface deformations is introduced to obtain the experimental data. The relationship between the changed volume of tissue deformation and the insertion parameters is created based on the law of conservation of energy, with the volume of tissue deformation having been obtained using image-based measurements. The experiments are performed on phantom specimens, and an energy-based analytical fitted model is presented to estimate the volume of tissue deformation. The experimental results show that the energy-based analytical fitted model can predict the volume of soft tissue deformation, and the root mean squared errors of the fitting model and experimental data are 0.61 and 0.25 at the velocities 2.50 mm/s and 5.00 mm/s. The estimating parameters of the soft tissue surface deformations are proven to be useful

  6. DNA sequence analysis with droplet-based microfluidics

    PubMed Central

    Abate, Adam R.; Hung, Tony; Sperling, Ralph A.; Mary, Pascaline; Rotem, Assaf; Agresti, Jeremy J.; Weiner, Michael A.; Weitz, David A.

    2014-01-01

    Droplet-based microfluidic techniques can form and process micrometer scale droplets at thousands per second. Each droplet can house an individual biochemical reaction, allowing millions of reactions to be performed in minutes with small amounts of total reagent. This versatile approach has been used for engineering enzymes, quantifying concentrations of DNA in solution, and screening protein crystallization conditions. Here, we use it to read the sequences of DNA molecules with a FRET-based assay. Using probes of different sequences, we interrogate a target DNA molecule for polymorphisms. With a larger probe set, additional polymorphisms can be interrogated as well as targets of arbitrary sequence. PMID:24185402

  7. Deformation Monitoring and Analysis of Lsp Landslide Based on Gbinsar

    NASA Astrophysics Data System (ADS)

    Zhou, L.; Guo, J.; Yang, F.

    2018-05-01

    Monitoring and analyzing the deformation of the river landslide in city to master the deformation law of landslide, which is an important means of landslide safety assessment. In this paper, aiming at the stability of the Liu Sha Peninsula Landslide during its strengthening process after the landslide disaster. Continuous and high precision deformation monitoring of the landslide was carried out by GBInSAR technique. Meanwhile, the two-dimensional deformation time series pictures of the landslide body were retrieved by the time series analysis method. The deformation monitoring and analysis results show that the reinforcement belt on the landslide body was basically stable and the deformation of most PS points on the reinforcement belt was within 1 mm. The deformation of most areas on the landslide body was basically within 4 mm, and the deformation presented obvious nonlinear changes. GBInSAR technique can quickly and effectively obtain the entire deformation information of the river landslide and the evolution process of deformation.

  8. Signal analysis of accelerometry data using gravity-based modeling

    NASA Astrophysics Data System (ADS)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  9. Topological analysis of metabolic networks based on petri net theory.

    PubMed

    Zevedei-Oancea, Ionela; Schuster, Stefan

    2011-01-01

    Petri net concepts provide additional tools for the modelling of metabolic networks. Here, the similarities between the counterparts in traditional biochemical modelling and Petri net theory are discussed. For example the stoichiometry matrix of a metabolic network corresponds to the incidence matrix of the Petri net. The flux modes and conservation relations have the T-invariants, respectively, P-invariants as counterparts. We reveal the biological meaning of some notions specific to the Petri net framework (traps, siphons, deadlocks, liveness). We focus on the topological analysis rather than on the analysis of the dynamic behaviour. The treatment of external metabolites is discussed. Some simple theoretical examples are presented for illustration. Also the Petri nets corresponding to some biochemical networks are built to support our results. For example, the role of triose phosphate isomerase (TPI) in Trypanosoma brucei metabolism is evaluated by detecting siphons and traps. All Petri net properties treated in this contribution are exemplified on a system extracted from nucleotide metabolism.

  10. Topological analysis of metabolic networks based on Petri net theory.

    PubMed

    Zevedei-Oancea, Ionela; Schuster, Stefan

    2003-01-01

    Petri net concepts provide additional tools for the modelling of metabolic networks. Here, the similarities between the counterparts in traditional biochemical modelling and Petri net theory are discussed. For example the stoichiometry matrix of a metabolic network corresponds to the incidence matrix of the Petri net. The flux modes and conservation relations have the T-invariants, respectively, P-invariants as counterparts. We reveal the biological meaning of some notions specific to the Petri net framework (traps, siphons, deadlocks, liveness). We focus on the topological analysis rather than on the analysis of the dynamic behaviour. The treatment of external metabolites is discussed. Some simple theoretical examples are presented for illustration. Also the Petri nets corresponding to some biochemical networks are built to support our results. For example, the role of triose phosphate isomerase (TPI) in Trypanosoma brucei metabolism is evaluated by detecting siphons and traps. All Petri net properties treated in this contribution are exemplified on a system extracted from nucleotide metabolism.

  11. Sensitivity analysis of navy aviation readiness based sparing model

    DTIC Science & Technology

    2017-09-01

    variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of

  12. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    DTIC Science & Technology

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  13. Noncoding sequence classification based on wavelet transform analysis: part II

    NASA Astrophysics Data System (ADS)

    Paredes, O.; Strojnik, M.; Romo-Vázquez, R.; Vélez-Pérez, H.; Ranta, R.; Garcia-Torales, G.; Scholl, M. K.; Morales, J. A.

    2017-09-01

    DNA sequences in human genome can be divided into the coding and noncoding ones. We hypothesize that the characteristic periodicities of the noncoding sequences are related to their function. We describe the procedure to identify these characteristic periodicities using the wavelet analysis. Our results show that three groups of noncoding sequences, each one with different biological function, may be differentiated by their wavelet coefficients within specific frequency range.

  14. Mission-Based Scenario Research: Experimental Design and Analysis

    DTIC Science & Technology

    2011-08-10

    with Army Reserach Lab; DCS Corporation, Alexandria, Va and Warren, Mi 14. ABSTRACT In this paper , we discuss a neuroimaging experiment that...Development and Engineering Center Warren, MI Kelvin S. Oie, PhD Army Research Laboratory Aberdeen Proving Ground, MD ABSTRACT In this paper , we...experiment, this paper will emphasize analyses that employ a pattern classification analysis approach. These classification examples aim to identify

  15. Classification of pulmonary airway disease based on mucosal color analysis

    NASA Astrophysics Data System (ADS)

    Suter, Melissa; Reinhardt, Joseph M.; Riker, David; Ferguson, John Scott; McLennan, Geoffrey

    2005-04-01

    Airway mucosal color changes occur in response to the development of bronchial diseases including lung cancer, cystic fibrosis, chronic bronchitis, emphysema and asthma. These associated changes are often visualized using standard macro-optical bronchoscopy techniques. A limitation to this form of assessment is that the subtle changes that indicate early stages in disease development may often be missed as a result of this highly subjective assessment, especially in inexperienced bronchoscopists. Tri-chromatic CCD chip bronchoscopes allow for digital color analysis of the pulmonary airway mucosa. This form of analysis may facilitate a greater understanding of airway disease response. A 2-step image classification approach is employed: the first step is to distinguish between healthy and diseased bronchoscope images and the second is to classify the detected abnormal images into 1 of 4 possible disease categories. A database of airway mucosal color constructed from healthy human volunteers is used as a standard against which statistical comparisons are made from mucosa with known apparent airway abnormalities. This approach demonstrates great promise as an effective detection and diagnosis tool to highlight potentially abnormal airway mucosa identifying a region possibly suited to further analysis via airway forceps biopsy, or newly developed micro-optical biopsy strategies. Following the identification of abnormal airway images a neural network is used to distinguish between the different disease classes. We have shown that classification of potentially diseased airway mucosa is possible through comparative color analysis of digital bronchoscope images. The combination of the two strategies appears to increase the classification accuracy in addition to greatly decreasing the computational time.

  16. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  17. Intelligent Control in Automation Based on Wireless Traffic Analysis

    SciTech Connect

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less

  18. Sentiments Analysis of Reviews Based on ARCNN Model

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoyu; Xu, Ming; Xu, Jian; Zheng, Ning; Yang, Tao

    2017-10-01

    The sentiments analysis of product reviews is designed to help customers understand the status of the product. The traditional method of sentiments analysis relies on the input of a fixed feature vector which is performance bottleneck of the basic codec architecture. In this paper, we propose an attention mechanism with BRNN-CNN model, referring to as ARCNN model. In order to have a good analysis of the semantic relations between words and solves the problem of dimension disaster, we use the GloVe algorithm to train the vector representations for words. Then, ARCNN model is proposed to deal with the problem of deep features training. Specifically, BRNN model is proposed to investigate non-fixed-length vectors and keep time series information perfectly and CNN can study more connection of deep semantic links. Moreover, the attention mechanism can automatically learn from the data and optimize the allocation of weights. Finally, a softmax classifier is designed to complete the sentiment classification of reviews. Experiments show that the proposed method can improve the accuracy of sentiment classification compared with benchmark methods.

  19. Intelligent Control in Automation Based on Wireless Traffic Analysis

    SciTech Connect

    Kurt Derr; Milos Manic

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less

  20. Depth data research of GIS based on clustering analysis algorithm

    NASA Astrophysics Data System (ADS)

    Xiong, Yan; Xu, Wenli

    2018-03-01

    The data of GIS have spatial distribution. Geographic data has both spatial characteristics and attribute characteristics, and also changes with time. Therefore, the amount of data is very large. Nowadays, many industries and departments in the society are using GIS. However, without proper data analysis and mining scheme, GIS will not exert its maximum effectiveness and will waste a lot of data. In this paper, we use the geographic information demand of a national security department as the experimental object, combining the characteristics of GIS data, taking into account the characteristics of time, space, attributes and so on, and using cluster analysis algorithm. We further study the mining scheme for depth data, and get the algorithm model. This algorithm can automatically classify sample data, and then carry out exploratory analysis. The research shows that the algorithm model and the information mining scheme can quickly find hidden depth information from the surface data of GIS, thus improving the efficiency of the security department. This algorithm can also be extended to other fields.