Sample records for alternative classification schemes

  1. Using Simulations to Investigate the Longitudinal Stability of Alternative Schemes for Classifying and Identifying Children with Reading Disabilities

    ERIC Educational Resources Information Center

    Schatschneider, Christopher; Wagner, Richard K.; Hart, Sara A.; Tighe, Elizabeth L.

    2016-01-01

    The present study employed data simulation techniques to investigate the 1-year stability of alternative classification schemes for identifying children with reading disabilities. Classification schemes investigated include low performance, unexpected low performance, dual-discrepancy, and a rudimentary form of constellation model of reading…

  2. Reconsideration of the scheme of the international classification of functioning, disability and health: incentives from the Netherlands for a global debate.

    PubMed

    Heerkens, Yvonne F; de Weerd, Marjolein; Huber, Machteld; de Brouwer, Carin P M; van der Veen, Sabina; Perenboom, Rom J M; van Gool, Coen H; Ten Napel, Huib; van Bon-Martens, Marja; Stallinga, Hillegonda A; van Meeteren, Nico L U

    2018-03-01

    The ICF (International Classification of Functioning, Disability and Health) framework (used worldwide to describe 'functioning' and 'disability'), including the ICF scheme (visualization of functioning as result of interaction with health condition and contextual factors), needs reconsideration. The purpose of this article is to discuss alternative ICF schemes. Reconsideration of ICF via literature review and discussions with 23 Dutch ICF experts. Twenty-six experts were invited to rank the three resulting alternative schemes. The literature review provided five themes: 1) societal developments; 2) health and research influences; 3) conceptualization of health; 4) models/frameworks of health and disability; and 5) ICF-criticism (e.g. position of 'health condition' at the top and role of 'contextual factors'). Experts concluded that the ICF scheme gives the impression that the medical perspective is dominant instead of the biopsychosocial perspective. Three alternative ICF schemes were ranked by 16 (62%) experts, resulting in one preferred scheme. There is a need for a new ICF scheme, better reflecting the ICF framework, for further (inter)national consideration. These Dutch schemes should be reviewed on a global scale, to develop a scheme that is more consistent with current and foreseen developments and changing ideas on health. Implications for Rehabilitation We propose policy makers on community, regional and (inter)national level to consider the use of the alternative schemes of the International Classification of Functioning, Disability and Health within their plans to promote functioning and health of their citizens and researchers and teachers to incorporate the alternative schemes into their research and education to emphasize the biopsychosocial paradigm. We propose to set up an international Delphi procedure involving citizens (including patients), experts in healthcare, occupational care, research, education and policy, and planning to get consensus on an alternative scheme of the International Classification of Functioning, Disability and Health. We recommend to discuss the alternatives for the present scheme of the International Classification of Functioning, Disability and Health in the present update and revision process within the World Health Organization as a part of the discussion on the future of the International Classification of Functioning, Disability and Health framework (including ontology, title and relation with the International Classification of Diseases). We recommend to revise the definition of personal factors and to draft a list of personal factors that can be used in policy making, clinical practice, research, and education and to put effort in the revision of the present list of environmental factors to make it more useful in, e.g., occupational health care.

  3. An Alternative Classification Scheme for Teaching Performance Incentives Using a Factor Analytic Approach.

    ERIC Educational Resources Information Center

    Mertler, Craig A.

    This study attempted to (1) expand the dichotomous classification scheme typically used by educators and researchers to describe teaching incentives and (2) offer administrators and teachers an alternative framework within which to develop incentive systems. Elementary, middle, and high school teachers in Ohio rated 10 commonly instituted teaching…

  4. Development of a methodology for classifying software errors

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  5. Validation of a selective ensemble-based classification scheme for myoelectric control using a three-dimensional Fitts' Law test.

    PubMed

    Scheme, Erik J; Englehart, Kevin B

    2013-07-01

    When controlling a powered upper limb prosthesis it is important not only to know how to move the device, but also when not to move. A novel approach to pattern recognition control, using a selective multiclass one-versus-one classification scheme has been shown to be capable of rejecting unintended motions. This method was shown to outperform other popular classification schemes when presented with muscle contractions that did not correspond to desired actions. In this work, a 3-D Fitts' Law test is proposed as a suitable alternative to using virtual limb environments for evaluating real-time myoelectric control performance. The test is used to compare the selective approach to a state-of-the-art linear discriminant analysis classification based scheme. The framework is shown to obey Fitts' Law for both control schemes, producing linear regression fittings with high coefficients of determination (R(2) > 0.936). Additional performance metrics focused on quality of control are discussed and incorporated in the evaluation. Using this framework the selective classification based scheme is shown to produce significantly higher efficiency and completion rates, and significantly lower overshoot and stopping distances, with no significant difference in throughput.

  6. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  7. Defining functional biomes and monitoring their change globally.

    PubMed

    Higgins, Steven I; Buitenwerf, Robert; Moncrieff, Glenn R

    2016-11-01

    Biomes are important constructs for organizing understanding of how the worlds' major terrestrial ecosystems differ from one another and for monitoring change in these ecosystems. Yet existing biome classification schemes have been criticized for being overly subjective and for explicitly or implicitly invoking climate. We propose a new biome map and classification scheme that uses information on (i) an index of vegetation productivity, (ii) whether the minimum of vegetation activity is in the driest or coldest part of the year, and (iii) vegetation height. Although biomes produced on the basis of this classification show a strong spatial coherence, they show little congruence with existing biome classification schemes. Our biome map provides an alternative classification scheme for comparing the biogeochemical rates of terrestrial ecosystems. We use this new biome classification scheme to analyse the patterns of biome change observed over recent decades. Overall, 13% to 14% of analysed pixels shifted in biome state over the 30-year study period. A wide range of biome transitions were observed. For example, biomes with tall vegetation and minimum vegetation activity in the cold season shifted to higher productivity biome states. Biomes with short vegetation and low seasonality shifted to seasonally moisture-limited biome states. Our findings and method provide a new source of data for rigorously monitoring global vegetation change, analysing drivers of vegetation change and for benchmarking models of terrestrial ecosystem function. © 2016 John Wiley & Sons Ltd.

  8. Ecosystem classifications based on summer and winter conditions.

    PubMed

    Andrew, Margaret E; Nelson, Trisalyn A; Wulder, Michael A; Hobart, George W; Coops, Nicholas C; Farmer, Carson J Q

    2013-04-01

    Ecosystem classifications map an area into relatively homogenous units for environmental research, monitoring, and management. However, their effectiveness is rarely tested. Here, three classifications are (1) defined and characterized for Canada along summertime productivity (moderate-resolution imaging spectrometer fraction of absorbed photosynthetically active radiation) and wintertime snow conditions (special sensor microwave/imager snow water equivalent), independently and in combination, and (2) comparatively evaluated to determine the ability of each classification to represent the spatial and environmental patterns of alternative schemes, including the Canadian ecozone framework. All classifications depicted similar patterns across Canada, but detailed class distributions differed. Class spatial characteristics varied with environmental conditions within classifications, but were comparable between classifications. There was moderate correspondence between classifications. The strongest association was between productivity classes and ecozones. The classification along both productivity and snow balanced these two sets of variables, yielding intermediate levels of association in all pairwise comparisons. Despite relatively low spatial agreement between classifications, they successfully captured patterns of the environmental conditions underlying alternate schemes (e.g., snow classes explained variation in productivity and vice versa). The performance of ecosystem classifications and the relevance of their input variables depend on the environmental patterns and processes used for applications and evaluation. Productivity or snow regimes, as constructed here, may be desirable when summarizing patterns controlled by summer- or wintertime conditions, respectively, or of climate change responses. General purpose ecosystem classifications should include both sets of drivers. Classifications should be carefully, quantitatively, and comparatively evaluated relative to a particular application prior to their implementation as monitoring and assessment frameworks.

  9. [The establishment, development and application of classification approach of freshwater phytoplankton based on the functional group: a review].

    PubMed

    Yang, Wen; Zhu, Jin-Yong; Lu, Kai-Hong; Wan, Li; Mao, Xiao-Hua

    2014-06-01

    Appropriate schemes for classification of freshwater phytoplankton are prerequisites and important tools for revealing phytoplanktonic succession and studying freshwater ecosystems. An alternative approach, functional group of freshwater phytoplankton, has been proposed and developed due to the deficiencies of Linnaean and molecular identification in ecological applications. The functional group of phytoplankton is a classification scheme based on autoecology. In this study, the theoretical basis and classification criterion of functional group (FG), morpho-functional group (MFG) and morphology-based functional group (MBFG) were summarized, as well as their merits and demerits. FG was considered as the optimal classification approach for the aquatic ecology research and aquatic environment evaluation. The application status of FG was introduced, with the evaluation standards and problems of two approaches to assess water quality on the basis of FG, index methods of Q and QR, being briefly discussed.

  10. Comparing the performance of flat and hierarchical Habitat/Land-Cover classification models in a NATURA 2000 site

    NASA Astrophysics Data System (ADS)

    Gavish, Yoni; O'Connell, Jerome; Marsh, Charles J.; Tarantino, Cristina; Blonda, Palma; Tomaselli, Valeria; Kunin, William E.

    2018-02-01

    The increasing need for high quality Habitat/Land-Cover (H/LC) maps has triggered considerable research into novel machine-learning based classification models. In many cases, H/LC classes follow pre-defined hierarchical classification schemes (e.g., CORINE), in which fine H/LC categories are thematically nested within more general categories. However, none of the existing machine-learning algorithms account for this pre-defined hierarchical structure. Here we introduce a novel Random Forest (RF) based application of hierarchical classification, which fits a separate local classification model in every branching point of the thematic tree, and then integrates all the different local models to a single global prediction. We applied the hierarchal RF approach in a NATURA 2000 site in Italy, using two land-cover (CORINE, FAO-LCCS) and one habitat classification scheme (EUNIS) that differ from one another in the shape of the class hierarchy. For all 3 classification schemes, both the hierarchical model and a flat model alternative provided accurate predictions, with kappa values mostly above 0.9 (despite using only 2.2-3.2% of the study area as training cells). The flat approach slightly outperformed the hierarchical models when the hierarchy was relatively simple, while the hierarchical model worked better under more complex thematic hierarchies. Most misclassifications came from habitat pairs that are thematically distant yet spectrally similar. In 2 out of 3 classification schemes, the additional constraints of the hierarchical model resulted with fewer such serious misclassifications relative to the flat model. The hierarchical model also provided valuable information on variable importance which can shed light into "black-box" based machine learning algorithms like RF. We suggest various ways by which hierarchical classification models can increase the accuracy and interpretability of H/LC classification maps.

  11. Classification of extraterrestrial civilizations

    NASA Astrophysics Data System (ADS)

    Tang, Tong B.; Chang, Grace

    1991-06-01

    A scheme of classification of extraterrestrial intelligence (ETI) communities based on the scope of energy accessible to the civilization in question is proposed as an alternative to the Kardeshev (1964) scheme that includes three types of civilization, as determined by their levels of energy expenditure. The proposed scheme includes six classes: (1) a civilization that runs essentially on energy exerted by individual beings or by domesticated lower life forms, (2) harnessing of natural sources on planetary surface with artificial constructions, like water wheels and wind sails, (3) energy from fossils and fissionable isotopes, mined beneath the planet surface, (4) exploitation of nuclear fusion on a large scale, whether on the planet, in space, or from primary solar energy, (5) extensive use of antimatter for energy storage, and (6) energy from spacetime, perhaps via the action of naked singularities.

  12. Interpretation for scales of measurement linking with abstract algebra

    PubMed Central

    2014-01-01

    The Stevens classification of levels of measurement involves four types of scale: “Nominal”, “Ordinal”, “Interval” and “Ratio”. This classification has been used widely in medical fields and has accomplished an important role in composition and interpretation of scale. With this classification, levels of measurements appear organized and validated. However, a group theory-like systematization beckons as an alternative because of its logical consistency and unexceptional applicability in the natural sciences but which may offer great advantages in clinical medicine. According to this viewpoint, the Stevens classification is reformulated within an abstract algebra-like scheme; ‘Abelian modulo additive group’ for “Ordinal scale” accompanied with ‘zero’, ‘Abelian additive group’ for “Interval scale”, and ‘field’ for “Ratio scale”. Furthermore, a vector-like display arranges a mixture of schemes describing the assessment of patient states. With this vector-like notation, data-mining and data-set combination is possible on a higher abstract structure level based upon a hierarchical-cluster form. Using simple examples, we show that operations acting on the corresponding mixed schemes of this display allow for a sophisticated means of classifying, updating, monitoring, and prognosis, where better data mining/data usage and efficacy is expected. PMID:24987515

  13. Interpretation for scales of measurement linking with abstract algebra.

    PubMed

    Sawamura, Jitsuki; Morishita, Shigeru; Ishigooka, Jun

    2014-01-01

    THE STEVENS CLASSIFICATION OF LEVELS OF MEASUREMENT INVOLVES FOUR TYPES OF SCALE: "Nominal", "Ordinal", "Interval" and "Ratio". This classification has been used widely in medical fields and has accomplished an important role in composition and interpretation of scale. With this classification, levels of measurements appear organized and validated. However, a group theory-like systematization beckons as an alternative because of its logical consistency and unexceptional applicability in the natural sciences but which may offer great advantages in clinical medicine. According to this viewpoint, the Stevens classification is reformulated within an abstract algebra-like scheme; 'Abelian modulo additive group' for "Ordinal scale" accompanied with 'zero', 'Abelian additive group' for "Interval scale", and 'field' for "Ratio scale". Furthermore, a vector-like display arranges a mixture of schemes describing the assessment of patient states. With this vector-like notation, data-mining and data-set combination is possible on a higher abstract structure level based upon a hierarchical-cluster form. Using simple examples, we show that operations acting on the corresponding mixed schemes of this display allow for a sophisticated means of classifying, updating, monitoring, and prognosis, where better data mining/data usage and efficacy is expected.

  14. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach. [Kansas

    NASA Technical Reports Server (NTRS)

    Hixson, M. M.; Bauer, M. E.; Davis, B. J.

    1979-01-01

    The effect of sampling on the accuracy (precision and bias) of crop area estimates made from classifications of LANDSAT MSS data was investigated. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plants. Four sampling schemes involving different numbers of samples and different size sampling units were evaluated. The precision of the wheat area estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling unit size.

  15. Classification schemes for knowledge translation interventions: a practical resource for researchers.

    PubMed

    Slaughter, Susan E; Zimmermann, Gabrielle L; Nuspl, Megan; Hanson, Heather M; Albrecht, Lauren; Esmail, Rosmin; Sauro, Khara; Newton, Amanda S; Donald, Maoliosa; Dyson, Michele P; Thomson, Denise; Hartling, Lisa

    2017-12-06

    As implementation science advances, the number of interventions to promote the translation of evidence into healthcare, health systems, or health policy is growing. Accordingly, classification schemes for these knowledge translation (KT) interventions have emerged. A recent scoping review identified 51 classification schemes of KT interventions to integrate evidence into healthcare practice; however, the review did not evaluate the quality of the classification schemes or provide detailed information to assist researchers in selecting a scheme for their context and purpose. This study aimed to further examine and assess the quality of these classification schemes of KT interventions, and provide information to aid researchers when selecting a classification scheme. We abstracted the following information from each of the original 51 classification scheme articles: authors' objectives; purpose of the scheme and field of application; socioecologic level (individual, organizational, community, system); adaptability (broad versus specific); target group (patients, providers, policy-makers), intent (policy, education, practice), and purpose (dissemination versus implementation). Two reviewers independently evaluated the methodological quality of the development of each classification scheme using an adapted version of the AGREE II tool. Based on these assessments, two independent reviewers reached consensus about whether to recommend each scheme for researcher use, or not. Of the 51 original classification schemes, we excluded seven that were not specific classification schemes, not accessible or duplicates. Of the remaining 44 classification schemes, nine were not recommended. Of the 35 recommended classification schemes, ten focused on behaviour change and six focused on population health. Many schemes (n = 29) addressed practice considerations. Fewer schemes addressed educational or policy objectives. Twenty-five classification schemes had broad applicability, six were specific, and four had elements of both. Twenty-three schemes targeted health providers, nine targeted both patients and providers and one targeted policy-makers. Most classification schemes were intended for implementation rather than dissemination. Thirty-five classification schemes of KT interventions were developed and reported with sufficient rigour to be recommended for use by researchers interested in KT in healthcare. Our additional categorization and quality analysis will aid in selecting suitable classification schemes for research initiatives in the field of implementation science.

  16. The future of transposable element annotation and their classification in the light of functional genomics - what we can learn from the fables of Jean de la Fontaine?

    PubMed

    Arensburger, Peter; Piégu, Benoît; Bigot, Yves

    2016-01-01

    Transposable element (TE) science has been significantly influenced by the pioneering ideas of David Finnegan near the end of the last century, as well as by the classification systems that were subsequently developed. Today, whole genome TE annotation is mostly done using tools that were developed to aid gene annotation rather than to specifically study TEs. We argue that further progress in the TE field is impeded both by current TE classification schemes and by a failure to recognize that TE biology is fundamentally different from that of multicellular organisms. Novel genome wide TE annotation methods are helping to redefine our understanding of TE sequence origins and evolution. We briefly discuss some of these new methods as well as ideas for possible alternative classification schemes. Our hope is to encourage the formation of a society to organize a larger debate on these questions and to promote the adoption of standards for annotation and an improved TE classification.

  17. Cross-ontological analytics for alignment of different classification schemes

    DOEpatents

    Posse, Christian; Sanfilippo, Antonio P; Gopalan, Banu; Riensche, Roderick M; Baddeley, Robert L

    2010-09-28

    Quantification of the similarity between nodes in multiple electronic classification schemes is provided by automatically identifying relationships and similarities between nodes within and across the electronic classification schemes. Quantifying the similarity between a first node in a first electronic classification scheme and a second node in a second electronic classification scheme involves finding a third node in the first electronic classification scheme, wherein a first product value of an inter-scheme similarity value between the second and third nodes and an intra-scheme similarity value between the first and third nodes is a maximum. A fourth node in the second electronic classification scheme can be found, wherein a second product value of an inter-scheme similarity value between the first and fourth nodes and an intra-scheme similarity value between the second and fourth nodes is a maximum. The maximum between the first and second product values represents a measure of similarity between the first and second nodes.

  18. An evaluation of sampling and full enumeration strategies for Fisher Jenks classification in big data settings

    USGS Publications Warehouse

    Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.

    2017-01-01

    Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.

  19. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  20. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  1. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  2. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  3. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  4. Classifying quantum entanglement through topological links

    NASA Astrophysics Data System (ADS)

    Quinta, Gonçalo M.; André, Rui

    2018-04-01

    We propose an alternative classification scheme for quantum entanglement based on topological links. This is done by identifying a nonrigid ring to a particle, attributing the act of cutting and removing a ring to the operation of tracing out the particle, and associating linked rings to entangled particles. This analogy naturally leads us to a classification of multipartite quantum entanglement based on all possible distinct links for a given number of rings. To determine all different possibilities, we develop a formalism that associates any link to a polynomial, with each polynomial thereby defining a distinct equivalence class. To demonstrate the use of this classification scheme, we choose qubit quantum states as our example of physical system. A possible procedure to obtain qubit states from the polynomials is also introduced, providing an example state for each link class. We apply the formalism for the quantum systems of three and four qubits and demonstrate the potential of these tools in a context of qubit networks.

  5. A Computerized English-Spanish Correlation Index to Five Biomedical Library Classification Schemes Based on MeSH*

    PubMed Central

    Muench, Eugene V.

    1971-01-01

    A computerized English/Spanish correlation index to five biomedical library classification schemes and a computerized English/Spanish, Spanish/English listings of MeSH are described. The index was accomplished by supplying appropriate classification numbers of five classification schemes (National Library of Medicine; Library of Congress; Dewey Decimal; Cunningham; Boston Medical) to MeSH and a Spanish translation of MeSH The data were keypunched, merged on magnetic tape, and sorted in a computer alphabetically by English and Spanish subject headings and sequentially by classification number. Some benefits and uses of the index are: a complete index to classification schemes based on MeSH terms; a tool for conversion of classification numbers when reclassifying collections; a Spanish index and a crude Spanish translation of five classification schemes; a data base for future applications, e.g., automatic classification. Other classification schemes, such as the UDC, and translations of MeSH into other languages can be added. PMID:5172471

  6. Medical image enhancement using resolution synthesis

    NASA Astrophysics Data System (ADS)

    Wong, Tak-Shing; Bouman, Charles A.; Thibault, Jean-Baptiste; Sauer, Ken D.

    2011-03-01

    We introduce a post-processing approach to improve the quality of CT reconstructed images. The scheme is adapted from the resolution-synthesis (RS)1 interpolation algorithm. In this approach, we consider the input image, scanned at a particular dose level, as a degraded version of a high quality image scanned at a high dose level. Image enhancement is achieved by predicting the high quality image by classification based linear regression. To improve the robustness of our scheme, we also apply the minimum description length principle to determine the optimal number of predictors to use in the scheme, and the ridge regression to regularize the design of the predictors. Experimental results show that our scheme is effective in reducing the noise in images reconstructed from filtered back projection without significant loss of image details. Alternatively, our scheme can also be applied to reduce dose while maintaining image quality at an acceptable level.

  7. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach

    NASA Technical Reports Server (NTRS)

    Hixson, M.; Bauer, M. E.; Davis, B. J. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plans. Evaluation of four sampling schemes involving different numbers of samples and different size sampling units shows that the precision of the wheat estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling size unit.

  8. An improved fault detection classification and location scheme based on wavelet transform and artificial neural network for six phase transmission line using single end data only.

    PubMed

    Koley, Ebha; Verma, Khushaboo; Ghosh, Subhojit

    2015-01-01

    Restrictions on right of way and increasing power demand has boosted development of six phase transmission. It offers a viable alternative for transmitting more power, without major modification in existing structure of three phase double circuit transmission system. Inspite of the advantages, low acceptance of six phase system is attributed to the unavailability of a proper protection scheme. The complexity arising from large number of possible faults in six phase lines makes the protection quite challenging. The proposed work presents a hybrid wavelet transform and modular artificial neural network based fault detector, classifier and locator for six phase lines using single end data only. The standard deviation of the approximate coefficients of voltage and current signals obtained using discrete wavelet transform are applied as input to the modular artificial neural network for fault classification and location. The proposed scheme has been tested for all 120 types of shunt faults with variation in location, fault resistance, fault inception angles. The variation in power system parameters viz. short circuit capacity of the source and its X/R ratio, voltage, frequency and CT saturation has also been investigated. The result confirms the effectiveness and reliability of the proposed protection scheme which makes it ideal for real time implementation.

  9. Physiotherapy movement based classification approaches to low back pain: comparison of subgroups through review and developer/expert survey.

    PubMed

    Karayannis, Nicholas V; Jull, Gwendolen A; Hodges, Paul W

    2012-02-20

    Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP) patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT), Treatment Based Classification (TBC), Pathoanatomic Based Classification (PBC), Movement System Impairment Classification (MSI), and O'Sullivan Classification System (OCS) schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i) loading strategies (MDT, TBC, PBC) aimed at eliciting a phenomenon of centralisation of symptoms; and (ii) modified movement strategies (MSI, OCS) targeted towards documenting the movement impairments associated with the pain state. Schemes vary on: the extent to which loading strategies are pursued; the assessment of movement dysfunction; and advocated treatment approaches. A biomechanical assessment predominates in the majority of schemes (MDT, PBC, MSI), certain psychosocial aspects (fear-avoidance) are considered in the TBC scheme, certain neurophysiologic (central versus peripherally mediated pain states) and psychosocial (cognitive and behavioural) aspects are considered in the OCS scheme.

  10. On Classification in the Study of Failure, and a Challenge to Classifiers

    NASA Technical Reports Server (NTRS)

    Wasson, Kimberly S.

    2003-01-01

    Classification schemes are abundant in the literature of failure. They serve a number of purposes, some more successfully than others. We examine several classification schemes constructed for various purposes relating to failure and its investigation, and discuss their values and limits. The analysis results in a continuum of uses for classification schemes, that suggests that the value of certain properties of these schemes is dependent on the goals a classification is designed to forward. The contrast in the value of different properties for different uses highlights a particular shortcoming: we argue that while humans are good at developing one kind of scheme: dynamic, flexible classifications used for exploratory purposes, we are not so good at developing another: static, rigid classifications used to trap and organize data for specific analytic goals. Our lack of strong foundation in developing valid instantiations of the latter impedes progress toward a number of investigative goals. This shortcoming and its consequences pose a challenge to researchers in the study of failure: to develop new methods for constructing and validating static classification schemes of demonstrable value in promoting the goals of investigations. We note current productive activity in this area, and outline foundations for more.

  11. Proposed new classification scheme for chemical injury to the human eye.

    PubMed

    Bagley, Daniel M; Casterton, Phillip L; Dressler, William E; Edelhauser, Henry F; Kruszewski, Francis H; McCulley, James P; Nussenblatt, Robert B; Osborne, Rosemarie; Rothenstein, Arthur; Stitzel, Katherine A; Thomas, Karluss; Ward, Sherry L

    2006-07-01

    Various ocular alkali burn classification schemes have been published and used to grade human chemical eye injuries for the purpose of identifying treatments and forecasting outcomes. The ILSI chemical eye injury classification scheme was developed for the additional purpose of collecting detailed human eye injury data to provide information on the mechanisms associated with chemical eye injuries. This information will have clinical application, as well as use in the development and validation of new methods to assess ocular toxicity. A panel of ophthalmic researchers proposed the new classification scheme based upon current knowledge of the mechanisms of eye injury, and their collective clinical and research experience. Additional ophthalmologists and researchers were surveyed to critique the scheme. The draft scheme was revised, and the proposed scheme represents the best consensus from at least 23 physicians and scientists. The new scheme classifies chemical eye injury into five categories based on clinical signs, symptoms, and expected outcomes. Diagnostic classification is based primarily on two clinical endpoints: (1) the extent (area) of injury at the limbus, and (2) the degree of injury (area and depth) to the cornea. The new classification scheme provides a uniform system for scoring eye injury across chemical classes, and provides enough detail for the clinician to collect data that will be relevant to identifying the mechanisms of ocular injury.

  12. OBJECTIVE METEOROLOGICAL CLASSIFICATION SCHEME DESIGNED TO ELUCIDATE OZONE'S DEPENDENCE ON METEOROLOGY

    EPA Science Inventory

    This paper utilizes a two-stage clustering approach as part of an objective classification scheme designed to elucidate 03's dependence on meteorology. hen applied to ten years (1981-1990) of meteorological data for Birmingham, Alabama, the classification scheme identified seven ...

  13. Entropy-based gene ranking without selection bias for the predictive classification of microarray data.

    PubMed

    Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe

    2003-11-06

    We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  14. FIELD TESTS OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED WATERSHED CLASSIFICATION SCHEMES IN THE GREAT LAKES BASIN

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1) Lake Superior tributaries and 2) watersheds of riverine coastal wetlands...

  15. FIELD TESTS OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED WATERSHED CLASSIFICATION SCHEMED IN THE GREAT LAKES BASIN

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1)Lake Superior tributaries and 2) watersheds of riverine coastal wetlands ...

  16. Enriching User-Oriented Class Associations for Library Classification Schemes.

    ERIC Educational Resources Information Center

    Pu, Hsiao-Tieh; Yang, Chyan

    2003-01-01

    Explores the possibility of adding user-oriented class associations to hierarchical library classification schemes. Analyses a log of book circulation records from a university library in Taiwan and shows that classification schemes can be made more adaptable by analyzing circulation patterns of similar users. (Author/LRW)

  17. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  18. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  19. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  20. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  1. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  2. A Classification Methodology and Retrieval Model to Support Software Reuse

    DTIC Science & Technology

    1988-01-01

    Dewey Decimal Classification ( DDC 18), an enumerative scheme, occupies 40 pages [Buchanan 19791. Langridge [19731 states that the facets listed in the...sense of historical importance or wide spread use. The schemes are: Dewey Decimal Classification ( DDC ), Universal Decimal Classification (UDC...Classification Systems ..... ..... 2.3.3 Library Classification__- .52 23.3.1 Dewey Decimal Classification -53 2.33.2 Universal Decimal Classification 55 2333

  3. Classification of close binary systems by Svechnikov

    NASA Astrophysics Data System (ADS)

    Dryomova, G. N.

    The paper presents the historical overview of classification schemes of eclipsing variable stars with the foreground of advantages of the classification scheme by Svechnikov being widely appreciated for Close Binary Systems due to simplicity of classification criteria and brevity.

  4. State of the Art in the Cramer Classification Scheme and ...

    EPA Pesticide Factsheets

    Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD. Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD.

  5. MeMoVolc report on classification and dynamics of volcanic explosive eruptions

    NASA Astrophysics Data System (ADS)

    Bonadonna, C.; Cioni, R.; Costa, A.; Druitt, T.; Phillips, J.; Pioli, L.; Andronico, D.; Harris, A.; Scollo, S.; Bachmann, O.; Bagheri, G.; Biass, S.; Brogi, F.; Cashman, K.; Dominguez, L.; Dürig, T.; Galland, O.; Giordano, G.; Gudmundsson, M.; Hort, M.; Höskuldsson, A.; Houghton, B.; Komorowski, J. C.; Küppers, U.; Lacanna, G.; Le Pennec, J. L.; Macedonio, G.; Manga, M.; Manzella, I.; Vitturi, M. de'Michieli; Neri, A.; Pistolesi, M.; Polacci, M.; Ripepe, M.; Rossi, E.; Scheu, B.; Sulpizio, R.; Tripoli, B.; Valade, S.; Valentine, G.; Vidal, C.; Wallenstein, N.

    2016-11-01

    Classifications of volcanic eruptions were first introduced in the early twentieth century mostly based on qualitative observations of eruptive activity, and over time, they have gradually been developed to incorporate more quantitative descriptions of the eruptive products from both deposits and observations of active volcanoes. Progress in physical volcanology, and increased capability in monitoring, measuring and modelling of explosive eruptions, has highlighted shortcomings in the way we classify eruptions and triggered a debate around the need for eruption classification and the advantages and disadvantages of existing classification schemes. Here, we (i) review and assess existing classification schemes, focussing on subaerial eruptions; (ii) summarize the fundamental processes that drive and parameters that characterize explosive volcanism; (iii) identify and prioritize the main research that will improve the understanding, characterization and classification of volcanic eruptions and (iv) provide a roadmap for producing a rational and comprehensive classification scheme. In particular, classification schemes need to be objective-driven and simple enough to permit scientific exchange and promote transfer of knowledge beyond the scientific community. Schemes should be comprehensive and encompass a variety of products, eruptive styles and processes, including for example, lava flows, pyroclastic density currents, gas emissions and cinder cone or caldera formation. Open questions, processes and parameters that need to be addressed and better characterized in order to develop more comprehensive classification schemes and to advance our understanding of volcanic eruptions include conduit processes and dynamics, abrupt transitions in eruption regime, unsteadiness, eruption energy and energy balance.

  6. Fused man-machine classification schemes to enhance diagnosis of breast microcalcifications

    NASA Astrophysics Data System (ADS)

    Andreadis, Ioannis; Sevastianos, Chatzistergos; George, Spyrou; Konstantina, Nikita

    2017-11-01

    Computer aided diagnosis (CAD x ) approaches are developed towards the effective discrimination between benign and malignant clusters of microcalcifications. Different sources of information are exploited, such as features extracted from the image analysis of the region of interest, features related to the location of the cluster inside the breast, age of the patient and descriptors provided by the radiologists while performing their diagnostic task. A series of different CAD x schemes are implemented, each of which uses a different category of features and adopts a variety of machine learning algorithms and alternative image processing techniques. A novel framework is introduced where these independent diagnostic components are properly combined according to features critical to a radiologist in an attempt to identify the most appropriate CAD x schemes for the case under consideration. An open access database (Digital Database of Screening Mammography (DDSM)) has been elaborated to construct a large dataset with cases of varying subtlety, in order to ensure the development of schemes with high generalization ability, as well as extensive evaluation of their performance. The obtained results indicate that the proposed framework succeeds in improving the diagnostic procedure, as the achieved overall classification performance outperforms all the independent single diagnostic components, as well as the radiologists that assessed the same cases, in terms of accuracy, sensitivity, specificity and area under the curve following receiver operating characteristic analysis.

  7. Polsar Land Cover Classification Based on Hidden Polarimetric Features in Rotation Domain and Svm Classifier

    NASA Astrophysics Data System (ADS)

    Tao, C.-S.; Chen, S.-W.; Li, Y.-Z.; Xiao, S.-P.

    2017-09-01

    Land cover classification is an important application for polarimetric synthetic aperture radar (PolSAR) data utilization. Rollinvariant polarimetric features such as H / Ani / α / Span are commonly adopted in PolSAR land cover classification. However, target orientation diversity effect makes PolSAR images understanding and interpretation difficult. Only using the roll-invariant polarimetric features may introduce ambiguity in the interpretation of targets' scattering mechanisms and limit the followed classification accuracy. To address this problem, this work firstly focuses on hidden polarimetric feature mining in the rotation domain along the radar line of sight using the recently reported uniform polarimetric matrix rotation theory and the visualization and characterization tool of polarimetric coherence pattern. The former rotates the acquired polarimetric matrix along the radar line of sight and fully describes the rotation characteristics of each entry of the matrix. Sets of new polarimetric features are derived to describe the hidden scattering information of the target in the rotation domain. The latter extends the traditional polarimetric coherence at a given rotation angle to the rotation domain for complete interpretation. A visualization and characterization tool is established to derive new polarimetric features for hidden information exploration. Then, a classification scheme is developed combing both the selected new hidden polarimetric features in rotation domain and the commonly used roll-invariant polarimetric features with a support vector machine (SVM) classifier. Comparison experiments based on AIRSAR and multi-temporal UAVSAR data demonstrate that compared with the conventional classification scheme which only uses the roll-invariant polarimetric features, the proposed classification scheme achieves both higher classification accuracy and better robustness. For AIRSAR data, the overall classification accuracy with the proposed classification scheme is 94.91 %, while that with the conventional classification scheme is 93.70 %. Moreover, for multi-temporal UAVSAR data, the averaged overall classification accuracy with the proposed classification scheme is up to 97.08 %, which is much higher than the 87.79 % from the conventional classification scheme. Furthermore, for multitemporal PolSAR data, the proposed classification scheme can achieve better robustness. The comparison studies also clearly demonstrate that mining and utilization of hidden polarimetric features and information in the rotation domain can gain the added benefits for PolSAR land cover classification and provide a new vision for PolSAR image interpretation and application.

  8. A Classification Scheme for Smart Manufacturing Systems’ Performance Metrics

    PubMed Central

    Lee, Y. Tina; Kumaraguru, Senthilkumaran; Jain, Sanjay; Robinson, Stefanie; Helu, Moneer; Hatim, Qais Y.; Rachuri, Sudarsan; Dornfeld, David; Saldana, Christopher J.; Kumara, Soundar

    2017-01-01

    This paper proposes a classification scheme for performance metrics for smart manufacturing systems. The discussion focuses on three such metrics: agility, asset utilization, and sustainability. For each of these metrics, we discuss classification themes, which we then use to develop a generalized classification scheme. In addition to the themes, we discuss a conceptual model that may form the basis for the information necessary for performance evaluations. Finally, we present future challenges in developing robust, performance-measurement systems for real-time, data-intensive enterprises. PMID:28785744

  9. Automated structural classification of lipids by machine learning.

    PubMed

    Taylor, Ryan; Miller, Ryan H; Miller, Ryan D; Porter, Michael; Dalgleish, James; Prince, John T

    2015-03-01

    Modern lipidomics is largely dependent upon structural ontologies because of the great diversity exhibited in the lipidome, but no automated lipid classification exists to facilitate this partitioning. The size of the putative lipidome far exceeds the number currently classified, despite a decade of work. Automated classification would benefit ongoing classification efforts by decreasing the time needed and increasing the accuracy of classification while providing classifications for mass spectral identification algorithms. We introduce a tool that automates classification into the LIPID MAPS ontology of known lipids with >95% accuracy and novel lipids with 63% accuracy. The classification is based upon simple chemical characteristics and modern machine learning algorithms. The decision trees produced are intelligible and can be used to clarify implicit assumptions about the current LIPID MAPS classification scheme. These characteristics and decision trees are made available to facilitate alternative implementations. We also discovered many hundreds of lipids that are currently misclassified in the LIPID MAPS database, strongly underscoring the need for automated classification. Source code and chemical characteristic lists as SMARTS search strings are available under an open-source license at https://www.github.com/princelab/lipid_classifier. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. CLASSIFICATION FRAMEWORK FOR COASTAL ECOSYSTEM RESPONSES TO AQUATIC STRESSORS

    EPA Science Inventory

    Many classification schemes have been developed to group ecosystems based on similar characteristics. To date, however, no single scheme has addressed coastal ecosystem responses to multiple stressors. We developed a classification framework for coastal ecosystems to improve the ...

  11. Evaluation of Visual Alerts in the Maritime Domain. Study 2. Program Modifications

    DTIC Science & Technology

    2009-02-01

    feedback that they were wrong, and without consulting the Status screen again enter the alternate answer (“ qwe ”). That is, the need to consult the...Neutral Large Slow No QWE Hostile Small Fast Yes ASD DRDC Atlantic CR 2008-268 5 Table 3. First proposed target types and...classification scheme. Target Size Speed Weapons Flag Response Type Neutral Large Slow No Other QWE Hostile Small Fast Yes Other ASD Friendly Large/Small Slow

  12. THE ROLE OF WATERSHED CLASSIFICATION IN DIAGNOSING CAUSES OF BIOLOGICAL IMPAIRMENT

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmention with a gewographically-based classification scheme for two case studies involving 1) Lake Superior tributaries and 2) watersheds of riverine coastal wetlands ...

  13. Selective classification for improved robustness of myoelectric control under nonideal conditions.

    PubMed

    Scheme, Erik J; Englehart, Kevin B; Hudgins, Bernard S

    2011-06-01

    Recent literature in pattern recognition-based myoelectric control has highlighted a disparity between classification accuracy and the usability of upper limb prostheses. This paper suggests that the conventionally defined classification accuracy may be idealistic and may not reflect true clinical performance. Herein, a novel myoelectric control system based on a selective multiclass one-versus-one classification scheme, capable of rejecting unknown data patterns, is introduced. This scheme is shown to outperform nine other popular classifiers when compared using conventional classification accuracy as well as a form of leave-one-out analysis that may be more representative of real prosthetic use. Additionally, the classification scheme allows for real-time, independent adjustment of individual class-pair boundaries making it flexible and intuitive for clinical use.

  14. A classification scheme for edge-localized modes based on their probability distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Max Planck Institute for Plasma Physics, D-85748 Garching; Hornung, G.

    We present here an automated classification scheme which is particularly well suited to scenarios where the parameters have significant uncertainties or are stochastic quantities. To this end, the parameters are modeled with probability distributions in a metric space and classification is conducted using the notion of nearest neighbors. The presented framework is then applied to the classification of type I and type III edge-localized modes (ELMs) from a set of carbon-wall plasmas at JET. This provides a fast, standardized classification of ELM types which is expected to significantly reduce the effort of ELM experts in identifying ELM types. Further, themore » classification scheme is general and can be applied to various other plasma phenomena as well.« less

  15. Alternative ways of representing Zapotec and Cuicatec folk classification of birds: a multidimensional model and its implications for culturally-informed conservation in Oaxaca, México.

    PubMed

    Alcántara-Salinas, Graciela; Ellen, Roy F; Valiñas-Coalla, Leopoldo; Caballero, Javier; Argueta-Villamar, Arturo

    2013-12-09

    We report on a comparative ethno-ornithological study of Zapotec and Cuicatec communities in Northern Oaxaca, Mexico that provided a challenge to some existing descriptions of folk classification. Our default model was the taxonomic system of ranks developed by Brent Berlin. Fieldwork was conducted in the Zapotec village of San Miguel Tiltepec and in the Cuicatec village of San Juan Teponaxtla, using a combination of ethnographic interviews and pile-sorting tests. Post-fieldwork, Principal Component Analysis using NTSYSpc V. 2.11f was applied to obtain pattern variation for the answers from different participants. Using language and pile-sorting data analysed through Principal Component Analysis, we show how both Zapotec and Cuicatec subjects place a particular emphasis on an intermediate level of classification.These categories group birds with non-birds using ecological and behavioral criteria, and violate a strict distinction between symbolic and mundane (or ‘natural’), and between ‘general-purpose’ and ‘single-purpose’ schemes. We suggest that shared classificatory knowledge embodying everyday schemes for apprehending the world of birds might be better reflected in a multidimensional model that would also provide a more realistic basis for developing culturally-informed conservation strategies.

  16. Alternative ways of representing Zapotec and Cuicatec folk classification of birds: a multidimensional model and its implications for culturally-informed conservation in Oaxaca, México

    PubMed Central

    2013-01-01

    Background We report on a comparative ethno-ornithological study of Zapotec and Cuicatec communities in Northern Oaxaca, Mexico that provided a challenge to some existing descriptions of folk classification. Our default model was the taxonomic system of ranks developed by Brent Berlin. Methods Fieldwork was conducted in the Zapotec village of San Miguel Tiltepec and in the Cuicatec village of San Juan Teponaxtla, using a combination of ethnographic interviews and pile-sorting tests. Post-fieldwork, Principal Component Analysis using NTSYSpc V. 2.11f was applied to obtain pattern variation for the answers from different participants. Results and conclusion Using language and pile-sorting data analysed through Principal Component Analysis, we show how both Zapotec and Cuicatec subjects place a particular emphasis on an intermediate level of classification. These categories group birds with non-birds using ecological and behavioral criteria, and violate a strict distinction between symbolic and mundane (or ‘natural’), and between ‘general-purpose’ and ‘single-purpose’ schemes. We suggest that shared classificatory knowledge embodying everyday schemes for apprehending the world of birds might be better reflected in a multidimensional model that would also provide a more realistic basis for developing culturally-informed conservation strategies. PMID:24321280

  17. Mapping Mangrove Density from Rapideye Data in Central America

    NASA Astrophysics Data System (ADS)

    Son, Nguyen-Thanh; Chen, Chi-Farn; Chen, Cheng-Ru

    2017-06-01

    Mangrove forests provide a wide range of socioeconomic and ecological services for coastal communities. Extensive aquaculture development of mangrove waters in many developing countries has constantly ignored services of mangrove ecosystems, leading to unintended environmental consequences. Monitoring the current status and distribution of mangrove forests is deemed important for evaluating forest management strategies. This study aims to delineate the density distribution of mangrove forests in the Gulf of Fonseca, Central America with Rapideye data using the support vector machines (SVM). The data collected in 2012 for density classification of mangrove forests were processed based on four different band combination schemes: scheme-1 (bands 1-3, 5 excluding the red-edge band 4), scheme-2 (bands 1-5), scheme-3 (bands 1-3, 5 incorporating with the normalized difference vegetation index, NDVI), and scheme-4 (bands 1-3, 5 incorporating with the normalized difference red-edge index, NDRI). We also hypothesized if the obvious contribution of Rapideye red-edge band could improve the classification results. Three main steps of data processing were employed: (1), data pre-processing, (2) image classification, and (3) accuracy assessment to evaluate the contribution of red-edge band in terms of the accuracy of classification results across these four schemes. The classification maps compared with the ground reference data indicated the slightly higher accuracy level observed for schemes 2 and 4. The overall accuracies and Kappa coefficients were 97% and 0.95 for scheme-2 and 96.9% and 0.95 for scheme-4, respectively.

  18. Realistic Expectations for Rock Identification.

    ERIC Educational Resources Information Center

    Westerback, Mary Elizabeth; Azer, Nazmy

    1991-01-01

    Presents a rock classification scheme for use by beginning students. The scheme is based on rock textures (glassy, crystalline, clastic, and organic framework) and observable structures (vesicles and graded bedding). Discusses problems in other rock classification schemes which may produce confusion, misidentification, and anxiety. (10 references)…

  19. A Philosophical Approach to Describing Science Content: An Example From Geologic Classification.

    ERIC Educational Resources Information Center

    Finley, Fred N.

    1981-01-01

    Examines how research of philosophers of science may be useful to science education researchers and curriculum developers in the development of descriptions of science content related to classification schemes. Provides examples of concept analysis of two igneous rock classification schemes. (DS)

  20. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. THE WESTERN LAKE SUPERIOR COMPARATIVE WATERSHED FRAMEWORK: A FIELD TEST OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED GEOGRAPHICALLY-INDEPENDENT CLASSIFICATION

    EPA Science Inventory

    Stratified random selection of watersheds allowed us to compare geographically-independent classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme within the Northern Lakes a...

  2. Towards a Collaborative Intelligent Tutoring System Classification Scheme

    ERIC Educational Resources Information Center

    Harsley, Rachel

    2014-01-01

    This paper presents a novel classification scheme for Collaborative Intelligent Tutoring Systems (CITS), an emergent research field. The three emergent classifications of CITS are unstructured, semi-structured, and fully structured. While all three types of CITS offer opportunities to improve student learning gains, the full extent to which these…

  3. A new classification scheme of European cold-water coral habitats: Implications for ecosystem-based management of the deep sea

    NASA Astrophysics Data System (ADS)

    Davies, J. S.; Guillaumont, B.; Tempera, F.; Vertino, A.; Beuck, L.; Ólafsdóttir, S. H.; Smith, C. J.; Fosså, J. H.; van den Beld, I. M. J.; Savini, A.; Rengstorf, A.; Bayle, C.; Bourillet, J.-F.; Arnaud-Haond, S.; Grehan, A.

    2017-11-01

    Cold-water corals (CWC) can form complex structures which provide refuge, nursery grounds and physical support for a diversity of other living organisms. However, irrespectively from such ecological significance, CWCs are still vulnerable to human pressures such as fishing, pollution, ocean acidification and global warming Providing coherent and representative conservation of vulnerable marine ecosystems including CWCs is one of the aims of the Marine Protected Areas networks being implemented across European seas and oceans under the EC Habitats Directive, the Marine Strategy Framework Directive and the OSPAR Convention. In order to adequately represent ecosystem diversity, these initiatives require a standardised habitat classification that organises the variety of biological assemblages and provides consistent and functional criteria to map them across European Seas. One such classification system, EUNIS, enables a broad level classification of the deep sea based on abiotic and geomorphological features. More detailed lower biotope-related levels are currently under-developed, particularly with regards to deep-water habitats (>200 m depth). This paper proposes a hierarchical CWC biotope classification scheme that could be incorporated by existing classification schemes such as EUNIS. The scheme was developed within the EU FP7 project CoralFISH to capture the variability of CWC habitats identified using a wealth of seafloor imagery datasets from across the Northeast Atlantic and Mediterranean. Depending on the resolution of the imagery being interpreted, this hierarchical scheme allows data to be recorded from broad CWC biotope categories down to detailed taxonomy-based levels, thereby providing a flexible yet valuable information level for management. The CWC biotope classification scheme identifies 81 biotopes and highlights the limitations of the classification framework and guidance provided by EUNIS, the EC Habitats Directive, OSPAR and FAO; which largely underrepresent CWC habitats.

  4. Map Classification: A Comparison of Schemes with Special Reference to the Continent of Africa. Occasional Papers, Number 154.

    ERIC Educational Resources Information Center

    Merrett, Christopher E.

    This guide to the theory and practice of map classification begins with a discussion of the filing of maps and the function of map classification based on area and theme as illustrated by four maps of Africa. The description of the various classification systems which follows is divided into book schemes with provision for maps (including Dewey…

  5. PBT assessment under REACH: Screening for low aquatic bioaccumulation with QSAR classifications based on physicochemical properties to replace BCF in vivo testing on fish.

    PubMed

    Nendza, Monika; Kühne, Ralph; Lombardo, Anna; Strempel, Sebastian; Schüürmann, Gerrit

    2018-03-01

    Aquatic bioconcentration factors (BCFs) are critical in PBT (persistent, bioaccumulative, toxic) and risk assessment of chemicals. High costs and use of more than 100 fish per standard BCF study (OECD 305) call for alternative methods to replace as much in vivo testing as possible. The BCF waiving scheme is a screening tool combining QSAR classifications based on physicochemical properties related to the distribution (hydrophobicity, ionisation), persistence (biodegradability, hydrolysis), solubility and volatility (Henry's law constant) of substances in water bodies and aquatic biota to predict substances with low aquatic bioaccumulation (nonB, BCF<2000). The BCF waiving scheme was developed with a dataset of reliable BCFs for 998 compounds and externally validated with another 181 substances. It performs with 100% sensitivity (no false negatives), >50% efficacy (waiving potential), and complies with the OECD principles for valid QSARs. The chemical applicability domain of the BCF waiving scheme is given by the structures of the training set, with some compound classes explicitly excluded like organometallics, poly- and perfluorinated compounds, aromatic triphenylphosphates, surfactants. The prediction confidence of the BCF waiving scheme is based on applicability domain compliance, consensus modelling, and the structural similarity with known nonB and B/vB substances. Compounds classified as nonB by the BCF waiving scheme are candidates for waiving of BCF in vivo testing on fish due to low concern with regard to the B criterion. The BCF waiving scheme supports the 3Rs with a possible reduction of >50% of BCF in vivo testing on fish. If the target chemical is outside the applicability domain of the BCF waiving scheme or not classified as nonB, further assessments with in silico, in vitro or in vivo methods are necessary to either confirm or reject bioaccumulative behaviour. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Predominant-period site classification for response spectra prediction equations in Italy

    USGS Publications Warehouse

    Di Alessandro, Carola; Bonilla, Luis Fabian; Boore, David M.; Rovelli, Antonio; Scotti, Oona

    2012-01-01

    We propose a site‐classification scheme based on the predominant period of the site, as determined from the average horizontal‐to‐vertical (H/V) spectral ratios of ground motion. Our scheme extends Zhao et al. (2006) classifications by adding two classes, the most important of which is defined by flat H/V ratios with amplitudes less than 2. The proposed classification is investigated by using 5%‐damped response spectra from Italian earthquake records. We select a dataset of 602 three‐component analog and digital recordings from 120 earthquakes recorded at 214 seismic stations within a hypocentral distance of 200 km. Selected events are in the moment‐magnitude range 4.0≤Mw≤6.8 and focal depths from a few kilometers to 46 km. We computed H/V ratios for these data and used them to classify each site into one of six classes. We then investigate the impact of this classification scheme on empirical ground‐motion prediction equations (GMPEs) by comparing its performance with that of the conventional rock/soil classification. Although the adopted approach results in only a small reduction of the overall standard deviation, the use of H/V spectral ratios in site classification does capture the signature of sites with flat frequency‐response, as well as deep and shallow‐soil profiles, characterized by long‐ and short‐period resonance, respectively; in addition, the classification scheme is relatively quick and inexpensive, which is an advantage over schemes based on measurements of shear‐wave velocity.

  7. Classification scheme for phenomenological universalities in growth problems in physics and other sciences.

    PubMed

    Castorina, P; Delsanto, P P; Guiot, C

    2006-05-12

    A classification in universality classes of broad categories of phenomenologies, belonging to physics and other disciplines, may be very useful for a cross fertilization among them and for the purpose of pattern recognition and interpretation of experimental data. We present here a simple scheme for the classification of nonlinear growth problems. The success of the scheme in predicting and characterizing the well known Gompertz, West, and logistic models, suggests to us the study of a hitherto unexplored class of nonlinear growth problems.

  8. Enhancing Vocabulary Acquisition through Reading: A Hierarchy of Text-Related Exercise Types.

    ERIC Educational Resources Information Center

    Wesche, M.; Paribakht, T. Sima

    This paper describes a classification scheme developed to examine the effects of extensive reading on primary and second language vocabulary acquisition and reports on an experiment undertaken to test the model scheme. The classification scheme represents a hypothesized hierarchy of the degree and type of mental processing required by various…

  9. Mammogram classification scheme using 2D-discrete wavelet and local binary pattern for detection of breast cancer

    NASA Astrophysics Data System (ADS)

    Adi Putra, Januar

    2018-04-01

    In this paper, we propose a new mammogram classification scheme to classify the breast tissues as normal or abnormal. Feature matrix is generated using Local Binary Pattern to all the detailed coefficients from 2D-DWT of the region of interest (ROI) of a mammogram. Feature selection is done by selecting the relevant features that affect the classification. Feature selection is used to reduce the dimensionality of data and features that are not relevant, in this paper the F-test and Ttest will be performed to the results of the feature extraction dataset to reduce and select the relevant feature. The best features are used in a Neural Network classifier for classification. In this research we use MIAS and DDSM database. In addition to the suggested scheme, the competent schemes are also simulated for comparative analysis. It is observed that the proposed scheme has a better say with respect to accuracy, specificity and sensitivity. Based on experiments, the performance of the proposed scheme can produce high accuracy that is 92.71%, while the lowest accuracy obtained is 77.08%.

  10. The impact of catchment source group classification on the accuracy of sediment fingerprinting outputs.

    PubMed

    Pulley, Simon; Foster, Ian; Collins, Adrian L

    2017-06-01

    The objective classification of sediment source groups is at present an under-investigated aspect of source tracing studies, which has the potential to statistically improve discrimination between sediment sources and reduce uncertainty. This paper investigates this potential using three different source group classification schemes. The first classification scheme was simple surface and subsurface groupings (Scheme 1). The tracer signatures were then used in a two-step cluster analysis to identify the sediment source groupings naturally defined by the tracer signatures (Scheme 2). The cluster source groups were then modified by splitting each one into a surface and subsurface component to suit catchment management goals (Scheme 3). The schemes were tested using artificial mixtures of sediment source samples. Controlled corruptions were made to some of the mixtures to mimic the potential causes of tracer non-conservatism present when using tracers in natural fluvial environments. It was determined how accurately the known proportions of sediment sources in the mixtures were identified after unmixing modelling using the three classification schemes. The cluster analysis derived source groups (2) significantly increased tracer variability ratios (inter-/intra-source group variability) (up to 2122%, median 194%) compared to the surface and subsurface groupings (1). As a result, the composition of the artificial mixtures was identified an average of 9.8% more accurately on the 0-100% contribution scale. It was found that the cluster groups could be reclassified into a surface and subsurface component (3) with no significant increase in composite uncertainty (a 0.1% increase over Scheme 2). The far smaller effects of simulated tracer non-conservatism for the cluster analysis based schemes (2 and 3) was primarily attributed to the increased inter-group variability producing a far larger sediment source signal that the non-conservatism noise (1). Modified cluster analysis based classification methods have the potential to reduce composite uncertainty significantly in future source tracing studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science.

    PubMed

    Waltman, Ludo; Yan, Erjia; van Eck, Nees Jan

    2011-10-01

    Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators). We combine these two ideas in a single indicator, referred to as the recursive mean normalized citation score indicator, and we study the validity of this indicator. Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme. Based on these observations, we advise against the use of indicators in which the idea of normalization based on a field classification scheme and the idea of recursive citation weighing are combined.

  12. Mild, moderate, meaningful? Examining the psychological and functioning correlates of DSM-5 eating disorder severity specifiers.

    PubMed

    Gianini, Loren; Roberto, Christina A; Attia, Evelyn; Walsh, B Timothy; Thomas, Jennifer J; Eddy, Kamryn T; Grilo, Carlos M; Weigel, Thomas; Sysko, Robyn

    2017-08-01

    This study evaluated the DSM-5 severity specifiers for treatment-seeking groups of participants with anorexia nervosa (AN), the purging form of bulimia nervosa (BN), and binge-eating disorder (BED). Hundred and sixty-two participants with AN, 93 participants with BN, and 343 participants with BED were diagnosed using semi-structured interviews, sub-categorized using DSM-5 severity specifiers and compared on demographic and cross-sectional clinical measures. In AN, the number of previous hospitalizations and the duration of illness increased with severity, but there was no difference across severity groups on measures of eating pathology, depression, or measures of self-reported physical or emotional functioning. In BN, the level of eating concerns increased across the severity groups, but the groups did not differ on measures of depression, self-esteem, and most eating pathology variables. In BN, support was also found for an alternative severity classification scheme based upon number of methods of purging. In BED, levels of several measures of eating pathology and self-reported physical and emotional functioning increased across the severity groups. For BED, however, support was also found for an alternative severity classification scheme based upon overvaluation of shape and weight. Preliminary evidence was also found for a transdiagnostic severity index based upon overvaluation of shape and weight. Overall, these data show limited support for the DSM-5 severity specifiers for BN and modest support for the DSM-5 severity specifiers for AN and BED. © 2017 Wiley Periodicals, Inc.

  13. Real-time ultrasonic weld evaluation system

    NASA Astrophysics Data System (ADS)

    Katragadda, Gopichand; Nair, Satish; Liu, Harry; Brown, Lawrence M.

    1996-11-01

    Ultrasonic testing techniques are currently used as an alternative to radiography for detecting, classifying,and sizing weld defects, and for evaluating weld quality. Typically, ultrasonic weld inspections are performed manually, which require significant operator expertise and time. Thus, in recent years, the emphasis is to develop automated methods to aid or replace operators in critical weld inspections where inspection time, reliability, and operator safety are major issues. During this period, significant advances wee made in the areas of weld defect classification and sizing. Very few of these methods, however have found their way into the market, largely due to the lack of an integrated approach enabling real-time implementation. Also, not much research effort was directed in improving weld acceptance criteria. This paper presents an integrated system utilizing state-of-the-art techniques for a complete automation of the weld inspection procedure. The modules discussed include transducer tracking, classification, sizing, and weld acceptance criteria. Transducer tracking was studied by experimentally evaluating sonic and optical position tracking techniques. Details for this evaluation are presented. Classification is obtained using a multi-layer perceptron. Results from different feature extraction schemes, including a new method based on a combination of time and frequency-domain signal representations are given. Algorithms developed to automate defect registration and sizing are discussed. A fuzzy-logic acceptance criteria for weld acceptance is presented describing how this scheme provides improved robustness compared to the traditional flow-diagram standards.

  14. Etiological classification of ischemic stroke in young patients: a comparative study of TOAST, CCS, and ASCO.

    PubMed

    Gökçal, Elif; Niftaliyev, Elvin; Asil, Talip

    2017-09-01

    Analysis of stroke subtypes is important for making treatment decisions and prognostic evaluations. The TOAST classification system is most commonly used, but the CCS and ASCO classification systems might be more useful to identify stroke etiologies in young patients whose strokes have a wide range of different causes. In this manuscript, we aim to compare the differences in subtype classification between TOAST, CCS, and ASCO in young stroke patients. The TOAST, CCS, and ASCO classification schemes were applied to 151 patients with ischemic stroke aged 18-49 years old and the proportion of subtypes classified by each scheme was compared. For comparison, determined etiologies were defined as cases with evident and probable subtypes when using the CCS scheme and cases with grade 1 and 2 subtypes but no other grade 1 subtype when using the ASCO scheme. The McNemar test with Bonferroni correction was used to assess significance. By TOAST, 41.1% of patients' stroke etiology was classified as undetermined etiology, 19.2% as cardioembolic, 13.2% as large artery atherosclerosis, 11.3% as small vessel occlusion, and 15.2% as other causes. Compared with TOAST, both CCS and ASCO assigned fewer patients to the undetermined etiology group (30.5% p < 0.001 and 26.5% p < 0.001, respectively) and assigned more patients to the small vessel occlusion category (19.9%, p < 0.001, and 21.9%, p < 0.001, respectively). Additionally, both schemes assigned more patients to the large artery atherosclerosis group (15.9 and 16.6%, respectively). The proportion of patients assigned to either the cardioembolic or the other causes etiology did not differ significantly between the three schemes. Application of the CCS and ASCO classification schemes in young stroke patients seems feasible, and using both schemes may result in fewer patients being classified as undetermined etiology. New studies with more patients and a prospective design are needed to explore this topic further.

  15. Development and application of a new comprehensive image-based classification scheme for coastal and benthic environments along the southeast Florida continental shelf

    NASA Astrophysics Data System (ADS)

    Makowski, Christopher

    The coastal (terrestrial) and benthic environments along the southeast Florida continental shelf show a unique biophysical succession of marine features from a highly urbanized, developed coastal region in the north (i.e. northern Miami-Dade County) to a protective marine sanctuary in the southeast (i.e. Florida Keys National Marine Sanctuary). However, the establishment of a standard bio-geomorphological classification scheme for this area of coastal and benthic environments is lacking. The purpose of this study was to test the hypothesis and answer the research question of whether new parameters of integrating geomorphological components with dominant biological covers could be developed and applied across multiple remote sensing platforms for an innovative way to identify, interpret, and classify diverse coastal and benthic environments along the southeast Florida continental shelf. An ordered manageable hierarchical classification scheme was developed to incorporate the categories of Physiographic Realm, Morphodynamic Zone, Geoform, Landform, Dominant Surface Sediment, and Dominant Biological Cover. Six different remote sensing platforms (i.e. five multi-spectral satellite image sensors and one high-resolution aerial orthoimagery) were acquired, delineated according to the new classification scheme, and compared to determine optimal formats for classifying the study area. Cognitive digital classification at a nominal scale of 1:6000 proved to be more accurate than autoclassification programs and therefore used to differentiate coastal marine environments based on spectral reflectance characteristics, such as color, tone, saturation, pattern, and texture of the seafloor topology. In addition, attribute tables were created in conjugation with interpretations to quantify and compare the spatial relationships between classificatory units. IKONOS-2 satellite imagery was determined to be the optimal platform for applying the hierarchical classification scheme. However, each remote sensing platform had beneficial properties depending on research goals, logistical restrictions, and financial support. This study concluded that a new hierarchical comprehensive classification scheme for identifying coastal marine environments along the southeast Florida continental shelf could be achieved by integrating geomorphological features with biological coverages. This newly developed scheme, which can be applied across multiple remote sensing platforms with GIS software, establishes an innovative classification protocol to be used in future research studies.

  16. Demonstration of Advanced EMI Models for Live-Site UXO Discrimination at Former Camp Butner, North Carolina

    DTIC Science & Technology

    2012-05-01

    GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7 . PERFORMING ORGANIZATION NAME(S...2.3.3 Classification using template matching ...................................................... 7 2.4 Details of classification schemes... 7 2.4.1 Camp Butner TEMTADS data inversion and classification scheme .......... 9

  17. Transporter taxonomy - a comparison of different transport protein classification schemes.

    PubMed

    Viereck, Michael; Gaulton, Anna; Digles, Daniela; Ecker, Gerhard F

    2014-06-01

    Currently, there are more than 800 well characterized human membrane transport proteins (including channels and transporters) and there are estimates that about 10% (approx. 2000) of all human genes are related to transport. Membrane transport proteins are of interest as potential drug targets, for drug delivery, and as a cause of side effects and drug–drug interactions. In light of the development of Open PHACTS, which provides an open pharmacological space, we analyzed selected membrane transport protein classification schemes (Transporter Classification Database, ChEMBL, IUPHAR/BPS Guide to Pharmacology, and Gene Ontology) for their ability to serve as a basis for pharmacology driven protein classification. A comparison of these membrane transport protein classification schemes by using a set of clinically relevant transporters as use-case reveals the strengths and weaknesses of the different taxonomy approaches.

  18. A scheme for a flexible classification of dietary and health biomarkers.

    PubMed

    Gao, Qian; Praticò, Giulia; Scalbert, Augustin; Vergères, Guy; Kolehmainen, Marjukka; Manach, Claudine; Brennan, Lorraine; Afman, Lydia A; Wishart, David S; Andres-Lacueva, Cristina; Garcia-Aloy, Mar; Verhagen, Hans; Feskens, Edith J M; Dragsted, Lars O

    2017-01-01

    Biomarkers are an efficient means to examine intakes or exposures and their biological effects and to assess system susceptibility. Aided by novel profiling technologies, the biomarker research field is undergoing rapid development and new putative biomarkers are continuously emerging in the scientific literature. However, the existing concepts for classification of biomarkers in the dietary and health area may be ambiguous, leading to uncertainty about their application. In order to better understand the potential of biomarkers and to communicate their use and application, it is imperative to have a solid scheme for biomarker classification that will provide a well-defined ontology for the field. In this manuscript, we provide an improved scheme for biomarker classification based on their intended use rather than the technology or outcomes (six subclasses are suggested: food compound intake biomarkers (FCIBs), food or food component intake biomarkers (FIBs), dietary pattern biomarkers (DPBs), food compound status biomarkers (FCSBs), effect biomarkers, physiological or health state biomarkers). The application of this scheme is described in detail for the dietary and health area and is compared with previous biomarker classification for this field of research.

  19. Model Validation and Site Characterization for Early Deployment MHK Sites and Establishment of Wave Classification Scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilcher, Levi F

    Model Validation and Site Characterization for Early Deployment Marine and Hydrokinetic Energy Sites and Establishment of Wave Classification Scheme presentation from from Water Power Technologies Office Peer Review, FY14-FY16.

  20. Mathematical model of blasting schemes management in mining operations in presence of random disturbances

    NASA Astrophysics Data System (ADS)

    Kazakova, E. I.; Medvedev, A. N.; Kolomytseva, A. O.; Demina, M. I.

    2017-11-01

    The paper presents a mathematical model of blasting schemes management in presence of random disturbances. Based on the lemmas and theorems proved, a control functional is formulated, which is stable. A universal classification of blasting schemes is developed. The main classification attributes are suggested: the orientation in plan the charging wells rows relatively the block of rocks; the presence of cuts in the blasting schemes; the separation of the wells series onto elements; the sequence of the blasting. The periodic regularity of transition from one Short-delayed scheme of blasting to another is proved.

  1. Less-Complex Method of Classifying MPSK

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2006-01-01

    An alternative to an optimal method of automated classification of signals modulated with M-ary phase-shift-keying (M-ary PSK or MPSK) has been derived. The alternative method is approximate, but it offers nearly optimal performance and entails much less complexity, which translates to much less computation time. Modulation classification is becoming increasingly important in radio-communication systems that utilize multiple data modulation schemes and include software-defined or software-controlled receivers. Such a receiver may "know" little a priori about an incoming signal but may be required to correctly classify its data rate, modulation type, and forward error-correction code before properly configuring itself to acquire and track the symbol timing, carrier frequency, and phase, and ultimately produce decoded bits. Modulation classification has long been an important component of military interception of initially unknown radio signals transmitted by adversaries. Modulation classification may also be useful for enabling cellular telephones to automatically recognize different signal types and configure themselves accordingly. The concept of modulation classification as outlined in the preceding paragraph is quite general. However, at the present early stage of development, and for the purpose of describing the present alternative method, the term "modulation classification" or simply "classification" signifies, more specifically, a distinction between M-ary and M'-ary PSK, where M and M' represent two different integer multiples of 2. Both the prior optimal method and the present alternative method require the acquisition of magnitude and phase values of a number (N) of consecutive baseband samples of the incoming signal + noise. The prior optimal method is based on a maximum- likelihood (ML) classification rule that requires a calculation of likelihood functions for the M and M' hypotheses: Each likelihood function is an integral, over a full cycle of carrier phase, of a complicated sum of functions of the baseband sample values, the carrier phase, the carrier-signal and noise magnitudes, and M or M'. Then the likelihood ratio, defined as the ratio between the likelihood functions, is computed, leading to the choice of whichever hypothesis - M or M'- is more likely. In the alternative method, the integral in each likelihood function is approximated by a sum over values of the integrand sampled at a number, 1, of equally spaced values of carrier phase. Used in this way, 1 is a parameter that can be adjusted to trade computational complexity against the probability of misclassification. In the limit as 1 approaches infinity, one obtains the integral form of the likelihood function and thus recovers the ML classification. The present approximate method has been tested in comparison with the ML method by means of computational simulations. The results of the simulations have shown that the performance (as quantified by probability of misclassification) of the approximate method is nearly indistinguishable from that of the ML method (see figure).

  2. DISSECT: a new mnemonic-based approach to the categorization of aortic dissection.

    PubMed

    Dake, M D; Thompson, M; van Sambeek, M; Vermassen, F; Morales, J P

    2013-08-01

    Classification systems for aortic dissection provide important guides to clinical decision-making, but the relevance of traditional categorization schemes is being questioned in an era when endovascular techniques are assuming a growing role in the management of this frequently complex and catastrophic entity. In recognition of the expanding range of interventional therapies now used as alternatives to conventional treatment approaches, the Working Group on Aortic Diseases of the DEFINE Project developed a categorization system that features the specific anatomic and clinical manifestations of the disease process that are most relevant to contemporary decision-making. The DISSECT classification system is a mnemonic-based approach to the evaluation of aortic dissection. It guides clinicians through an assessment of six critical characteristics that facilitate optimal communication of the most salient details that currently influence the selection of a therapeutic option, including those findings that are key when considering an endovascular procedure, but are not taken into account by the DeBakey or Stanford categorization schemes. The six features of aortic dissection include: duration of disease; intimal tear location; size of the dissected aorta; segmental extent of aortic involvement; clinical complications of the dissection, and thrombus within the aortic false lumen. In current clinical practice, endovascular therapy is increasingly considered as an alternative to medical management or open surgical repair in select cases of type B aortic dissection. Currently, endovascular aortic repair is not used for patients with type A aortic dissection, but catheter-based techniques directed at peripheral branch vessel ischemia that may complicate type A dissection are considered valuable adjunctive interventions, when indicated. The use of a new system for categorization of aortic dissection, DISSECT, addresses the shortcomings of well-known established schemes devised more than 40 years ago, before the introduction of endovascular techniques. It will serve as a guide to support a critical analysis of contemporary therapeutic options and inform management decisions based on specific features of the disease process. Copyright © 2013 European Society for Vascular Surgery. All rights reserved.

  3. Classification of childhood epilepsies in a tertiary pediatric neurology clinic using a customized classification scheme from the international league against epilepsy 2010 report.

    PubMed

    Khoo, Teik-Beng

    2013-01-01

    In its 2010 report, the International League Against Epilepsy Commission on Classification and Terminology had made a number of changes to the organization, terminology, and classification of seizures and epilepsies. This study aims to test the usefulness of this revised classification scheme on children with epilepsies aged between 0 and 18 years old. Of 527 patients, 75.1% only had 1 type of seizure and the commonest was focal seizure (61.9%). A specific electroclinical syndrome diagnosis could be made in 27.5%. Only 2.1% had a distinctive constellation. In this cohort, 46.9% had an underlying structural, metabolic, or genetic etiology. Among the important causes were pre-/perinatal insults, malformation of cortical development, intracranial infections, and neurocutaneous syndromes. However, 23.5% of the patients in our cohort were classified as having "epilepsies of unknown cause." The revised classification scheme is generally useful for pediatric patients. To make it more inclusive and clinically meaningful, some local customizations are required.

  4. Toward an endovascular internal carotid artery classification system.

    PubMed

    Shapiro, M; Becske, T; Riina, H A; Raz, E; Zumofen, D; Jafar, J J; Huang, P P; Nelson, P K

    2014-02-01

    Does the world need another ICA classification scheme? We believe so. The purpose of proposed angiography-driven classification is to optimize description of the carotid artery from the endovascular perspective. A review of existing, predominantly surgically-driven classifications is performed, and a new scheme, based on the study of NYU aneurysm angiographic and cross-sectional databases is proposed. Seven segments - cervical, petrous, cavernous, paraophthlamic, posterior communicating, choroidal, and terminus - are named. This nomenclature recognizes intrinsic uncertainty in precise angiographic and cross-sectional localization of aneurysms adjacent to the dural rings, regarding all lesions distal to the cavernous segment as potentially intradural. Rather than subdividing various transitional, ophthalmic, and hypophyseal aneurysm subtypes, as necessitated by their varied surgical approaches and risks, the proposed classification emphasizes their common endovascular treatment features, while recognizing that many complex, trans-segmental, and fusiform aneurysms not readily classifiable into presently available, saccular aneurysm-driven schemes, are being increasingly addressed by endovascular means. We believe this classification may find utility in standardizing nomenclature for outcome tracking, treatment trials and physician communication.

  5. Underwater target classification using wavelet packets and neural networks.

    PubMed

    Azimi-Sadjadi, M R; Yao, D; Huang, Q; Dobeck, G J

    2000-01-01

    In this paper, a new subband-based classification scheme is developed for classifying underwater mines and mine-like targets from the acoustic backscattered signals. The system consists of a feature extractor using wavelet packets in conjunction with linear predictive coding (LPC), a feature selection scheme, and a backpropagation neural-network classifier. The data set used for this study consists of the backscattered signals from six different objects: two mine-like targets and four nontargets for several aspect angles. Simulation results on ten different noisy realizations and for signal-to-noise ratio (SNR) of 12 dB are presented. The receiver operating characteristic (ROC) curve of the classifier generated based on these results demonstrated excellent classification performance of the system. The generalization ability of the trained network was demonstrated by computing the error and classification rate statistics on a large data set. A multiaspect fusion scheme was also adopted in order to further improve the classification performance.

  6. Do thoraco-lumbar spinal injuries classification systems exhibit lower inter- and intra-observer agreement than other fractures classifications?: A comparison using fractures of the trochanteric area of the proximal femur as contrast model.

    PubMed

    Urrutia, Julio; Zamora, Tomas; Klaber, Ianiv; Carmona, Maximiliano; Palma, Joaquin; Campos, Mauricio; Yurac, Ratko

    2016-04-01

    It has been postulated that the complex patterns of spinal injuries have prevented adequate agreement using thoraco-lumbar spinal injuries (TLSI) classifications; however, limb fracture classifications have also shown variable agreements. This study compared agreement using two TLSI classifications with agreement using two classifications of fractures of the trochanteric area of the proximal femur (FTAPF). Six evaluators classified the radiographs and computed tomography scans of 70 patients with acute TLSI using the Denis and the new AO Spine thoraco-lumbar injury classifications. Additionally, six evaluators classified the radiographs of 70 patients with FTAPF using the Tronzo and the AO schemes. Six weeks later, all cases were presented in a random sequence for repeat assessment. The Kappa coefficient (κ) was used to determine agreement. Inter-observer agreement: For TLSI, using the AOSpine classification, the mean κ was 0.62 (0.57-0.66) considering fracture types, and 0.55 (0.52-0.57) considering sub-types; using the Denis classification, κ was 0.62 (0.59-0.65). For FTAPF, with the AO scheme, the mean κ was 0.58 (0.54-0.63) considering fracture types and 0.31 (0.28-0.33) considering sub-types; for the Tronzo classification, κ was 0.54 (0.50-0.57). Intra-observer agreement: For TLSI, using the AOSpine scheme, the mean κ was 0.77 (0.72-0.83) considering fracture types, and 0.71 (0.67-0.76) considering sub-types; for the Denis classification, κ was 0.76 (0.71-0.81). For FTAPF, with the AO scheme, the mean κ was 0.75 (0.69-0.81) considering fracture types and 0.45 (0.39-0.51) considering sub-types; for the Tronzo classification, κ was 0.64 (0.58-0.70). Using the main types of AO classifications, inter- and intra-observer agreement of TLSI were comparable to agreement evaluating FTAPF; including sub-types, inter- and intra-observer agreement evaluating TLSI were significantly better than assessing FTAPF. Inter- and intra-observer agreements using the Denis classification were also significantly better than agreement using the Tronzo scheme. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. A comparative agreement evaluation of two subaxial cervical spine injury classification systems: the AOSpine and the Allen and Ferguson schemes.

    PubMed

    Urrutia, Julio; Zamora, Tomas; Campos, Mauricio; Yurac, Ratko; Palma, Joaquin; Mobarec, Sebastian; Prada, Carlos

    2016-07-01

    We performed an agreement study using two subaxial cervical spine classification systems: the AOSpine and the Allen and Ferguson (A&F) classifications. We sought to determine which scheme allows better agreement by different evaluators and by the same evaluator on different occasions. Complete imaging studies of 65 patients with subaxial cervical spine injuries were classified by six evaluators (three spine sub-specialists and three senior orthopaedic surgery residents) using the AOSpine subaxial cervical spine classification system and the A&F scheme. The cases were displayed in a random sequence after a 6-week interval for repeat evaluation. The Kappa coefficient (κ) was used to determine inter- and intra-observer agreement. Inter-observer: considering the main AO injury types, the agreement was substantial for the AOSpine classification [κ = 0.61 (0.57-0.64)]; using AO sub-types, the agreement was moderate [κ = 0.57 (0.54-0.60)]. For the A&F classification, the agreement [κ = 0.46 (0.42-0.49)] was significantly lower than using the AOSpine scheme. Intra-observer: the agreement was substantial considering injury types [κ = 0.68 (0.62-0.74)] and considering sub-types [κ = 0.62 (0.57-0.66)]. Using the A&F classification, the agreement was also substantial [κ = 0.66 (0.61-0.71)]. No significant differences were observed between spine surgeons and orthopaedic residents in the overall inter- and intra-observer agreement, or in the inter- and intra-observer agreement of specific type of injuries. The AOSpine classification (using the four main injury types or at the sub-types level) allows a significantly better agreement than the A&F classification. The A&F scheme does not allow reliable communication between medical professionals.

  8. Developing a contributing factor classification scheme for Rasmussen's AcciMap: Reliability and validity evaluation.

    PubMed

    Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F

    2017-10-01

    One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1  = 68.8%; M T2  = 73.9%), and were poor at the descriptor level (M T1  = 58.5%; M T2  = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1  = 73.9%; M T2  = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1  = 67.6%; M T2  = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. New Course Design: Classification Schemes and Information Architecture.

    ERIC Educational Resources Information Center

    Weinberg, Bella Hass

    2002-01-01

    Describes a course developed at St. John's University (New York) in the Division of Library and Information Science that relates traditional classification schemes to information architecture and Web sites. Highlights include functional aspects of information architecture, that is, the way content is structured; assignments; student reactions; and…

  10. An alternative view of protein fold space.

    PubMed

    Shindyalov, I N; Bourne, P E

    2000-02-15

    Comparing and subsequently classifying protein structures information has received significant attention concurrent with the increase in the number of experimentally derived 3-dimensional structures. Classification schemes have focused on biological function found within protein domains and on structure classification based on topology. Here an alternative view is presented that groups substructures. Substructures are long (50-150 residue) highly repetitive near-contiguous pieces of polypeptide chain that occur frequently in a set of proteins from the PDB defined as structurally non-redundant over the complete polypeptide chain. The substructure classification is based on a previously reported Combinatorial Extension (CE) algorithm that provides a significantly different set of structure alignments than those previously described, having, for example, only a 40% overlap with FSSP. Qualitatively the algorithm provides longer contiguous aligned segments at the price of a slightly higher root-mean-square deviation (rmsd). Clustering these alignments gives a discreet and highly repetitive set of substructures not detectable by sequence similarity alone. In some cases different substructures represent all or different parts of well known folds indicative of the Russian doll effect--the continuity of protein fold space. In other cases they fall into different structure and functional classifications. It is too early to determine whether these newly classified substructures represent new insights into the evolution of a structural framework important to many proteins. What is apparent from on-going work is that these substructures have the potential to be useful probes in finding remote sequence homology and in structure prediction studies. The characteristics of the complete all-by-all comparison of the polypeptide chains present in the PDB and details of the filtering procedure by pair-wise structure alignment that led to the emergent substructure gallery are discussed. Substructure classification, alignments, and tools to analyze them are available at http://cl.sdsc.edu/ce.html.

  11. Classification of baseline toxicants for QSAR predictions to replace fish acute toxicity studies.

    PubMed

    Nendza, Monika; Müller, Martin; Wenzel, Andrea

    2017-03-22

    Fish acute toxicity studies are required for environmental hazard and risk assessment of chemicals by national and international legislations such as REACH, the regulations of plant protection products and biocidal products, or the GHS (globally harmonised system) for classification and labelling of chemicals. Alternative methods like QSARs (quantitative structure-activity relationships) can replace many ecotoxicity tests. However, complete substitution of in vivo animal tests by in silico methods may not be realistic. For the so-called baseline toxicants, it is possible to predict the fish acute toxicity with sufficient accuracy from log K ow and, hence, valid QSARs can replace in vivo testing. In contrast, excess toxicants and chemicals not reliably classified as baseline toxicants require further in silico, in vitro or in vivo assessments. Thus, the critical task is to discriminate between baseline and excess toxicants. For fish acute toxicity, we derived a scheme based on structural alerts and physicochemical property thresholds to classify chemicals as either baseline toxicants (=predictable by QSARs) or as potential excess toxicants (=not predictable by baseline QSARs). The step-wise approach identifies baseline toxicants (true negatives) in a precautionary way to avoid false negative predictions. Therefore, a certain fraction of false positives can be tolerated, i.e. baseline toxicants without specific effects that may be tested instead of predicted. Application of the classification scheme to a new heterogeneous dataset for diverse fish species results in 40% baseline toxicants, 24% excess toxicants and 36% compounds not classified. Thus, we can conclude that replacing about half of the fish acute toxicity tests by QSAR predictions is realistic to be achieved in the short-term. The long-term goals are classification criteria also for further groups of toxicants and to replace as many in vivo fish acute toxicity tests as possible with valid QSAR predictions.

  12. Multivariate decoding of brain images using ordinal regression.

    PubMed

    Doyle, O M; Ashburner, J; Zelaya, F O; Williams, S C R; Mehta, M A; Marquand, A F

    2013-11-01

    Neuroimaging data are increasingly being used to predict potential outcomes or groupings, such as clinical severity, drug dose response, and transitional illness states. In these examples, the variable (target) we want to predict is ordinal in nature. Conventional classification schemes assume that the targets are nominal and hence ignore their ranked nature, whereas parametric and/or non-parametric regression models enforce a metric notion of distance between classes. Here, we propose a novel, alternative multivariate approach that overcomes these limitations - whole brain probabilistic ordinal regression using a Gaussian process framework. We applied this technique to two data sets of pharmacological neuroimaging data from healthy volunteers. The first study was designed to investigate the effect of ketamine on brain activity and its subsequent modulation with two compounds - lamotrigine and risperidone. The second study investigates the effect of scopolamine on cerebral blood flow and its modulation using donepezil. We compared ordinal regression to multi-class classification schemes and metric regression. Considering the modulation of ketamine with lamotrigine, we found that ordinal regression significantly outperformed multi-class classification and metric regression in terms of accuracy and mean absolute error. However, for risperidone ordinal regression significantly outperformed metric regression but performed similarly to multi-class classification both in terms of accuracy and mean absolute error. For the scopolamine data set, ordinal regression was found to outperform both multi-class and metric regression techniques considering the regional cerebral blood flow in the anterior cingulate cortex. Ordinal regression was thus the only method that performed well in all cases. Our results indicate the potential of an ordinal regression approach for neuroimaging data while providing a fully probabilistic framework with elegant approaches for model selection. Copyright © 2013. Published by Elsevier Inc.

  13. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.

    PubMed

    Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali

    2017-01-01

    With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.

  14. Enhancing Vocabulary Acquisition Through Reading: A Hierarchy of Text-Related Exercise Types.

    ERIC Educational Resources Information Center

    Paribakht, T. Sima; Wesche, Marjorie

    1996-01-01

    Presents a classification scheme for reading-related exercises advocated in English-as-a-Foreign-Language textbooks. The scheme proposes a hierarchy of the degree and type of mental processing required by various vocabulary exercises. The categories of classification are selective attention, recognition, manipulation, interpretation and…

  15. Comparing ecoregional classifications for natural areas management in the Klamath Region, USA

    USGS Publications Warehouse

    Sarr, Daniel A.; Duff, Andrew; Dinger, Eric C.; Shafer, Sarah L.; Wing, Michael; Seavy, Nathaniel E.; Alexander, John D.

    2015-01-01

    We compared three existing ecoregional classification schemes (Bailey, Omernik, and World Wildlife Fund) with two derived schemes (Omernik Revised and Climate Zones) to explore their effectiveness in explaining species distributions and to better understand natural resource geography in the Klamath Region, USA. We analyzed presence/absence data derived from digital distribution maps for trees, amphibians, large mammals, small mammals, migrant birds, and resident birds using three statistical analyses of classification accuracy (Analysis of Similarity, Canonical Analysis of Principal Coordinates, and Classification Strength). The classifications were roughly comparable in classification accuracy, with Omernik Revised showing the best overall performance. Trees showed the strongest fidelity to the classifications, and large mammals showed the weakest fidelity. We discuss the implications for regional biogeography and describe how intermediate resolution ecoregional classifications may be appropriate for use as natural areas management domains.

  16. Monitoring nanotechnology using patent classifications: an overview and comparison of nanotechnology classification schemes

    NASA Astrophysics Data System (ADS)

    Jürgens, Björn; Herrero-Solana, Victor

    2017-04-01

    Patents are an essential information source used to monitor, track, and analyze nanotechnology. When it comes to search nanotechnology-related patents, a keyword search is often incomplete and struggles to cover such an interdisciplinary discipline. Patent classification schemes can reveal far better results since they are assigned by experts who classify the patent documents according to their technology. In this paper, we present the most important classifications to search nanotechnology patents and analyze how nanotechnology is covered in the main patent classification systems used in search systems nowadays: the International Patent Classification (IPC), the United States Patent Classification (USPC), and the Cooperative Patent Classification (CPC). We conclude that nanotechnology has a significantly better patent coverage in the CPC since considerable more nanotechnology documents were retrieved than by using other classifications, and thus, recommend its use for all professionals involved in nanotechnology patent searches.

  17. The Nutraceutical Bioavailability Classification Scheme: Classifying Nutraceuticals According to Factors Limiting their Oral Bioavailability.

    PubMed

    McClements, David Julian; Li, Fang; Xiao, Hang

    2015-01-01

    The oral bioavailability of a health-promoting dietary component (nutraceutical) may be limited by various physicochemical and physiological phenomena: liberation from food matrices, solubility in gastrointestinal fluids, interaction with gastrointestinal components, chemical degradation or metabolism, and epithelium cell permeability. Nutraceutical bioavailability can therefore be improved by designing food matrices that control their bioaccessibility (B*), absorption (A*), and transformation (T*) within the gastrointestinal tract (GIT). This article reviews the major factors influencing the gastrointestinal fate of nutraceuticals, and then uses this information to develop a new scheme to classify the major factors limiting nutraceutical bioavailability: the nutraceutical bioavailability classification scheme (NuBACS). This new scheme is analogous to the biopharmaceutical classification scheme (BCS) used by the pharmaceutical industry to classify drug bioavailability, but it contains additional factors important for understanding nutraceutical bioavailability in foods. The article also highlights potential strategies for increasing the oral bioavailability of nutraceuticals based on their NuBACS designation (B*A*T*).

  18. Application of a 5-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants in the InSiGHT locus-specific database.

    PubMed

    Thompson, Bryony A; Spurdle, Amanda B; Plazzer, John-Paul; Greenblatt, Marc S; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P; Farrington, Susan M; Frayling, Ian M; Frebourg, Thierry; Goldgar, David E; Heinen, Christopher D; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J; Sijmons, Rolf; Tavtigian, Sean V; Tops, Carli M; Weber, Thomas; Wijnen, Juul; Woods, Michael O; Macrae, Finlay; Genuardi, Maurizio

    2014-02-01

    The clinical classification of hereditary sequence variants identified in disease-related genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch syndrome-associated genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist in variant classification and was recognized through microattribution. The scheme was refined by multidisciplinary expert committee review of the clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants that were not obviously protein truncating from nomenclature. This large-scale endeavor will facilitate the consistent management of families suspected to have Lynch syndrome and demonstrates the value of multidisciplinary collaboration in the curation and classification of variants in public locus-specific databases.

  19. Cheese Classification, Characterization, and Categorization: A Global Perspective.

    PubMed

    Almena-Aliste, Montserrat; Mietton, Bernard

    2014-02-01

    Cheese is one of the most fascinating, complex, and diverse foods enjoyed today. Three elements constitute the cheese ecosystem: ripening agents, consisting of enzymes and microorganisms; the composition of the fresh cheese; and the environmental conditions during aging. These factors determine and define not only the sensory quality of the final cheese product but also the vast diversity of cheeses produced worldwide. How we define and categorize cheese is a complicated matter. There are various approaches to cheese classification, and a global approach for classification and characterization is needed. We review current cheese classification schemes and the limitations inherent in each of the schemes described. While some classification schemes are based on microbiological criteria, others rely on descriptions of the technologies used for cheese production. The goal of this review is to present an overview of comprehensive and practical integrative classification models in order to better describe cheese diversity and the fundamental differences within cheeses, as well as to connect fundamental technological, microbiological, chemical, and sensory characteristics to contribute to an overall characterization of the main families of cheese, including the expanding world of American artisanal cheeses.

  20. New KF-PP-SVM classification method for EEG in brain-computer interfaces.

    PubMed

    Yang, Banghua; Han, Zhijun; Zan, Peng; Wang, Qian

    2014-01-01

    Classification methods are a crucial direction in the current study of brain-computer interfaces (BCIs). To improve the classification accuracy for electroencephalogram (EEG) signals, a novel KF-PP-SVM (kernel fisher, posterior probability, and support vector machine) classification method is developed. Its detailed process entails the use of common spatial patterns to obtain features, based on which the within-class scatter is calculated. Then the scatter is added into the kernel function of a radial basis function to construct a new kernel function. This new kernel is integrated into the SVM to obtain a new classification model. Finally, the output of SVM is calculated based on posterior probability and the final recognition result is obtained. To evaluate the effectiveness of the proposed KF-PP-SVM method, EEG data collected from laboratory are processed with four different classification schemes (KF-PP-SVM, KF-SVM, PP-SVM, and SVM). The results showed that the overall average improvements arising from the use of the KF-PP-SVM scheme as opposed to KF-SVM, PP-SVM and SVM schemes are 2.49%, 5.83 % and 6.49 % respectively.

  1. Application of a five-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants lodged on the InSiGHT locus-specific database

    PubMed Central

    Plazzer, John-Paul; Greenblatt, Marc S.; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T.; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P.; Farrington, Susan M.; Frayling, Ian M.; Frebourg, Thierry; Goldgar, David E.; Heinen, Christopher D.; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J.; Sijmons, Rolf; Tavtigian, Sean V.; Tops, Carli M.; Weber, Thomas; Wijnen, Juul; Woods, Michael O.; Macrae, Finlay; Genuardi, Maurizio

    2015-01-01

    Clinical classification of sequence variants identified in hereditary disease genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch Syndrome genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist variant classification, and recognized by microattribution. The scheme was refined by multidisciplinary expert committee review of clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants not obviously protein-truncating from nomenclature. This large-scale endeavor will facilitate consistent management of suspected Lynch Syndrome families, and demonstrates the value of multidisciplinary collaboration for curation and classification of variants in public locus-specific databases. PMID:24362816

  2. Video event classification and image segmentation based on noncausal multidimensional hidden Markov models.

    PubMed

    Ma, Xiang; Schonfeld, Dan; Khokhar, Ashfaq A

    2009-06-01

    In this paper, we propose a novel solution to an arbitrary noncausal, multidimensional hidden Markov model (HMM) for image and video classification. First, we show that the noncausal model can be solved by splitting it into multiple causal HMMs and simultaneously solving each causal HMM using a fully synchronous distributed computing framework, therefore referred to as distributed HMMs. Next we present an approximate solution to the multiple causal HMMs that is based on an alternating updating scheme and assumes a realistic sequential computing framework. The parameters of the distributed causal HMMs are estimated by extending the classical 1-D training and classification algorithms to multiple dimensions. The proposed extension to arbitrary causal, multidimensional HMMs allows state transitions that are dependent on all causal neighbors. We, thus, extend three fundamental algorithms to multidimensional causal systems, i.e., 1) expectation-maximization (EM), 2) general forward-backward (GFB), and 3) Viterbi algorithms. In the simulations, we choose to limit ourselves to a noncausal 2-D model whose noncausality is along a single dimension, in order to significantly reduce the computational complexity. Simulation results demonstrate the superior performance, higher accuracy rate, and applicability of the proposed noncausal HMM framework to image and video classification.

  3. Sunspot Pattern Classification using PCA and Neural Networks (Poster)

    NASA Technical Reports Server (NTRS)

    Rajkumar, T.; Thompson, D. E.; Slater, G. L.

    2005-01-01

    The sunspot classification scheme presented in this paper is considered as a 2-D classification problem on archived datasets, and is not a real-time system. As a first step, it mirrors the Zuerich/McIntosh historical classification system and reproduces classification of sunspot patterns based on preprocessing and neural net training datasets. Ultimately, the project intends to move from more rudimentary schemes, to develop spatial-temporal-spectral classes derived by correlating spatial and temporal variations in various wavelengths to the brightness fluctuation spectrum of the sun in those wavelengths. Once the approach is generalized, then the focus will naturally move from a 2-D to an n-D classification, where "n" includes time and frequency. Here, the 2-D perspective refers both to the actual SOH0 Michelson Doppler Imager (MDI) images that are processed, but also refers to the fact that a 2-D matrix is created from each image during preprocessing. The 2-D matrix is the result of running Principal Component Analysis (PCA) over the selected dataset images, and the resulting matrices and their eigenvalues are the objects that are stored in a database, classified, and compared. These matrices are indexed according to the standard McIntosh classification scheme.

  4. Classification of Dust Days by Satellite Remotely Sensed Aerosol Products

    NASA Technical Reports Server (NTRS)

    Sorek-Hammer, M.; Cohen, A.; Levy, Robert C.; Ziv, B.; Broday, D. M.

    2013-01-01

    Considerable progress in satellite remote sensing (SRS) of dust particles has been seen in the last decade. From an environmental health perspective, such an event detection, after linking it to ground particulate matter (PM) concentrations, can proxy acute exposure to respirable particles of certain properties (i.e. size, composition, and toxicity). Being affected considerably by atmospheric dust, previous studies in the Eastern Mediterranean, and in Israel in particular, have focused on mechanistic and synoptic prediction, classification, and characterization of dust events. In particular, a scheme for identifying dust days (DD) in Israel based on ground PM10 (particulate matter of size smaller than 10 nm) measurements has been suggested, which has been validated by compositional analysis. This scheme requires information regarding ground PM10 levels, which is naturally limited in places with sparse ground-monitoring coverage. In such cases, SRS may be an efficient and cost-effective alternative to ground measurements. This work demonstrates a new model for identifying DD and non-DD (NDD) over Israel based on an integration of aerosol products from different satellite platforms (Moderate Resolution Imaging Spectroradiometer (MODIS) and Ozone Monitoring Instrument (OMI)). Analysis of ground-monitoring data from 2007 to 2008 in southern Israel revealed 67 DD, with more than 88 percent occurring during winter and spring. A Classification and Regression Tree (CART) model that was applied to a database containing ground monitoring (the dependent variable) and SRS aerosol product (the independent variables) records revealed an optimal set of binary variables for the identification of DD. These variables are combinations of the following primary variables: the calendar month, ground-level relative humidity (RH), the aerosol optical depth (AOD) from MODIS, and the aerosol absorbing index (AAI) from OMI. A logistic regression that uses these variables, coded as binary variables, demonstrated 93.2 percent correct classifications of DD and NDD. Evaluation of the combined CART-logistic regression scheme in an adjacent geographical region (Gush Dan) demonstrated good results. Using SRS aerosol products for DD and NDD, identification may enable us to distinguish between health, ecological, and environmental effects that result from exposure to these distinct particle populations.

  5. Improved opponent color local binary patterns: an effective local image descriptor for color texture classification

    NASA Astrophysics Data System (ADS)

    Bianconi, Francesco; Bello-Cerezo, Raquel; Napoletano, Paolo

    2018-01-01

    Texture classification plays a major role in many computer vision applications. Local binary patterns (LBP) encoding schemes have largely been proven to be very effective for this task. Improved LBP (ILBP) are conceptually simple, easy to implement, and highly effective LBP variants based on a point-to-average thresholding scheme instead of a point-to-point one. We propose the use of this encoding scheme for extracting intra- and interchannel features for color texture classification. We experimentally evaluated the resulting improved opponent color LBP alone and in concatenation with the ILBP of the local color contrast map on a set of image classification tasks over 9 datasets of generic color textures and 11 datasets of biomedical textures. The proposed approach outperformed other grayscale and color LBP variants in nearly all the datasets considered and proved competitive even against image features from last generation convolutional neural networks, particularly for the classification of biomedical images.

  6. TFM classification and staging of oral submucous fibrosis: A new proposal.

    PubMed

    Arakeri, Gururaj; Thomas, Deepak; Aljabab, Abdulsalam S; Hunasgi, Santosh; Rai, Kirthi Kumar; Hale, Beverley; Fonseca, Felipe Paiva; Gomez, Ricardo Santiago; Rahimi, Siavash; Merkx, Matthias A W; Brennan, Peter A

    2018-04-01

    We have evaluated the rationale of existing grading and staging schemes of oral submucous fibrosis (OSMF) based on how they are categorized. A novel classification and staging scheme is proposed. A total of 300 OSMF patients were evaluated for agreement between functional, clinical, and histopathological staging. Bilateral biopsies were assessed in 25 patients to evaluate for any differences in histopathological staging of OSMF in the same mouth. Extent of clinician agreement for categorized staging data was evaluated using Cohen's weighted kappa analysis. Cross-tabulation was performed on categorical grading data to understand the intercorrelation, and the unweighted kappa analysis was used to assess the bilateral grade agreement. Probabilities of less than 0.05 were considered significant. Data were analyzed using SPSS Statistics (version 25.0, IBM, USA). A low agreement was found between all the stages depicting the independent nature of trismus, clinical features, and histopathological components (K = 0.312, 0.167, 0.152) in OSMF. Following analysis, a three-component classification scheme (TFM classification) was developed that describes the severity of each independently, grouping them using a novel three-tier staging scheme as a guide to the treatment plan. The proposed classification and staging could be useful for effective communication, categorization, and for recording data and prognosis, and for guiding treatment plans. Furthermore, the classification considers OSMF malignant transformation in detail. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Discovery of User-Oriented Class Associations for Enriching Library Classification Schemes.

    ERIC Educational Resources Information Center

    Pu, Hsiao-Tieh

    2002-01-01

    Presents a user-based approach to exploring the possibility of adding user-oriented class associations to hierarchical library classification schemes. Classes not grouped in the same subject hierarchies yet relevant to users' knowledge are obtained by analyzing a log book of a university library's circulation records, using collaborative filtering…

  8. Social Constructivism: Botanical Classification Schemes of Elementary School Children.

    ERIC Educational Resources Information Center

    Tull, Delena

    The assertion that there is a social component to children's construction of knowledge about natural phenomena is supported by evidence from an examination of children's classification schemes for plants. An ethnographic study was conducted with nine sixth grade children in central Texas. The children classified plants in the outdoors, in a…

  9. A Classification Scheme for Career Education Resource Materials.

    ERIC Educational Resources Information Center

    Koontz, Ronald G.

    The introductory section of the paper expresses its purpose: to devise a classification scheme for career education resource material, which will be used to develop the USOE Office of Career Education Resource Library and will be disseminated to interested State departments of education and local school districts to assist them in classifying…

  10. A Classification Scheme for Adult Education. Education Libraries Bulletin, Supplement Twelve.

    ERIC Educational Resources Information Center

    Greaves, Monica A., Comp.

    This classification scheme, based on the 'facet formula' theory of Ranganathan, is designed primarily for the library of the National Institute of Adult Education in London, England. Kinds of persons being educated (educands), methods and problems of education, specific countries, specific organizations, and forms in which the information is…

  11. A Computer Oriented Scheme for Coding Chemicals in the Field of Biomedicine.

    ERIC Educational Resources Information Center

    Bobka, Marilyn E.; Subramaniam, J.B.

    The chemical coding scheme of the Medical Coding Scheme (MCS), developed for use in the Comparative Systems Laboratory (CSL), is outlined and evaluated in this report. The chemical coding scheme provides a classification scheme and encoding method for drugs and chemical terms. Using the scheme complicated chemical structures may be expressed…

  12. A Noise-Filtered Under-Sampling Scheme for Imbalanced Classification.

    PubMed

    Kang, Qi; Chen, XiaoShuang; Li, SiSi; Zhou, MengChu

    2017-12-01

    Under-sampling is a popular data preprocessing method in dealing with class imbalance problems, with the purposes of balancing datasets to achieve a high classification rate and avoiding the bias toward majority class examples. It always uses full minority data in a training dataset. However, some noisy minority examples may reduce the performance of classifiers. In this paper, a new under-sampling scheme is proposed by incorporating a noise filter before executing resampling. In order to verify the efficiency, this scheme is implemented based on four popular under-sampling methods, i.e., Undersampling + Adaboost, RUSBoost, UnderBagging, and EasyEnsemble through benchmarks and significance analysis. Furthermore, this paper also summarizes the relationship between algorithm performance and imbalanced ratio. Experimental results indicate that the proposed scheme can improve the original undersampling-based methods with significance in terms of three popular metrics for imbalanced classification, i.e., the area under the curve, -measure, and -mean.

  13. Maximum likelihood estimation of label imperfections and its use in the identification of mislabeled patterns

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    The problem of estimating label imperfections and the use of the estimation in identifying mislabeled patterns is presented. Expressions for the maximum likelihood estimates of classification errors and a priori probabilities are derived from the classification of a set of labeled patterns. Expressions also are given for the asymptotic variances of probability of correct classification and proportions. Simple models are developed for imperfections in the labels and for classification errors and are used in the formulation of a maximum likelihood estimation scheme. Schemes are presented for the identification of mislabeled patterns in terms of threshold on the discriminant functions for both two-class and multiclass cases. Expressions are derived for the probability that the imperfect label identification scheme will result in a wrong decision and are used in computing thresholds. The results of practical applications of these techniques in the processing of remotely sensed multispectral data are presented.

  14. Machine learning in APOGEE. Unsupervised spectral classification with K-means

    NASA Astrophysics Data System (ADS)

    Garcia-Dias, Rafael; Allende Prieto, Carlos; Sánchez Almeida, Jorge; Ordovás-Pascual, Ignacio

    2018-05-01

    Context. The volume of data generated by astronomical surveys is growing rapidly. Traditional analysis techniques in spectroscopy either demand intensive human interaction or are computationally expensive. In this scenario, machine learning, and unsupervised clustering algorithms in particular, offer interesting alternatives. The Apache Point Observatory Galactic Evolution Experiment (APOGEE) offers a vast data set of near-infrared stellar spectra, which is perfect for testing such alternatives. Aims: Our research applies an unsupervised classification scheme based on K-means to the massive APOGEE data set. We explore whether the data are amenable to classification into discrete classes. Methods: We apply the K-means algorithm to 153 847 high resolution spectra (R ≈ 22 500). We discuss the main virtues and weaknesses of the algorithm, as well as our choice of parameters. Results: We show that a classification based on normalised spectra captures the variations in stellar atmospheric parameters, chemical abundances, and rotational velocity, among other factors. The algorithm is able to separate the bulge and halo populations, and distinguish dwarfs, sub-giants, RC, and RGB stars. However, a discrete classification in flux space does not result in a neat organisation in the parameters' space. Furthermore, the lack of obvious groups in flux space causes the results to be fairly sensitive to the initialisation, and disrupts the efficiency of commonly-used methods to select the optimal number of clusters. Our classification is publicly available, including extensive online material associated with the APOGEE Data Release 12 (DR12). Conclusions: Our description of the APOGEE database can help greatly with the identification of specific types of targets for various applications. We find a lack of obvious groups in flux space, and identify limitations of the K-means algorithm in dealing with this kind of data. Full Tables B.1-B.4 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/612/A98

  15. A Standardised Vocabulary for Identifying Benthic Biota and Substrata from Underwater Imagery: The CATAMI Classification Scheme

    PubMed Central

    Jordan, Alan; Rees, Tony; Gowlett-Holmes, Karen

    2015-01-01

    Imagery collected by still and video cameras is an increasingly important tool for minimal impact, repeatable observations in the marine environment. Data generated from imagery includes identification, annotation and quantification of biological subjects and environmental features within an image. To be long-lived and useful beyond their project-specific initial purpose, and to maximize their utility across studies and disciplines, marine imagery data should use a standardised vocabulary of defined terms. This would enable the compilation of regional, national and/or global data sets from multiple sources, contributing to broad-scale management studies and development of automated annotation algorithms. The classification scheme developed under the Collaborative and Automated Tools for Analysis of Marine Imagery (CATAMI) project provides such a vocabulary. The CATAMI classification scheme introduces Australian-wide acknowledged, standardised terminology for annotating benthic substrates and biota in marine imagery. It combines coarse-level taxonomy and morphology, and is a flexible, hierarchical classification that bridges the gap between habitat/biotope characterisation and taxonomy, acknowledging limitations when describing biological taxa through imagery. It is fully described, documented, and maintained through curated online databases, and can be applied across benthic image collection methods, annotation platforms and scoring methods. Following release in 2013, the CATAMI classification scheme was taken up by a wide variety of users, including government, academia and industry. This rapid acceptance highlights the scheme’s utility and the potential to facilitate broad-scale multidisciplinary studies of marine ecosystems when applied globally. Here we present the CATAMI classification scheme, describe its conception and features, and discuss its utility and the opportunities as well as challenges arising from its use. PMID:26509918

  16. A new Fourier transform based CBIR scheme for mammographic mass classification: a preliminary invariance assessment

    NASA Astrophysics Data System (ADS)

    Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin

    2015-03-01

    The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.

  17. An Analysis Platform for Multiscale Hydrogeologic Modeling with Emphasis on Hybrid Multiscale Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheibe, Timothy D.; Murphy, Ellyn M.; Chen, Xingyuan

    2015-01-01

    One of the most significant challenges facing hydrogeologic modelers is the disparity between those spatial and temporal scales at which fundamental flow, transport and reaction processes can best be understood and quantified (e.g., microscopic to pore scales, seconds to days) and those at which practical model predictions are needed (e.g., plume to aquifer scales, years to centuries). While the multiscale nature of hydrogeologic problems is widely recognized, technological limitations in computational and characterization restrict most practical modeling efforts to fairly coarse representations of heterogeneous properties and processes. For some modern problems, the necessary level of simplification is such that modelmore » parameters may lose physical meaning and model predictive ability is questionable for any conditions other than those to which the model was calibrated. Recently, there has been broad interest across a wide range of scientific and engineering disciplines in simulation approaches that more rigorously account for the multiscale nature of systems of interest. In this paper, we review a number of such approaches and propose a classification scheme for defining different types of multiscale simulation methods and those classes of problems to which they are most applicable. Our classification scheme is presented in terms of a flow chart (Multiscale Analysis Platform or MAP), and defines several different motifs of multiscale simulation. Within each motif, the member methods are reviewed and example applications are discussed. We focus attention on hybrid multiscale methods, in which two or more models with different physics described at fundamentally different scales are directly coupled within a single simulation. Very recently these methods have begun to be applied to groundwater flow and transport simulations, and we discuss these applications in the context of our classification scheme. As computational and characterization capabilities continue to improve, we envision that hybrid multiscale modeling will become more common and may become a viable alternative to conventional single-scale models in the near future.« less

  18. An analysis platform for multiscale hydrogeologic modeling with emphasis on hybrid multiscale methods.

    PubMed

    Scheibe, Timothy D; Murphy, Ellyn M; Chen, Xingyuan; Rice, Amy K; Carroll, Kenneth C; Palmer, Bruce J; Tartakovsky, Alexandre M; Battiato, Ilenia; Wood, Brian D

    2015-01-01

    One of the most significant challenges faced by hydrogeologic modelers is the disparity between the spatial and temporal scales at which fundamental flow, transport, and reaction processes can best be understood and quantified (e.g., microscopic to pore scales and seconds to days) and at which practical model predictions are needed (e.g., plume to aquifer scales and years to centuries). While the multiscale nature of hydrogeologic problems is widely recognized, technological limitations in computation and characterization restrict most practical modeling efforts to fairly coarse representations of heterogeneous properties and processes. For some modern problems, the necessary level of simplification is such that model parameters may lose physical meaning and model predictive ability is questionable for any conditions other than those to which the model was calibrated. Recently, there has been broad interest across a wide range of scientific and engineering disciplines in simulation approaches that more rigorously account for the multiscale nature of systems of interest. In this article, we review a number of such approaches and propose a classification scheme for defining different types of multiscale simulation methods and those classes of problems to which they are most applicable. Our classification scheme is presented in terms of a flowchart (Multiscale Analysis Platform), and defines several different motifs of multiscale simulation. Within each motif, the member methods are reviewed and example applications are discussed. We focus attention on hybrid multiscale methods, in which two or more models with different physics described at fundamentally different scales are directly coupled within a single simulation. Very recently these methods have begun to be applied to groundwater flow and transport simulations, and we discuss these applications in the context of our classification scheme. As computational and characterization capabilities continue to improve, we envision that hybrid multiscale modeling will become more common and also a viable alternative to conventional single-scale models in the near future. © 2014, National Ground Water Association.

  19. The classification of anxiety and hysterical states. Part I. Historical review and empirical delineation.

    PubMed

    Sheehan, D V; Sheehan, K H

    1982-08-01

    The history of the classification of anxiety, hysterical, and hypochondriacal disorders is reviewed. Problems in the ability of current classification schemes to predict, control, and describe the relationship between the symptoms and other phenomena are outlined. Existing classification schemes failed the first test of a good classification model--that of providing categories that are mutually exclusive. The independence of these diagnostic categories from each other does not appear to hold up on empirical testing. In the absence of inherently mutually exclusive categories, further empirical investigation of these classes is obstructed since statistically valid analysis of the nominal data and any useful multivariate analysis would be difficult if not impossible. It is concluded that the existing classifications are unsatisfactory and require some fundamental reconceptualization.

  20. A classification scheme for alternative oxidases reveals the taxonomic distribution and evolutionary history of the enzyme in angiosperms.

    PubMed

    Costa, José Hélio; McDonald, Allison E; Arnholdt-Schmitt, Birgit; Fernandes de Melo, Dirce

    2014-11-01

    A classification scheme based on protein phylogenies and sequence harmony method was used to clarify the taxonomic distribution and evolutionary history of the alternative oxidase (AOX) in angiosperms. A large data set analyses showed that AOX1 and AOX2 subfamilies were distributed into 4 phylogenetic clades: AOX1a-c/1e, AOX1d, AOX2a-c and AOX2d. High diversity in AOX family compositions was found. While the AOX2 subfamily was not detected in monocots, the AOX1 subfamily has expanded (AOX1a-e) in the large majority of these plants. In addition, Poales AOX1b and 1d were orthologous to eudicots AOX1d and then renamed as AOX1d1 and 1d2. AOX1 or AOX2 losses were detected in some eudicot plants. Several AOX2 duplications (AOX2a-c) were identified in eudicot species, mainly in the asterids. The AOX2b originally identified in eudicots in the Fabales order (soybean, cowpea) was divergent from AOX2a-c showing some specific amino acids with AOX1d and then it was renamed as AOX2d. AOX1d and AOX2d seem to be stress-responsive, facultative and mutually exclusive among species suggesting a complementary role with an AOX1(a) in stress conditions. Based on the data collected, we present a model for the evolutionary history of AOX in angiosperms and highlight specific areas where further research would be most beneficial. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Evaluation of effectiveness of wavelet based denoising schemes using ANN and SVM for bearing condition classification.

    PubMed

    Vijay, G S; Kumar, H S; Srinivasa Pai, P; Sriram, N S; Rao, Raj B K N

    2012-01-01

    The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR) and reducing the root-mean-square error (RMSE). In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN) and the Support Vector Machine (SVM), for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB) test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher's Criterion (FC). Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal.

  2. Classification of basic facilities for high-rise residential: A survey from 100 housing scheme in Kajang area

    NASA Astrophysics Data System (ADS)

    Ani, Adi Irfan Che; Sairi, Ahmad; Tawil, Norngainy Mohd; Wahab, Siti Rashidah Hanum Abd; Razak, Muhd Zulhanif Abd

    2016-08-01

    High demand for housing and limited land in town area has increasing the provision of high-rise residential scheme. This type of housing has different owners but share the same land lot and common facilities. Thus, maintenance works of the buildings and common facilities must be well organized. The purpose of this paper is to identify and classify basic facilities for high-rise residential building hoping to improve the management of the scheme. The method adopted is a survey on 100 high-rise residential schemes that ranged from affordable housing to high cost housing by using a snowball sampling. The scope of this research is within Kajang area, which is rapidly developed with high-rise housing. The objective of the survey is to list out all facilities in every sample of the schemes. The result confirmed that pre-determined 11 classifications hold true and can provide the realistic classification for high-rise residential scheme. This paper proposed for redefinition of facilities provided to create a better management system and give a clear definition on the type of high-rise residential based on its facilities.

  3. Applying the Methodology of the Community College Classification Scheme to the Public Master's Colleges and Universities Sector

    ERIC Educational Resources Information Center

    Kinkead, J. Clint.; Katsinas, Stephen G.

    2011-01-01

    This work brings forward the geographically-based classification scheme for the public Master's Colleges and Universities sector. Using the same methodology developed by Katsinas and Hardy (2005) to classify community colleges, this work classifies Master's Colleges and Universities. This work has four major findings and conclusions. First, a…

  4. What's in a Name? A Comparison of Methods for Classifying Predominant Type of Maltreatment

    ERIC Educational Resources Information Center

    Lau, A.S.; Leeb, R.T.; English, D.; Graham, J.C.; Briggs, E.C.; Brody, K.E.; Marshall, J.M.

    2005-01-01

    Objective:: The primary aim of the study was to identify a classification scheme, for determining the predominant type of maltreatment in a child's history that best predicts differences in developmental outcomes. Method:: Three different predominant type classification schemes were examined in a sample of 519 children with a history of alleged…

  5. Using methods from the data mining and machine learning literature for disease classification and prediction: A case study examining classification of heart failure sub-types

    PubMed Central

    Austin, Peter C.; Tu, Jack V.; Ho, Jennifer E.; Levy, Daniel; Lee, Douglas S.

    2014-01-01

    Objective Physicians classify patients into those with or without a specific disease. Furthermore, there is often interest in classifying patients according to disease etiology or subtype. Classification trees are frequently used to classify patients according to the presence or absence of a disease. However, classification trees can suffer from limited accuracy. In the data-mining and machine learning literature, alternate classification schemes have been developed. These include bootstrap aggregation (bagging), boosting, random forests, and support vector machines. Study design and Setting We compared the performance of these classification methods with those of conventional classification trees to classify patients with heart failure according to the following sub-types: heart failure with preserved ejection fraction (HFPEF) vs. heart failure with reduced ejection fraction (HFREF). We also compared the ability of these methods to predict the probability of the presence of HFPEF with that of conventional logistic regression. Results We found that modern, flexible tree-based methods from the data mining literature offer substantial improvement in prediction and classification of heart failure sub-type compared to conventional classification and regression trees. However, conventional logistic regression had superior performance for predicting the probability of the presence of HFPEF compared to the methods proposed in the data mining literature. Conclusion The use of tree-based methods offers superior performance over conventional classification and regression trees for predicting and classifying heart failure subtypes in a population-based sample of patients from Ontario. However, these methods do not offer substantial improvements over logistic regression for predicting the presence of HFPEF. PMID:23384592

  6. A new classification scheme for periodontal and peri-implant diseases and conditions - Introduction and key changes from the 1999 classification.

    PubMed

    G Caton, Jack; Armitage, Gary; Berglundh, Tord; Chapple, Iain L C; Jepsen, Søren; S Kornman, Kenneth; L Mealey, Brian; Papapanou, Panos N; Sanz, Mariano; S Tonetti, Maurizio

    2018-06-01

    A classification scheme for periodontal and peri-implant diseases and conditions is necessary for clinicians to properly diagnose and treat patients as well as for scientists to investigate etiology, pathogenesis, natural history, and treatment of the diseases and conditions. This paper summarizes the proceedings of the World Workshop on the Classification of Periodontal and Peri-implant Diseases and Conditions. The workshop was co-sponsored by the American Academy of Periodontology (AAP) and the European Federation of Periodontology (EFP) and included expert participants from all over the world. Planning for the conference, which was held in Chicago on November 9 to 11, 2017, began in early 2015. An organizing committee from the AAP and EFP commissioned 19 review papers and four consensus reports covering relevant areas in periodontology and implant dentistry. The authors were charged with updating the 1999 classification of periodontal diseases and conditions and developing a similar scheme for peri-implant diseases and conditions. Reviewers and workgroups were also asked to establish pertinent case definitions and to provide diagnostic criteria to aid clinicians in the use of the new classification. All findings and recommendations of the workshop were agreed to by consensus. This introductory paper presents an overview for the new classification of periodontal and peri-implant diseases and conditions, along with a condensed scheme for each of four workgroup sections, but readers are directed to the pertinent consensus reports and review papers for a thorough discussion of the rationale, criteria, and interpretation of the proposed classification. Changes to the 1999 classification are highlighted and discussed. Although the intent of the workshop was to base classification on the strongest available scientific evidence, lower level evidence and expert opinion were inevitably used whenever sufficient research data were unavailable. The scope of this workshop was to align and update the classification scheme to the current understanding of periodontal and peri-implant diseases and conditions. This introductory overview presents the schematic tables for the new classification of periodontal and peri-implant diseases and conditions and briefly highlights changes made to the 1999 classification. It cannot present the wealth of information included in the reviews, case definition papers, and consensus reports that has guided the development of the new classification, and reference to the consensus and case definition papers is necessary to provide a thorough understanding of its use for either case management or scientific investigation. Therefore, it is strongly recommended that the reader use this overview as an introduction to these subjects. Accessing this publication online will allow the reader to use the links in this overview and the tables to view the source papers (Table ). © 2018 American Academy of Periodontology and European Federation of Periodontology.

  7. A new classification scheme for periodontal and peri-implant diseases and conditions - Introduction and key changes from the 1999 classification.

    PubMed

    G Caton, Jack; Armitage, Gary; Berglundh, Tord; Chapple, Iain L C; Jepsen, Søren; S Kornman, Kenneth; L Mealey, Brian; Papapanou, Panos N; Sanz, Mariano; S Tonetti, Maurizio

    2018-06-01

    A classification scheme for periodontal and peri-implant diseases and conditions is necessary for clinicians to properly diagnose and treat patients as well as for scientists to investigate etiology, pathogenesis, natural history, and treatment of the diseases and conditions. This paper summarizes the proceedings of the World Workshop on the Classification of Periodontal and Peri-implant Diseases and Conditions. The workshop was co-sponsored by the American Academy of Periodontology (AAP) and the European Federation of Periodontology (EFP) and included expert participants from all over the world. Planning for the conference, which was held in Chicago on November 9 to 11, 2017, began in early 2015. An organizing committee from the AAP and EFP commissioned 19 review papers and four consensus reports covering relevant areas in periodontology and implant dentistry. The authors were charged with updating the 1999 classification of periodontal diseases and conditions and developing a similar scheme for peri-implant diseases and conditions. Reviewers and workgroups were also asked to establish pertinent case definitions and to provide diagnostic criteria to aid clinicians in the use of the new classification. All findings and recommendations of the workshop were agreed to by consensus. This introductory paper presents an overview for the new classification of periodontal and peri-implant diseases and conditions, along with a condensed scheme for each of four workgroup sections, but readers are directed to the pertinent consensus reports and review papers for a thorough discussion of the rationale, criteria, and interpretation of the proposed classification. Changes to the 1999 classification are highlighted and discussed. Although the intent of the workshop was to base classification on the strongest available scientific evidence, lower level evidence and expert opinion were inevitably used whenever sufficient research data were unavailable. The scope of this workshop was to align and update the classification scheme to the current understanding of periodontal and peri-implant diseases and conditions. This introductory overview presents the schematic tables for the new classification of periodontal and peri-implant diseases and conditions and briefly highlights changes made to the 1999 classification. It cannot present the wealth of information included in the reviews, case definition papers, and consensus reports that has guided the development of the new classification, and reference to the consensus and case definition papers is necessary to provide a thorough understanding of its use for either case management or scientific investigation. Therefore, it is strongly recommended that the reader use this overview as an introduction to these subjects. Accessing this publication online will allow the reader to use the links in this overview and the tables to view the source papers (Table 1). © 2018 American Academy of Periodontology and European Federation of Periodontology.

  8. Automatic classification of protein structures using physicochemical parameters.

    PubMed

    Mohan, Abhilash; Rao, M Divya; Sunderrajan, Shruthi; Pennathur, Gautam

    2014-09-01

    Protein classification is the first step to functional annotation; SCOP and Pfam databases are currently the most relevant protein classification schemes. However, the disproportion in the number of three dimensional (3D) protein structures generated versus their classification into relevant superfamilies/families emphasizes the need for automated classification schemes. Predicting function of novel proteins based on sequence information alone has proven to be a major challenge. The present study focuses on the use of physicochemical parameters in conjunction with machine learning algorithms (Naive Bayes, Decision Trees, Random Forest and Support Vector Machines) to classify proteins into their respective SCOP superfamily/Pfam family, using sequence derived information. Spectrophores™, a 1D descriptor of the 3D molecular field surrounding a structure was used as a benchmark to compare the performance of the physicochemical parameters. The machine learning algorithms were modified to select features based on information gain for each SCOP superfamily/Pfam family. The effect of combining physicochemical parameters and spectrophores on classification accuracy (CA) was studied. Machine learning algorithms trained with the physicochemical parameters consistently classified SCOP superfamilies and Pfam families with a classification accuracy above 90%, while spectrophores performed with a CA of around 85%. Feature selection improved classification accuracy for both physicochemical parameters and spectrophores based machine learning algorithms. Combining both attributes resulted in a marginal loss of performance. Physicochemical parameters were able to classify proteins from both schemes with classification accuracy ranging from 90-96%. These results suggest the usefulness of this method in classifying proteins from amino acid sequences.

  9. A classification scheme for risk assessment methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that amore » method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report--what a 'method' is and where it fits. In Section 3 we present background for our classification scheme--what other schemes we have found, the fundamental nature of methods and their necessary incompleteness. In Section 4 we present our classification scheme in the form of a matrix, then we present an analogy that should provide an understanding of the scheme, concluding with an explanation of the two dimensions and the nine types in our scheme. In Section 5 we present examples of each of our classification types. In Section 6 we present conclusions.« less

  10. Classification of instability after reverse shoulder arthroplasty guides surgical management and outcomes.

    PubMed

    Abdelfattah, Adham; Otto, Randall J; Simon, Peter; Christmas, Kaitlyn N; Tanner, Gregory; LaMartina, Joey; Levy, Jonathan C; Cuff, Derek J; Mighell, Mark A; Frankle, Mark A

    2018-04-01

    Revision of unstable reverse shoulder arthroplasty (RSA) remains a significant challenge. The purpose of this study was to determine the reliability of a new treatment-guiding classification for instability after RSA, to describe the clinical outcomes of patients stabilized operatively, and to identify those with higher risk of recurrence. All patients undergoing revision for instability after RSA were identified at our institution. Demographic, clinical, radiographic, and intraoperative data were collected. A classification was developed using all identified causes of instability after RSA and allocating them to 1 of 3 defined treatment-guiding categories. Eight surgeons reviewed all data and applied the classification scheme to each case. Interobserver and intraobserver reliability was used to evaluate the classification scheme. Preoperative clinical outcomes were compared with final follow-up in stabilized shoulders. Forty-three revision cases in 34 patients met the inclusion for study. Five patients remained unstable after revision. Persistent instability most commonly occurred in persistent deltoid dysfunction and postoperative acromial fractures but also in 1 case of soft tissue impingement. Twenty-one patients remained stable at minimum 2 years of follow-up and had significant improvement of clinical outcome scores and range of motion. Reliability of the classification scheme showed substantial and almost perfect interobserver and intraobserver agreement among all the participants (κ = 0.699 and κ = 0.851, respectively). Instability after RSA can be successfully treated with revision surgery using the reliable treatment-guiding classification scheme presented herein. However, more understanding is needed for patients with greater risk of recurrent instability after revision surgery. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  11. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI).

    PubMed

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-06-01

    Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research.

  12. Twenty five years of beach monitoring in Hong Kong: A re-examination of the beach water quality classification scheme from a comparative and global perspective.

    PubMed

    Thoe, W; Lee, Olive H K; Leung, K F; Lee, T; Ashbolt, Nicholas J; Yang, Ron R; Chui, Samuel H K

    2018-06-01

    Hong Kong's beach water quality classification scheme, used effectively for >25 years in protecting public health, was first established in local epidemiology studies during the late 1980s where Escherichia coli (E. coli) was identified as the most suitable faecal indicator bacteria. To review and further substantiate the scheme's robustness, a performance check was carried out to classify water quality of 37 major local beaches in Hong Kong during four bathing seasons (March-October) from 2010 to 2013. Given the enterococci and E. coli data collected, beach classification by the local scheme was found to be in line with the prominent international benchmarks recommended by the World Health Organization and the European Union. Local bacteriological studies over the last 15 years further confirmed that E. coli is the more suitable faecal indicator bacteria than enterococci in the local context. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Update on diabetes classification.

    PubMed

    Thomas, Celeste C; Philipson, Louis H

    2015-01-01

    This article highlights the difficulties in creating a definitive classification of diabetes mellitus in the absence of a complete understanding of the pathogenesis of the major forms. This brief review shows the evolving nature of the classification of diabetes mellitus. No classification scheme is ideal, and all have some overlap and inconsistencies. The only diabetes in which it is possible to accurately diagnose by DNA sequencing, monogenic diabetes, remains undiagnosed in more than 90% of the individuals who have diabetes caused by one of the known gene mutations. The point of classification, or taxonomy, of disease, should be to give insight into both pathogenesis and treatment. It remains a source of frustration that all schemes of diabetes mellitus continue to fall short of this goal. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Identification of terrain cover using the optimum polarimetric classifier

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Swartz, A. A.; Yueh, H. A.; Novak, L. M.; Shin, R. T.

    1988-01-01

    A systematic approach for the identification of terrain media such as vegetation canopy, forest, and snow-covered fields is developed using the optimum polarimetric classifier. The covariance matrices for various terrain cover are computed from theoretical models of random medium by evaluating the scattering matrix elements. The optimal classification scheme makes use of a quadratic distance measure and is applied to classify a vegetation canopy consisting of both trees and grass. Experimentally measured data are used to validate the classification scheme. Analytical and Monte Carlo simulated classification errors using the fully polarimetric feature vector are compared with classification based on single features which include the phase difference between the VV and HH polarization returns. It is shown that the full polarimetric results are optimal and provide better classification performance than single feature measurements.

  15. A proposed classification scheme for Ada-based software products

    NASA Technical Reports Server (NTRS)

    Cernosek, Gary J.

    1986-01-01

    As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.

  16. Multidimensional classification of magma types for altered igneous rocks and application to their tectonomagmatic discrimination and igneous provenance of siliciclastic sediments

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Pandarinath, Kailasa; Amezcua-Valdez, Alejandra; Rosales-Rivera, Mauricio; Verma, Sanjeet K.; Quiroz-Ruiz, Alfredo; Armstrong-Altrin, John S.

    2017-05-01

    A new multidimensional scheme consistent with the International Union of Geological Sciences (IUGS) is proposed for the classification of igneous rocks in terms of four magma types: ultrabasic, basic, intermediate, and acid. Our procedure is based on an extensive database of major element composition of a total of 33,868 relatively fresh rock samples having a multinormal distribution (initial database with 37,215 samples). Multinormally distributed database in terms of log-ratios of samples was ascertained by a new computer program DOMuDaF, in which the discordancy test was applied at the 99.9% confidence level. Isometric log-ratio (ilr) transformation was used to provide overall percent correct classification of 88.7%, 75.8%, 88.0%, and 80.9% for ultrabasic, basic, intermediate, and acid rocks, respectively. Given the known mathematical and uncertainty propagation properties, this transformation could be adopted for routine applications. The incorrect classification was mainly for the "neighbour" magma types, e.g., basic for ultrabasic and vice versa. Some of these misclassifications do not have any effect on multidimensional tectonic discrimination. For an efficient application of this multidimensional scheme, a new computer program MagClaMSys_ilr (MagClaMSys-Magma Classification Major-element based System) was written, which is available for on-line processing on http://tlaloc.ier.unam.mx/index.html. This classification scheme was tested from newly compiled data for relatively fresh Neogene igneous rocks and was found to be consistent with the conventional IUGS procedure. The new scheme was successfully applied to inter-laboratory data for three geochemical reference materials (basalts JB-1 and JB-1a, and andesite JA-3) from Japan and showed that the inferred magma types are consistent with the rock name (basic for basalts JB-1 and JB-1a and intermediate for andesite JA-3). The scheme was also successfully applied to five case studies of older Archaean to Mesozoic igneous rocks. Similar or more reliable results were obtained from existing tectonomagmatic discrimination diagrams when used in conjunction with the new computer program as compared to the IUGS scheme. The application to three case studies of igneous provenance of sedimentary rocks was demonstrated as a novel approach. Finally, we show that the new scheme is more robust for post-emplacement compositional changes than the conventional IUGS procedure.

  17. Classification of proteins: available structural space for molecular modeling.

    PubMed

    Andreeva, Antonina

    2012-01-01

    The wealth of available protein structural data provides unprecedented opportunity to study and better understand the underlying principles of protein folding and protein structure evolution. A key to achieving this lies in the ability to analyse these data and to organize them in a coherent classification scheme. Over the past years several protein classifications have been developed that aim to group proteins based on their structural relationships. Some of these classification schemes explore the concept of structural neighbourhood (structural continuum), whereas other utilize the notion of protein evolution and thus provide a discrete rather than continuum view of protein structure space. This chapter presents a strategy for classification of proteins with known three-dimensional structure. Steps in the classification process along with basic definitions are introduced. Examples illustrating some fundamental concepts of protein folding and evolution with a special focus on the exceptions to them are presented.

  18. Inter-sectoral costs and benefits of mental health prevention: towards a new classification scheme.

    PubMed

    Drost, Ruben M W A; Paulus, Aggie T G; Ruwaard, Dirk; Evers, Silvia M A A

    2013-12-01

    Many preventive interventions for mental disorders have costs and benefits that spill over to sectors outside the healthcare sector. Little is known about these "inter-sectoral costs and benefits" (ICBs) of prevention. However, to achieve an efficient allocation of scarce resources, insights on ICBs are indispensable. The main aim was to identify the ICBs related to the prevention of mental disorders and provide a sector-specific classification scheme for these ICBs. Using PubMed, a literature search was conducted for ICBs of mental disorders and related (psycho)social effects. A policy perspective was used to build the scheme's structure, which was adapted to the outcomes of the literature search. In order to validate the scheme's international applicability inside and outside the mental health domain, semi-structured interviews were conducted with (inter)national experts in the broad fields of health promotion and disease prevention. The searched-for items appeared in a total of 52 studies. The ICBs found were classified in one of four sectors: "Education", "Labor and Social Security", "Household and Leisure" or "Criminal Justice System". Psycho(social) effects were placed in a separate section under "Individual and Family". Based on interviews, the scheme remained unadjusted, apart from adding a population-based dimension. This is the first study which offers a sector-specific classification of ICBs. Given the explorative nature of the study, no guidelines on sector-specific classification of ICBs were available. Nevertheless, the classification scheme was acknowledged by an international audience and could therefore provide added value to researchers and policymakers in the field of mental health economics and prevention. The identification and classification of ICBs offers decision makers supporting information on how to optimally allocate scarce resources with respect to preventive interventions for mental disorders. By exploring a new area of research, which has remained largely unexplored until now, the current study has an added value as it may form the basis for the development of a tool which can be used to calculate the ICBs of specific mental health related preventive interventions.

  19. Classifying GRB 170817A/GW170817 in a Fermi duration-hardness plane

    NASA Astrophysics Data System (ADS)

    Horváth, I.; Tóth, B. G.; Hakkila, J.; Tóth, L. V.; Balázs, L. G.; Rácz, I. I.; Pintér, S.; Bagoly, Z.

    2018-03-01

    GRB 170817A, associated with the LIGO-Virgo GW170817 neutron-star merger event, lacks the short duration and hard spectrum of a Short gamma-ray burst (GRB) expected from long-standing classification models. Correctly identifying the class to which this burst belongs requires comparison with other GRBs detected by the Fermi GBM. The aim of our analysis is to classify Fermi GRBs and to test whether or not GRB 170817A belongs—as suggested—to the Short GRB class. The Fermi GBM catalog provides a large database with many measured variables that can be used to explore gamma-ray burst classification. We use statistical techniques to look for clustering in a sample of 1298 gamma-ray bursts described by duration and spectral hardness. Classification of the detected bursts shows that GRB 170817A most likely belongs to the Intermediate, rather than the Short GRB class. We discuss this result in light of theoretical neutron-star merger models and existing GRB classification schemes. It appears that GRB classification schemes may not yet be linked to appropriate theoretical models, and that theoretical models may not yet adequately account for known GRB class properties. We conclude that GRB 170817A may not fit into a simple phenomenological classification scheme.

  20. Interactive searching of facial image databases

    NASA Astrophysics Data System (ADS)

    Nicholls, Robert A.; Shepherd, John W.; Shepherd, Jean

    1995-09-01

    A set of psychological facial descriptors has been devised to enable computerized searching of criminal photograph albums. The descriptors have been used to encode image databased of up to twelve thousand images. Using a system called FACES, the databases are searched by translating a witness' verbal description into corresponding facial descriptors. Trials of FACES have shown that this coding scheme is more productive and efficient than searching traditional photograph albums. An alternative method of searching the encoded database using a genetic algorithm is currenly being tested. The genetic search method does not require the witness to verbalize a description of the target but merely to indicate a degree of similarity between the target and a limited selection of images from the database. The major drawback of FACES is that is requires a manual encoding of images. Research is being undertaken to automate the process, however, it will require an algorithm which can predict human descriptive values. Alternatives to human derived coding schemes exist using statistical classifications of images. Since databases encoded using statistical classifiers do not have an obvious direct mapping to human derived descriptors, a search method which does not require the entry of human descriptors is required. A genetic search algorithm is being tested for such a purpose.

  1. Diffuse Lung Disease in Biopsied Children 2 to 18 Years of Age. Application of the chILD Classification Scheme.

    PubMed

    Fan, Leland L; Dishop, Megan K; Galambos, Csaba; Askin, Frederic B; White, Frances V; Langston, Claire; Liptzin, Deborah R; Kroehl, Miranda E; Deutsch, Gail H; Young, Lisa R; Kurland, Geoffrey; Hagood, James; Dell, Sharon; Trapnell, Bruce C; Deterding, Robin R

    2015-10-01

    Children's Interstitial and Diffuse Lung Disease (chILD) is a heterogeneous group of disorders that is challenging to categorize. In previous study, a classification scheme was successfully applied to children 0 to 2 years of age who underwent lung biopsies for chILD. This classification scheme has not been evaluated in children 2 to 18 years of age. This multicenter interdisciplinary study sought to describe the spectrum of biopsy-proven chILD in North America and to apply a previously reported classification scheme in children 2 to 18 years of age. Mortality and risk factors for mortality were also assessed. Patients 2 to 18 years of age who underwent lung biopsies for diffuse lung disease from 12 North American institutions were included. Demographic and clinical data were collected and described. The lung biopsies were reviewed by pediatric lung pathologists with expertise in diffuse lung disease and were classified by the chILD classification scheme. Logistic regression was used to determine risk factors for mortality. A total of 191 cases were included in the final analysis. Number of biopsies varied by center (5-49 biopsies; mean, 15.8) and by age (2-18 yr; mean, 10.6 yr). The most common classification category in this cohort was Disorders of the Immunocompromised Host (40.8%), and the least common was Disorders of Infancy (4.7%). Immunocompromised patients suffered the highest mortality (52.8%). Additional associations with mortality included mechanical ventilation, worse clinical status at time of biopsy, tachypnea, hemoptysis, and crackles. Pulmonary hypertension was found to be a risk factor for mortality but only in the immunocompetent patients. In patients 2 to 18 years of age who underwent lung biopsies for diffuse lung disease, there were far fewer diagnoses prevalent in infancy and more overlap with adult diagnoses. Immunocompromised patients with diffuse lung disease who underwent lung biopsies had less than 50% survival at time of last follow-up.

  2. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    NASA Astrophysics Data System (ADS)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of existing environmental projects (for example, GEOSS and INSPIRE). This requirement imposes constraints on the selection. Thirdly, the selected classification scheme or group of schemes (if more than one) must be capable of alignment (establishing different kinds of mappings between concepts, hence preserving intact the original knowledge schemes) or merging (the creation of another unique ontology from the original ontological sources) (Pérez-Gómez et al., 2004). Last but not least, there is the issue of including multi-lingual schemes that are based on free, open standards (non-proprietary). Using these selection criteria, we aim to support open and convenient data discovery and exchange for users who speak different languages (particularly the European ones for the broad scopes of EuroGEOSS). In order to support the project, we have developed a solution that employs two classification schemes: the Societal Benefit Areas (SBAs)3: the upper-level environmental categorization developed for the GEOSS project and the GEneral Multilingual Environmental Thesaurus (GEMET)4: a general environmental thesaurus whose conceptual structure has already been integrated with the spatial data themes proposed by the INSPIRE project. The former seems to provide the spatial data keywords relevant to the INSPIRE's Directive (JRC, 2008). In this way, we provide users with a basic set of concepts to support resource description and discovery in the thematic areas while supporting the requirements of INSPIRE and GEOSS. Furthermore, the use of only two classification schemes together with the fact that the SBAs are very general categories while GEMET includes much more detailed, yet still top-level, concepts, makes alignment an achievable task. Alignment was selected over merging because it leaves the existing classification schemes intact and requires only a simple activity of defining mappings from GEMET to the SBAs. In order to accomplish this task we are developing a simple, automated, open-source application to assist thematic experts in defining the mappings between concepts in the two classification schemes. The application will then generate SKOS mappings (exactMatch, closeMatch, broadMatch, narrowMatch, relatedMatch) based on thematic expert selections between the concepts in GEMET with the SBAs (including both the general Societal Benefit Areas and their subcategories). Once these mappings are defined and the SKOS files generated, resource providers will be able to select concepts from either GEMET or the SBAs (or a mixture) to describe their resources, and discovery approaches will support selection of concepts from either classification scheme, also returning results classified using the other scheme. While the focus of our work has been on the SBAs and GEMET, we also plan to provide a method for resource providers to further extend the semantic infrastructure by defining alignments to new classification schemes if these are required to support particular specialized thematic areas that are not covered by GEMET. In this way, the approach is flexible and suited to the general scope of EuroGEOSS, allowing specialists to increase at will the level of semantic quality and specificity of data to the initial infrastructural skeleton of the project. References ____________________________________________ Joint research Centre (JRC), 2008. INSPIRE Metadata Editor User Guide Pérez-Gómez A., Fernandez-Lopez M., Corcho O. Ontological engineering: With Examples from the Areas of Knowledge Management, e-Commerce and the Semantic Web.Spinger: London, 2004

  3. Sensitivity analysis of the GEMS soil organic carbon model to land cover land use classification uncertainties under different climate scenarios in Senegal

    USGS Publications Warehouse

    Dieye, A.M.; Roy, David P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.

    2012-01-01

    Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs.

  4. Approach for a Clinically Useful Comprehensive Classification of Vascular and Neural Aspects of Diabetic Retinal Disease

    PubMed Central

    Abramoff, Michael D.; Fort, Patrice E.; Han, Ian C.; Jayasundera, K. Thiran; Sohn, Elliott H.; Gardner, Thomas W.

    2018-01-01

    The Early Treatment Diabetic Retinopathy Study (ETDRS) and other standardized classification schemes have laid a foundation for tremendous advances in the understanding and management of diabetic retinopathy (DR). However, technological advances in optics and image analysis, especially optical coherence tomography (OCT), OCT angiography (OCTa), and ultra-widefield imaging, as well as new discoveries in diabetic retinal neuropathy (DRN), are exposing the limitations of ETDRS and other classification systems to completely characterize retinal changes in diabetes, which we term diabetic retinal disease (DRD). While it may be most straightforward to add axes to existing classification schemes, as diabetic macular edema (DME) was added as an axis to earlier DR classifications, doing so may make these classifications increasingly complicated and thus clinically intractable. Therefore, we propose future research efforts to develop a new, comprehensive, and clinically useful classification system that will identify multimodal biomarkers to reflect the complex pathophysiology of DRD and accelerate the development of therapies to prevent vision-threatening DRD. PMID:29372250

  5. Approach for a Clinically Useful Comprehensive Classification of Vascular and Neural Aspects of Diabetic Retinal Disease.

    PubMed

    Abramoff, Michael D; Fort, Patrice E; Han, Ian C; Jayasundera, K Thiran; Sohn, Elliott H; Gardner, Thomas W

    2018-01-01

    The Early Treatment Diabetic Retinopathy Study (ETDRS) and other standardized classification schemes have laid a foundation for tremendous advances in the understanding and management of diabetic retinopathy (DR). However, technological advances in optics and image analysis, especially optical coherence tomography (OCT), OCT angiography (OCTa), and ultra-widefield imaging, as well as new discoveries in diabetic retinal neuropathy (DRN), are exposing the limitations of ETDRS and other classification systems to completely characterize retinal changes in diabetes, which we term diabetic retinal disease (DRD). While it may be most straightforward to add axes to existing classification schemes, as diabetic macular edema (DME) was added as an axis to earlier DR classifications, doing so may make these classifications increasingly complicated and thus clinically intractable. Therefore, we propose future research efforts to develop a new, comprehensive, and clinically useful classification system that will identify multimodal biomarkers to reflect the complex pathophysiology of DRD and accelerate the development of therapies to prevent vision-threatening DRD.

  6. Local Laplacian Coding From Theoretical Analysis of Local Coding Schemes for Locally Linear Classification.

    PubMed

    Pang, Junbiao; Qin, Lei; Zhang, Chunjie; Zhang, Weigang; Huang, Qingming; Yin, Baocai

    2015-12-01

    Local coordinate coding (LCC) is a framework to approximate a Lipschitz smooth function by combining linear functions into a nonlinear one. For locally linear classification, LCC requires a coding scheme that heavily determines the nonlinear approximation ability, posing two main challenges: 1) the locality making faraway anchors have smaller influences on current data and 2) the flexibility balancing well between the reconstruction of current data and the locality. In this paper, we address the problem from the theoretical analysis of the simplest local coding schemes, i.e., local Gaussian coding and local student coding, and propose local Laplacian coding (LPC) to achieve the locality and the flexibility. We apply LPC into locally linear classifiers to solve diverse classification tasks. The comparable or exceeded performances of state-of-the-art methods demonstrate the effectiveness of the proposed method.

  7. Toward functional classification of neuronal types.

    PubMed

    Sharpee, Tatyana O

    2014-09-17

    How many types of neurons are there in the brain? This basic neuroscience question remains unsettled despite many decades of research. Classification schemes have been proposed based on anatomical, electrophysiological, or molecular properties. However, different schemes do not always agree with each other. This raises the question of whether one can classify neurons based on their function directly. For example, among sensory neurons, can a classification scheme be devised that is based on their role in encoding sensory stimuli? Here, theoretical arguments are outlined for how this can be achieved using information theory by looking at optimal numbers of cell types and paying attention to two key properties: correlations between inputs and noise in neural responses. This theoretical framework could help to map the hierarchical tree relating different neuronal classes within and across species. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Applications of the U.S. Geological Survey's global land cover product

    USGS Publications Warehouse

    Reed, B.

    1997-01-01

    The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).

  9. Applying graphics user interface ot group technology classification and coding at the Boeing aerospace company

    NASA Astrophysics Data System (ADS)

    Ness, P. H.; Jacobson, H.

    1984-10-01

    The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.

  10. Mental Task Classification Scheme Utilizing Correlation Coefficient Extracted from Interchannel Intrinsic Mode Function.

    PubMed

    Rahman, Md Mostafizur; Fattah, Shaikh Anowarul

    2017-01-01

    In view of recent increase of brain computer interface (BCI) based applications, the importance of efficient classification of various mental tasks has increased prodigiously nowadays. In order to obtain effective classification, efficient feature extraction scheme is necessary, for which, in the proposed method, the interchannel relationship among electroencephalogram (EEG) data is utilized. It is expected that the correlation obtained from different combination of channels will be different for different mental tasks, which can be exploited to extract distinctive feature. The empirical mode decomposition (EMD) technique is employed on a test EEG signal obtained from a channel, which provides a number of intrinsic mode functions (IMFs), and correlation coefficient is extracted from interchannel IMF data. Simultaneously, different statistical features are also obtained from each IMF. Finally, the feature matrix is formed utilizing interchannel correlation features and intrachannel statistical features of the selected IMFs of EEG signal. Different kernels of the support vector machine (SVM) classifier are used to carry out the classification task. An EEG dataset containing ten different combinations of five different mental tasks is utilized to demonstrate the classification performance and a very high level of accuracy is achieved by the proposed scheme compared to existing methods.

  11. Comparative efficiency of a scheme of cyclic alternating-period subtraction

    NASA Astrophysics Data System (ADS)

    Golikov, V. S.; Artemenko, I. G.; Malinin, A. P.

    1986-06-01

    The estimation of the detection quality of a signal on a background of correlated noise according to the Neumann-Pearson criterion is examined. It is shown that, in a number of cases, the cyclic alternating-period subtraction scheme has a higher noise immunity than the conventional alternating-period subtraction scheme.

  12. Reconciling Mining with the Conservation of Cave Biodiversity: A Quantitative Baseline to Help Establish Conservation Priorities.

    PubMed

    Jaffé, Rodolfo; Prous, Xavier; Zampaulo, Robson; Giannini, Tereza C; Imperatriz-Fonseca, Vera L; Maurity, Clóvis; Oliveira, Guilherme; Brandi, Iuri V; Siqueira, José O

    2016-01-01

    Caves pose significant challenges for mining projects, since they harbor many endemic and threatened species, and must therefore be protected. Recent discussions between academia, environmental protection agencies, and industry partners, have highlighted problems with the current Brazilian legislation for the protection of caves. While the licensing process is long, complex and cumbersome, the criteria used to assign caves into conservation relevance categories are often subjective, with relevance being mainly determined by the presence of obligate cave dwellers (troglobites) and their presumed rarity. However, the rarity of these troglobitic species is questionable, as most remain unidentified to the species level and their habitats and distribution ranges are poorly known. Using data from 844 iron caves retrieved from different speleology reports for the Carajás region (South-Eastern Amazon, Brazil), one of the world's largest deposits of high-grade iron ore, we assess the influence of different cave characteristics on four biodiversity proxies (species richness, presence of troglobites, presence of rare troglobites, and presence of resident bat populations). We then examine how the current relevance classification scheme ranks caves with different biodiversity indicators. Large caves were found to be important reservoirs of biodiversity, so they should be prioritized in conservation programs. Our results also reveal spatial autocorrelation in all the biodiversity proxies assessed, indicating that iron caves should be treated as components of a cave network immersed in the karst landscape. Finally, we show that by prioritizing the conservation of rare troglobites, the current relevance classification scheme is undermining overall cave biodiversity and leaving ecologically important caves unprotected. We argue that conservation efforts should target subterranean habitats as a whole and propose an alternative relevance ranking scheme, which could help simplify the assessment process and channel more resources to the effective protection of overall cave biodiversity.

  13. Assessment of Metronidazole Susceptibility in Helicobacter pylori: Statistical Validation and Error Rate Analysis of Breakpoints Determined by the Disk Diffusion Test

    PubMed Central

    Chaves, Sandra; Gadanho, Mário; Tenreiro, Rogério; Cabrita, José

    1999-01-01

    Metronidazole susceptibility of 100 Helicobacter pylori strains was assessed by determining the inhibition zone diameters by disk diffusion test and the MICs by agar dilution and PDM Epsilometer test (E test). Linear regression analysis was performed, allowing the definition of significant linear relations, and revealed correlations of disk diffusion results with both E-test and agar dilution results (r2 = 0.88 and 0.81, respectively). No significant differences (P = 0.84) were found between MICs defined by E test and those defined by agar dilution, taken as a standard. Reproducibility comparison between E-test and disk diffusion tests showed that they are equivalent and with good precision. Two interpretative susceptibility schemes (with or without an intermediate class) were compared by an interpretative error rate analysis method. The susceptibility classification scheme that included the intermediate category was retained, and breakpoints were assessed for diffusion assay with 5-μg metronidazole disks. Strains with inhibition zone diameters less than 16 mm were defined as resistant (MIC > 8 μg/ml), those with zone diameters equal to or greater than 16 mm but less than 21 mm were considered intermediate (4 μg/ml < MIC ≤ 8 μg/ml), and those with zone diameters of 21 mm or greater were regarded as susceptible (MIC ≤ 4 μg/ml). Error rate analysis applied to this classification scheme showed occurrence frequencies of 1% for major errors and 7% for minor errors, when the results were compared to those obtained by agar dilution. No very major errors were detected, suggesting that disk diffusion might be a good alternative for determining the metronidazole sensitivity of H. pylori strains. PMID:10203543

  14. Chondrule formation, metamorphism, brecciation, an important new primary chondrule group, and the classification of chondrules

    NASA Technical Reports Server (NTRS)

    Sears, Derek W. G.; Shaoxiong, Huang; Benoit, Paul H.

    1995-01-01

    The recently proposed compositional classification scheme for meteoritic chondrules divides the chondrules into groups depending on the composition of their two major phases, olivine (or pyroxene) and the mesostasis, both of which are genetically important. The scheme is here applied to discussions of three topics: the petrographic classification of Roosevelt County 075 (the least-metamorphosed H chondrite known), brecciation (an extremely important and ubiquitous process probably experienced by greater than 40% of all unequilibrated ordinary chondrites), and the group A5 chondrules in the least metamorphosed ordinary chondrites which have many similarities to chondrules in the highly metamorphosed 'equilibrated' chondrites. Since composition provides insights into both primary formation properties of the chondruies and the effects of metamorphism on the entire assemblage it is possible to determine the petrographic type of RC075 as 3.1 with unique certainty. Similarly, the near scheme can be applied to individual chondrules without knowledge of the petrographic type of the host chondrite, which makes it especially suitable for studying breccias. Finally, the new scheme has revealed the existence of chondrules not identified by previous techniques and which appear to be extremely important. Like group A1 and A2 chondrules (but unlike group B1 chondrules) the primitive group A5 chondruies did not supercool during formation, but unlike group A1 and A2 chondrules (and like group B1 chondrules) they did not suffer volatile loss and reduction during formation. It is concluded that the compositional classification scheme provides important new insights into the formation and history of chondrules and chondrites which would be overlooked by previous schemes.

  15. Carnegie's New Community Engagement Classification: Affirming Higher Education's Role in Community

    ERIC Educational Resources Information Center

    Driscoll, Amy

    2009-01-01

    In 2005, the Carnegie Foundation for the Advancement of Teaching (CFAT) stirred the higher education world with the announcement of a new classification for institutions that engage with community. The classification, community engagement, is the first in a set of planned classification schemes resulting from the foundation's reexamination of the…

  16. Automated reuseable components system study results

    NASA Technical Reports Server (NTRS)

    Gilroy, Kathy

    1989-01-01

    The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.

  17. Dental panoramic image analysis for enhancement biomarker of mandibular condyle for osteoporosis early detection

    NASA Astrophysics Data System (ADS)

    Suprijanto; Azhari; Juliastuti, E.; Septyvergy, A.; Setyagar, N. P. P.

    2016-03-01

    Osteoporosis is a degenerative disease characterized by low Bone Mineral Density (BMD). Currently, a BMD level is determined by Dual Energy X-ray Absorptiometry (DXA) at the lumbar vertebrae and femur. Previous studies reported that dental panoramic radiography image has potential information for early osteoporosis detection. This work reported alternative scheme, that consists of the determination of the Region of Interest (ROI) the condyle mandibular in the image as biomarker and feature extraction from ROI and classification of bone conditions. The minimum value of intensity in the cavity area is used to compensate an offset on the ROI. For feature extraction, the fraction of intensity values in the ROI that represent high bone density and the ROI total area is perfomed. The classification will be evaluated from the ability of each feature and its combinations for the BMD detection in 2 classes (normal and abnormal), with the artificial neural network method. The evaluation system used 105 panoramic image data from menopause women which consist of 36 training data and 69 test data that were divided into 2 classes. The 2 classes of classification obtained 88.0% accuracy rate and 88.0% sensitivity rate.

  18. A new classification of glaucomas

    PubMed Central

    Bordeianu, Constantin-Dan

    2014-01-01

    Purpose To suggest a new glaucoma classification that is pathogenic, etiologic, and clinical. Methods After discussing the logical pathway used in criteria selection, the paper presents the new classification and compares it with the classification currently in use, that is, the one issued by the European Glaucoma Society in 2008. Results The paper proves that the new classification is clear (being based on a coherent and consistently followed set of criteria), is comprehensive (framing all forms of glaucoma), and helps in understanding the sickness understanding (in that it uses a logical framing system). The great advantage is that it facilitates therapeutic decision making in that it offers direct therapeutic suggestions and avoids errors leading to disasters. Moreover, the scheme remains open to any new development. Conclusion The suggested classification is a pathogenic, etiologic, and clinical classification that fulfills the conditions of an ideal classification. The suggested classification is the first classification in which the main criterion is consistently used for the first 5 to 7 crossings until its differentiation capabilities are exhausted. Then, secondary criteria (etiologic and clinical) pick up the relay until each form finds its logical place in the scheme. In order to avoid unclear aspects, the genetic criterion is no longer used, being replaced by age, one of the clinical criteria. The suggested classification brings only benefits to all categories of ophthalmologists: the beginners will have a tool to better understand the sickness and to ease their decision making, whereas the experienced doctors will have their practice simplified. For all doctors, errors leading to therapeutic disasters will be less likely to happen. Finally, researchers will have the object of their work gathered in the group of glaucoma with unknown or uncertain pathogenesis, whereas the results of their work will easily find a logical place in the scheme, as the suggested classification remains open to any new development. PMID:25246759

  19. Classification for Estuarine Ecosystems: A Review and Comparison of Selected Classification Schemes

    EPA Science Inventory

    Estuarine scientists have devoted considerable effort to classifying coastal, estuarine and marine environments and their watersheds, for a variety of purposes. These classifications group systems with similarities – most often in physical and hydrodynamic properties – in order ...

  20. Comprehensive 4-stage categorization of bicuspid aortic valve leaflet morphology by cardiac MRI in 386 patients.

    PubMed

    Murphy, I G; Collins, J; Powell, A; Markl, M; McCarthy, P; Malaisrie, S C; Carr, J C; Barker, A J

    2017-08-01

    Bicuspid aortic valve (BAV) disease is heterogeneous and related to valve dysfunction and aortopathy. Appropriate follow up and surveillance of patients with BAV may depend on correct phenotypic categorization. There are multiple classification schemes, however a need exists to comprehensively capture commissure fusion, leaflet asymmetry, and valve orifice orientation. Our aim was to develop a BAV classification scheme for use at MRI to ascertain the frequency of different phenotypes and the consistency of BAV classification. The BAV classification scheme builds on the Sievers surgical BAV classification, adding valve orifice orientation, partial leaflet fusion and leaflet asymmetry. A single observer successfully applied this classification to 386 of 398 Cardiac MRI studies. Repeatability of categorization was ascertained with intraobserver and interobserver kappa scores. Sensitivity and specificity of MRI findings was determined from operative reports, where available. Fusion of the right and left leaflets accounted for over half of all cases. Partial leaflet fusion was seen in 46% of patients. Good interobserver agreement was seen for orientation of the valve opening (κ = 0.90), type (κ = 0.72) and presence of partial fusion (κ = 0.83, p < 0.0001). Retrospective review of operative notes showed sensitivity and specificity for orientation (90, 93%) and for Sievers type (73, 87%). The proposed BAV classification schema was assessed by MRI for its reliability to classify valve morphology in addition to illustrating the wide heterogeneity of leaflet size, orifice orientation, and commissural fusion. The classification may be helpful in further understanding the relationship between valve morphology, flow derangement and aortopathy.

  1. Centrifuge: rapid and sensitive classification of metagenomic sequences

    PubMed Central

    Song, Li; Breitwieser, Florian P.

    2016-01-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649

  2. Modern radiosurgical and endovascular classification schemes for brain arteriovenous malformations.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-05-04

    Stereotactic radiosurgery (SRS) and endovascular techniques are commonly used for treating brain arteriovenous malformations (bAVMs). They are usually used as ancillary techniques to microsurgery but may also be used as solitary treatment options. Careful patient selection requires a clear estimate of the treatment efficacy and complication rates for the individual patient. As such, classification schemes are an essential part of patient selection paradigm for each treatment modality. While the Spetzler-Martin grading system and its subsequent modifications are commonly used for microsurgical outcome prediction for bAVMs, the same system(s) may not be easily applicable to SRS and endovascular therapy. Several radiosurgical- and endovascular-based grading scales have been proposed for bAVMs. However, a comprehensive review of these systems including a discussion on their relative advantages and disadvantages is missing. This paper is dedicated to modern classification schemes designed for SRS and endovascular techniques.

  3. Stokes space modulation format classification based on non-iterative clustering algorithm for coherent optical receivers.

    PubMed

    Mai, Xiaofeng; Liu, Jie; Wu, Xiong; Zhang, Qun; Guo, Changjian; Yang, Yanfu; Li, Zhaohui

    2017-02-06

    A Stokes-space modulation format classification (MFC) technique is proposed for coherent optical receivers by using a non-iterative clustering algorithm. In the clustering algorithm, two simple parameters are calculated to help find the density peaks of the data points in Stokes space and no iteration is required. Correct MFC can be realized in numerical simulations among PM-QPSK, PM-8QAM, PM-16QAM, PM-32QAM and PM-64QAM signals within practical optical signal-to-noise ratio (OSNR) ranges. The performance of the proposed MFC algorithm is also compared with those of other schemes based on clustering algorithms. The simulation results show that good classification performance can be achieved using the proposed MFC scheme with moderate time complexity. Proof-of-concept experiments are finally implemented to demonstrate MFC among PM-QPSK/16QAM/64QAM signals, which confirm the feasibility of our proposed MFC scheme.

  4. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    PubMed

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  6. Causation and Validation of Nursing Diagnoses: A Middle Range Theory.

    PubMed

    de Oliveira Lopes, Marcos Venícios; da Silva, Viviane Martins; Herdman, T Heather

    2017-01-01

    To describe a predictive middle range theory (MRT) that provides a process for validation and incorporation of nursing diagnoses in clinical practice. Literature review. The MRT includes definitions, a pictorial scheme, propositions, causal relationships, and translation to nursing practice. The MRT can be a useful alternative for education, research, and translation of this knowledge into practice. This MRT can assist clinicians in understanding clinical reasoning, based on temporal logic and spectral interaction among elements of nursing classifications. In turn, this understanding will improve the use and accuracy of nursing diagnosis, which is a critical component of the nursing process that forms a basis for nursing practice standards worldwide. © 2015 NANDA International, Inc.

  7. 10 CFR 61.58 - Alternative requirements for waste classification and characteristics.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... LAND DISPOSAL OF RADIOACTIVE WASTE Technical Requirements for Land Disposal Facilities § 61.58 Alternative requirements for waste classification and characteristics. The Commission may, upon request or on... 10 Energy 2 2014-01-01 2014-01-01 false Alternative requirements for waste classification and...

  8. 10 CFR 61.58 - Alternative requirements for waste classification and characteristics.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... LAND DISPOSAL OF RADIOACTIVE WASTE Technical Requirements for Land Disposal Facilities § 61.58 Alternative requirements for waste classification and characteristics. The Commission may, upon request or on... 10 Energy 2 2012-01-01 2012-01-01 false Alternative requirements for waste classification and...

  9. 10 CFR 61.58 - Alternative requirements for waste classification and characteristics.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... LAND DISPOSAL OF RADIOACTIVE WASTE Technical Requirements for Land Disposal Facilities § 61.58 Alternative requirements for waste classification and characteristics. The Commission may, upon request or on... 10 Energy 2 2010-01-01 2010-01-01 false Alternative requirements for waste classification and...

  10. 10 CFR 61.58 - Alternative requirements for waste classification and characteristics.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... LAND DISPOSAL OF RADIOACTIVE WASTE Technical Requirements for Land Disposal Facilities § 61.58 Alternative requirements for waste classification and characteristics. The Commission may, upon request or on... 10 Energy 2 2013-01-01 2013-01-01 false Alternative requirements for waste classification and...

  11. 10 CFR 61.58 - Alternative requirements for waste classification and characteristics.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... LAND DISPOSAL OF RADIOACTIVE WASTE Technical Requirements for Land Disposal Facilities § 61.58 Alternative requirements for waste classification and characteristics. The Commission may, upon request or on... 10 Energy 2 2011-01-01 2011-01-01 false Alternative requirements for waste classification and...

  12. Addition of Histology to the Paris Classification of Pediatric Crohn Disease Alters Classification of Disease Location.

    PubMed

    Fernandes, Melissa A; Verstraete, Sofia G; Garnett, Elizabeth A; Heyman, Melvin B

    2016-02-01

    The aim of the study was to investigate the value of microscopic findings in the classification of pediatric Crohn disease (CD) by determining whether classification of disease changes significantly with inclusion of histologic findings. Sixty patients were randomly selected from a cohort of patients studied at the Pediatric Inflammatory Bowel Disease Clinic at the University of California, San Francisco Benioff Children's Hospital. Two physicians independently reviewed the electronic health records of the included patients to determine the Paris classification for each patient by adhering to present guidelines and then by including microscopic findings. Macroscopic and combined disease location classifications were discordant in 34 (56.6%), with no statistically significant differences between groups. Interobserver agreement was higher in the combined classification (κ = 0.73, 95% confidence interval 0.65-0.82) as opposed to when classification was limited to macroscopic findings (κ = 0.53, 95% confidence interval 0.40-0.58). When evaluating the proximal upper gastrointestinal tract (Paris L4a), the interobserver agreement was better in macroscopic compared with the combined classification. Disease extent classifications differed significantly when comparing isolated macroscopic findings (Paris classification) with the combined scheme that included microscopy. Further studies are needed to determine which scheme provides more accurate representation of disease extent.

  13. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  14. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI)

    PubMed Central

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-01-01

    Introduction: Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. Aim: The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. Methods: first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. Results: There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. Conclusion: The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research. PMID:28883671

  15. Classification and reduction of pilot error

    NASA Technical Reports Server (NTRS)

    Rogers, W. H.; Logan, A. L.; Boley, G. D.

    1989-01-01

    Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.

  16. A Visual Basic program to plot sediment grain-size data on ternary diagrams

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.

    2008-01-01

    Sedimentologic datasets are typically large and compiled into tables or databases, but pure numerical information can be difficult to understand and interpret. Thus, scientists commonly use graphical representations to reduce complexities, recognize trends and patterns in the data, and develop hypotheses. Of the graphical techniques, one of the most common methods used by sedimentologists is to plot the basic gravel, sand, silt, and clay percentages on equilateral triangular diagrams. This means of presenting data is simple and facilitates rapid classification of sediments and comparison of samples.The original classification scheme developed by Shepard (1954) used a single ternary diagram with sand, silt, and clay in the corners and 10 categories to graphically show the relative proportions among these three grades within a sample. This scheme, however, did not allow for sediments with significant amounts of gravel. Therefore, Shepard's classification scheme was later modified by the addition of a second ternary diagram with two categories to account for gravel and gravelly sediment (Schlee, 1973). The system devised by Folk (1954, 1974)\\ is also based on two triangular diagrams, but it has 21 categories and uses the term mud (defined as silt plus clay). Patterns within the triangles of both systems differ, as does the emphasis placed on gravel. For example, in the system described by Shepard, gravelly sediments have more than 10% gravel; in Folk's system, slightly gravelly sediments have as little as 0.01% gravel. Folk's classification scheme stresses gravel because its concentration is a function of the highest current velocity at the time of deposition as is the maximum grain size of the detritus that is available; Shepard's classification scheme emphasizes the ratios of sand, silt, and clay because they reflect sorting and reworking (Poppe et al., 2005).The program described herein (SEDPLOT) generates verbal equivalents and ternary diagrams to characterize sediment grain-size distributions. It is written in Microsoft Visual Basic 6.0 and provides a window to facilitate program execution. The inputs for the sediment fractions are percentages of gravel, sand, silt, and clay in the Wentworth (1922) grade scale, and the program permits the user to select output in either the Shepard (1954) classification scheme, modified as described above, or the Folk (1954, 1974) scheme. Users select options primarily with mouse-click events and through interactive dialogue boxes. This program is intended as a companion to other Visual Basic software we have developed to process sediment data (Poppe et al., 2003, 2004).

  17. An analysis of the synoptic and climatological applicability of circulation type classifications for Ireland

    NASA Astrophysics Data System (ADS)

    Broderick, Ciaran; Fealy, Rowan

    2013-04-01

    Circulation type classifications (CTCs) compiled as part of the COST733 Action, entitled 'Harmonisation and Application of Weather Type Classifications for European Regions', are examined for their synoptic and climatological applicability to Ireland based on their ability to characterise surface temperature and precipitation. In all 16 different objective classification schemes, representative of four different methodological approaches to circulation typing (optimization algorithms, threshold based methods, eigenvector techniques and leader algorithms) are considered. Several statistical metrics which variously quantify the ability of CTCs to discretize daily data into well-defined homogeneous groups are used to evaluate and compare different approaches to synoptic typing. The records from 14 meteorological stations located across the island of Ireland are used in the study. The results indicate that while it was not possible to identify a single optimum classification or approach to circulation typing - conditional on the location and surface variables considered - a number of general assertions regarding the performance of different schemes can be made. The findings for surface temperature indicate that that those classifications based on predefined thresholds (e.g. Litynski, GrossWetterTypes and original Lamb Weather Type) perform well, as do the Kruizinga and Lund classification schemes. Similarly for precipitation predefined type classifications return high skill scores, as do those classifications derived using some optimization procedure (e.g. SANDRA, Self Organizing Maps and K-Means clustering). For both temperature and precipitation the results generally indicate that the classifications perform best for the winter season - reflecting the closer coupling between large-scale circulation and surface conditions during this period. In contrast to the findings for temperature, spatial patterns in the performance of classifications were more evident for precipitation. In the case of this variable those more westerly synoptic stations open to zonal airflow and less influenced by regional scale forcings generally exhibited a stronger link with large-scale circulation.

  18. The Classification of Hysteria and Related Disorders: Historical and Phenomenological Considerations

    PubMed Central

    North, Carol S.

    2015-01-01

    This article examines the history of the conceptualization of dissociative, conversion, and somatoform syndromes in relation to one another, chronicles efforts to classify these and other phenomenologically-related psychopathology in the American diagnostic system for mental disorders, and traces the subsequent divergence in opinions of dissenting sectors on classification of these disorders. This article then considers the extensive phenomenological overlap across these disorders in empirical research, and from this foundation presents a new model for the conceptualization of these disorders. The classification of disorders formerly known as hysteria and phenomenologically-related syndromes has long been contentious and unsettled. Examination of the long history of the conceptual difficulties, which remain inherent in existing classification schemes for these disorders, can help to address the continuing controversy. This review clarifies the need for a major conceptual revision of the current classification of these disorders. A new phenomenologically-based classification scheme for these disorders is proposed that is more compatible with the agnostic and atheoretical approach to diagnosis of mental disorders used by the current classification system. PMID:26561836

  19. An on-line BCI for control of hand grasp sequence and holding using adaptive probabilistic neural network.

    PubMed

    Hazrati, Mehrnaz Kh; Erfanian, Abbas

    2008-01-01

    This paper presents a new EEG-based Brain-Computer Interface (BCI) for on-line controlling the sequence of hand grasping and holding in a virtual reality environment. The goal of this research is to develop an interaction technique that will allow the BCI to be effective in real-world scenarios for hand grasp control. Moreover, for consistency of man-machine interface, it is desirable the intended movement to be what the subject imagines. For this purpose, we developed an on-line BCI which was based on the classification of EEG associated with imagination of the movement of hand grasping and resting state. A classifier based on probabilistic neural network (PNN) was introduced for classifying the EEG. The PNN is a feedforward neural network that realizes the Bayes decision discriminant function by estimating probability density function using mixtures of Gaussian kernels. Two types of classification schemes were considered here for on-line hand control: adaptive and static. In contrast to static classification, the adaptive classifier was continuously updated on-line during recording. The experimental evaluation on six subjects on different days demonstrated that by using the static scheme, a classification accuracy as high as the rate obtained by the adaptive scheme can be achieved. At the best case, an average classification accuracy of 93.0% and 85.8% was obtained using adaptive and static scheme, respectively. The results obtained from more than 1500 trials on six subjects showed that interactive virtual reality environment can be used as an effective tool for subject training in BCI.

  20. Classification of Instructional Programs: 2000 Edition.

    ERIC Educational Resources Information Center

    Morgan, Robert L.; Hunt, E. Stephen

    This third revision of the Classification of Instructional Programs (CIP) updates and modifies education program classifications, providing a taxonomic scheme that supports the accurate tracking, assessment, and reporting of field of study and program completions activity. This edition has also been adopted as the standard field of study taxonomy…

  1. Attribution of local climate zones using a multitemporal land use/land cover classification scheme

    NASA Astrophysics Data System (ADS)

    Wicki, Andreas; Parlow, Eberhard

    2017-04-01

    Worldwide, the number of people living in an urban environment exceeds the rural population with increasing tendency. Especially in relation to global climate change, cities play a major role considering the impacts of extreme heat waves on the population. For urban planners, it is important to know which types of urban structures are beneficial for a comfortable urban climate and which actions can be taken to improve urban climate conditions. Therefore, it is essential to differ between not only urban and rural environments, but also between different levels of urban densification. To compare these built-up types within different cities worldwide, Stewart and Oke developed the concept of local climate zones (LCZ) defined by morphological characteristics. The original LCZ scheme often has considerable problems when adapted to European cities with historical city centers, including narrow streets and irregular patterns. In this study, a method to bridge the gap between a classical land use/land cover (LULC) classification and the LCZ scheme is presented. Multitemporal Landsat 8 data are used to create a high accuracy LULC map, which is linked to the LCZ by morphological parameters derived from a high-resolution digital surface model and cadastral data. A bijective combination of the different classification schemes could not be achieved completely due to overlapping threshold values and the spatially homogeneous distribution of morphological parameters, but the attribution of LCZ to the LULC classification was successful.

  2. Prediction of cause of death from forensic autopsy reports using text classification techniques: A comparative study.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa

    2018-07-01

    Automatic text classification techniques are useful for classifying plaintext medical documents. This study aims to automatically predict the cause of death from free text forensic autopsy reports by comparing various schemes for feature extraction, term weighing or feature value representation, text classification, and feature reduction. For experiments, the autopsy reports belonging to eight different causes of death were collected, preprocessed and converted into 43 master feature vectors using various schemes for feature extraction, representation, and reduction. The six different text classification techniques were applied on these 43 master feature vectors to construct a classification model that can predict the cause of death. Finally, classification model performance was evaluated using four performance measures i.e. overall accuracy, macro precision, macro-F-measure, and macro recall. From experiments, it was found that that unigram features obtained the highest performance compared to bigram, trigram, and hybrid-gram features. Furthermore, in feature representation schemes, term frequency, and term frequency with inverse document frequency obtained similar and better results when compared with binary frequency, and normalized term frequency with inverse document frequency. Furthermore, the chi-square feature reduction approach outperformed Pearson correlation, and information gain approaches. Finally, in text classification algorithms, support vector machine classifier outperforms random forest, Naive Bayes, k-nearest neighbor, decision tree, and ensemble-voted classifier. Our results and comparisons hold practical importance and serve as references for future works. Moreover, the comparison outputs will act as state-of-art techniques to compare future proposals with existing automated text classification techniques. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  3. 46 CFR 8.450 - Termination of classification society authority.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Termination of classification society authority. 8.450... VESSEL INSPECTION ALTERNATIVES Alternate Compliance Program § 8.450 Termination of classification society authority. (a) The Coast Guard may terminate an authorization agreement with a classification society to...

  4. 46 CFR 8.450 - Termination of classification society authority.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Termination of classification society authority. 8.450... VESSEL INSPECTION ALTERNATIVES Alternate Compliance Program § 8.450 Termination of classification society authority. (a) The Coast Guard may terminate an authorization agreement with a classification society to...

  5. DREAM: Classification scheme for dialog acts in clinical research query mediation.

    PubMed

    Hoxha, Julia; Chandar, Praveen; He, Zhe; Cimino, James; Hanauer, David; Weng, Chunhua

    2016-02-01

    Clinical data access involves complex but opaque communication between medical researchers and query analysts. Understanding such communication is indispensable for designing intelligent human-machine dialog systems that automate query formulation. This study investigates email communication and proposes a novel scheme for classifying dialog acts in clinical research query mediation. We analyzed 315 email messages exchanged in the communication for 20 data requests obtained from three institutions. The messages were segmented into 1333 utterance units. Through a rigorous process, we developed a classification scheme and applied it for dialog act annotation of the extracted utterances. Evaluation results with high inter-annotator agreement demonstrate the reliability of this scheme. This dataset is used to contribute preliminary understanding of dialog acts distribution and conversation flow in this dialog space. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Multi-criteria decision aid approach for the selection of the best compromise management scheme for ELVs: the case of Cyprus.

    PubMed

    Mergias, I; Moustakas, K; Papadopoulos, A; Loizidou, M

    2007-08-25

    Each alternative scheme for treating a vehicle at its end of life has its own consequences from a social, environmental, economic and technical point of view. Furthermore, the criteria used to determine these consequences are often contradictory and not equally important. In the presence of multiple conflicting criteria, an optimal alternative scheme never exists. A multiple-criteria decision aid (MCDA) method to aid the Decision Maker (DM) in selecting the best compromise scheme for the management of End-of-Life Vehicles (ELVs) is presented in this paper. The constitution of a set of alternatives schemes, the selection of a list of relevant criteria to evaluate these alternative schemes and the choice of an appropriate management system are also analyzed in this framework. The proposed procedure relies on the PROMETHEE method which belongs to the well-known family of multiple criteria outranking methods. For this purpose, level, linear and Gaussian functions are used as preference functions.

  7. 46 CFR 8.420 - Classification society authorization to participate in the Alternate Compliance Program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Classification society authorization to participate in... § 8.420 Classification society authorization to participate in the Alternate Compliance Program. (a) The Commandant may authorize a recognized classification society to participate in the ACP...

  8. 46 CFR 8.420 - Classification society authorization to participate in the Alternate Compliance Program.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Classification society authorization to participate in... § 8.420 Classification society authorization to participate in the Alternate Compliance Program. (a) The Commandant may authorize a recognized classification society to participate in the ACP...

  9. Acute Oral Toxicity of Trimethylolethane Trinitrate (TMETN) in Sprague- Dawley Rats

    DTIC Science & Technology

    1989-07-01

    classification scheme of Hodge and Steiner, these results indicate that TMETN is a slightly toxic compound.1 20. ON-RIBUTION /AVAILABILITY OF ABSTRACT 21. ABSTRACT...the classification scheme of Hodge and Sterner, these results indcate that TMETN is a slightly toxic compound. KEY WORDS: Acute Oral Toxicit-y...Dawley rats and 1027.4 63.7 mg/kg in female Sprague-Dawley rats. These MLD values place TMETN in the "slightly toxic" range by the system of Hodge and

  10. [Economic management of health crises affecting production animals in Europe].

    PubMed

    Vandeputte, S; Humblet, M F; Fecher-Bourgeois, F; Gosset, C; Albert, A; Vernaillen, F; Saegerman, C

    2011-12-01

    The importance of animal health crises has considerably increased over the last few years. When a crisis occurs, farmers can receive financial support through various public, private and mixed compensation schemes. Economic losses resulting from diseases may be direct and indirect. If a disease is covered by European Union regulations then countries have a legal obligation to partly compensate farmers for direct losses, either directly through the national budget, or through a specific fund. The European Veterinary Fund also co-finances these losses. Only a few countries provide compensation for indirect losses. The private insurance sector also provides protection against some direct and indirect losses but the risks covered are variable. To encourage farmers to subscribe to this kind of insurance, some public authorities provide subsidies to help pay the premium. Insurance companies do not generally cover the risks linked to contagious diseases, but some companies do extend cover to include this type of risk. Several alternatives, such as mutual funds, are available to improve risk coverage. There is a lack of harmonisation among the various compensation schemes of different countries. Public authorities cannot provide full compensation, but mutual funds and private insurance companies are alternatives that should be further investigated and their use should be extended to other countries. A classification of diseases would harmonise the situation at the European level.

  11. NASA Scope and Subject Category Guide

    NASA Technical Reports Server (NTRS)

    2011-01-01

    This guide provides a simple, effective tool to assist aerospace information analysts and database builders in the high-level subject classification of technical materials. Each of the 76 subject categories comprising the classification scheme is presented with a description of category scope, a listing of subtopics, cross references, and an indication of particular areas of NASA interest. The guide also includes an index of nearly 3,000 specific research topics cross referenced to the subject categories. The portable document format (PDF) version of the guide contains links in the index from each input subject to its corresponding categories. In addition to subject classification, the guide can serve as an aid to searching databases that use the classification scheme, and is also an excellent selection guide for those involved in the acquisition of aerospace literature. The CD-ROM contains both HTML and PDF versions.

  12. A risk-based classification scheme for genetically modified foods. III: Evaluation using a panel of reference foods.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    This paper presents an exploratory evaluation of four functional components of a proposed risk-based classification scheme (RBCS) for crop-derived genetically modified (GM) foods in a concordance study. Two independent raters assigned concern levels to 20 reference GM foods using a rating form based on the proposed RBCS. The four components of evaluation were: (1) degree of concordance, (2) distribution across concern levels, (3) discriminating ability of the scheme, and (4) ease of use. At least one of the 20 reference foods was assigned to each of the possible concern levels, demonstrating the ability of the scheme to identify GM foods of different concern with respect to potential health risk. There was reasonably good concordance between the two raters for the three separate parts of the RBCS. The raters agreed that the criteria in the scheme were sufficiently clear in discriminating reference foods into different concern levels, and that with some experience, the scheme was reasonably easy to use. Specific issues and suggestions for improvements identified in the concordance study are discussed.

  13. Using random forest for reliable classification and cost-sensitive learning for medical diagnosis.

    PubMed

    Yang, Fan; Wang, Hua-zhen; Mi, Hong; Lin, Cheng-de; Cai, Wei-wen

    2009-01-30

    Most machine-learning classifiers output label predictions for new instances without indicating how reliable the predictions are. The applicability of these classifiers is limited in critical domains where incorrect predictions have serious consequences, like medical diagnosis. Further, the default assumption of equal misclassification costs is most likely violated in medical diagnosis. In this paper, we present a modified random forest classifier which is incorporated into the conformal predictor scheme. A conformal predictor is a transductive learning scheme, using Kolmogorov complexity to test the randomness of a particular sample with respect to the training sets. Our method show well-calibrated property that the performance can be set prior to classification and the accurate rate is exactly equal to the predefined confidence level. Further, to address the cost sensitive problem, we extend our method to a label-conditional predictor which takes into account different costs for misclassifications in different class and allows different confidence level to be specified for each class. Intensive experiments on benchmark datasets and real world applications show the resultant classifier is well-calibrated and able to control the specific risk of different class. The method of using RF outlier measure to design a nonconformity measure benefits the resultant predictor. Further, a label-conditional classifier is developed and turn to be an alternative approach to the cost sensitive learning problem that relies on label-wise predefined confidence level. The target of minimizing the risk of misclassification is achieved by specifying the different confidence level for different class.

  14. A new local-global approach for classification.

    PubMed

    Peres, R T; Pedreira, C E

    2010-09-01

    In this paper, we propose a new local-global pattern classification scheme that combines supervised and unsupervised approaches, taking advantage of both, local and global environments. We understand as global methods the ones concerned with the aim of constructing a model for the whole problem space using the totality of the available observations. Local methods focus into sub regions of the space, possibly using an appropriately selected subset of the sample. In the proposed method, the sample is first divided in local cells by using a Vector Quantization unsupervised algorithm, the LBG (Linde-Buzo-Gray). In a second stage, the generated assemblage of much easier problems is locally solved with a scheme inspired by Bayes' rule. Four classification methods were implemented for comparison purposes with the proposed scheme: Learning Vector Quantization (LVQ); Feedforward Neural Networks; Support Vector Machine (SVM) and k-Nearest Neighbors. These four methods and the proposed scheme were implemented in eleven datasets, two controlled experiments, plus nine public available datasets from the UCI repository. The proposed method has shown a quite competitive performance when compared to these classical and largely used classifiers. Our method is simple concerning understanding and implementation and is based on very intuitive concepts. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. A novel encoding scheme for effective biometric discretization: Linearly Separable Subcode.

    PubMed

    Lim, Meng-Hui; Teoh, Andrew Beng Jin

    2013-02-01

    Separability in a code is crucial in guaranteeing a decent Hamming-distance separation among the codewords. In multibit biometric discretization where a code is used for quantization-intervals labeling, separability is necessary for preserving distance dissimilarity when feature components are mapped from a discrete space to a Hamming space. In this paper, we examine separability of Binary Reflected Gray Code (BRGC) encoding and reveal its inadequacy in tackling interclass variation during the discrete-to-binary mapping, leading to a tradeoff between classification performance and entropy of binary output. To overcome this drawback, we put forward two encoding schemes exhibiting full-ideal and near-ideal separability capabilities, known as Linearly Separable Subcode (LSSC) and Partially Linearly Separable Subcode (PLSSC), respectively. These encoding schemes convert the conventional entropy-performance tradeoff into an entropy-redundancy tradeoff in the increase of code length. Extensive experimental results vindicate the superiority of our schemes over the existing encoding schemes in discretization performance. This opens up possibilities of achieving much greater classification performance with high output entropy.

  16. A new scheme for urban impervious surface classification from SAR images

    NASA Astrophysics Data System (ADS)

    Zhang, Hongsheng; Lin, Hui; Wang, Yunpeng

    2018-05-01

    Urban impervious surfaces have been recognized as a significant indicator for various environmental and socio-economic studies. There is an increasingly urgent demand for timely and accurate monitoring of the impervious surfaces with satellite technology from local to global scales. In the past decades, optical remote sensing has been widely employed for this task with various techniques. However, there are still a range of challenges, e.g. handling cloud contamination on optical data. Therefore, the Synthetic Aperture Radar (SAR) was introduced for the challenging task because it is uniquely all-time- and all-weather-capable. Nevertheless, with an increasing number of SAR data applied, the methodology used for impervious surfaces classification remains unchanged from the methods used for optical datasets. This shortcoming has prevented the community from fully exploring the potential of using SAR data for impervious surfaces classification. We proposed a new scheme that is comparable to the well-known and fundamental Vegetation-Impervious surface-Soil (V-I-S) model for mapping urban impervious surfaces. Three scenes of fully polarimetric Radsarsat-2 data for the cities of Shenzhen, Hong Kong and Macau were employed to test and validate the proposed methodology. Experimental results indicated that the overall accuracy and Kappa coefficient were 96.00% and 0.8808 in Shenzhen, 93.87% and 0.8307 in Hong Kong and 97.48% and 0.9354 in Macau, indicating the applicability and great potential of the new scheme for impervious surfaces classification using polarimetric SAR data. Comparison with the traditional scheme indicated that this new scheme was able to improve the overall accuracy by up to 4.6% and Kappa coefficient by up to 0.18.

  17. FORUM: A Suggestion for an Improved Vegetation Scheme for Local and Global Mapping and Monitoring.

    PubMed

    ADAMS

    1999-01-01

    / Understanding of global ecological problems is at least partly dependent on clear assessments of vegetation change, and such assessment is always dependent on the use of a vegetation classification scheme. Use of satellite remotely sensed data is the only practical means of carrying out any global-scale vegetation mapping exercise, but if the resulting maps are to be useful to most ecologists and conservationists, they must be closely tied to clearly defined features of vegetation on the ground. Furthermore, much of the mapping that does take place involves more local-scale description of field sites; for purposes of cost and practicality, such studies usually do not involve remote sensing using satellites. There is a need for a single scheme that integrates the smallest to the largest scale in a way that is meaningful to most environmental scientists. Existing schemes are unsatisfactory for this task; they are ambiguous, unnecessarily complex, and their categories do not correspond to common-sense definitions. In response to these problems, a simple structural-physiognomically based scheme with 23 fundamental categories is proposed here for mapping and monitoring on any scale, from local to global. The fundamental categories each subdivide into more specific structural categories for more detailed mapping, but all the categories can be used throughout the world and at any scale, allowing intercomparison between regions. The next stage in the process will be to obtain the views of as many people working in as many different fields as possible, to see whether the proposed scheme suits their needs and how it should be modified. With a few modifications, such a scheme could easily be appended to an existing land cover classification scheme, such as the FAO system, greatly increasing the usefulness and accessability of the results of the landcover classification. KEY WORDS: Vegetation scheme; Mapping; Monitoring; Land cover

  18. What are the most fire-dangerous atmospheric circulations in the Eastern-Mediterranean? Analysis of the synoptic wildfire climatology.

    PubMed

    Paschalidou, A K; Kassomenos, P A

    2016-01-01

    Wildfire management is closely linked to robust forecasts of changes in wildfire risk related to meteorological conditions. This link can be bridged either through fire weather indices or through statistical techniques that directly relate atmospheric patterns to wildfire activity. In the present work the COST-733 classification schemes are applied in order to link wildfires in Greece with synoptic circulation patterns. The analysis reveals that the majority of wildfire events can be explained by a small number of specific synoptic circulations, hence reflecting the synoptic climatology of wildfires. All 8 classification schemes used, prove that the most fire-dangerous conditions in Greece are characterized by a combination of high atmospheric pressure systems located N to NW of Greece, coupled with lower pressures located over the very Eastern part of the Mediterranean, an atmospheric pressure pattern closely linked to the local Etesian winds over the Aegean Sea. During these events, the atmospheric pressure has been reported to be anomalously high, while anomalously low 500hPa geopotential heights and negative total water column anomalies were also observed. Among the various classification schemes used, the 2 Principal Component Analysis-based classifications, namely the PCT and the PXE, as well as the Leader Algorithm classification LND proved to be the best options, in terms of being capable to isolate the vast amount of fire events in a small number of classes with increased frequency of occurrence. It is estimated that these 3 schemes, in combination with medium-range to seasonal climate forecasts, could be used by wildfire risk managers to provide increased wildfire prediction accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computer-aided diagnosis of pulmonary diseases using x-ray darkfield radiography

    NASA Astrophysics Data System (ADS)

    Einarsdóttir, Hildur; Yaroshenko, Andre; Velroyen, Astrid; Bech, Martin; Hellbach, Katharina; Auweter, Sigrid; Yildirim, Önder; Meinel, Felix G.; Eickelberg, Oliver; Reiser, Maximilian; Larsen, Rasmus; Kjær Ersbøll, Bjarne; Pfeiffer, Franz

    2015-12-01

    In this work we develop a computer-aided diagnosis (CAD) scheme for classification of pulmonary disease for grating-based x-ray radiography. In addition to conventional transmission radiography, the grating-based technique provides a dark-field imaging modality, which utilizes the scattering properties of the x-rays. This modality has shown great potential for diagnosing early stage emphysema and fibrosis in mouse lungs in vivo. The CAD scheme is developed to assist radiologists and other medical experts to develop new diagnostic methods when evaluating grating-based images. The scheme consists of three stages: (i) automatic lung segmentation; (ii) feature extraction from lung shape and dark-field image intensities; (iii) classification between healthy, emphysema and fibrosis lungs. A study of 102 mice was conducted with 34 healthy, 52 emphysema and 16 fibrosis subjects. Each image was manually annotated to build an experimental dataset. System performance was assessed by: (i) determining the quality of the segmentations; (ii) validating emphysema and fibrosis recognition by a linear support vector machine using leave-one-out cross-validation. In terms of segmentation quality, we obtained an overlap percentage (Ω) 92.63  ±  3.65%, Dice Similarity Coefficient (DSC) 89.74  ±  8.84% and Jaccard Similarity Coefficient 82.39  ±  12.62%. For classification, the accuracy, sensitivity and specificity of diseased lung recognition was 100%. Classification between emphysema and fibrosis resulted in an accuracy of 93%, whilst the sensitivity was 94% and specificity 88%. In addition to the automatic classification of lungs, deviation maps created by the CAD scheme provide a visual aid for medical experts to further assess the severity of pulmonary disease in the lung, and highlights regions affected.

  20. Centrifuge: rapid and sensitive classification of metagenomic sequences.

    PubMed

    Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L

    2016-12-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.

  1. "Interactive Classification Technology"

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    1999-01-01

    The investigators are upgrading a knowledge representation language called SL (Symbolic Language) and an automated reasoning system called SMS (Symbolic Manipulation System) to enable the technologies to be used in automated reasoning and interactive classification systems. The overall goals of the project are: a) the enhancement of the representation language SL to accommodate multiple perspectives and a wider range of meaning; b) the development of a sufficient set of operators to enable the interpreter of SL to handle representations of basic cognitive acts; and c) the development of a default inference scheme to operate over SL notation as it is encoded. As to particular goals the first-year work plan focused on inferencing and.representation issues, including: 1) the development of higher level cognitive/ classification functions and conceptual models for use in inferencing and decision making; 2) the specification of a more detailed scheme of defaults and the enrichment of SL notation to accommodate the scheme; and 3) the adoption of additional perspectives for inferencing.

  2. Combination of support vector machine, artificial neural network and random forest for improving the classification of convective and stratiform rain using spectral features of SEVIRI data

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2018-05-01

    A model combining three classifiers, namely Support vector machine, Artificial neural network and Random forest (SAR) is designed for improving the classification of convective and stratiform rain. This model (SAR model) has been trained and then tested on a datasets derived from MSG-SEVIRI (Meteosat Second Generation-Spinning Enhanced Visible and Infrared Imager). Well-classified, mid-classified and misclassified pixels are determined from the combination of three classifiers. Mid-classified and misclassified pixels that are considered unreliable pixels are reclassified by using a novel training of the developed scheme. In this novel training, only the input data corresponding to the pixels in question to are used. This whole process is repeated a second time and applied to mid-classified and misclassified pixels separately. Learning and validation of the developed scheme are realized against co-located data observed by ground radar. The developed scheme outperformed different classifiers used separately and reached 97.40% of overall accuracy of classification.

  3. Microtopographic characterization of ice-wedge polygon landscape in Barrow, Alaska: a digital map of troughs, rims, centers derived from high resolution (0.25 m) LiDAR data

    DOE Data Explorer

    Gangodagamage, Chandana; Wullschleger, Stan

    2014-07-03

    The dataset represents microtopographic characterization of the ice-wedge polygon landscape in Barrow, Alaska. Three microtopographic features are delineated using 0.25 m high resolution digital elevation dataset derived from LiDAR. The troughs, rims, and centers are the three categories in this classification scheme. The polygon troughs are the surface expression of the ice-wedges that are in lower elevations than the interior polygon. The elevated shoulders of the polygon interior immediately adjacent to the polygon troughs are the polygon rims for the low center polygons. In case of high center polygons, these features are the topographic highs. In this classification scheme, both topographic highs and rims are considered as polygon rims. The next version of the dataset will include more refined classification scheme including separate classes for rims ad topographic highs. The interior part of the polygon just adjacent to the polygon rims are the polygon centers.

  4. 78 FR 60670 - Airworthiness Directives; The Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... in this regard. Request To Approve an Alternate Generic Repair Scheme as an AMOC British Airways requested that an alternate generic repair scheme be approved as an AMOC to this final rule. British Airways... scheme to British Airways which allowed British Airways to manufacture certain repair parts. British...

  5. Dewey Decimal Classification for U. S. Conn: An Advantage?

    ERIC Educational Resources Information Center

    Marek, Kate

    This paper examines the use of the Dewey Decimal Classification (DDC) system at the U. S. Conn Library at Wayne State College (WSC) in Nebraska. Several developments in the last 20 years which have eliminated the trend toward reclassification of academic library collections from DDC to the Library of Congress (LC) classification scheme are…

  6. A Global Classification System for Catchment Hydrology

    NASA Astrophysics Data System (ADS)

    Woods, R. A.

    2004-05-01

    It is a shocking state of affairs - there is no underpinning scientific taxonomy of catchments. There are widely used global classification systems for climate, river morphology, lakes and wetlands, but for river catchments there exists only a plethora of inconsistent, incomplete regional schemes. By proceeding without a common taxonomy for catchments, freshwater science has missed one of its key developmental stages, and has leapt from definition of phenomena to experiments, theories and models, without the theoretical framework of a classification. I propose the development of a global hierarchical classification system for physical aspects of river catchments, to help underpin physical science in the freshwater environment and provide a solid foundation for classification of river ecosystems. Such a classification scheme can open completely new vistas in hydrology: for example it will be possible to (i) rationally transfer experimental knowledge of hydrological processes between basins anywhere in the world, provided they belong to the same class; (ii) perform meaningful meta-analyses in order to reconcile studies that show inconsistent results (iii) generate new testable hypotheses which involve locations worldwide.

  7. Guidelines for a priori grouping of species in hierarchical community models

    USGS Publications Warehouse

    Pacifici, Krishna; Zipkin, Elise; Collazo, Jaime; Irizarry, Julissa I.; DeWan, Amielle A.

    2014-01-01

    Recent methodological advances permit the estimation of species richness and occurrences for rare species by linking species-level occurrence models at the community level. The value of such methods is underscored by the ability to examine the influence of landscape heterogeneity on species assemblages at large spatial scales. A salient advantage of community-level approaches is that parameter estimates for data-poor species are more precise as the estimation process borrows from data-rich species. However, this analytical benefit raises a question about the degree to which inferences are dependent on the implicit assumption of relatedness among species. Here, we assess the sensitivity of community/group-level metrics, and individual-level species inferences given various classification schemes for grouping species assemblages using multispecies occurrence models. We explore the implications of these groupings on parameter estimates for avian communities in two ecosystems: tropical forests in Puerto Rico and temperate forests in northeastern United States. We report on the classification performance and extent of variability in occurrence probabilities and species richness estimates that can be observed depending on the classification scheme used. We found estimates of species richness to be most precise and to have the best predictive performance when all of the data were grouped at a single community level. Community/group-level parameters appear to be heavily influenced by the grouping criteria, but were not driven strictly by total number of detections for species. We found different grouping schemes can provide an opportunity to identify unique assemblage responses that would not have been found if all of the species were analyzed together. We suggest three guidelines: (1) classification schemes should be determined based on study objectives; (2) model selection should be used to quantitatively compare different classification approaches; and (3) sensitivity of results to different classification approaches should be assessed. These guidelines should help researchers apply hierarchical community models in the most effective manner.

  8. Strain Rate Tensor Estimation in Cine Cardiac MRI Based on Elastic Image Registration

    NASA Astrophysics Data System (ADS)

    Sánchez-Ferrero, Gonzalo Vegas; Vega, Antonio Tristán; Grande, Lucilio Cordero; de La Higuera, Pablo Casaseca; Fernández, Santiago Aja; Fernández, Marcos Martín; López, Carlos Alberola

    In this work we propose an alternative method to estimate and visualize the Strain Rate Tensor (SRT) in Magnetic Resonance Images (MRI) when Phase Contrast MRI (PCMRI) and Tagged MRI (TMRI) are not available. This alternative is based on image processing techniques. Concretely, image registration algorithms are used to estimate the movement of the myocardium at each point. Additionally, a consistency checking method is presented to validate the accuracy of the estimates when no golden standard is available. Results prove that the consistency checking method provides an upper bound of the mean squared error of the estimate. Our experiments with real data show that the registration algorithm provides a useful deformation field to estimate the SRT fields. A classification between regional normal and dysfunctional contraction patterns, as compared with experts diagnosis, points out that the parameters extracted from the estimated SRT can represent these patterns. Additionally, a scheme for visualizing and analyzing the local behavior of the SRT field is presented.

  9. Diagnostic classification scheme in Iranian breast cancer patients using a decision tree.

    PubMed

    Malehi, Amal Saki

    2014-01-01

    The objective of this study was to determine a diagnostic classification scheme using a decision tree based model. The study was conducted as a retrospective case-control study in Imam Khomeini hospital in Tehran during 2001 to 2009. Data, including demographic and clinical-pathological characteristics, were uniformly collected from 624 females, 312 of them were referred with positive diagnosis of breast cancer (cases) and 312 healthy women (controls). The decision tree was implemented to develop a diagnostic classification scheme using CART 6.0 Software. The AUC (area under curve), was measured as the overall performance of diagnostic classification of the decision tree. Five variables as main risk factors of breast cancer and six subgroups as high risk were identified. The results indicated that increasing age, low age at menarche, single and divorced statues, irregular menarche pattern and family history of breast cancer are the important diagnostic factors in Iranian breast cancer patients. The sensitivity and specificity of the analysis were 66% and 86.9% respectively. The high AUC (0.82) also showed an excellent classification and diagnostic performance of the model. Decision tree based model appears to be suitable for identifying risk factors and high or low risk subgroups. It can also assists clinicians in making a decision, since it can identify underlying prognostic relationships and understanding the model is very explicit.

  10. Heuristic pattern correction scheme using adaptively trained generalized regression neural networks.

    PubMed

    Hoya, T; Chambers, J A

    2001-01-01

    In many pattern classification problems, an intelligent neural system is required which can learn the newly encountered but misclassified patterns incrementally, while keeping a good classification performance over the past patterns stored in the network. In the paper, an heuristic pattern correction scheme is proposed using adaptively trained generalized regression neural networks (GRNNs). The scheme is based upon both network growing and dual-stage shrinking mechanisms. In the network growing phase, a subset of the misclassified patterns in each incoming data set is iteratively added into the network until all the patterns in the incoming data set are classified correctly. Then, the redundancy in the growing phase is removed in the dual-stage network shrinking. Both long- and short-term memory models are considered in the network shrinking, which are motivated from biological study of the brain. The learning capability of the proposed scheme is investigated through extensive simulation studies.

  11. Particle-size distribution models for the conversion of Chinese data to FAO/USDA system.

    PubMed

    Shangguan, Wei; Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r (2), Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best.

  12. Analysis of the Local Lymph Node Assay (LLNA) variability for assessing the prediction of skin sensitisation potential and potency of chemicals with non-animal approaches.

    PubMed

    Dumont, Coralie; Barroso, João; Matys, Izabela; Worth, Andrew; Casati, Silvia

    2016-08-01

    The knowledge of the biological mechanisms leading to the induction of skin sensitisation has favoured in recent years the development of alternative non-animal methods. During the formal validation process, results from the Local Lymph Node Assay (LLNA) are generally used as reference data to assess the predictive capacity of the non-animal tests. This study reports an analysis of the variability of the LLNA for a set of chemicals for which multiple studies are available and considers three hazard classification schemes: POS/NEG, GHS/CLP and ECETOC. As the type of vehicle used in a LLNA study is known to influence to some extent the results, two analyses were performed: considering the solvent used to test the chemicals and without considering the solvent. The results show that the number of discordant classifications increases when a chemical is tested in more than one solvent. Moreover, it can be concluded that study results leading to classification in the strongest classes (1A and EXT) seem to be more reliable than those in the weakest classes. This study highlights the importance of considering the variability of the reference data when evaluating non-animal tests. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Log-ratio transformed major element based multidimensional classification for altered High-Mg igneous rocks

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Quiroz-Ruiz, Alfredo

    2016-12-01

    A new multidimensional classification scheme consistent with the chemical classification of the International Union of Geological Sciences (IUGS) is proposed for the nomenclature of High-Mg altered rocks. Our procedure is based on an extensive database of major element (SiO2, TiO2, Al2O3, Fe2O3t, MnO, MgO, CaO, Na2O, K2O, and P2O5) compositions of a total of 33,868 (920 High-Mg and 32,948 "Common") relatively fresh igneous rock samples. The database consisting of these multinormally distributed samples in terms of their isometric log-ratios was used to propose a set of 11 discriminant functions and 6 diagrams to facilitate High-Mg rock classification. The multinormality required by linear discriminant and canonical analysis was ascertained by a new computer program DOMuDaF. One multidimensional function can distinguish the High-Mg and Common igneous rocks with high percent success values of about 86.4% and 98.9%, respectively. Similarly, from 10 discriminant functions the High-Mg rocks can also be classified as one of the four rock types (komatiite, meimechite, picrite, and boninite), with high success values of about 88%-100%. Satisfactory functioning of this new classification scheme was confirmed by seven independent tests. Five further case studies involving application to highly altered rocks illustrate the usefulness of our proposal. A computer program HMgClaMSys was written to efficiently apply the proposed classification scheme, which will be available for online processing of igneous rock compositional data. Monte Carlo simulation modeling and mass-balance computations confirmed the robustness of our classification with respect to analytical errors and postemplacement compositional changes.

  14. Adaptive video-based vehicle classification technique for monitoring traffic.

    DOT National Transportation Integrated Search

    2015-08-01

    This report presents a methodology for extracting two vehicle features, vehicle length and number of axles in order : to classify the vehicles from video, based on Federal Highway Administration (FHWA)s recommended vehicle : classification scheme....

  15. Stygoregions – a promising approach to a bioregional classification of groundwater systems

    PubMed Central

    Stein, Heide; Griebler, Christian; Berkhoff, Sven; Matzke, Dirk; Fuchs, Andreas; Hahn, Hans Jürgen

    2012-01-01

    Linked to diverse biological processes, groundwater ecosystems deliver essential services to mankind, the most important of which is the provision of drinking water. In contrast to surface waters, ecological aspects of groundwater systems are ignored by the current European Union and national legislation. Groundwater management and protection measures refer exclusively to its good physicochemical and quantitative status. Current initiatives in developing ecologically sound integrative assessment schemes by taking groundwater fauna into account depend on the initial classification of subsurface bioregions. In a large scale survey, the regional and biogeographical distribution patterns of groundwater dwelling invertebrates were examined for many parts of Germany. Following an exploratory approach, our results underline that the distribution patterns of invertebrates in groundwater are not in accordance with any existing bioregional classification system established for surface habitats. In consequence, we propose to develope a new classification scheme for groundwater ecosystems based on stygoregions. PMID:22993698

  16. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE PAGES

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-25

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  17. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krogel, Jaron T.; Reboredo, Fernando A.

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this paper, we explore alternatives to reduce the memory usage of splined orbitalsmore » without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. Finally, for production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.« less

  18. Kinetic energy classification and smoothing for compact B-spline basis sets in quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.; Reboredo, Fernando A.

    2018-01-01

    Quantum Monte Carlo calculations of defect properties of transition metal oxides have become feasible in recent years due to increases in computing power. As the system size has grown, availability of on-node memory has become a limiting factor. Saving memory while minimizing computational cost is now a priority. The main growth in memory demand stems from the B-spline representation of the single particle orbitals, especially for heavier elements such as transition metals where semi-core states are present. Despite the associated memory costs, splines are computationally efficient. In this work, we explore alternatives to reduce the memory usage of splined orbitals without significantly affecting numerical fidelity or computational efficiency. We make use of the kinetic energy operator to both classify and smooth the occupied set of orbitals prior to splining. By using a partitioning scheme based on the per-orbital kinetic energy distributions, we show that memory savings of about 50% is possible for select transition metal oxide systems. For production supercells of practical interest, our scheme incurs a performance penalty of less than 5%.

  19. Modern classification and outcome predictors of surgery in patients with brain arteriovenous malformations.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-02-23

    Brain arteriovenous malformations (bAVM) are challenging lesions. Part of this challenge stems from the infinite diversity of these lesions regarding shape, location, anatomy, and physiology. This diversity has called on a variety of treatment modalities for these lesions, of which microsurgical resection prevails as the mainstay of treatment. As such, outcome prediction and managing strategy mainly rely on unraveling the nature of these complex tangles and ways each lesion responds to various therapeutic modalities. This strategy needs the ability to decipher each lesion through accurate and efficient categorization. Therefore, classification schemes are essential parts of treatment planning and outcome prediction. This article summarizes different surgical classification schemes and outcome predictors proposed for bAVMs.

  20. Systems-based decomposition schemes for the approximate solution of multi-term fractional differential equations

    NASA Astrophysics Data System (ADS)

    Ford, Neville J.; Connolly, Joseph A.

    2009-07-01

    We give a comparison of the efficiency of three alternative decomposition schemes for the approximate solution of multi-term fractional differential equations using the Caputo form of the fractional derivative. The schemes we compare are based on conversion of the original problem into a system of equations. We review alternative approaches and consider how the most appropriate numerical scheme may be chosen to solve a particular equation.

  1. Medical X-ray Image Hierarchical Classification Using a Merging and Splitting Scheme in Feature Space.

    PubMed

    Fesharaki, Nooshin Jafari; Pourghassem, Hossein

    2013-07-01

    Due to the daily mass production and the widespread variation of medical X-ray images, it is necessary to classify these for searching and retrieving proposes, especially for content-based medical image retrieval systems. In this paper, a medical X-ray image hierarchical classification structure based on a novel merging and splitting scheme and using shape and texture features is proposed. In the first level of the proposed structure, to improve the classification performance, similar classes with regard to shape contents are grouped based on merging measures and shape features into the general overlapped classes. In the next levels of this structure, the overlapped classes split in smaller classes based on the classification performance of combination of shape and texture features or texture features only. Ultimately, in the last levels, this procedure is also continued forming all the classes, separately. Moreover, to optimize the feature vector in the proposed structure, we use orthogonal forward selection algorithm according to Mahalanobis class separability measure as a feature selection and reduction algorithm. In other words, according to the complexity and inter-class distance of each class, a sub-space of the feature space is selected in each level and then a supervised merging and splitting scheme is applied to form the hierarchical classification. The proposed structure is evaluated on a database consisting of 2158 medical X-ray images of 18 classes (IMAGECLEF 2005 database) and accuracy rate of 93.6% in the last level of the hierarchical structure for an 18-class classification problem is obtained.

  2. Deep learning aided decision support for pulmonary nodules diagnosing: a review.

    PubMed

    Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping; He, Jianxing; Liu, Bo

    2018-04-01

    Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing.

  3. Beyond the frontiers of neuronal types

    PubMed Central

    Battaglia, Demian; Karagiannis, Anastassios; Gallopin, Thierry; Gutch, Harold W.; Cauli, Bruno

    2012-01-01

    Cortical neurons and, particularly, inhibitory interneurons display a large diversity of morphological, synaptic, electrophysiological, and molecular properties, as well as diverse embryonic origins. Various authors have proposed alternative classification schemes that rely on the concomitant observation of several multimodal features. However, a broad variability is generally observed even among cells that are grouped into a same class. Furthermore, the attribution of specific neurons to a single defined class is often difficult, because individual properties vary in a highly graded fashion, suggestive of continua of features between types. Going beyond the description of representative traits of distinct classes, we focus here on the analysis of atypical cells. We introduce a novel paradigm for neuronal type classification, assuming explicitly the existence of a structured continuum of diversity. Our approach, grounded on the theory of fuzzy sets, identifies a small optimal number of model archetypes. At the same time, it quantifies the degree of similarity between these archetypes and each considered neuron. This allows highlighting archetypal cells, which bear a clear similarity to a single model archetype, and edge cells, which manifest a convergence of traits from multiple archetypes. PMID:23403725

  4. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  5. A Visual Basic program to classify sediments based on gravel-sand-silt-clay ratios

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.; Hastings, M.E.

    2003-01-01

    Nomenclature describing size distributions is important to geologists because grain size is the most basic attribute of sediments. Traditionally, geologists have divided sediments into four size fractions that include gravel, sand, silt, and clay, and classified these sediments based on ratios of the various proportions of the fractions. Definitions of these fractions have long been standardized to the grade scale described by Wentworth (1922), and two main classification schemes have been adopted to describe the approximate relationship between the size fractions.Specifically, according to the Wentworth grade scale gravel-sized particles have a nominal diameter of ⩾2.0 mm; sand-sized particles have nominal diameters from <2.0 mm to ⩾62.5 μm; silt-sized particles have nominal diameters from <62.5 to ⩾4.0 μm; and clay is <4.0 μm. As for sediment classification, most sedimentologists use one of the systems described either by Shepard (1954) or Folk (1954, 1974). The original scheme devised by Shepard (1954) utilized a single ternary diagram with sand, silt, and clay in the corners to graphically show the relative proportions among these three grades within a sample. This scheme, however, does not allow for sediments with significant amounts of gravel. Therefore, Shepard's classification scheme (Fig. 1) was subsequently modified by the addition of a second ternary diagram to account for the gravel fraction (Schlee, 1973). The system devised by Folk (1954, 1974) is also based on two triangular diagrams (Fig. 2), but it has 23 major categories, and uses the term mud (defined as silt plus clay). The patterns within the triangles of both systems differ, as does the emphasis placed on gravel. For example, in the system described by Shepard, gravelly sediments have more than 10% gravel; in Folk's system, slightly gravelly sediments have as little as 0.01% gravel. Folk's classification scheme stresses gravel because its concentration is a function of the highest current velocity at the time of deposition, together with the maximum grain size of the detritus that is available; Shepard's classification scheme emphasizes the ratios of sand, silt, and clay because they reflect sorting and reworking (Poppe et al., 2000).

  6. Etiologic classification of TIA and minor stroke by A-S-C-O and causative classification system as compared to TOAST reduces the proportion of patients categorized as cause undetermined.

    PubMed

    Desai, Jamsheed A; Abuzinadah, Ahmad R; Imoukhuede, Oje; Bernbaum, Manya L; Modi, Jayesh; Demchuk, Andrew M; Coutts, Shelagh B

    2014-01-01

    The assortment of patients based on the underlying pathophysiology is central to preventing recurrent stroke after a transient ischemic attack and minor stroke (TIA-MS). The causative classification of stroke (CCS) and the A-S-C-O (A for atherosclerosis, S for small vessel disease, C for Cardiac source, O for other cause) classification schemes have recently been developed. These systems have not been specifically applied to the TIA-MS population. We hypothesized that both CCS and A-S-C-O would increase the proportion of patients with a definitive etiologic mechanism for TIA-MS as compared with TOAST. Patients were analyzed from the CATCH study. A single-stroke physician assigned all patients to an etiologic subtype using published algorithms for TOAST, CCS and ASCO. We compared the proportions in the various categories for each classification scheme and then the association with stroke progression or recurrence was assessed. TOAST, CCS and A-S-C-O classification schemes were applied in 469 TIA-MS patients. When compared to TOAST both CCS (58.0 vs. 65.3%; p < 0.0001) and ASCO grade 1 or 2 (37.5 vs. 65.3%; p < 0.0001) assigned fewer patients as cause undetermined. CCS had increased assignment of cardioembolism (+3.8%, p = 0.0001) as compared with TOAST. ASCO grade 1 or 2 had increased assignment of cardioembolism (+8.5%, p < 0.0001), large artery atherosclerosis (+14.9%, p < 0.0001) and small artery occlusion (+4.3%, p < 0.0001) as compared with TOAST. Compared with CCS, using ASCO resulted in a 20.5% absolute reduction in patients assigned to the 'cause undetermined' category (p < 0.0001). Patients who had multiple high-risk etiologies either by CCS or ASCO classification or an ASCO undetermined classification had a higher chance of having a recurrent event. Both CCS and ASCO schemes reduce the proportion of TIA and minor stroke patients classified as 'cause undetermined.' ASCO resulted in the fewest patients classified as cause undetermined. Stroke recurrence after TIA-MS is highest in patients with multiple high-risk etiologies or cryptogenic stroke classified by ASCO. © 2014 S. Karger AG, Basel.

  7. Development of a Hazard Classification Scheme for Substances Used in the Fraudulent Adulteration of Foods.

    PubMed

    Everstine, Karen; Abt, Eileen; McColl, Diane; Popping, Bert; Morrison-Rowe, Sara; Lane, Richard W; Scimeca, Joseph; Winter, Carl; Ebert, Andrew; Moore, Jeffrey C; Chin, Henry B

    2018-01-01

    Food fraud, the intentional misrepresentation of the true identity of a food product or ingredient for economic gain, is a threat to consumer confidence and public health and has received increased attention from both regulators and the food industry. Following updates to food safety certification standards and publication of new U.S. regulatory requirements, we undertook a project to (i) develop a scheme to classify food fraud-related adulterants based on their potential health hazard and (ii) apply this scheme to the adulterants in a database of 2,970 food fraud records. The classification scheme was developed by a panel of experts in food safety and toxicology from the food industry, academia, and the U.S. Food and Drug Administration. Categories and subcategories were created through an iterative process of proposal, review, and validation using a subset of substances known to be associated with the fraudulent adulteration of foods. Once developed, the scheme was applied to the adulterants in the database. The resulting scheme included three broad categories: 1, potentially hazardous adulterants; 2, adulterants that are unlikely to be hazardous; and 3, unclassifiable adulterants. Categories 1 and 2 consisted of seven subcategories intended to further define the range of hazard potential for adulterants. Application of the scheme to the 1,294 adulterants in the database resulted in 45% of adulterants classified in category 1 (potentially hazardous). Twenty-seven percent of the 1,294 adulterants had a history of causing consumer illness or death, were associated with safety-related regulatory action, or were classified as allergens. These results reinforce the importance of including a consideration of food fraud-related adulterants in food safety systems. This classification scheme supports food fraud mitigation efforts and hazard identification as required in the U.S. Food Safety Modernization Act Preventive Controls Rules.

  8. Initial interpretation and evaluation of a profile-based classification system for the anxiety and mood disorders: Incremental validity compared to DSM-IV categories.

    PubMed

    Rosellini, Anthony J; Brown, Timothy A

    2014-12-01

    Limitations in anxiety and mood disorder diagnostic reliability and validity due to the categorical approach to classification used by the Diagnostic and Statistical Manual of Mental Disorders (DSM) have been long recognized. Although these limitations have led researchers to forward alternative classification schemes, few have been empirically evaluated. In a sample of 1,218 outpatients with anxiety and mood disorders, the present study examined the validity of Brown and Barlow's (2009) proposal to classify the anxiety and mood disorders using an integrated dimensional-categorical approach based on transdiagnostic emotional disorder vulnerabilities and phenotypes. Latent class analyses of 7 transdiagnostic dimensional indicators suggested that a 6-class (i.e., profile) solution provided the best model fit and was the most conceptually interpretable. Interpretation of the classes was further supported when compared with DSM diagnoses (i.e., within-class prevalence of diagnoses, using diagnoses to predict class membership). In addition, hierarchical multiple regression models were used to demonstrate the incremental validity of the profiles; class probabilities consistently accounted for unique variance in anxiety and mood disorder outcomes above and beyond DSM diagnoses. These results provide support for the potential development and utility of a hybrid dimensional-categorical profile approach to anxiety and mood disorder classification. In particular, the availability of dimensional indicators and corresponding profiles may serve as a useful complement to DSM diagnoses for both researchers and clinicians. (c) 2014 APA, all rights reserved.

  9. Initial Interpretation and Evaluation of a Profile-Based Classification System for the Anxiety and Mood Disorders: Incremental Validity Compared to DSM-IV Categories

    PubMed Central

    Rosellini, Anthony J.; Brown, Timothy A.

    2014-01-01

    Limitations in anxiety and mood disorder diagnostic reliability and validity due to the categorical approach to classification used by the Diagnostic and Statistical Manual of Mental Disorders (DSM) have been long recognized. Although these limitations have led researchers to forward alternative classification schemes, few have been empirically evaluated. In a sample of 1,218 outpatients with anxiety and mood disorders, the present study examined the validity of Brown and Barlow's (2009) proposal to classify the anxiety and mood disorders using an integrated dimensional-categorical approach based on transdiagnostic emotional disorder vulnerabilities and phenotypes. Latent class analyses of seven transdiagnostic dimensional indicators suggested that a six-class (i.e., profile) solution provided the best model fit and was the most conceptually interpretable. Interpretation of the classes was further supported when compared with DSM-IV diagnoses (i.e., within-class prevalence of diagnoses, using diagnoses to predict class membership). In addition, hierarchical multiple regression models were used to demonstrate the incremental validity of the profiles; class probabilities consistently accounted for unique variance in anxiety and mood disorder outcomes above and beyond DSM diagnoses. These results provide support for the potential development and utility of a hybrid dimensional-categorical profile approach to anxiety and mood disorder classification. In particular, the availability of dimensional indicators and corresponding profiles may serve as a useful complement to DSM diagnoses for both researchers and clinicians. PMID:25265416

  10. Functional traits, convergent evolution, and periodic tables of niches.

    PubMed

    Winemiller, Kirk O; Fitzgerald, Daniel B; Bower, Luke M; Pianka, Eric R

    2015-08-01

    Ecology is often said to lack general theories sufficiently predictive for applications. Here, we examine the concept of a periodic table of niches and feasibility of niche classification schemes from functional trait and performance data. Niche differences and their influence on ecological patterns and processes could be revealed effectively by first performing data reduction/ordination analyses separately on matrices of trait and performance data compiled according to logical associations with five basic niche 'dimensions', or aspects: habitat, life history, trophic, defence and metabolic. Resultant patterns then are integrated to produce interpretable niche gradients, ordinations and classifications. Degree of scheme periodicity would depend on degrees of niche conservatism and convergence causing species clustering across multiple niche dimensions. We analysed a sample data set containing trait and performance data to contrast two approaches for producing niche schemes: species ordination within niche gradient space, and niche categorisation according to trait-value thresholds. Creation of niche schemes useful for advancing ecological knowledge and its applications will depend on research that produces functional trait and performance datasets directly related to niche dimensions along with criteria for data standardisation and quality. As larger databases are compiled, opportunities will emerge to explore new methods for data reduction, ordination and classification. © 2015 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  11. Using dual classifications in the development of avian wetland indices of biological integrity for wetlands in West Virginia, USA.

    PubMed

    Veselka, Walter; Anderson, James T; Kordek, Walter S

    2010-05-01

    Considerable resources are being used to develop and implement bioassessment methods for wetlands to ensure that "biological integrity" is maintained under the United States Clean Water Act. Previous research has demonstrated that avian composition is susceptible to human impairments at multiple spatial scales. Using a site-specific disturbance gradient, we built avian wetland indices of biological integrity (AW-IBI) specific to two wetland classification schemes, one based on vegetative structure and the other based on the wetland's position in the landscape and sources of water. The resulting class-specific AW-IBI was comprised of one to four metrics that varied in their sensitivity to the disturbance gradient. Some of these metrics were specific to only one of the classification schemes, whereas others could discriminate varying levels of disturbance regardless of classification scheme. Overall, all of the derived biological indices specific to the vegetative structure-based classes of wetlands had a significant relation with the disturbance gradient; however, the biological index derived for floodplain wetlands exhibited a more consistent response to a local disturbance gradient. We suspect that the consistency of this response is due to the inherent nature of the connectivity of available habitat in floodplain wetlands.

  12. A Critical Review of Mode of Action (MOA) Assignment ...

    EPA Pesticide Factsheets

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human health toxicology. With increasing calls to assess thousands of chemicals, some of which have little available information other than structure, clear understanding how each of these MOA schemes was devised, what information they are based on, and the limitations of each approach is critical. Several groups are developing low-tier methods to more easily classify or assess chemicals, using approaches such as the ecological threshold of concern (eco-TTC) and chemical-activity. Evaluation of these approaches and determination of their domain of applicability is partly dependent on the MOA classification that is used. The most commonly used MOA classification schemes for ecotoxicology include Verhaar and Russom (included in ASTER), both of which are used to predict acute aquatic toxicity MOA. Verhaar is a QSAR-based system that classifies chemicals into one of 4 classes, with a 5th class specified for those chemicals that are not classified in the other 4. ASTER/Russom includes 8 classifications: narcotics (3 groups), oxidative phosphorylation uncouplers, respiratory inhibitors, electrophiles/proelectrophiles, AChE inhibitors, or CNS seizure agents. Other methodologies include TEST (Toxicity Estimation Software Tool), a computational chemistry-based application that allows prediction to one of 5 broad MOA

  13. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  14. A soft computing scheme incorporating ANN and MOV energy in fault detection, classification and distance estimation of EHV transmission line with FSC.

    PubMed

    Khadke, Piyush; Patne, Nita; Singh, Arvind; Shinde, Gulab

    2016-01-01

    In this article, a novel and accurate scheme for fault detection, classification and fault distance estimation for a fixed series compensated transmission line is proposed. The proposed scheme is based on artificial neural network (ANN) and metal oxide varistor (MOV) energy, employing Levenberg-Marquardt training algorithm. The novelty of this scheme is the use of MOV energy signals of fixed series capacitors (FSC) as input to train the ANN. Such approach has never been used in any earlier fault analysis algorithms in the last few decades. Proposed scheme uses only single end measurement energy signals of MOV in all the 3 phases over one cycle duration from the occurrence of a fault. Thereafter, these MOV energy signals are fed as input to ANN for fault distance estimation. Feasibility and reliability of the proposed scheme have been evaluated for all ten types of fault in test power system model at different fault inception angles over numerous fault locations. Real transmission system parameters of 3-phase 400 kV Wardha-Aurangabad transmission line (400 km) with 40 % FSC at Power Grid Wardha Substation, India is considered for this research. Extensive simulation experiments show that the proposed scheme provides quite accurate results which demonstrate complete protection scheme with high accuracy, simplicity and robustness.

  15. Evaluation of host and viral factors associated with severe dengue based on the 2009 WHO classification.

    PubMed

    Pozo-Aguilar, Jorge O; Monroy-Martínez, Verónica; Díaz, Daniel; Barrios-Palacios, Jacqueline; Ramos, Celso; Ulloa-García, Armando; García-Pillado, Janet; Ruiz-Ordaz, Blanca H

    2014-12-11

    Dengue fever (DF) is the most prevalent arthropod-borne viral disease affecting humans. The World Health Organization (WHO) proposed a revised classification in 2009 to enable the more effective identification of cases of severe dengue (SD). This was designed primarily as a clinical tool, but it also enables cases of SD to be differentiated into three specific subcategories (severe vascular leakage, severe bleeding, and severe organ dysfunction). However, no study has addressed whether this classification has advantage in estimating factors associated with the progression of disease severity or dengue pathogenesis. We evaluate in a dengue outbreak associated risk factors that could contribute to the development of SD according to the 2009 WHO classification. A prospective cross-sectional study was performed during an epidemic of dengue in 2009 in Chiapas, Mexico. Data were analyzed for host and viral factors associated with dengue cases, using the 1997 and 2009 WHO classifications. The cost-benefit ratio (CBR) was also estimated. The sensitivity in the 1997 WHO classification for determining SD was 75%, and the specificity was 97.7%. For the 2009 scheme, these were 100% and 81.1%, respectively. The 2009 classification showed a higher benefit (537%) with a lower cost (10.2%) than the 1997 WHO scheme. A secondary antibody response was strongly associated with SD. Early viral load was higher in cases of SD than in those with DF. Logistic regression analysis identified predictive SD factors (secondary infection, disease phase, viral load) within the 2009 classification. However, within the 1997 scheme it was not possible to differentiate risk factors between DF and dengue hemorrhagic fever or dengue shock syndrome. The critical clinical stage for determining SD progression was the transition from fever to defervescence in which plasma leakage can occur. The clinical phenotype of SD is influenced by the host (secondary response) and viral factors (viral load). The 2009 WHO classification showed greater sensitivity to identify SD in real time. Timely identification of SD enables accurate early decisions, allowing proper management of health resources for the benefit of patients at risk for SD. This is possible based on the 2009 WHO classification.

  16. Reconciling Mining with the Conservation of Cave Biodiversity: A Quantitative Baseline to Help Establish Conservation Priorities

    PubMed Central

    Prous, Xavier; Zampaulo, Robson; Giannini, Tereza C.; Imperatriz-Fonseca, Vera L.; Maurity, Clóvis; Oliveira, Guilherme; Brandi, Iuri V.; Siqueira, José O.

    2016-01-01

    Caves pose significant challenges for mining projects, since they harbor many endemic and threatened species, and must therefore be protected. Recent discussions between academia, environmental protection agencies, and industry partners, have highlighted problems with the current Brazilian legislation for the protection of caves. While the licensing process is long, complex and cumbersome, the criteria used to assign caves into conservation relevance categories are often subjective, with relevance being mainly determined by the presence of obligate cave dwellers (troglobites) and their presumed rarity. However, the rarity of these troglobitic species is questionable, as most remain unidentified to the species level and their habitats and distribution ranges are poorly known. Using data from 844 iron caves retrieved from different speleology reports for the Carajás region (South-Eastern Amazon, Brazil), one of the world's largest deposits of high-grade iron ore, we assess the influence of different cave characteristics on four biodiversity proxies (species richness, presence of troglobites, presence of rare troglobites, and presence of resident bat populations). We then examine how the current relevance classification scheme ranks caves with different biodiversity indicators. Large caves were found to be important reservoirs of biodiversity, so they should be prioritized in conservation programs. Our results also reveal spatial autocorrelation in all the biodiversity proxies assessed, indicating that iron caves should be treated as components of a cave network immersed in the karst landscape. Finally, we show that by prioritizing the conservation of rare troglobites, the current relevance classification scheme is undermining overall cave biodiversity and leaving ecologically important caves unprotected. We argue that conservation efforts should target subterranean habitats as a whole and propose an alternative relevance ranking scheme, which could help simplify the assessment process and channel more resources to the effective protection of overall cave biodiversity. PMID:27997576

  17. COMPARISON OF GEOGRAPHIC CLASSIFICATION SCHEMES FOR MID-ATLANTIC STREAM FISH ASSEMBLAGES

    EPA Science Inventory

    Understanding the influence of geographic factors in structuring fish assemblages is crucial to developing a comprehensive assessment of stream conditions. We compared the classification strengths (CS) of geographic groups (ecoregions and catchments), stream order, and groups bas...

  18. Sorting Potatoes for Miss Bonner.

    ERIC Educational Resources Information Center

    Herreid, Clyde Freeman

    1998-01-01

    Discusses the basis of a classification scheme for types of case studies. Four major classification headings are identified: (1) individual assignment; (2) lecture; (3) discussion; and (4) small group activities. Describes each heading from the point of view of several teaching methods. (DDR)

  19. SOM Classification of Martian TES Data

    NASA Technical Reports Server (NTRS)

    Hogan, R. C.; Roush, T. L.

    2002-01-01

    A classification scheme based on unsupervised self-organizing maps (SOM) is described. Results from its application to the ASU mineral spectral database are presented. Applications to the Martian Thermal Emission Spectrometer data are discussed. Additional information is contained in the original extended abstract.

  20. Exploring the impact of wavelet-based denoising in the classification of remote sensing hyperspectral images

    NASA Astrophysics Data System (ADS)

    Quesada-Barriuso, Pablo; Heras, Dora B.; Argüello, Francisco

    2016-10-01

    The classification of remote sensing hyperspectral images for land cover applications is a very intensive topic. In the case of supervised classification, Support Vector Machines (SVMs) play a dominant role. Recently, the Extreme Learning Machine algorithm (ELM) has been extensively used. The classification scheme previously published by the authors, and called WT-EMP, introduces spatial information in the classification process by means of an Extended Morphological Profile (EMP) that is created from features extracted by wavelets. In addition, the hyperspectral image is denoised in the 2-D spatial domain, also using wavelets and it is joined to the EMP via a stacked vector. In this paper, the scheme is improved achieving two goals. The first one is to reduce the classification time while preserving the accuracy of the classification by using ELM instead of SVM. The second one is to improve the accuracy results by performing not only a 2-D denoising for every spectral band, but also a previous additional 1-D spectral signature denoising applied to each pixel vector of the image. For each denoising the image is transformed by applying a 1-D or 2-D wavelet transform, and then a NeighShrink thresholding is applied. Improvements in terms of classification accuracy are obtained, especially for images with close regions in the classification reference map, because in these cases the accuracy of the classification in the edges between classes is more relevant.

  1. Extending a field-based Sonoran desert vegetation classification to a regional scale using optical and microwave satellite imagery

    NASA Astrophysics Data System (ADS)

    Shupe, Scott Marshall

    2000-10-01

    Vegetation mapping in and regions facilitates ecological studies, land management, and provides a record to which future land changes can be compared. Accurate and representative mapping of desert vegetation requires a sound field sampling program and a methodology to transform the data collected into a representative classification system. Time and cost constraints require that a remote sensing approach be used if such a classification system is to be applied on a regional scale. However, desert vegetation may be sparse and thus difficult to sense at typical satellite resolutions, especially given the problem of soil reflectance. This study was designed to address these concerns by conducting vegetation mapping research using field and satellite data from the US Army Yuma Proving Ground (USYPG) in Southwest Arizona. Line and belt transect data from the Army's Land Condition Trend Analysis (LCTA) Program were transformed into relative cover and relative density classification schemes using cluster analysis. Ordination analysis of the same data produced two and three-dimensional graphs on which the homogeneity of each vegetation class could be examined. It was found that the use of correspondence analysis (CA), detrended correspondence analysis (DCA), and non-metric multidimensional scaling (NMS) ordination methods was superior to the use of any single ordination method for helping to clarify between-class and within-class relationships in vegetation composition. Analysis of these between-class and within-class relationships were of key importance in examining how well relative cover and relative density schemes characterize the USYPG vegetation. Using these two classification schemes as reference data, maximum likelihood and artificial neural net classifications were then performed on a coregistered dataset consisting of a summer Landsat Thematic Mapper (TM) image, one spring and one summer ERS-1 microwave image, and elevation, slope, and aspect layers. Classifications using a combination of ERS-1 imagery and elevation, slope, and aspect data were superior to classifications carried out using Landsat TM data alone. In all classification iterations it was consistently found that the highest classification accuracy was obtained by using a combination of Landsat TM, ERS-1, and elevation, slope, and aspect data. Maximum likelihood classification accuracy was found to be higher than artificial neural net classification in all cases.

  2. A Job Classification Scheme for Health Manpower

    PubMed Central

    Weiss, Jeffrey H.

    1968-01-01

    The Census Bureau's occupational classification scheme and concept of the “health services industry” are inadequate tools for analysis of the changing job structure of health manpower. In an attempt to remedy their inadequacies, a new analytical framework—drawing upon the work of James Scoville on the job content of the U.S. economy—was devised. The first stage in formulating this new framework was to determine which jobs should be considered health jobs. The overall health care job family was designed to encompass jobs in which the primary technical focus or function is oriented toward the provision of health services. There are two dimensions to the job classification scheme presented here. The first describes each job in terms of job content; relative income data and minimum education and training requirements were employed as surrogate measures. By this means, health care jobs were grouped by three levels of job content: high, medium, and low. The other dimension describes each job in terms of its technical focus or function; by this means, health care jobs were grouped into nine job families. PMID:5673666

  3. Exploiting unsupervised and supervised classification for segmentation of the pathological lung in CT

    NASA Astrophysics Data System (ADS)

    Korfiatis, P.; Kalogeropoulou, C.; Daoussis, D.; Petsas, T.; Adonopoulos, A.; Costaridou, L.

    2009-07-01

    Delineation of lung fields in presence of diffuse lung diseases (DLPDs), such as interstitial pneumonias (IP), challenges segmentation algorithms. To deal with IP patterns affecting the lung border an automated image texture classification scheme is proposed. The proposed segmentation scheme is based on supervised texture classification between lung tissue (normal and abnormal) and surrounding tissue (pleura and thoracic wall) in the lung border region. This region is coarsely defined around an initial estimate of lung border, provided by means of Markov Radom Field modeling and morphological operations. Subsequently, a support vector machine classifier was trained to distinguish between the above two classes of tissue, using textural feature of gray scale and wavelet domains. 17 patients diagnosed with IP, secondary to connective tissue diseases were examined. Segmentation performance in terms of overlap was 0.924±0.021, and for shape differentiation mean, rms and maximum distance were 1.663±0.816, 2.334±1.574 and 8.0515±6.549 mm, respectively. An accurate, automated scheme is proposed for segmenting abnormal lung fields in HRC affected by IP

  4. A Classification Scheme for Analyzing Mobile Apps Used to Prevent and Manage Disease in Late Life

    PubMed Central

    Wang, Aiguo; Lu, Xin; Chen, Hongtu; Li, Changqun; Levkoff, Sue

    2014-01-01

    Background There are several mobile apps that offer tools for disease prevention and management among older adults, and promote health behaviors that could potentially reduce or delay the onset of disease. A classification scheme that categorizes apps could be useful to both older adult app users and app developers. Objective The objective of our study was to build and evaluate the effectiveness of a classification scheme that classifies mobile apps available for older adults in the “Health & Fitness” category of the iTunes App Store. Methods We constructed a classification scheme for mobile apps according to three dimensions: (1) the Precede-Proceed Model (PPM), which classifies mobile apps in terms of predisposing, enabling, and reinforcing factors for behavior change; (2) health care process, specifically prevention versus management of disease; and (3) health conditions, including physical health and mental health. Content analysis was conducted by the research team on health and fitness apps designed specifically for older adults, as well as those applicable to older adults, released during the months of June and August 2011 and August 2012. Face validity was assessed by a different group of individuals, who were not related to the study. A reliability analysis was conducted to confirm the accuracy of the coding scheme of the sample apps in this study. Results After applying sample inclusion and exclusion criteria, a total of 119 apps were included in the study sample, of which 26/119 (21.8%) were released in June 2011, 45/119 (37.8%) in August 2011, and 48/119 (40.3%) in August 2012. Face validity was determined by interviewing 11 people, who agreed that this scheme accurately reflected the nature of this application. The entire study sample was successfully coded, demonstrating satisfactory inter-rater reliability by two independent coders (95.8% initial concordance and 100% concordance after consensus was reached). The apps included in the study sample were more likely to be used for the management of disease than prevention of disease (109/119, 91.6% vs 15/119, 12.6%). More apps contributed to physical health rather than mental health (81/119, 68.1% vs 47/119, 39.5%). Enabling apps (114/119, 95.8%) were more common than reinforcing (20/119, 16.8%) or predisposing apps (10/119, 8.4%). Conclusions The findings, including face validity and inter-rater reliability, support the integrity of the proposed classification scheme for categorizing mobile apps for older adults in the “Health and Fitness” category available in the iTunes App Store. Using the proposed classification system, older adult app users would be better positioned to identify apps appropriate for their needs, and app developers would be able to obtain the distributions of available mobile apps for health-related concerns of older adults more easily. PMID:25098687

  5. Branch classification: A new mechanism for improving branch predictor performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, P.Y.; Hao, E.; Patt, Y.

    There is wide agreement that one of the most significant impediments to the performance of current and future pipelined superscalar processors is the presence of conditional branches in the instruction stream. Speculative execution is one solution to the branch problem, but speculative work is discarded if a branch is mispredicted. For it to be effective, speculative work is discarded if a branch is mispredicted. For it to be effective, speculative execution requires a very accurate branch predictor; 95% accuracy is not good enough. This paper proposes branch classification, a methodology for building more accurate branch predictors. Branch classification allows anmore » individual branch instruction to be associated with the branch predictor best suited to predict its direction. Using this approach, a hybrid branch predictor can be constructed such that each component branch predictor predicts those branches for which it is best suited. To demonstrate the usefulness of branch classification, an example classification scheme is given and a new hybrid predictor is built based on this scheme which achieves a higher prediction accuracy than any branch predictor previously reported in the literature.« less

  6. Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.

    PubMed

    Chen, Shizhi; Yang, Xiaodong; Tian, Yingli

    2015-09-01

    A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.

  7. Evaluation of several schemes for classification of remotely sensed data: Their parameters and performance. [Foster County, North Dakota; Grant County, Kansas; Iroquois County, Illinois, Tippecanoe County, Indiana; and Pottawattamie and Shelby Counties, Iowa

    NASA Technical Reports Server (NTRS)

    Scholz, D.; Fuhs, N.; Hixson, M.; Akiyama, T. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Data sets for corn, soybeans, winter wheat, and spring wheat were used to evaluate the following schemes for crop identification: (1) per point Gaussian maximum classifier; (2) per point sum of normal densities classifiers; (3) per point linear classifier; (4) per point Gaussian maximum likelihood decision tree classifiers; and (5) texture sensitive per field Gaussian maximum likelihood classifier. Test site location and classifier both had significant effects on classification accuracy of small grains; classifiers did not differ significantly in overall accuracy, with the majority of the difference among classifiers being attributed to training method rather than to the classification algorithm applied. The complexity of use and computer costs for the classifiers varied significantly. A linear classification rule which assigns each pixel to the class whose mean is closest in Euclidean distance was the easiest for the analyst and cost the least per classification.

  8. ERTS-1 data applications to Minnesota forest land use classification

    NASA Technical Reports Server (NTRS)

    Sizer, J. E. (Principal Investigator); Eller, R. G.; Meyer, M. P.; Ulliman, J. J.

    1973-01-01

    The author has identified the following significant results. Color-combined ERTS-1 MSS spectral slices were analyzed to determine the maximum (repeatable) level of meaningful forest resource classification data visually attainable by skilled forest photointerpreters for the following purposes: (1) periodic updating of the Minnesota Land Management Information System (MLMIS) statewide computerized land use data bank, and (2) to provide first-stage forest resources survey data for large area forest land management planning. Controlled tests were made of two forest classification schemes by experienced professional foresters with special photointerpretation training and experience. The test results indicate it is possible to discriminate the MLMIS forest class from the MLMIS nonforest classes, but that it is not possible, under average circumstances, to further stratify the forest classification into species components with any degree of reliability with ERTS-1 imagery. An ongoing test of the resulting classification scheme involves the interpretation, and mapping, of the south half of Itasca County, Minnesota, with ERTS-1 imagery. This map is undergoing field checking by on the ground field cooperators, whose evaluation will be completed in the fall of 1973.

  9. A fast and efficient segmentation scheme for cell microscopic image.

    PubMed

    Lebrun, G; Charrier, C; Lezoray, O; Meurie, C; Cardot, H

    2007-04-27

    Microscopic cellular image segmentation schemes must be efficient for reliable analysis and fast to process huge quantity of images. Recent studies have focused on improving segmentation quality. Several segmentation schemes have good quality but processing time is too expensive to deal with a great number of images per day. For segmentation schemes based on pixel classification, the classifier design is crucial since it is the one which requires most of the processing time necessary to segment an image. The main contribution of this work is focused on how to reduce the complexity of decision functions produced by support vector machines (SVM) while preserving recognition rate. Vector quantization is used in order to reduce the inherent redundancy present in huge pixel databases (i.e. images with expert pixel segmentation). Hybrid color space design is also used in order to improve data set size reduction rate and recognition rate. A new decision function quality criterion is defined to select good trade-off between recognition rate and processing time of pixel decision function. The first results of this study show that fast and efficient pixel classification with SVM is possible. Moreover posterior class pixel probability estimation is easy to compute with Platt method. Then a new segmentation scheme using probabilistic pixel classification has been developed. This one has several free parameters and an automatic selection must dealt with, but criteria for evaluate segmentation quality are not well adapted for cell segmentation, especially when comparison with expert pixel segmentation must be achieved. Another important contribution in this paper is the definition of a new quality criterion for evaluation of cell segmentation. The results presented here show that the selection of free parameters of the segmentation scheme by optimisation of the new quality cell segmentation criterion produces efficient cell segmentation.

  10. Alternative temporal classification of long Gamma Ray Bursts

    NASA Astrophysics Data System (ADS)

    Alejandro Vasquez, Nicolas; Baquero, Andres; Andrade, David

    2015-08-01

    In order to increase the understanding on Gamma Ray Bursts, many attempts of classification have been proposed. Starting with the canonical classification into long and short GRBs, alternative classifications taking into account the cosmological origin of GRBs have been analyzed. In the present work we propose an alternative classification based on two temporal estimators, the Auto Correlation Function (ACF) of the light curves and the emission time which considered the time where the bursts engine is active. The time estimators chosen reflects the internal evolution of the GRB and the internal structure. Using a sample of 61 bright GRBs detected by SWIFT satellite with known redshift, we proposed a bimodal distribution of long bursts. The two types of bursts have different internal structure suggesting different progenitors.

  11. Sensitivity and Specificity of the World Health Organization Dengue Classification Schemes for Severe Dengue Assessment in Children in Rio de Janeiro

    PubMed Central

    Macedo, Gleicy A.; Gonin, Michelle Luiza C.; Pone, Sheila M.; Cruz, Oswaldo G.; Nobre, Flávio F.; Brasil, Patrícia

    2014-01-01

    Background The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Principal Findings Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. Conclusions/Significance This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction. PMID:24777054

  12. Sensitivity and specificity of the World Health Organization dengue classification schemes for severe dengue assessment in children in Rio de Janeiro.

    PubMed

    Macedo, Gleicy A; Gonin, Michelle Luiza C; Pone, Sheila M; Cruz, Oswaldo G; Nobre, Flávio F; Brasil, Patrícia

    2014-01-01

    The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction.

  13. A learning scheme for reach to grasp movements: on EMG-based interfaces using task specific motion decoding models.

    PubMed

    Liarokapis, Minas V; Artemiadis, Panagiotis K; Kyriakopoulos, Kostas J; Manolakos, Elias S

    2013-09-01

    A learning scheme based on random forests is used to discriminate between different reach to grasp movements in 3-D space, based on the myoelectric activity of human muscles of the upper-arm and the forearm. Task specificity for motion decoding is introduced in two different levels: Subspace to move toward and object to be grasped. The discrimination between the different reach to grasp strategies is accomplished with machine learning techniques for classification. The classification decision is then used in order to trigger an EMG-based task-specific motion decoding model. Task specific models manage to outperform "general" models providing better estimation accuracy. Thus, the proposed scheme takes advantage of a framework incorporating both a classifier and a regressor that cooperate advantageously in order to split the task space. The proposed learning scheme can be easily used to a series of EMG-based interfaces that must operate in real time, providing data-driven capabilities for multiclass problems, that occur in everyday life complex environments.

  14. The reliability of axis V of the multiaxial classification scheme.

    PubMed

    van Goor-Lambo, G

    1987-07-01

    In a reliability study concerning axis V (abnormal psychosocial situations) of the Multiaxial classification scheme for psychiatric disorders in childhood and adolescence, it was found that the level of agreement in scoring was adequate for only 2 out of 12 categories. A proposal for a modification of axis V was made, including a differentiation and regrouping of the categories and an adjustment of the descriptions in the glossary. With this modification of axis V another reliability study was carried out, in which the level of agreement in scoring was adequate for 12 out of 16 categories.

  15. Analysis of DSN software anomalies

    NASA Technical Reports Server (NTRS)

    Galorath, D. D.; Hecht, H.; Hecht, M.; Reifer, D. J.

    1981-01-01

    A categorized data base of software errors which were discovered during the various stages of development and operational use of the Deep Space Network DSN/Mark 3 System was developed. A study team identified several existing error classification schemes (taxonomies), prepared a detailed annotated bibliography of the error taxonomy literature, and produced a new classification scheme which was tuned to the DSN anomaly reporting system and encapsulated the work of others. Based upon the DSN/RCI error taxonomy, error data on approximately 1000 reported DSN/Mark 3 anomalies were analyzed, interpreted and classified. Next, error data are summarized and histograms were produced highlighting key tendencies.

  16. Nosology, ontology and promiscuous realism.

    PubMed

    Binney, Nicholas

    2015-06-01

    Medics may consider worrying about their metaphysics and ontology to be a waste of time. I will argue here that this is not the case. Promiscuous realism is a metaphysical position which holds that multiple, equally valid, classification schemes should be applied to objects (such as patients) to capture different aspects of their complex and heterogeneous nature. As medics at the bedside may need to capture different aspects of their patients' problems, they may need to use multiple classification schemes (multiple nosologies), and thus consider adopting a different metaphysics to the one commonly in use. © 2014 John Wiley & Sons, Ltd.

  17. Understanding Homicide-Suicide.

    PubMed

    Knoll, James L

    2016-12-01

    Homicide-suicide is the phenomenon in which an individual kills 1 or more people and commits suicide. Research on homicide-suicide has been hampered by a lack of an accepted classification scheme and reliance on media reports. Mass murder-suicide is gaining increasing attention particularly in the United States. This article reviews the research and literature on homicide-suicide, proposing a standard classification scheme. Preventive methods are discussed and sociocultural factors explored. For a more accurate and complete understanding of homicide-suicide, it is argued that future research should use the full psychological autopsy approach, to include collateral interviews. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Assessment of skeletal maturity in scoliosis patients to determine clinical management: a new classification scheme using distal radius and ulna radiographs.

    PubMed

    Luk, Keith D K; Saw, Lim Beng; Grozman, Samuel; Cheung, Kenneth M C; Samartzis, Dino

    2014-02-01

    Assessment of skeletal maturity in patients with adolescent idiopathic scoliosis (AIS) is important to guide clinical management. Understanding growth peak and cessation is crucial to determine clinical observational intervals, timing to initiate or end bracing therapy, and when to instrument and fuse. The commonly used clinical or radiologic methods to assess skeletal maturity are still deficient in predicting the growth peak and cessation among adolescents, and bone age is too complicated to apply. To address these concerns, we describe a new distal radius and ulna (DRU) classification scheme to assess skeletal maturity. A prospective study. One hundred fifty young, female AIS patients with hand x-rays and no previous history of spine surgery from a single institute were assessed. Radius and ulna plain radiographs, and various anthropomorphic parameters were assessed. We identified various stages of radius and ulna epiphysis maturity, which were graded as R1-R11 for the radius and U1-U9 for the ulna. The bone age, development of sexual characteristics, standing height, sitting height, arm span, radius length, and tibia length were studied prospectively at each stage of these epiphysis changes. Standing height, sitting height, and arm span growth were at their peak during stages R7 (mean, 11.4 years old) and U5 (mean, 11.0 years old). The long bone growths also demonstrated a common peak at R7 and U5. Cessation of height and arm span growth was noted after stages R10 (mean, 15.6 years old) and U9 (mean, 17.3 years old). The new DRU classification is a practical and easy-to-use scheme that can provide skeletal maturation status. This classification scheme provides close relationship with adolescent growth spurt and cessation of growth. This classification may have a tremendous utility in improving clinical-decision making in the conservative and operative management of scoliosis patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. National Library of Medicine Classification: A Scheme for the Shelf Arrangement of Books in a Field of Medicine and Its Related Sciences. Fourth Edition.

    ERIC Educational Resources Information Center

    Wiggins, Emilie, Ed.

    Outlined is the National Library of Medicine classification system for medicine and related sciences. In this system each preclinical science, such as human anatomy, biochemistry or pathology, and each medical subject, such as infectious diseases or pediatrics, receives a two-letter classification. Under each of these main headings numbered minor…

  20. Human Factors Engineering. Student Supplement,

    DTIC Science & Technology

    1981-08-01

    a job TASK TAXONOMY A classification scheme for the different levels of activities in a system, i.e., job - task - sub-task, etc. TASK-AN~ALYSIS...with the classification of learning objectives by learning category so as to identify learningPhas III guidelines necessary for optimum learning to...correct. .4... .the sequencing of all dependent tasks. .1.. .the classification of learning objectives by learning category and the Identification of

  1. A combined reconstruction-classification method for diffuse optical tomography.

    PubMed

    Hiltunen, P; Prince, S J D; Arridge, S

    2009-11-07

    We present a combined classification and reconstruction algorithm for diffuse optical tomography (DOT). DOT is a nonlinear ill-posed inverse problem. Therefore, some regularization is needed. We present a mixture of Gaussians prior, which regularizes the DOT reconstruction step. During each iteration, the parameters of a mixture model are estimated. These associate each reconstructed pixel with one of several classes based on the current estimate of the optical parameters. This classification is exploited to form a new prior distribution to regularize the reconstruction step and update the optical parameters. The algorithm can be described as an iteration between an optimization scheme with zeroth-order variable mean and variance Tikhonov regularization and an expectation-maximization scheme for estimation of the model parameters. We describe the algorithm in a general Bayesian framework. Results from simulated test cases and phantom measurements show that the algorithm enhances the contrast of the reconstructed images with good spatial accuracy. The probabilistic classifications of each image contain only a few misclassified pixels.

  2. Diffusion of Zonal Variables Using Node-Centered Diffusion Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, T B

    2007-08-06

    Tom Kaiser [1] has done some preliminary work to use the node-centered diffusion solver (originally developed by T. Palmer [2]) in Kull for diffusion of zonal variables such as electron temperature. To avoid numerical diffusion, Tom used a scheme developed by Shestakov et al. [3] and found their scheme could, in the vicinity of steep gradients, decouple nearest-neighbor zonal sub-meshes leading to 'alternating-zone' (red-black mode) errors. Tom extended their scheme to couple the sub-meshes with appropriate chosen artificial diffusion and thereby solved the 'alternating-zone' problem. Because the choice of the artificial diffusion coefficient could be very delicate, it is desirablemore » to use a scheme that does not require the artificial diffusion but still able to avoid both numerical diffusion and the 'alternating-zone' problem. In this document we present such a scheme.« less

  3. Family Traits of Galaxies: From the Tuning Fork to a Physical Classification in a Multi-Wavelength Context

    NASA Astrophysics Data System (ADS)

    Rampazzo, Roberto; D'Onofrio, Mauro; Zaggia, Simone; Elmegreen, Debra M.; Laurikainen, Eija; Duc, Pierre-Alain; Gallart, Carme; Fraix-Burnet, Didier

    At the time of the Great Debate nebulæ where recognized to have different morphologies and first classifications, sometimes only descriptive, have been attempted. A review of these early classification systems are well documented by the Allan Sandage's review in 2005 (Sandage 2005). This review emphasized the debt, in term of continuity of forms of spiral galaxies, due by the Hubble's classification scheme to the Reynold's systems proposed in 1920 (Reynolds, 1920).

  4. An evaluation of supervised classifiers for indirectly detecting salt-affected areas at irrigation scheme level

    NASA Astrophysics Data System (ADS)

    Muller, Sybrand Jacobus; van Niekerk, Adriaan

    2016-07-01

    Soil salinity often leads to reduced crop yield and quality and can render soils barren. Irrigated areas are particularly at risk due to intensive cultivation and secondary salinization caused by waterlogging. Regular monitoring of salt accumulation in irrigation schemes is needed to keep its negative effects under control. The dynamic spatial and temporal characteristics of remote sensing can provide a cost-effective solution for monitoring salt accumulation at irrigation scheme level. This study evaluated a range of pan-fused SPOT-5 derived features (spectral bands, vegetation indices, image textures and image transformations) for classifying salt-affected areas in two distinctly different irrigation schemes in South Africa, namely Vaalharts and Breede River. The relationship between the input features and electro conductivity measurements were investigated using regression modelling (stepwise linear regression, partial least squares regression, curve fit regression modelling) and supervised classification (maximum likelihood, nearest neighbour, decision tree analysis, support vector machine and random forests). Classification and regression trees and random forest were used to select the most important features for differentiating salt-affected and unaffected areas. The results showed that the regression analyses produced weak models (<0.4 R squared). Better results were achieved using the supervised classifiers, but the algorithms tend to over-estimate salt-affected areas. A key finding was that none of the feature sets or classification algorithms stood out as being superior for monitoring salt accumulation at irrigation scheme level. This was attributed to the large variations in the spectral responses of different crops types at different growing stages, coupled with their individual tolerances to saline conditions.

  5. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers

    PubMed Central

    Bennett, David A.; Blennow, Kaj; Carrillo, Maria C.; Feldman, Howard H.; Frisoni, Giovanni B.; Hampel, Harald; Jagust, William J.; Johnson, Keith A.; Knopman, David S.; Petersen, Ronald C.; Scheltens, Philip; Sperling, Reisa A.; Dubois, Bruno

    2016-01-01

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the “A/T/N” system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. “A” refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); “T,” the value of a tau biomarker (CSF phospho tau, or tau PET); and “N,” biomarkers of neurodegeneration or neuronal injury ([18F]-fluorodeoxyglucose–PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N−, or A+/T−/N−, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme. PMID:27371494

  6. Taxonomy and Classification Scheme for Artificial Space Objects

    DTIC Science & Technology

    2013-09-01

    filter UVB and spectroscopic measurements) and albedo (including polarimetry ). Earliest classifications of asteroids [17] were based on the filter...similarities of the asteroid colors to K0 to K2V stars. The first more complete asteroid taxonomy was based on a synthesis of polarimetry , radiometry, and

  7. A Critical Review of Mode of Action (MOA) Assignment Classifications for Ecotoxicology

    EPA Science Inventory

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human health toxicology. With increasing calls to assess thousands of chemicals, some of which have little available informatio...

  8. Solar wind classification from a machine learning perspective

    NASA Astrophysics Data System (ADS)

    Heidrich-Meisner, V.; Wimmer-Schweingruber, R. F.

    2017-12-01

    It is a very well known fact that the ubiquitous solar wind comes in at least two varieties, the slow solar wind and the coronal hole wind. The simplified view of two solar wind types has been frequently challenged. Existing solar wind categorization schemes rely mainly on different combinations of the solar wind proton speed, the O and C charge state ratios, the Alfvén speed, the expected proton temperature and the specific proton entropy. In available solar wind classification schemes, solar wind from stream interaction regimes is often considered either as coronal hole wind or slow solar wind, although their plasma properties are different compared to "pure" coronal hole or slow solar wind. As shown in Neugebauer et al. (2016), even if only two solar wind types are assumed, available solar wind categorization schemes differ considerably for intermediate solar wind speeds. Thus, the decision boundary between the coronal hole and the slow solar wind is so far not well defined.In this situation, a machine learning approach to solar wind classification can provide an additional perspective.We apply a well-known machine learning method, k-means, to the task of solar wind classification in order to answer the following questions: (1) How many solar wind types can reliably be identified in our data set comprised of ten years of solar wind observations from the Advanced Composition Explorer (ACE)? (2) Which combinations of solar wind parameters are particularly useful for solar wind classification?Potential subtypes of slow solar wind are of particular interest because they can provide hints of respective different source regions or release mechanisms of slow solar wind.

  9. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  10. An integrated modelling and multicriteria analysis approach to managing nitrate diffuse pollution: 2. A case study for a chalk catchment in England.

    PubMed

    Koo, B K; O'Connell, P E

    2006-04-01

    The site-specific land use optimisation methodology, suggested by the authors in the first part of this two-part paper, has been applied to the River Kennet catchment at Marlborough, Wiltshire, UK, for a case study. The Marlborough catchment (143 km(2)) is an agriculture-dominated rural area over a deep chalk aquifer that is vulnerable to nitrate pollution from agricultural diffuse sources. For evaluation purposes, the catchment was discretised into a network of 1 kmx1 km grid cells. For each of the arable-land grid cells, seven land use alternatives (four arable-land alternatives and three grassland alternatives) were evaluated for their environmental and economic potential. For environmental evaluation, nitrate leaching rates of land use alternatives were estimated using SHETRAN simulations and groundwater pollution potential was evaluated using the DRASTIC index. For economic evaluation, economic gross margins were estimated using a simple agronomic model based on nitrogen response functions and agricultural land classification grades. In order to see whether the site-specific optimisation is efficient at the catchment scale, land use optimisation was carried out for four optimisation schemes (i.e. using four sets of criterion weights). Consequently, four land use scenarios were generated and the site-specifically optimised land use scenario was evaluated as the best compromise solution between long term nitrate pollution and agronomy at the catchment scale.

  11. Differential diagnosis of suspected multiple sclerosis: a consensus approach

    PubMed Central

    Miller, DH; Weinshenker, BG; Filippi, M; Banwell, BL; Cohen, JA; Freedman, MS; Galetta, SL; Hutchinson, M; Johnson, RT; Kappos, L; Kira, J; Lublin, FD; McFarland, HF; Montalban, X; Panitch, H; Richert, JR; Reingold, SC; Polman, CH

    2008-01-01

    Background and objectives Diagnosis of multiple sclerosis (MS) requires exclusion of diseases that could better explain the clinical and paraclinical findings. A systematic process for exclusion of alternative diagnoses has not been defined. An International Panel of MS experts developed consensus perspectives on MS differential diagnosis. Methods Using available literature and consensus, we developed guidelines for MS differential diagnosis, focusing on exclusion of potential MS mimics, diagnosis of common initial isolated clinical syndromes, and differentiating between MS and non-MS idiopathic inflammatory demyelinating diseases. Results We present recommendations for 1) clinical and paraclinical red flags suggesting alternative diagnoses to MS; 2) more precise definition of “clinically isolated syndromes” (CIS), often the first presentations of MS or its alternatives; 3) algorithms for diagnosis of three common CISs related to MS in the optic nerves, brainstem, and spinal cord; and 4) a classification scheme and diagnosis criteria for idiopathic inflammatory demyelinating disorders of the central nervous system. Conclusions Differential diagnosis leading to MS or alternatives is complex and a strong evidence base is lacking. Consensus-determined guidelines provide a practical path for diagnosis and will be useful for the non-MS specialist neurologist. Recommendations are made for future research to validate and support these guidelines. Guidance on the differential diagnosis process when MS is under consideration will enhance diagnostic accuracy and precision. PMID:18805839

  12. Classification of diffuse lung diseases: why and how.

    PubMed

    Hansell, David M

    2013-09-01

    The understanding of complex lung diseases, notably the idiopathic interstitial pneumonias and small airways diseases, owes as much to repeated attempts over the years to classify them as to any single conceptual breakthrough. One of the many benefits of a successful classification scheme is that it allows workers, within and between disciplines, to be clear that they are discussing the same disease. This may be of particular importance in the recruitment of individuals for a clinical trial that requires a standardized and homogeneous study population. Different specialties require fundamentally different things from a classification: for epidemiologic studies, a classification that requires categorization of individuals according to histopathologic pattern is not usually practicable. Conversely, a scheme that simply divides diffuse parenchymal disease into inflammatory and noninflammatory categories is unlikely to further the understanding about the pathogenesis of disease. Thus, for some disease groupings, for example, pulmonary vasculopathies, there may be several appropriate classifications, each with its merits and demerits. There has been an interesting shift in the past few years, from the accepted primacy of histopathology as the sole basis on which the classification of parenchymal lung disease has rested, to new ways of considering how these entities relate to each other. Some inventive thinking has resulted in new classifications that undoubtedly benefit patients and clinicians in their endeavor to improve management and outcome. The challenge of understanding the logic behind current classifications and their shortcomings are explored in various examples of lung diseases.

  13. Video Games: Instructional Potential and Classification.

    ERIC Educational Resources Information Center

    Nawrocki, Leon H.; Winner, Janet L.

    1983-01-01

    Intended to provide a framework and impetus for future investigations of video games, this paper summarizes activities investigating the instructional use of such games, observations by the authors, and a proposed classification scheme and a paradigm to assist in the preliminary selection of instructional video games. Nine references are listed.…

  14. Fabric wrinkle characterization and classification using modified wavelet coefficients and optimized support-vector-machine classifier

    USDA-ARS?s Scientific Manuscript database

    This paper presents a novel wrinkle evaluation method that uses modified wavelet coefficients and an optimized support-vector-machine (SVM) classification scheme to characterize and classify wrinkle appearance of fabric. Fabric images were decomposed with the wavelet transform (WT), and five parame...

  15. Mode of Action (MOA) Assignment Classifications for Ecotoxicology: Evaluation of Available Methods

    EPA Science Inventory

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human toxicology. With increasing calls to assess 1000s of chemicals, some of which have little available information other tha...

  16. Surveillance system and method having an operating mode partitioned fault classification model

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor)

    2005-01-01

    A system and method which partitions a parameter estimation model, a fault detection model, and a fault classification model for a process surveillance scheme into two or more coordinated submodels together providing improved diagnostic decision making for at least one determined operating mode of an asset.

  17. Structure D'Ensemble, Multiple Classification, Multiple Seriation and Amount of Irrelevant Information

    ERIC Educational Resources Information Center

    Hamel, B. Remmo; Van Der Veer, M. A. A.

    1972-01-01

    A significant positive correlation between multiple classification was found, in testing 65 children aged 6 to 8 years, at the stage of concrete operations. This is interpreted as support for the existence of a structure d'ensemble of operational schemes in the period of concrete operations. (Authors)

  18. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  19. GRB 060614: a Fake Short Gamma-Ray Burst

    NASA Astrophysics Data System (ADS)

    Caito, L.; Bernardini, M. G.; Bianco, C. L.; Dainotti, M. G.; Guida, R.; Ruffini, R.

    2008-05-01

    The explosion of GRB 060614 produced a deep break in the GRB scenario and opened new horizons of investigation because it can't be traced back to any traditional scheme of classification. In fact, it has features both of long bursts and of short bursts and, above all, it is the first case of long duration near GRB without any bright Ib/c associated Supernova. We will show that, in our canonical GRB scenario [1], this ``anomalous'' situation finds a natural interpretation and allows us to discuss a possible variation to the traditional classification scheme, introducing the distinction between ``genuine'' and ``fake'' short bursts.

  20. GRB 060614: a progress report

    NASA Astrophysics Data System (ADS)

    Caito, L.; Bernardini, M. G.; Bianco, C. L.; Dainotti, M. G.; Guida, R.; Ruffini, R.

    2008-01-01

    The explosion of GRB 060614, detected by the Swift satellite, produced a deep break in the GRB scenario opening new horizons of investigation, because it can't be traced back to any traditional scheme of classification. In fact, it manifests peculiarities both of long bursts and of short bursts. Above all, it is the first case of long duration near GRB without any bright Ib/c associated Supernova. We will show that, in our canonical GRB scenario ([l]), this ``anomalous'' situation finds a natural interpretation and allows us to discuss a possible variation to the traditional classification scheme, introducing the distinction between ``genuine'' and ``fake'' short bursts.

  1. A search for space energy alternatives

    NASA Technical Reports Server (NTRS)

    Gilbreath, W. P.; Billman, K. W.

    1978-01-01

    This paper takes a look at a number of schemes for converting radiant energy in space to useful energy for man. These schemes are possible alternatives to the currently most studied solar power satellite concept. Possible primary collection and conversion devices discussed include the space particle flux devices, solar windmills, photovoltaic devices, photochemical cells, photoemissive converters, heat engines, dielectric energy conversion, electrostatic generators, plasma solar collectors, and thermionic schemes. Transmission devices reviewed include lasers and masers.

  2. Mapping of rock types using a joint approach by combining the multivariate statistics, self-organizing map and Bayesian neural networks: an example from IODP 323 site

    NASA Astrophysics Data System (ADS)

    Karmakar, Mampi; Maiti, Saumen; Singh, Amrita; Ojha, Maheswar; Maity, Bhabani Sankar

    2017-07-01

    Modeling and classification of the subsurface lithology is very important to understand the evolution of the earth system. However, precise classification and mapping of lithology using a single framework are difficult due to the complexity and the nonlinearity of the problem driven by limited core sample information. Here, we implement a joint approach by combining the unsupervised and the supervised methods in a single framework for better classification and mapping of rock types. In the unsupervised method, we use the principal component analysis (PCA), K-means cluster analysis (K-means), dendrogram analysis, Fuzzy C-means (FCM) cluster analysis and self-organizing map (SOM). In the supervised method, we use the Bayesian neural networks (BNN) optimized by the Hybrid Monte Carlo (HMC) (BNN-HMC) and the scaled conjugate gradient (SCG) (BNN-SCG) techniques. We use P-wave velocity, density, neutron porosity, resistivity and gamma ray logs of the well U1343E of the Integrated Ocean Drilling Program (IODP) Expedition 323 in the Bering Sea slope region. While the SOM algorithm allows us to visualize the clustering results in spatial domain, the combined classification schemes (supervised and unsupervised) uncover the different patterns of lithology such of as clayey-silt, diatom-silt and silty-clay from an un-cored section of the drilled hole. In addition, the BNN approach is capable of estimating uncertainty in the predictive modeling of three types of rocks over the entire lithology section at site U1343. Alternate succession of clayey-silt, diatom-silt and silty-clay may be representative of crustal inhomogeneity in general and thus could be a basis for detail study related to the productivity of methane gas in the oceans worldwide. Moreover, at the 530 m depth down below seafloor (DSF), the transition from Pliocene to Pleistocene could be linked to lithological alternation between the clayey-silt and the diatom-silt. The present results could provide the basis for the detailed study to get deeper insight into the Bering Sea' sediment deposition and sequence.

  3. Planetree health information services: public access to the health information people want.

    PubMed Central

    Cosgrove, T L

    1994-01-01

    In July 1981, the Planetree Health Resource Center opened on the San Francisco campus of California Pacific Medical Center (Pacific Presbyterian Medical Center). Planetree was founded on the belief that access to information can empower people and help them face health and medical challenges. The Health Resource Center was created to provide medical library and health information resources to the general public. Over the last twelve years, Planetree has tried to develop a consumer health library collection and information service that is responsive to the needs and interests of a diverse public. In an effort to increase accessibility to the medical literature, a consumer health library classification scheme was created for the organization of library materials. The scheme combines the specificity and sophistication of the National Library of Medicine classification scheme with the simplicity of common lay terminology. PMID:8136762

  4. User oriented ERTS-1 images. [vegetation identification in Canada through image enhancement

    NASA Technical Reports Server (NTRS)

    Shlien, S.; Goodenough, D.

    1974-01-01

    Photographic reproduction of ERTS-1 images are capable of displaying only a portion of the total information available from the multispectral scanner. Methods are being developed to generate ERTS-1 images oriented towards special users such as agriculturists, foresters, and hydrologists by applying image enhancement techniques and interactive statistical classification schemes. Spatial boundaries and linear features can be emphasized and delineated using simple filters. Linear and nonlinear transformations can be applied to the spectral data to emphasize certain ground information. An automatic classification scheme was developed to identify particular ground cover classes such as fallow, grain, rape seed or various vegetation covers. The scheme applies the maximum likelihood decision rule to the spectral information and classifies the ERTS-1 image on a pixel by pixel basis. Preliminary results indicate that the classifier has limited success in distinguishing crops, but is well adapted for identifying different types of vegetation.

  5. Deep learning aided decision support for pulmonary nodules diagnosing: a review

    PubMed Central

    Yang, Yixin; Feng, Xiaoyi; Chi, Wenhao; Li, Zhengyang; Duan, Wenzhe; Liu, Haiping; Liang, Wenhua; Wang, Wei; Chen, Ping

    2018-01-01

    Deep learning techniques have recently emerged as promising decision supporting approaches to automatically analyze medical images for different clinical diagnosing purposes. Diagnosing of pulmonary nodules by using computer-assisted diagnosing has received considerable theoretical, computational, and empirical research work, and considerable methods have been developed for detection and classification of pulmonary nodules on different formats of images including chest radiographs, computed tomography (CT), and positron emission tomography in the past five decades. The recent remarkable and significant progress in deep learning for pulmonary nodules achieved in both academia and the industry has demonstrated that deep learning techniques seem to be promising alternative decision support schemes to effectively tackle the central issues in pulmonary nodules diagnosing, including feature extraction, nodule detection, false-positive reduction, and benign-malignant classification for the huge volume of chest scan data. The main goal of this investigation is to provide a comprehensive state-of-the-art review of the deep learning aided decision support for pulmonary nodules diagnosing. As far as the authors know, this is the first time that a review is devoted exclusively to deep learning techniques for pulmonary nodules diagnosing. PMID:29780633

  6. A Sensory 3D Map of the Odor Description Space Derived from a Comparison of Numeric Odor Profile Databases.

    PubMed

    Zarzo, Manuel

    2015-06-01

    Many authors have proposed different schemes of odor classification, which are useful to aid the complex task of describing smells. However, reaching a consensus on a particular classification seems difficult because our psychophysical space of odor description is a continuum and is not clustered into well-defined categories. An alternative approach is to describe the perceptual space of odors as a low-dimensional coordinate system. This idea was first proposed by Crocker and Henderson in 1927, who suggested using numeric profiles based on 4 dimensions: "fragrant," "acid," "burnt," and "caprylic." In the present work, the odor profiles of 144 aroma chemicals were compared by means of statistical regression with comparable numeric odor profiles obtained from 2 databases, enabling a plausible interpretation of the 4 dimensions. Based on the results and taking into account comparable 2D sensory maps of odor descriptors from the literature, a 3D sensory map (odor cube) has been drawn up to improve understanding of the similarities and dissimilarities of the odor descriptors most frequently used in fragrance chemistry. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. An analysis of USSPACECOM's space surveillance network sensor tasking methodology

    NASA Astrophysics Data System (ADS)

    Berger, Jeff M.; Moles, Joseph B.; Wilsey, David G.

    1992-12-01

    This study provides the basis for the development of a cost/benefit assessment model to determine the effects of alterations to the Space Surveillance Network (SSN) on orbital element (OE) set accuracy. It provides a review of current methods used by NORAD and the SSN to gather and process observations, an alternative to the current Gabbard classification method, and the development of a model to determine the effects of observation rate and correction interval on OE set accuracy. The proposed classification scheme is based on satellite J2 perturbations. Specifically, classes were established based on mean motion, eccentricity, and inclination since J2 perturbation effects are functions of only these elements. Model development began by creating representative sensor observations using a highly accurate orbital propagation model. These observations were compared to predicted observations generated using the NORAD Simplified General Perturbation (SGP4) model and differentially corrected using a Bayes, sequential estimation, algorithm. A 10-run Monte Carlo analysis was performed using this model on 12 satellites using 16 different observation rate/correction interval combinations. An ANOVA and confidence interval analysis of the results show that this model does demonstrate the differences in steady state position error based on varying observation rate and correction interval.

  8. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  9. Cloud cover determination in polar regions from satellite imagery

    NASA Technical Reports Server (NTRS)

    Barry, R. G.; Maslanik, J. A.; Key, J. R.

    1987-01-01

    A definition is undertaken of the spectral and spatial characteristics of clouds and surface conditions in the polar regions, and to the creation of calibrated, geometrically correct data sets suitable for quantitative analysis. Ways are explored in which this information can be applied to cloud classifications as new methods or as extensions to existing classification schemes. A methodology is developed that uses automated techniques to merge Advanced Very High Resolution Radiometer (AVHRR) and Scanning Multichannel Microwave Radiometer (SMMR) data, and to apply first-order calibration and zenith angle corrections to the AVHRR imagery. Cloud cover and surface types are manually interpreted, and manual methods are used to define relatively pure training areas to describe the textural and multispectral characteristics of clouds over several surface conditions. The effects of viewing angle and bidirectional reflectance differences are studied for several classes, and the effectiveness of some key components of existing classification schemes is tested.

  10. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    PubMed

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  11. Classification of Palmprint Using Principal Line

    NASA Astrophysics Data System (ADS)

    Prasad, Munaga V. N. K.; Kumar, M. K. Pramod; Sharma, Kuldeep

    In this paper, a new classification scheme for palmprint is proposed. Palmprint is one of the reliable physiological characteristics that can be used to authenticate an individual. Palmprint classification provides an important indexing mechanism in a very large palmprint database. Here, the palmprint database is initially categorized into two groups, right hand group and left hand group. Then, each group is further classified based on the distance traveled by principal line i.e. Heart Line During pre processing, a rectangular Region of Interest (ROI) in which only heart line is present, is extracted. Further, ROI is divided into 6 regions and depending upon the regions in which the heart line traverses the palmprint is classified accordingly. Consequently, our scheme allows 64 categories for each group forming a total number of 128 possible categories. The technique proposed in this paper includes only 15 such categories and it classifies not more than 20.96% of the images into a single category.

  12. Classification of topological phonons in linear mechanical metamaterials

    PubMed Central

    Süsstrunk, Roman

    2016-01-01

    Topological phononic crystals, alike their electronic counterparts, are characterized by a bulk–edge correspondence where the interior of a material dictates the existence of stable surface or boundary modes. In the mechanical setup, such surface modes can be used for various applications such as wave guiding, vibration isolation, or the design of static properties such as stable floppy modes where parts of a system move freely. Here, we provide a classification scheme of topological phonons based on local symmetries. We import and adapt the classification of noninteracting electron systems and embed it into the mechanical setup. Moreover, we provide an extensive set of examples that illustrate our scheme and can be used to generate models in unexplored symmetry classes. Our work unifies the vast recent literature on topological phonons and paves the way to future applications of topological surface modes in mechanical metamaterials. PMID:27482105

  13. Restoration of Wavelet-Compressed Images and Motion Imagery

    DTIC Science & Technology

    2004-01-01

    SECURITY CLASSIFICATION OF REPORT UNCLASSIFIED 18. SECURITY CLASSIFICATION OF THIS PAGE UNCLASSIFIED 19. SECURITY CLASSIFICATION...images is that they are global translates of each other, where 29 the global motion parameters are known. In a very simple sense , these five images form...Image Proc., vol. 1, Oct. 2001, pp. 185–188. [2] J. W. Woods and T. Naveen, “A filter based bit allocation scheme for subband compresion of HDTV,” IEEE

  14. Cross-mapping the ICNP with NANDA, HHCC, Omaha System and NIC for unified nursing language system development. International Classification for Nursing Practice. International Council of Nurses. North American Nursing Diagnosis Association. Home Health Care Classification. Nursing Interventions Classification.

    PubMed

    Hyun, S; Park, H A

    2002-06-01

    Nursing language plays an important role in describing and defining nursing phenomena and nursing actions. There are numerous vocabularies describing nursing diagnoses, interventions and outcomes in nursing. However, the lack of a standardized unified nursing language is considered a problem for further development of the discipline of nursing. In an effort to unify the nursing languages, the International Council of Nurses (ICN) has proposed the International Classification for Nursing Practice (ICNP) as a unified nursing language system. The purpose of this study was to evaluate the inclusiveness and expressiveness of the ICNP terms by cross-mapping them with the existing nursing terminologies, specifically the North American Nursing Diagnosis Association (NANDA) taxonomy I, the Omaha System, the Home Health Care Classification (HHCC) and the Nursing Interventions Classification (NIC). Nine hundred and seventy-four terms from these four classifications were cross-mapped with the ICNP terms. This was performed in accordance with the Guidelines for Composing a Nursing Diagnosis and Guidelines for Composing a Nursing Intervention, which were suggested by the ICNP development team. An expert group verified the results. The ICNP Phenomena Classification described 87.5% of the NANDA diagnoses, 89.7% of the HHCC diagnoses and 72.7% of the Omaha System problem classification scheme. The ICNP Action Classification described 79.4% of the NIC interventions, 80.6% of the HHCC interventions and 71.4% of the Omaha System intervention scheme. The results of this study suggest that the ICNP has a sound starting structure for a unified nursing language system and can be used to describe most of the existing terminologies. Recommendations for the addition of terms to the ICNP are provided.

  15. Determining the saliency of feature measurements obtained from images of sedimentary organic matter for use in its classification

    NASA Astrophysics Data System (ADS)

    Weller, Andrew F.; Harris, Anthony J.; Ware, J. Andrew; Jarvis, Paul S.

    2006-11-01

    The classification of sedimentary organic matter (OM) images can be improved by determining the saliency of image analysis (IA) features measured from them. Knowing the saliency of IA feature measurements means that only the most significant discriminating features need be used in the classification process. This is an important consideration for classification techniques such as artificial neural networks (ANNs), where too many features can lead to the 'curse of dimensionality'. The classification scheme adopted in this work is a hybrid of morphologically and texturally descriptive features from previous manual classification schemes. Some of these descriptive features are assigned to IA features, along with several others built into the IA software (Halcon) to ensure that a valid cross-section is available. After an image is captured and segmented, a total of 194 features are measured for each particle. To reduce this number to a more manageable magnitude, the SPSS AnswerTree Exhaustive CHAID (χ 2 automatic interaction detector) classification tree algorithm is used to establish each measurement's saliency as a classification discriminator. In the case of continuous data as used here, the F-test is used as opposed to the published algorithm. The F-test checks various statistical hypotheses about the variance of groups of IA feature measurements obtained from the particles to be classified. The aim is to reduce the number of features required to perform the classification without reducing its accuracy. In the best-case scenario, 194 inputs are reduced to 8, with a subsequent multi-layer back-propagation ANN recognition rate of 98.65%. This paper demonstrates the ability of the algorithm to reduce noise, help overcome the curse of dimensionality, and facilitate an understanding of the saliency of IA features as discriminators for sedimentary OM classification.

  16. Looking at Citations: Using Corpora in English for Academic Purposes.

    ERIC Educational Resources Information Center

    Thompson, Paul; Tribble, Chris

    2001-01-01

    Presents a classification scheme and the results of applying this scheme to the coding of academic texts in a corpus. The texts are doctoral theses from agricultural botany and agricultural economics departments. Results lead to a comparison of the citation practices of writers in different disciplines and the different rhetorical practices of…

  17. An unsupervised classification technique for multispectral remote sensing data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Cummings, R. E.

    1973-01-01

    Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.

  18. Unsupervised classification of earth resources data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.

    1972-01-01

    A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.

  19. Characterising Complex Enzyme Reaction Data

    PubMed Central

    Rahman, Syed Asad; Thornton, Janet M.

    2016-01-01

    The relationship between enzyme-catalysed reactions and the Enzyme Commission (EC) number, the widely accepted classification scheme used to characterise enzyme activity, is complex and with the rapid increase in our knowledge of the reactions catalysed by enzymes needs revisiting. We present a manual and computational analysis to investigate this complexity and found that almost one-third of all known EC numbers are linked to more than one reaction in the secondary reaction databases (e.g., KEGG). Although this complexity is often resolved by defining generic, alternative and partial reactions, we have also found individual EC numbers with more than one reaction catalysing different types of bond changes. This analysis adds a new dimension to our understanding of enzyme function and might be useful for the accurate annotation of the function of enzymes and to study the changes in enzyme function during evolution. PMID:26840640

  20. Minimum Disclosure Counting for the Alternative Vote

    NASA Astrophysics Data System (ADS)

    Wen, Roland; Buckland, Richard

    Although there is a substantial body of work on preventing bribery and coercion of voters in cryptographic election schemes for plurality electoral systems, there are few attempts to construct such schemes for preferential electoral systems. The problem is preferential systems are prone to bribery and coercion via subtle signature attacks during the counting. We introduce a minimum disclosure counting scheme for the alternative vote preferential system. Minimum disclosure provides protection from signature attacks by revealing only the winning candidate.

  1. Classification of cryocoolers

    NASA Technical Reports Server (NTRS)

    Walker, G.

    1985-01-01

    A great diversity of methods and mechanisms were devised to effect cryogenic refrigeration. The basic parameters and considerations affecting the selection of a particular system are reviewed. A classification scheme for mechanical cryocoolers is presented. An important distinguishing feature is the incorporation or not of a regenerative heat exchanger, of valves, and of the method for achieving a pressure variation.

  2. A satellite rainfall retrieval technique over northern Algeria based on the probability of rainfall intensities classification from MSG-SEVIRI

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2016-09-01

    In this paper, an algorithm based on the probability of rainfall intensities classification for rainfall estimation from Meteosat Second Generation/Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) has been developed. The classification scheme uses various spectral parameters of SEVIRI that provide information about cloud top temperature and optical and microphysical cloud properties. The presented method is developed and trained for the north of Algeria. The calibration of the method is carried out using as a reference rain classification fields derived from radar for rainy season from November 2006 to March 2007. Rainfall rates are assigned to rain areas previously identified and classified according to the precipitation formation processes. The comparisons between satellite-derived precipitation estimates and validation data show that the developed scheme performs reasonably well. Indeed, the correlation coefficient presents a significant level (r:0.87). The values of POD, POFD and FAR are 80%, 13% and 25%, respectively. Also, for a rainfall estimation of about 614 mm, the RMSD, Bias, MAD and PD indicate 102.06(mm), 2.18(mm), 68.07(mm) and 12.58, respectively.

  3. Texture as a basis for acoustic classification of substrate in the nearshore region

    NASA Astrophysics Data System (ADS)

    Dennison, A.; Wattrus, N. J.

    2016-12-01

    Segmentation and classification of substrate type from two locations in Lake Superior, are predicted using multivariate statistical processing of textural measures derived from shallow-water, high-resolution multibeam bathymetric data. During a multibeam sonar survey, both bathymetric and backscatter data are collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on substrate type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. Preliminary results from an analysis of bathymetric data and ground-truth samples collected from the Amnicon River, Superior, Wisconsin, and the Lester River, Duluth, Minnesota, demonstrate the ability to process and develop a novel classification scheme of the bottom type in two geomorphologically distinct areas.

  4. Classifying machinery condition using oil samples and binary logistic regression

    NASA Astrophysics Data System (ADS)

    Phillips, J.; Cripps, E.; Lau, John W.; Hodkiewicz, M. R.

    2015-08-01

    The era of big data has resulted in an explosion of condition monitoring information. The result is an increasing motivation to automate the costly and time consuming human elements involved in the classification of machine health. When working with industry it is important to build an understanding and hence some trust in the classification scheme for those who use the analysis to initiate maintenance tasks. Typically "black box" approaches such as artificial neural networks (ANN) and support vector machines (SVM) can be difficult to provide ease of interpretability. In contrast, this paper argues that logistic regression offers easy interpretability to industry experts, providing insight to the drivers of the human classification process and to the ramifications of potential misclassification. Of course, accuracy is of foremost importance in any automated classification scheme, so we also provide a comparative study based on predictive performance of logistic regression, ANN and SVM. A real world oil analysis data set from engines on mining trucks is presented and using cross-validation we demonstrate that logistic regression out-performs the ANN and SVM approaches in terms of prediction for healthy/not healthy engines.

  5. Using two classification schemes to develop vegetation indices of biological integrity for wetlands in West Virginia, USA.

    PubMed

    Veselka, Walter; Rentch, James S; Grafton, William N; Kordek, Walter S; Anderson, James T

    2010-11-01

    Bioassessment methods for wetlands, and other bodies of water, have been developed worldwide to measure and quantify changes in "biological integrity." These assessments are based on a classification system, meant to ensure appropriate comparisons between wetland types. Using a local site-specific disturbance gradient, we built vegetation indices of biological integrity (Veg-IBIs) based on two commonly used wetland classification systems in the USA: One based on vegetative structure and the other based on a wetland's position in a landscape and sources of water. The resulting class-specific Veg-IBIs were comprised of 1-5 metrics that varied in their sensitivity to the disturbance gradient (R2=0.14-0.65). Moreover, the sensitivity to the disturbance gradient increased as metrics from each of the two classification schemes were combined (added). Using this information to monitor natural and created wetlands will help natural resource managers track changes in biological integrity of wetlands in response to anthropogenic disturbance and allows the use of vegetative communities to set ecological performance standards for mitigation banks.

  6. Pāhoehoe, `a`ā, and block lava: an illustrated history of the nomenclature

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.; Rowland, Scott K.; Villeneuve, Nicolas; Thordarson, Thor

    2017-01-01

    Lava flows occur worldwide, and throughout history, various cultures (and geologists) have described flows based on their surface textures. As a result, surface morphology-based nomenclature schemes have been proposed in most languages to aid in the classification and distinction of lava surface types. One of the first to be published was likely the nine-class, Italian-language description-based classification proposed by Mario Gemmellaro in 1858. By far, the most commonly used terms to describe lava surfaces today are not descriptive but, instead, are merely words, specifically the Hawaiian words `a`ā (rough brecciated basalt lava) and pāhoehoe (smooth glassy basalt lava), plus block lava (thick brecciated lavas that are typically more silicic than basalt). `A`ā and pāhoehoe were introduced into the Western geological vocabulary by American geologists working in Hawai`i during the 1800s. They and other nineteenth century geologists proposed formal lava-type classification schemes for scientific use, and most of them used the Hawaiian words. In 1933, Ruy Finch added the third lava type, block lava, to the classification scheme, with the tripartite system being formalized in 1953 by Gordon Macdonald. More recently, particularly since the 1980s and based largely on studies of lava flow interiors, a number of sub-types and transitional forms of all three major lava types have been defined. This paper reviews the early history of the development of the pāhoehoe, `a`ā, and block lava-naming system and presents a new descriptive classification so as to break out the three parental lava types into their many morphological sub-types.

  7. Break-even cost of cloning in genetic improvement of dairy cattle.

    PubMed

    Dematawewa, C M; Berger, P J

    1998-04-01

    Twelve different models for alternative progeny-testing schemes based on genetic and economic gains were compared. The first 10 alternatives were considered to be optimally operating progeny-testing schemes. Alternatives 1 to 5 considered the following combinations of technologies: 1) artificial insemination, 2) artificial insemination with sexed semen, 3) artificial insemination with embryo transfer, 4) artificial insemination and embryo transfer with few bulls as sires, and 5) artificial insemination, embryo transfer, and sexed semen with few bulls, respectively. Alternatives 6 to 12 considered cloning from dams. Alternatives 11 and 12 considered a regular progeny-testing scheme that had selection gains (intensity x accuracy x genetic standard deviation) of 890, 300, 600, and 89 kg, respectively, for the four paths. The sums of the generation intervals of the four paths were 19 yr for the first 8 alternatives and 19.5, 22, 29, and 29.5 yr for alternatives 9 to 12, respectively. Rates of genetic gain in milk yield for alternatives 1 to 5 were 257, 281, 316, 327, and 340 kg/yr, respectively. The rate of gain for other alternatives increased as number of clones increased. The use of three records per clone increased both accuracy and generation interval of a path. Cloning was highly beneficial for progeny-testing schemes with lower intensity and accuracy of selection. The discounted economic gain (break-even cost) per clone was the highest ($84) at current selection levels using sexed semen and three records on clones of the dam. The total cost associated with cloning has to be below $84 for cloning to be an economically viable option.

  8. TFOS DEWS II Definition and Classification Report.

    PubMed

    Craig, Jennifer P; Nichols, Kelly K; Akpek, Esen K; Caffery, Barbara; Dua, Harminder S; Joo, Choun-Ki; Liu, Zuguo; Nelson, J Daniel; Nichols, Jason J; Tsubota, Kazuo; Stapleton, Fiona

    2017-07-01

    The goals of the TFOS DEWS II Definition and Classification Subcommittee were to create an evidence-based definition and a contemporary classification system for dry eye disease (DED). The new definition recognizes the multifactorial nature of dry eye as a disease where loss of homeostasis of the tear film is the central pathophysiological concept. Ocular symptoms, as a broader term that encompasses reports of discomfort or visual disturbance, feature in the definition and the key etiologies of tear film instability, hyperosmolarity, and ocular surface inflammation and damage were determined to be important for inclusion in the definition. In the light of new data, neurosensory abnormalities were also included in the definition for the first time. In the classification of DED, recent evidence supports a scheme based on the pathophysiology where aqueous deficient and evaporative dry eye exist as a continuum, such that elements of each are considered in diagnosis and management. Central to the scheme is a positive diagnosis of DED with signs and symptoms, and this is directed towards management to restore homeostasis. The scheme also allows consideration of various related manifestations, such as non-obvious disease involving ocular surface signs without related symptoms, including neurotrophic conditions where dysfunctional sensation exists, and cases where symptoms exist without demonstrable ocular surface signs, including neuropathic pain. This approach is not intended to override clinical assessment and judgment but should prove helpful in guiding clinical management and research. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Taxonomy of breast cancer based on normal cell phenotype predicts outcome

    PubMed Central

    Santagata, Sandro; Thakkar, Ankita; Ergonul, Ayse; Wang, Bin; Woo, Terri; Hu, Rong; Harrell, J. Chuck; McNamara, George; Schwede, Matthew; Culhane, Aedin C.; Kindelberger, David; Rodig, Scott; Richardson, Andrea; Schnitt, Stuart J.; Tamimi, Rulla M.; Ince, Tan A.

    2014-01-01

    Accurate classification is essential for understanding the pathophysiology of a disease and can inform therapeutic choices. For hematopoietic malignancies, a classification scheme based on the phenotypic similarity between tumor cells and normal cells has been successfully used to define tumor subtypes; however, use of normal cell types as a reference by which to classify solid tumors has not been widely emulated, in part due to more limited understanding of epithelial cell differentiation compared with hematopoiesis. To provide a better definition of the subtypes of epithelial cells comprising the breast epithelium, we performed a systematic analysis of a large set of breast epithelial markers in more than 15,000 normal breast cells, which identified 11 differentiation states for normal luminal cells. We then applied information from this analysis to classify human breast tumors based on normal cell types into 4 major subtypes, HR0–HR3, which were differentiated by vitamin D, androgen, and estrogen hormone receptor (HR) expression. Examination of 3,157 human breast tumors revealed that these HR subtypes were distinct from the current classification scheme, which is based on estrogen receptor, progesterone receptor, and human epidermal growth factor receptor 2. Patient outcomes were best when tumors expressed all 3 hormone receptors (subtype HR3) and worst when they expressed none of the receptors (subtype HR0). Together, these data provide an ontological classification scheme associated with patient survival differences and provides actionable insights for treating breast tumors. PMID:24463450

  10. A Classification Scheme for Glaciological AVA Responses

    NASA Astrophysics Data System (ADS)

    Booth, A.; Emir, E.

    2014-12-01

    A classification scheme is proposed for amplitude vs. angle (AVA) responses as an aid to the interpretation of seismic reflectivity in glaciological research campaigns. AVA responses are a powerful tool in characterising the material properties of glacier ice and its substrate. However, before interpreting AVA data, careful true amplitude processing is required to constrain basal reflectivity and compensate amplitude decay mechanisms, including anelastic attenuation and spherical divergence. These fundamental processing steps can be difficult to design in cases of noisy data, e.g. where a target reflection is contaminated by surface wave energy (in the case of shallow glaciers) or by energy reflected from out of the survey plane. AVA methods have equally powerful usage in estimating the fluid fill of potential hydrocarbon reservoirs. However, such applications seldom use true amplitude data and instead consider qualitative AVA responses using a well-defined classification scheme. Such schemes are often defined in terms of the characteristics of best-fit responses to the observed reflectivity, e.g. the intercept (I) and gradient (G) of a linear approximation to the AVA data. The position of the response on a cross-plot of I and G then offers a diagnostic attribute for certain fluid types. We investigate the advantages in glaciology of emulating this practice, and develop a cross-plot based on the 3-term Shuey AVA approximation (using I, G, and a curvature term C). Model AVA curves define a clear lithification trend: AVA responses to stiff (lithified) substrates fall discretely into one quadrant of the cross-plot, with positive I and negative G, whereas those to fluid-rich substrates plot diagonally opposite (in the negative I and positive G quadrant). The remaining quadrants are unoccupied by plausible single-layer responses and may therefore be diagnostic of complex thin-layer reflectivity, and the magnitude and polarity of the C term serves as a further indicator of fluid content. The use of the AVA cross-plot is explored for seismic data from European Arctic glaciers, including Storglaciären and Midtre Lovénbreen, with additional examples from other published sources. The classification scheme should provide a useful reference for the initial assessment of a glaciological AVA response.

  11. The EpiOcular™ Eye Irritation Test is the Method of Choice for the In Vitro Eye Irritation Testing of Agrochemical Formulations: Correlation Analysis of EpiOcular Eye Irritation Test and BCOP Test Data According to the UN GHS, US EPA and Brazil ANVISA Classification Schemes.

    PubMed

    Kolle, Susanne N; Rey Moreno, Maria Cecilia; Mayer, Winfried; van Cott, Andrew; van Ravenzwaay, Bennard; Landsiedel, Robert

    2015-07-01

    The Bovine Corneal Opacity and Permeability (BCOP) test is commonly used for the identification of severe ocular irritants (GHS Category 1), but it is not recommended for the identification of ocular irritants (GHS Category 2). The incorporation of human reconstructed tissue model-based tests into a tiered test strategy to identify ocular non-irritants and replace the Draize rabbit eye irritation test has been suggested (OECD TG 405). The value of the EpiOcular™ Eye Irritation Test (EIT) for the prediction of ocular non-irritants (GHS No Category) has been demonstrated, and an OECD Test Guideline (TG) was drafted in 2014. The purpose of this study was to evaluate whether the BCOP test, in conjunction with corneal histopathology (as suggested for the evaluation of the depth of the injury( and/or the EpiOcular-EIT, could be used to predict the eye irritation potential of agrochemical formulations according to the UN GHS, US EPA and Brazil ANVISA classification schemes. We have assessed opacity, permeability and histopathology in the BCOP assay, and relative tissue viability in the EpiOcular-EIT, for 97 agrochemical formulations with available in vivo eye irritation data. By using the OECD TG 437 protocol for liquids, the BCOP test did not result in sufficient correct predictions of severe ocular irritants for any of the three classification schemes. The lack of sensitivity could be improved somewhat by the inclusion of corneal histopathology, but the relative viability in the EpiOcular-EIT clearly outperformed the BCOP test for all three classification schemes. The predictive capacity of the EpiOcular-EIT for ocular non-irritants (UN GHS No Category) for the 97 agrochemical formulations tested (91% sensitivity, 72% specificity and 82% accuracy for UN GHS classification) was comparable to that obtained in the formal validation exercise underlying the OECD draft TG. We therefore conclude that the EpiOcular-EIT is currently the best in vitro method for the prediction of the eye irritation potential of liquid agrochemical formulations. 2015 FRAME.

  12. Automated identification of sleep states from EEG signals by means of ensemble empirical mode decomposition and random under sampling boosting.

    PubMed

    Hassan, Ahnaf Rashik; Bhuiyan, Mohammed Imamul Hassan

    2017-03-01

    Automatic sleep staging is essential for alleviating the burden of the physicians of analyzing a large volume of data by visual inspection. It is also a precondition for making an automated sleep monitoring system feasible. Further, computerized sleep scoring will expedite large-scale data analysis in sleep research. Nevertheless, most of the existing works on sleep staging are either multichannel or multiple physiological signal based which are uncomfortable for the user and hinder the feasibility of an in-home sleep monitoring device. So, a successful and reliable computer-assisted sleep staging scheme is yet to emerge. In this work, we propose a single channel EEG based algorithm for computerized sleep scoring. In the proposed algorithm, we decompose EEG signal segments using Ensemble Empirical Mode Decomposition (EEMD) and extract various statistical moment based features. The effectiveness of EEMD and statistical features are investigated. Statistical analysis is performed for feature selection. A newly proposed classification technique, namely - Random under sampling boosting (RUSBoost) is introduced for sleep stage classification. This is the first implementation of EEMD in conjunction with RUSBoost to the best of the authors' knowledge. The proposed feature extraction scheme's performance is investigated for various choices of classification models. The algorithmic performance of our scheme is evaluated against contemporary works in the literature. The performance of the proposed method is comparable or better than that of the state-of-the-art ones. The proposed algorithm gives 88.07%, 83.49%, 92.66%, 94.23%, and 98.15% for 6-state to 2-state classification of sleep stages on Sleep-EDF database. Our experimental outcomes reveal that RUSBoost outperforms other classification models for the feature extraction framework presented in this work. Besides, the algorithm proposed in this work demonstrates high detection accuracy for the sleep states S1 and REM. Statistical moment based features in the EEMD domain distinguish the sleep states successfully and efficaciously. The automated sleep scoring scheme propounded herein can eradicate the onus of the clinicians, contribute to the device implementation of a sleep monitoring system, and benefit sleep research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Creating a Canonical Scientific and Technical Information Classification System for NCSTRL+

    NASA Technical Reports Server (NTRS)

    Tiffany, Melissa E.; Nelson, Michael L.

    1998-01-01

    The purpose of this paper is to describe the new subject classification system for the NCSTRL+ project. NCSTRL+ is a canonical digital library (DL) based on the Networked Computer Science Technical Report Library (NCSTRL). The current NCSTRL+ classification system uses the NASA Scientific and Technical (STI) subject classifications, which has a bias towards the aerospace, aeronautics, and engineering disciplines. Examination of other scientific and technical information classification systems showed similar discipline-centric weaknesses. Traditional, library-oriented classification systems represented all disciplines, but were too generalized to serve the needs of a scientific and technically oriented digital library. Lack of a suitable existing classification system led to the creation of a lightweight, balanced, general classification system that allows the mapping of more specialized classification schemes into the new framework. We have developed the following classification system to give equal weight to all STI disciplines, while being compact and lightweight.

  14. On the convergence of nonconvex minimization methods for image recovery.

    PubMed

    Xiao, Jin; Ng, Michael Kwok-Po; Yang, Yu-Fei

    2015-05-01

    Nonconvex nonsmooth regularization method has been shown to be effective for restoring images with neat edges. Fast alternating minimization schemes have also been proposed and developed to solve the nonconvex nonsmooth minimization problem. The main contribution of this paper is to show the convergence of these alternating minimization schemes, based on the Kurdyka-Łojasiewicz property. In particular, we show that the iterates generated by the alternating minimization scheme, converges to a critical point of this nonconvex nonsmooth objective function. We also extend the analysis to nonconvex nonsmooth regularization model with box constraints, and obtain similar convergence results of the related minimization algorithm. Numerical examples are given to illustrate our convergence analysis.

  15. The Evolution of Complex Microsurgical Midface Reconstruction: A Classification Scheme and Reconstructive Algorithm.

    PubMed

    Alam, Daniel; Ali, Yaseen; Klem, Christopher; Coventry, Daniel

    2016-11-01

    Orbito-malar reconstruction after oncological resection represents one of the most challenging facial reconstructive procedures. Until the last few decades, rehabilitation was typically prosthesis based with a limited role for surgery. The advent of microsurgical techniques allowed large-volume tissue reconstitution from a distant donor site, revolutionizing the potential approaches to these defects. The authors report a novel surgery-based algorithm and a classification scheme for complete midface reconstruction with a foundation in the Gillies principles of like-to-like reconstruction and with a significant role of computer-aided virtual planning. With this approach, the authors have been able to achieve significantly better patient outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Introduction to the Apollo collections: Part 2: Lunar breccias

    NASA Technical Reports Server (NTRS)

    Mcgee, P. E.; Simonds, C. H.; Warner, J. L.; Phinney, W. C.

    1979-01-01

    Basic petrographic, chemical and age data for a representative suite of lunar breccias are presented for students and potential lunar sample investigators. Emphasis is on sample description and data presentation. Samples are listed, together with a classification scheme based on matrix texture and mineralogy and the nature and abundance of glass present both in the matrix and as clasts. A calculus of the classification scheme, describes the characteristic features of each of the breccia groups. The cratering process which describes the sequence of events immediately following an impact event is discussed, especially the thermal and material transport processes affecting the two major components of lunar breccias (clastic debris and fused material).

  17. Robust Transmission of H.264/AVC Streams Using Adaptive Group Slicing and Unequal Error Protection

    NASA Astrophysics Data System (ADS)

    Thomos, Nikolaos; Argyropoulos, Savvas; Boulgouris, Nikolaos V.; Strintzis, Michael G.

    2006-12-01

    We present a novel scheme for the transmission of H.264/AVC video streams over lossy packet networks. The proposed scheme exploits the error-resilient features of H.264/AVC codec and employs Reed-Solomon codes to protect effectively the streams. A novel technique for adaptive classification of macroblocks into three slice groups is also proposed. The optimal classification of macroblocks and the optimal channel rate allocation are achieved by iterating two interdependent steps. Dynamic programming techniques are used for the channel rate allocation process in order to reduce complexity. Simulations clearly demonstrate the superiority of the proposed method over other recent algorithms for transmission of H.264/AVC streams.

  18. The Why, What, and Impact of GPA at Oxford Brookes University

    ERIC Educational Resources Information Center

    Andrews, Matthew

    2016-01-01

    This paper examines the introduction at Oxford Brookes University of a Grade Point Average (GPA) scheme alongside the traditional honours degree classification. It considers the reasons for the introduction of GPA, the way in which the scheme was implemented, and offers an insight into the impact of GPA at Brookes. Finally, the paper considers…

  19. 46 CFR 8.260 - Revocation of classification society recognition.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Revocation of classification society recognition. 8.260... VESSEL INSPECTION ALTERNATIVES Recognition of a Classification Society § 8.260 Revocation of classification society recognition. A recognized classification society which fails to maintain the minimum...

  20. 46 CFR 8.260 - Revocation of classification society recognition.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Revocation of classification society recognition. 8.260... VESSEL INSPECTION ALTERNATIVES Recognition of a Classification Society § 8.260 Revocation of classification society recognition. A recognized classification society which fails to maintain the minimum...

  1. Maximizing the Predictive Value of Production Rules

    DTIC Science & Technology

    1988-08-31

    Clancev, 1985] Clancey, W. "Heuristic Classification." Artifcial Intelligence . 27 (1985) 289-350. [Crawford, 19881 Crawford, S. "Extensions to the CART...Optimality 16 6.1.2. Comparative Analysis for Normally Distributed Data 17 6.2. Comparison with Alternative Machine Learning Methods 18 6.2.1. Alternative...are reported on data sets previously analyzed in the Al literature using alternative classification techniques. 1. Introduction MIanv decision-making

  2. Consensus embedding: theory, algorithms and application to segmentation and classification of biomedical data

    PubMed Central

    2012-01-01

    Background Dimensionality reduction (DR) enables the construction of a lower dimensional space (embedding) from a higher dimensional feature space while preserving object-class discriminability. However several popular DR approaches suffer from sensitivity to choice of parameters and/or presence of noise in the data. In this paper, we present a novel DR technique known as consensus embedding that aims to overcome these problems by generating and combining multiple low-dimensional embeddings, hence exploiting the variance among them in a manner similar to ensemble classifier schemes such as Bagging. We demonstrate theoretical properties of consensus embedding which show that it will result in a single stable embedding solution that preserves information more accurately as compared to any individual embedding (generated via DR schemes such as Principal Component Analysis, Graph Embedding, or Locally Linear Embedding). Intelligent sub-sampling (via mean-shift) and code parallelization are utilized to provide for an efficient implementation of the scheme. Results Applications of consensus embedding are shown in the context of classification and clustering as applied to: (1) image partitioning of white matter and gray matter on 10 different synthetic brain MRI images corrupted with 18 different combinations of noise and bias field inhomogeneity, (2) classification of 4 high-dimensional gene-expression datasets, (3) cancer detection (at a pixel-level) on 16 image slices obtained from 2 different high-resolution prostate MRI datasets. In over 200 different experiments concerning classification and segmentation of biomedical data, consensus embedding was found to consistently outperform both linear and non-linear DR methods within all applications considered. Conclusions We have presented a novel framework termed consensus embedding which leverages ensemble classification theory within dimensionality reduction, allowing for application to a wide range of high-dimensional biomedical data classification and segmentation problems. Our generalizable framework allows for improved representation and classification in the context of both imaging and non-imaging data. The algorithm offers a promising solution to problems that currently plague DR methods, and may allow for extension to other areas of biomedical data analysis. PMID:22316103

  3. Regional assessment of lake ecological states using Landsat: A classification scheme for alkaline-saline, flamingo lakes in the East African Rift Valley

    NASA Astrophysics Data System (ADS)

    Tebbs, E. J.; Remedios, J. J.; Avery, S. T.; Rowland, C. S.; Harper, D. M.

    2015-08-01

    In situ reflectance measurements and Landsat satellite imagery were combined to develop an optical classification scheme for alkaline-saline lakes in the Eastern Rift Valley. The classification allows the ecological state and consequent value, in this case to Lesser Flamingos, to be determined using Landsat satellite imagery. Lesser Flamingos depend on a network of 15 alkaline-saline lakes in East African Rift Valley, where they feed by filtering cyanobacteria and benthic diatoms from the lakes' waters. The classification developed here was based on a decision tree which used the reflectance in Landsat ETM+ bands 2-4 to assign one of six classes: low phytoplankton biomass; suspended sediment-dominated; microphytobenthos; high cyanobacterial biomass; cyanobacterial scum and bleached cyanobacterial scum. The classification accuracy was 77% when verified against in situ measurements. Classified imagery and timeseries were produced for selected lakes, which show the different ecological behaviours of these complex systems. The results have highlighted the importance to flamingos of the food resources offered by the extremely remote Lake Logipi. This study has demonstrated the potential of high spatial resolution, low spectral resolution sensors for providing ecologically valuable information at a regional scale, for alkaline-saline lakes and similar hypereutrophic inland waters.

  4. Computer-aided Classification of Mammographic Masses Using Visually Sensitive Image Features

    PubMed Central

    Wang, Yunzhi; Aghaei, Faranak; Zarafshani, Ali; Qiu, Yuchen; Qian, Wei; Zheng, Bin

    2017-01-01

    Purpose To develop a new computer-aided diagnosis (CAD) scheme that computes visually sensitive image features routinely used by radiologists to develop a machine learning classifier and distinguish between the malignant and benign breast masses detected from digital mammograms. Methods An image dataset including 301 breast masses was retrospectively selected. From each segmented mass region, we computed image features that mimic five categories of visually sensitive features routinely used by radiologists in reading mammograms. We then selected five optimal features in the five feature categories and applied logistic regression models for classification. A new CAD interface was also designed to show lesion segmentation, computed feature values and classification score. Results Areas under ROC curves (AUC) were 0.786±0.026 and 0.758±0.027 when to classify mass regions depicting on two view images, respectively. By fusing classification scores computed from two regions, AUC increased to 0.806±0.025. Conclusion This study demonstrated a new approach to develop CAD scheme based on 5 visually sensitive image features. Combining with a “visual aid” interface, CAD results may be much more easily explainable to the observers and increase their confidence to consider CAD generated classification results than using other conventional CAD approaches, which involve many complicated and visually insensitive texture features. PMID:27911353

  5. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    PubMed

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  6. Classification Scheme for Items in CAAT.

    ERIC Educational Resources Information Center

    Epstein, Marion G.

    In planning the development of the system for computer assisted assembly of tests, it was agreed at the outset that one of the basic requirements for the successful initiation of any such system would be the development of a detailed item content classification system. The design of the system for classifying item content is a key element in…

  7. Mutual information-based analysis of JPEG2000 contexts.

    PubMed

    Liu, Zhen; Karam, Lina J

    2005-04-01

    Context-based arithmetic coding has been widely adopted in image and video compression and is a key component of the new JPEG2000 image compression standard. In this paper, the contexts used in JPEG2000 are analyzed using the mutual information, which is closely related to the compression performance. We first show that, when combining the contexts, the mutual information between the contexts and the encoded data will decrease unless the conditional probability distributions of the combined contexts are the same. Given I, the initial number of contexts, and F, the final desired number of contexts, there are S(I, F) possible context classification schemes where S(I, F) is called the Stirling number of the second kind. The optimal classification scheme is the one that gives the maximum mutual information. Instead of using an exhaustive search, the optimal classification scheme can be obtained through a modified generalized Lloyd algorithm with the relative entropy as the distortion metric. For binary arithmetic coding, the search complexity can be reduced by using dynamic programming. Our experimental results show that the JPEG2000 contexts capture the correlations among the wavelet coefficients very well. At the same time, the number of contexts used as part of the standard can be reduced without loss in the coding performance.

  8. A Rapid Approach to Modeling Species-Habitat Relationships

    NASA Technical Reports Server (NTRS)

    Carter, Geoffrey M.; Breinger, David R.; Stolen, Eric D.

    2005-01-01

    A growing number of species require conservation or management efforts. Success of these activities requires knowledge of the species' occurrence pattern. Species-habitat models developed from GIS data sources are commonly used to predict species occurrence but commonly used data sources are often developed for purposes other than predicting species occurrence and are of inappropriate scale and the techniques used to extract predictor variables are often time consuming and cannot be repeated easily and thus cannot efficiently reflect changing conditions. We used digital orthophotographs and a grid cell classification scheme to develop an efficient technique to extract predictor variables. We combined our classification scheme with a priori hypothesis development using expert knowledge and a previously published habitat suitability index and used an objective model selection procedure to choose candidate models. We were able to classify a large area (57,000 ha) in a fraction of the time that would be required to map vegetation and were able to test models at varying scales using a windowing process. Interpretation of the selected models confirmed existing knowledge of factors important to Florida scrub-jay habitat occupancy. The potential uses and advantages of using a grid cell classification scheme in conjunction with expert knowledge or an habitat suitability index (HSI) and an objective model selection procedure are discussed.

  9. Parameter diagnostics of phases and phase transition learning by neural networks

    NASA Astrophysics Data System (ADS)

    Suchsland, Philippe; Wessel, Stefan

    2018-05-01

    We present an analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer. Such shallow neural networks were previously found to be efficient in classifying phases and locating phase transitions of various basic model systems. In order to rationalize the emergence of the classification process and for identifying any underlying physical quantities, it is feasible to examine the weight matrices and the convolutional filter kernels that result from the learning process of such shallow networks. Furthermore, we demonstrate how the learning-by-confusing scheme can be used, in combination with a simple threshold-value classification method, to diagnose the learning parameters of neural networks. In particular, we study the classification process of both fully-connected and convolutional neural networks for the two-dimensional Ising model with extended domain wall configurations included in the low-temperature regime. Moreover, we consider the two-dimensional XY model and contrast the performance of the learning-by-confusing scheme and convolutional neural networks trained on bare spin configurations to the case of preprocessed samples with respect to vortex configurations. We discuss these findings in relation to similar recent investigations and possible further applications.

  10. Psychological Features and Their Relationship to Movement-Based Subgroups in People Living With Low Back Pain.

    PubMed

    Karayannis, Nicholas V; Jull, Gwendolen A; Nicholas, Michael K; Hodges, Paul W

    2018-01-01

    To determine the distribution of higher psychological risk features within movement-based subgroups for people with low back pain (LBP). Cross-sectional observational study. Participants were recruited from physiotherapy clinics and community advertisements. Measures were collected at a university outpatient-based physiotherapy clinic. People (N=102) seeking treatment for LBP. Participants were subgrouped according to 3 classification schemes: Mechanical Diagnosis and Treatment (MDT), Treatment-Based Classification (TBC), and O'Sullivan Classification (OSC). Questionnaires were used to categorize low-, medium-, and high-risk features based on depression, anxiety, and stress (Depression, Anxiety, and Stress Scale-21 Items); fear avoidance (Fear-Avoidance Beliefs Questionnaire); catastrophizing and coping (Pain-Related Self-Symptoms Scale); and self-efficacy (Pain Self-Efficacy Questionnaire). Psychological risk profiles were compared between movement-based subgroups within each scheme. Scores across all questionnaires revealed that most patients had low psychological risk profiles, but there were instances of higher (range, 1%-25%) risk profiles within questionnaire components. The small proportion of individuals with higher psychological risk scores were distributed between subgroups across TBC, MDT, and OSC schemes. Movement-based subgrouping alone cannot inform on individuals with higher psychological risk features. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  11. 75 FR 56015 - Vessel Inspection Alternatives

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-15

    ... page 88, in Sec. 8.420, paragraph (c) is revised to read as follows: Sec. 8.420 Classification society authorization to participate in the Alternate Compliance Program. * * * * * (c) A recognized classification... (2) Must have performed a delegated function related to general vessel safety assessment, as defined...

  12. Computer classification of remotely sensed multispectral image data by extraction and classification of homogeneous objects

    NASA Technical Reports Server (NTRS)

    Kettig, R. L.

    1975-01-01

    A method of classification of digitized multispectral images is developed and experimentally evaluated on actual earth resources data collected by aircraft and satellite. The method is designed to exploit the characteristic dependence between adjacent states of nature that is neglected by the more conventional simple-symmetric decision rule. Thus contextual information is incorporated into the classification scheme. The principle reason for doing this is to improve the accuracy of the classification. For general types of dependence this would generally require more computation per resolution element than the simple-symmetric classifier. But when the dependence occurs in the form of redundance, the elements can be classified collectively, in groups, therby reducing the number of classifications required.

  13. Murmur intensity in adult dogs with pulmonic and subaortic stenosis reflects disease severity.

    PubMed

    Caivano, D; Dickson, D; Martin, M; Rishniw, M

    2018-03-01

    The aims of this study were to determine whether murmur intensity in adult dogs with pulmonic stenosis or subaortic stenosis reflects echocardiographic disease severity and to determine whether a six-level murmur grading scheme provides clinical advantages over a four-level scheme. In this retrospective multi-investigator study on adult dogs with pulmonic stenosis or subaortic stenosis, murmur intensity was compared to echocardiographically determined pressure gradient across the affected valve. Disease severity, based on pressure gradients, was assessed between sequential murmur grades to identify redundancy in classification. A simplified four-level murmur intensity classification scheme ('soft', 'moderate', 'loud', 'palpable') was evaluated. In total, 284 dogs (153 with pulmonic stenosis, 131 with subaortic stenosis) were included; 55 dogs had soft, 59 had moderate, 72 had loud and 98 had palpable murmurs. 95 dogs had mild stenosis, 46 had moderate stenosis, and 143 had severe stenosis. No dogs with soft murmurs of either pulmonic or subaortic stenosis had transvalvular pressure gradients greater than 50 mmHg. Dogs with loud or palpable murmurs mostly, but not always, had severe stenosis. Stenosis severity increased with increasing murmur intensity. The traditional six-level murmur grading scheme provided no additional clinical information than the four-level descriptive murmur grading scheme. A simplified descriptive four-level murmur grading scheme differentiated stenosis severity without loss of clinical information, compared to the traditional six-level scheme. Soft murmurs in dogs with pulmonic or subaortic stenosis are strongly indicative of mild lesions. Loud or palpable murmurs are strongly suggestive of severe stenosis. © 2017 British Small Animal Veterinary Association.

  14. Random forest feature selection, fusion and ensemble strategy: Combining multiple morphological MRI measures to discriminate among healhy elderly, MCI, cMCI and alzheimer's disease patients: From the alzheimer's disease neuroimaging initiative (ADNI) database.

    PubMed

    Dimitriadis, S I; Liparas, Dimitris; Tsolaki, Magda N

    2018-05-15

    In the era of computer-assisted diagnostic tools for various brain diseases, Alzheimer's disease (AD) covers a large percentage of neuroimaging research, with the main scope being its use in daily practice. However, there has been no study attempting to simultaneously discriminate among Healthy Controls (HC), early mild cognitive impairment (MCI), late MCI (cMCI) and stable AD, using features derived from a single modality, namely MRI. Based on preprocessed MRI images from the organizers of a neuroimaging challenge, 3 we attempted to quantify the prediction accuracy of multiple morphological MRI features to simultaneously discriminate among HC, MCI, cMCI and AD. We explored the efficacy of a novel scheme that includes multiple feature selections via Random Forest from subsets of the whole set of features (e.g. whole set, left/right hemisphere etc.), Random Forest classification using a fusion approach and ensemble classification via majority voting. From the ADNI database, 60 HC, 60 MCI, 60 cMCI and 60 CE were used as a training set with known labels. An extra dataset of 160 subjects (HC: 40, MCI: 40, cMCI: 40 and AD: 40) was used as an external blind validation dataset to evaluate the proposed machine learning scheme. In the second blind dataset, we succeeded in a four-class classification of 61.9% by combining MRI-based features with a Random Forest-based Ensemble Strategy. We achieved the best classification accuracy of all teams that participated in this neuroimaging competition. The results demonstrate the effectiveness of the proposed scheme to simultaneously discriminate among four groups using morphological MRI features for the very first time in the literature. Hence, the proposed machine learning scheme can be used to define single and multi-modal biomarkers for AD. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Object-based delineation and classification of alluvial fans by application of mean-shift segmentation and support vector machines

    NASA Astrophysics Data System (ADS)

    Pipaud, Isabel; Lehmkuhl, Frank

    2017-09-01

    In the field of geomorphology, automated extraction and classification of landforms is one of the most active research areas. Until the late 2000s, this task has primarily been tackled using pixel-based approaches. As these methods consider pixels and pixel neighborhoods as the sole basic entities for analysis, they cannot account for the irregular boundaries of real-world objects. Object-based analysis frameworks emerging from the field of remote sensing have been proposed as an alternative approach, and were successfully applied in case studies falling in the domains of both general and specific geomorphology. In this context, the a-priori selection of scale parameters or bandwidths is crucial for the segmentation result, because inappropriate parametrization will either result in over-segmentation or insufficient segmentation. In this study, we describe a novel supervised method for delineation and classification of alluvial fans, and assess its applicability using a SRTM 1‧‧ DEM scene depicting a section of the north-eastern Mongolian Altai, located in northwest Mongolia. The approach is premised on the application of mean-shift segmentation and the use of a one-class support vector machine (SVM) for classification. To consider variability in terms of alluvial fan dimension and shape, segmentation is performed repeatedly for different weightings of the incorporated morphometric parameters as well as different segmentation bandwidths. The final classification layer is obtained by selecting, for each real-world object, the most appropriate segmentation result according to fuzzy membership values derived from the SVM classification. Our results show that mean-shift segmentation and SVM-based classification provide an effective framework for delineation and classification of a particular landform. Variable bandwidths and terrain parameter weightings were identified as being crucial for consideration of intra-class variability, and, in turn, for a constantly high segmentation quality. Our analysis further reveals that incorporation of morphometric parameters quantifying specific morphological aspects of a landform is indispensable for developing an accurate classification scheme. Alluvial fans exhibiting accentuated composite morphologies were identified as a major challenge for automatic delineation, as they cannot be fully captured by a single segmentation run. There is, however, a high probability that this shortcoming can be overcome by enhancing the presented approach with a routine merging fan sub-entities based on their spatial relationships.

  16. 46 CFR 8.220 - Recognition of a classification society.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Recognition of a classification society. 8.220 Section 8... INSPECTION ALTERNATIVES Recognition of a Classification Society § 8.220 Recognition of a classification society. (a) A classification society must be recognized by the Commandant before it may receive statutory...

  17. 46 CFR 8.220 - Recognition of a classification society.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Recognition of a classification society. 8.220 Section 8... INSPECTION ALTERNATIVES Recognition of a Classification Society § 8.220 Recognition of a classification society. (a) A classification society must be recognized by the Commandant before it may receive statutory...

  18. 46 CFR 31.01-3 - Alternate compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... classification societies, including information for ordering copies of approved classification society rules and...; telephone (202) 372-1372; or fax (202) 372-1925. Approved classification society rules and supplements are...

  19. 46 CFR 31.01-3 - Alternate compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classification societies, including information for ordering copies of approved classification society rules and...; telephone (202) 372-1372; or fax (202) 372-1925. Approved classification society rules and supplements are...

  20. An international phase 3 trial in head and neck cancer: quality of life and symptom results: EORTC 24954 on behalf of the EORTC Head and Neck and the EORTC Radiation Oncology Group.

    PubMed

    Bottomley, Andrew; Tridello, Gloria; Coens, Corneel; Rolland, Frederic; Tesselaar, Margot E T; Leemans, C Rene; Hupperets, Pierre; Licitra, Lisa; Vermorken, Jan B; Van Den Weyngaert, Danielle; Truc, Gilles; Barillot, Isabelle; Lefebvre, Jean-Louis

    2014-02-01

    The European Organization for Research and Treatment of Cancer (EORTC) 24954 phase 3 randomized clinical trial compared 2 schemes of combined chemotherapy for patients with resectable cancers of the hypopharynx and larynx: sequential induction chemotherapy and radiotherapy versus alternating chemoradiotherapy. The current study reports detailed effects of both treatment arms on health-related quality of life (HRQOL) and symptoms. A total of 450 patients aged 35 years to 76 years (World Health Organization performance status (WHO PS) ≤ 2) with untreated, resectable advanced squamous cell carcinoma of the larynx (tumor classification of T3-T4) or hypopharynx (tumor classification of T2-T3-T4) with regional lymph nodes in the neck classified as N0 to N2 with no metastases were randomized in this prospective phase 3 trial into either the sequential arm (control) or the alternating arm (experimental). QOL assessment was performed at randomization; at baseline; at 42 days; and at 6, 12, 24, 36, and 48 months. There were no observed differences with regard to the primary endpoint of Fatigue and secondary endpoint of Dyspnea. Significant differences were found in the secondary endpoints of Swallowing and Speech problems at 42 days after randomization in favor of patients in the sequential arm. Explanatory and sensitivity analysis revealed that the primary analysis favored the sequential arm, but the majority of differences in HRQOL did not exist at the end of treatment, and returned to baseline levels. In the current study, a trend toward worse scores was noted in the patients treated on the alternating chemoradiotherapy arm but very few differences reached the level of statistical significance. The HRQOL scores of the majority of patients returned to baseline after therapy. © 2013 American Cancer Society.

  1. A risk-based classification scheme for genetically modified foods. I: Conceptual development.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    The predominant paradigm for the premarket assessment of genetically modified (GM) foods reflects heightened public concern by focusing on foods modified by recombinant deoxyribonucleic acid (rDNA) techniques, while foods modified by other methods of genetic modification are generally not assessed for safety. To determine whether a GM product requires less or more regulatory oversight and testing, we developed and evaluated a risk-based classification scheme (RBCS) for crop-derived GM foods. The results of this research are presented in three papers. This paper describes the conceptual development of the proposed RBCS that focuses on two categories of adverse health effects: (1) toxic and antinutritional effects, and (2) allergenic effects. The factors that may affect the level of potential health risks of GM foods are identified. For each factor identified, criteria for differentiating health risk potential are developed. The extent to which a GM food satisfies applicable criteria for each factor is rated separately. A concern level for each category of health effects is then determined by aggregating the ratings for the factors using predetermined aggregation rules. An overview of the proposed scheme is presented, as well as the application of the scheme to a hypothetical GM food.

  2. Alternating Direction Implicit (ADI) schemes for a PDE-based image osmosis model

    NASA Astrophysics Data System (ADS)

    Calatroni, L.; Estatico, C.; Garibaldi, N.; Parisotto, S.

    2017-10-01

    We consider Alternating Direction Implicit (ADI) splitting schemes to compute efficiently the numerical solution of the PDE osmosis model considered by Weickert et al. in [10] for several imaging applications. The discretised scheme is shown to preserve analogous properties to the continuous model. The dimensional splitting strategy traduces numerically into the solution of simple tridiagonal systems for which standard matrix factorisation techniques can be used to improve upon the performance of classical implicit methods, even for large time steps. Applications to the shadow removal problem are presented.

  3. [Total-body irradiation in non-Hodgkin's lymphomas as an alternative to chemotherapy].

    PubMed

    Rühl, U

    1977-05-01

    On the bases of previous experiences and present results it can be stated that total-body irradiation is an effective therapeutical technique for treatment of lymphocytic non-Hodkin's lymphomas including chronic lymphatic leukemia; first results from prospectively randomized studies even revealed a slight superiority of this method as compared to the scheme of combined cytostatical therapy (CVP) mostly applied at present. Particular advantages of total-body irradiation are the easy applicability, the relatively short time needed for treatment, and the lack of subjective secondary effects. Thus, ambulatory therapy can be performed without any difficulty. The only complication which may occur arises from myelotoxicity reaching its maximum not earlier than after the end of treatment. Careful follow-up of the patients, therefore, is indispensable. The indication of total-body irradiation for the treatment of non-Hodgkin's lymphomas depends on the objective findings, the stage of disease, and mainly on the histological classification.

  4. Threshold of toxicological concern values for non-genotoxic effects in industrial chemicals: re-evaluation of the Cramer classification.

    PubMed

    Kalkhof, H; Herzler, M; Stahlmann, R; Gundert-Remy, U

    2012-01-01

    The TTC concept employs available data from animal testing to derive a distribution of NOAELs. Taking a probabilistic view, the 5th percentile of the distribution is taken as a threshold value for toxicity. In this paper, we use 824 NOAELs from repeated dose toxicity studies of industrial chemicals to re-evaluate the currently employed TTC values, which have been derived for substances grouped according to the Cramer scheme (Cramer et al. in Food Cosm Toxicol 16:255-276, 1978) by Munro et al. (Food Chem Toxicol 34:829-867, 1996) and refined by Kroes and Kozianowski (Toxicol Lett 127:43-46, 2002), Kroes et al. 2000. In our data set, consisting of 756 NOAELs from 28-day repeated dose testing and 57 NOAELs from 90-days repeated dose testing, the experimental NOAEL had to be extrapolated to chronic TTC using regulatory accepted extrapolation factors. The TTC values derived from our data set were higher than the currently used TTC values confirming the safety of the latter. We analysed the prediction of the Cramer classification by comparing the classification by this tool with the guidance values for classification according to the Globally Harmonised System of classification and labelling of the United Nations (GHS). Nearly 90% of the chemicals were in Cramer class 3 and assumed as highly toxic compared to 22% according to the GHS. The Cramer classification does underestimate the toxicity of chemicals only in 4.6% of the cases. Hence, from a regulatory perspective, the Cramer classification scheme might be applied as it overestimates hazard of a chemical.

  5. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation

    PubMed Central

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-01-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67×3 (67 clusters of three observations) and a 33×6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67×3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis. PMID:20011037

  6. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    PubMed Central

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  7. Integrating disparate lidar data at the national scale to assess the relationships between height above ground, land cover and ecoregions

    USGS Publications Warehouse

    Stoker, Jason M.; Cochrane, Mark A.; Roy, David P.

    2013-01-01

    With the acquisition of lidar data for over 30 percent of the US, it is now possible to assess the three-dimensional distribution of features at the national scale. This paper integrates over 350 billion lidar points from 28 disparate datasets into a national-scale database and evaluates if height above ground is an important variable in the context of other nationalscale layers, such as the US Geological Survey National Land Cover Database and the US Environmental Protection Agency ecoregions maps. While the results were not homoscedastic and the available data did not allow for a complete height census in any of the classes, it does appear that where lidar data were used, there were detectable differences in heights among many of these national classification schemes. This study supports the hypothesis that there were real, detectable differences in heights in certain national-scale classification schemes, despite height not being a variable used in any of the classification routines.

  8. Occupant detection using support vector machines with a polynomial kernel function

    NASA Astrophysics Data System (ADS)

    Destefanis, Eduardo A.; Kienzle, Eberhard; Canali, Luis R.

    2000-10-01

    The use of air bags in the presence of bad passenger and baby seat positions in car seats can injure or kill these individuals in case of an accident when this device is inflated. A proposed solution is the use of range sensors to detect passenger and baby seat risky positions. Such sensors allow the Airbag inflation to be controlled. This work is concerned with the application of different classification schemes to a real world problem and the optimization of a sensor as a function of the classification performance. The sensor is constructed using a new technology which is called Photo-Mixer-Device (PMD). A systematic analysis of the occupant detection problem was made using real and virtual environments. The challenge is to find the best sensor geometry and to adapt a classification scheme under the current technological constraints. Passenger head position detection is also a desirable issue. A couple of classifiers have been used into a simple configuration to reach this goal. Experiences and results are described.

  9. Healthy and Unhealthy Perfectionists among Academically Gifted Chinese Students in Hong Kong: Do Different Classification Schemes Make a Difference?

    ERIC Educational Resources Information Center

    Chan, David W.

    2010-01-01

    This study investigated the identification and distribution of perfectionist types with a sample of 111 academically gifted Chinese students aged 17 to 20 in Hong Kong. Three approaches to classification were employed. Apart from the direct questioning approach, the rational approach and the clustering approach classified students using their…

  10. Application of a hierarchical habitat unit classification system: stream habitat and salmonid distribution in Ward Creek, southeast Alaska.

    Treesearch

    M.D. Bryant; B.E. Wright; B.J. Davies

    1992-01-01

    A hierarchical classification system separating stream habitat into habitat units defined by stream morphology and hydrology was used in a pre-enhancement stream survey. The system separates habitat units into macrounits, mesounits, and micro- units and includes a separate evaluation of instream cover that also uses the hierarchical scheme. This paper presents an...

  11. An evaluation of several different classification schemes - Their parameters and performance. [maximum likelihood decision for crop identification

    NASA Technical Reports Server (NTRS)

    Scholz, D.; Fuhs, N.; Hixson, M.

    1979-01-01

    The overall objective of this study was to apply and evaluate several of the currently available classification schemes for crop identification. The approaches examined were: (1) a per point Gaussian maximum likelihood classifier, (2) a per point sum of normal densities classifier, (3) a per point linear classifier, (4) a per point Gaussian maximum likelihood decision tree classifier, and (5) a texture sensitive per field Gaussian maximum likelihood classifier. Three agricultural data sets were used in the study: areas from Fayette County, Illinois, and Pottawattamie and Shelby Counties in Iowa. The segments were located in two distinct regions of the Corn Belt to sample variability in soils, climate, and agricultural practices.

  12. Identification and classification of known and putative antimicrobial compounds produced by a wide variety of Bacillales species.

    PubMed

    Zhao, Xin; Kuipers, Oscar P

    2016-11-07

    Gram-positive bacteria of the Bacillales are important producers of antimicrobial compounds that might be utilized for medical, food or agricultural applications. Thanks to the wide availability of whole genome sequence data and the development of specific genome mining tools, novel antimicrobial compounds, either ribosomally- or non-ribosomally produced, of various Bacillales species can be predicted and classified. Here, we provide a classification scheme of known and putative antimicrobial compounds in the specific context of Bacillales species. We identify and describe known and putative bacteriocins, non-ribosomally synthesized peptides (NRPs), polyketides (PKs) and other antimicrobials from 328 whole-genome sequenced strains of 57 species of Bacillales by using web based genome-mining prediction tools. We provide a classification scheme for these bacteriocins, update the findings of NRPs and PKs and investigate their characteristics and suitability for biocontrol by describing per class their genetic organization and structure. Moreover, we highlight the potential of several known and novel antimicrobials from various species of Bacillales. Our extended classification of antimicrobial compounds demonstrates that Bacillales provide a rich source of novel antimicrobials that can now readily be tapped experimentally, since many new gene clusters are identified.

  13. Flood Mapping in the Lower Mekong River Basin Using Daily MODIS Observations

    NASA Technical Reports Server (NTRS)

    Fayne, Jessica V.; Bolten, John D.; Doyle, Colin S.; Fuhrmann, Sven; Rice, Matthew T.; Houser, Paul R.; Lakshmi, Venkat

    2017-01-01

    In flat homogenous terrain such as in Cambodia and Vietnam, the monsoon season brings significant and consistent flooding between May and November. To monitor flooding in the Lower Mekong region, the near real-time NASA Flood Extent Product (NASA-FEP) was developed using seasonal normalized difference vegetation index (NDVI) differences from the 250 m resolution Moderate Resolution Imaging Spectroradiometer (MODIS) sensor compared to daily observations. The use of a percentage change interval classification relating to various stages of flooding reduces might be confusing to viewers or potential users, and therefore reducing the product usage. To increase the product usability through simplification, the classification intervals were compared with other commonly used change detection schemes to identify the change classification scheme that best delineates flooded areas. The percentage change method used in the NASA-FEP proved to be helpful in delineating flood boundaries compared to other change detection methods. The results of the accuracy assessments indicate that the -75% NDVI change interval can be reclassified to a descriptive 'flood' classification. A binary system was used to simplify the interpretation of the NASA-FEP by removing extraneous information from lower interval change classes.

  14. Automatic breast tissue density estimation scheme in digital mammography images

    NASA Astrophysics Data System (ADS)

    Menechelli, Renan C.; Pacheco, Ana Luisa V.; Schiabel, Homero

    2017-03-01

    Cases of breast cancer have increased substantially each year. However, radiologists are subject to subjectivity and failures of interpretation which may affect the final diagnosis in this examination. The high density features in breast tissue are important factors related to these failures. Thus, among many functions some CADx (Computer-Aided Diagnosis) schemes are classifying breasts according to the predominant density. In order to aid in such a procedure, this work attempts to describe automated software for classification and statistical information on the percentage change in breast tissue density, through analysis of sub regions (ROIs) from the whole mammography image. Once the breast is segmented, the image is divided into regions from which texture features are extracted. Then an artificial neural network MLP was used to categorize ROIs. Experienced radiologists have previously determined the ROIs density classification, which was the reference to the software evaluation. From tests results its average accuracy was 88.7% in ROIs classification, and 83.25% in the classification of the whole breast density in the 4 BI-RADS density classes - taking into account a set of 400 images. Furthermore, when considering only a simplified two classes division (high and low densities) the classifier accuracy reached 93.5%, with AUC = 0.95.

  15. Support vector machine and principal component analysis for microarray data classification

    NASA Astrophysics Data System (ADS)

    Astuti, Widi; Adiwijaya

    2018-03-01

    Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.

  16. Creating a Taxonomy of Local Boards of Health Based on Local Health Departments’ Perspectives

    PubMed Central

    Shah, Gulzar H.; Sotnikov, Sergey; Leep, Carolyn J.; Ye, Jiali; Van Wave, Timothy W.

    2017-01-01

    Objectives To develop a local board of health (LBoH) classification scheme and empirical definitions to provide a coherent framework for describing variation in the LBoHs. Methods This study is based on data from the 2015 Local Board of Health Survey, conducted among a nationally representative sample of local health department administrators, with 394 responses. The classification development consisted of the following steps: (1) theoretically guided initial domain development, (2) mapping of the survey variables to the proposed domains, (3) data reduction using principal component analysis and group consensus, and (4) scale development and testing for internal consistency. Results The final classification scheme included 60 items across 6 governance function domains and an additional domain—LBoH characteristics and strengths, such as meeting frequency, composition, and diversity of information sources. Application of this classification strongly supports the premise that LBoHs differ in their performance of governance functions and in other characteristics. Conclusions The LBoH taxonomy provides an empirically tested standardized tool for classifying LBoHs from the viewpoint of local health department administrators. Future studies can use this taxonomy to better characterize the impact of LBoHs. PMID:27854524

  17. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  18. Emotion recognition based on physiological changes in music listening.

    PubMed

    Kim, Jonghwa; André, Elisabeth

    2008-12-01

    Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\\% and 70\\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.

  19. A prototype of mammography CADx scheme integrated to imaging quality evaluation techniques

    NASA Astrophysics Data System (ADS)

    Schiabel, Homero; Matheus, Bruno R. N.; Angelo, Michele F.; Patrocínio, Ana Claudia; Ventura, Liliane

    2011-03-01

    As all women over the age of 40 are recommended to perform mammographic exams every two years, the demands on radiologists to evaluate mammographic images in short periods of time has increased considerably. As a tool to improve quality and accelerate analysis CADe/Dx (computer-aided detection/diagnosis) schemes have been investigated, but very few complete CADe/Dx schemes have been developed and most are restricted to detection and not diagnosis. The existent ones usually are associated to specific mammographic equipment (usually DR), which makes them very expensive. So this paper describes a prototype of a complete mammography CADx scheme developed by our research group integrated to an imaging quality evaluation process. The basic structure consists of pre-processing modules based on image acquisition and digitization procedures (FFDM, CR or film + scanner), a segmentation tool to detect clustered microcalcifications and suspect masses and a classification scheme, which evaluates as the presence of microcalcifications clusters as well as possible malignant masses based on their contour. The aim is to provide enough information not only on the detected structures but also a pre-report with a BI-RADS classification. At this time the system is still lacking an interface integrating all the modules. Despite this, it is functional as a prototype for clinical practice testing, with results comparable to others reported in literature.

  20. Divorcing Strain Classification from Species Names.

    PubMed

    Baltrus, David A

    2016-06-01

    Confusion about strain classification and nomenclature permeates modern microbiology. Although taxonomists have traditionally acted as gatekeepers of order, the numbers of, and speed at which, new strains are identified has outpaced the opportunity for professional classification for many lineages. Furthermore, the growth of bioinformatics and database-fueled investigations have placed metadata curation in the hands of researchers with little taxonomic experience. Here I describe practical challenges facing modern microbial taxonomy, provide an overview of complexities of classification for environmentally ubiquitous taxa like Pseudomonas syringae, and emphasize that classification can be independent of nomenclature. A move toward implementation of relational classification schemes based on inherent properties of whole genomes could provide sorely needed continuity in how strains are referenced across manuscripts and data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Circulation Type Classifications and their nexus to Van Bebber's storm track Vb

    NASA Astrophysics Data System (ADS)

    Hofstätter, M.; Chimani, B.

    2012-04-01

    Circulation Type Classifications (CTCs) are tools to identify repetitive and predominantly stationary patterns of the atmospheric circulation over a certain area, with the purpose to enable the recognition of specific characteristics in surface climate variables. On the other hand storm tracks can be used to identify similar types of synoptic events from a non-stationary, kinematic perspective. Such a storm track classification for Europe has been done in the late 19th century by Van Bebber (1882, 1891), from which the famous type Vb and Vc/d remained up to the present day because of to their association with major flooding events like in August 2002 in Europe. In this work a systematic tracking procedure has been developed, to determine storm track types and their characteristics especially for the Eastern Alpine Region in the period 1961-2002, using ERA40 and ERAinterim reanalysis. The focus thereby is on cyclone tracks of type V as suggested by van Bebber and congeneric types. This new catalogue is used as a reference to verify the hypothesis of a certain coherence of storm track Vb with certain circulation types (e.g. Fricke and Kaminski, 2002). Selected objective and subjective classification schemes from the COST733 action (http://cost733.met.no/, Phillip et al. 2010) are used therefore, as well as the manual classification from ZAMG (Lauscher 1972 and 1985), in which storm track Vb has been classified explicitly on a daily base since 1948. The latter scheme should prove itself as a valuable and unique data source in that issue. Results show that not less than 146 storm tracks are identified as Vb between 1961 and 2002, whereas only three events could be found from literature, pointing to big subjectivity and preconception in the issue of Vb storm tracks. The annual number of Vb storm tracks do not show any significant trend over the last 42 years, but large variations from year to year. Circulation type classification CAP27 (Cluster Analysis of Principal Components) is the best performing, fully objective scheme tested herein, showing the power to discriminate Vb events. Most of the other fully objective schemes do by far not perform as well. Largest skill in that issue can be seen from the subjective/manual CTCs, proving themselves to enhance relevant synoptic phenomena instead of emphasizing mathematic criteria in the classification. The hypothesis of Fricke and Kaminsky can definitely be supported by this work: Vb storm tracks are included in one or the other stationary circulation pattern, but to which extent depends on the specific characteristics of the CTC in question.

  2. Maxillectomy defects: a suggested classification scheme.

    PubMed

    Akinmoladun, V I; Dosumu, O O; Olusanya, A A; Ikusika, O F

    2013-06-01

    The term "maxillectomy" has been used to describe a variety of surgical procedures for a spectrum of diseases involving a diverse anatomical site. Hence, classifications of maxillectomy defects have often made communication difficult. This article highlights this problem, emphasises the need for a uniform system of classification and suggests a classification system which is simple and comprehensive. Articles related to this subject, especially those with specified classifications of maxillary surgical defects were sourced from the internet through Google, Scopus and PubMed using the search terms maxillectomy defects classification. A manual search through available literature was also done. The review of the materials revealed many classifications and modifications of classifications from the descriptive, reconstructive and prosthodontic perspectives. No globally acceptable classification exists among practitioners involved in the management of diseases in the mid-facial region. There were over 14 classifications of maxillary defects found in the English literature. Attempts made to address the inadequacies of previous classifications have tended to result in cumbersome and relatively complex classifications. A single classification that is based on both surgical and prosthetic considerations is most desirable and is hereby proposed.

  3. Mapping forest types in Worcester County, Maryland, using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Burtis, J., Jr.; Witt, R. G.

    1981-01-01

    The feasibility of mapping Level 2 forest cover types for a county-sized area on Maryland's Eastern Shore was demonstrated. A Level 1 land use/land cover classification was carried out for all of Worcester County as well. A June 1978 LANDSAT scene was utilized in a classification which employed two software packages on different computers (IDIMS on an HP 3000 and ASTEP-II on a Univac 1108). A twelve category classification scheme was devised for the study area. Resulting products include black and white line printer maps, final color coded classification maps, digitally enhanced color imagery and tabulated acreage statistics for all land use and land cover types.

  4. Generalized Rainich conditions, generalized stress-energy conditions, and the Hawking-Ellis classification

    NASA Astrophysics Data System (ADS)

    Martín–Moruno, Prado; Visser, Matt

    2017-11-01

    The (generalized) Rainich conditions are algebraic conditions which are polynomial in the (mixed-component) stress-energy tensor. As such they are logically distinct from the usual classical energy conditions (NEC, WEC, SEC, DEC), and logically distinct from the usual Hawking-Ellis (Segré-Plebański) classification of stress-energy tensors (type I, type II, type III, type IV). There will of course be significant inter-connections between these classification schemes, which we explore in the current article. Overall, we shall argue that it is best to view the (generalized) Rainich conditions as a refinement of the classical energy conditions and the usual Hawking-Ellis classification.

  5. The Importance of Temporal and Spatial Vegetation Structure Information in Biotope Mapping Schemes: A Case Study in Helsingborg, Sweden

    NASA Astrophysics Data System (ADS)

    Gao, Tian; Qiu, Ling; Hammer, Mårten; Gunnarsson, Allan

    2012-02-01

    Temporal and spatial vegetation structure has impact on biodiversity qualities. Yet, current schemes of biotope mapping do only to a limited extend incorporate these factors in the mapping. The purpose of this study is to evaluate the application of a modified biotope mapping scheme that includes temporal and spatial vegetation structure. A refined scheme was developed based on a biotope classification, and applied to a green structure system in Helsingborg city in southern Sweden. It includes four parameters of vegetation structure: continuity of forest cover, age of dominant trees, horizontal structure, and vertical structure. The major green structure sites were determined by interpretation of panchromatic aerial photographs assisted with a field survey. A set of biotope maps was constructed on the basis of each level of modified classification. An evaluation of the scheme included two aspects in particular: comparison of species richness between long-continuity and short-continuity forests based on identification of woodland continuity using ancient woodland indicators (AWI) species and related historical documents, and spatial distribution of animals in the green space in relation to vegetation structure. The results indicate that (1) the relationship between forest continuity: according to verification of historical documents, the richness of AWI species was higher in long-continuity forests; Simpson's diversity was significantly different between long- and short-continuity forests; the total species richness and Shannon's diversity were much higher in long-continuity forests shown a very significant difference. (2) The spatial vegetation structure and age of stands influence the richness and abundance of the avian fauna and rabbits, and distance to the nearest tree and shrub was a strong determinant of presence for these animal groups. It is concluded that continuity of forest cover, age of dominant trees, horizontal and vertical structures of vegetation should now be included in urban biotope classifications.

  6. A proportional control scheme for high density force myography.

    PubMed

    Belyea, Alexander T; Englehart, Kevin B; Scheme, Erik J

    2018-08-01

    Force myography (FMG) has been shown to be a potentially higher accuracy alternative to electromyography for pattern recognition based prosthetic control. Classification accuracy, however, is just one factor that affects the usability of a control system. Others, like the ability to start and stop, to coordinate dynamic movements, and to control the velocity of the device through some proportional control scheme can be of equal importance. To impart effective fine control using FMG-based pattern recognition, it is important that a method of controlling the velocity of each motion be developed. In this work force myography data were collected from 14 able bodied participants and one amputee participant as they performed a set of wrist and hand motions. The offline proportional control performance of a standard mean signal amplitude approach and a proposed regression-based alternative was compared. The impact of providing feedback during training, as well as the use of constrained or unconstrained hand and wrist contractions, were also evaluated. It is shown that the commonly used mean of rectified channel amplitudes approach commonly employed with electromyography does not translate to force myography. The proposed class-based regression proportional control approach is shown significantly outperform this standard approach (ρ  <  0.001), yielding a R 2 correlation coefficients of 0.837 and 0.830 for constrained and unconstrained forearm contractions, respectively for able bodied participants. No significant difference (ρ  =  0.693) was found in R 2 performance when feedback was provided during training or not. The amputee subject achieved a classification accuracy of 83.4%  ±  3.47% demonstrating the ability to distinguish contractions well with FMG. In proportional control the amputee participant achieved an R 2 of of 0.375 for regression based proportional control during unconstrained contractions. This is lower than the unconstrained case for able-bodied subjects for this particular amputee, possibly due to difficultly in visualizing contraction level modulation without feedback. This may be remedied in the use of a prosthetic limb that would provide real-time feedback in the form of device speed. A novel class-specific regression-based approach is proposed for multi-class control is described and shown to provide an effective means of providing FMG-based proportional control.

  7. A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Bhaduri, Budhendra L

    2011-01-01

    Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less

  8. A novel transferable individual tree crown delineation model based on Fishing Net Dragging and boundary classification

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Im, Jungho; Quackenbush, Lindi J.

    2015-12-01

    This study provides a novel approach to individual tree crown delineation (ITCD) using airborne Light Detection and Ranging (LiDAR) data in dense natural forests using two main steps: crown boundary refinement based on a proposed Fishing Net Dragging (FiND) method, and segment merging based on boundary classification. FiND starts with approximate tree crown boundaries derived using a traditional watershed method with Gaussian filtering and refines these boundaries using an algorithm that mimics how a fisherman drags a fishing net. Random forest machine learning is then used to classify boundary segments into two classes: boundaries between trees and boundaries between branches that belong to a single tree. Three groups of LiDAR-derived features-two from the pseudo waveform generated along with crown boundaries and one from a canopy height model (CHM)-were used in the classification. The proposed ITCD approach was tested using LiDAR data collected over a mountainous region in the Adirondack Park, NY, USA. Overall accuracy of boundary classification was 82.4%. Features derived from the CHM were generally more important in the classification than the features extracted from the pseudo waveform. A comprehensive accuracy assessment scheme for ITCD was also introduced by considering both area of crown overlap and crown centroids. Accuracy assessment using this new scheme shows the proposed ITCD achieved 74% and 78% as overall accuracy, respectively, for deciduous and mixed forest.

  9. The classification of phobic disorders.

    PubMed

    Sheehan, D V; Sheehan, K H

    The history of classification of phobic disorders is reviewed. Problems in the ability of current classification schemes to predict, control and describe the relationship between the symptoms and other phenomena are outlined. A new classification of phobic disorders is proposed based on the presence or absence of an endogenous anxiety syndrome with the phobias. The two categories of phobic disorder have a different clinical presentation and course, a different mean age of onset, distribution of age of onset, sex distribution, response to treatment modalities, GSR testing and habituation response. Empirical evidence supporting this proposal is cited. This classification has heuristic merit in guiding research efforts and discussions and in directing the clinician to a simple and practical solution of his patient's phobic disorder.

  10. A classification of open Gaussian dynamics

    NASA Astrophysics Data System (ADS)

    Grimmer, Daniel; Brown, Eric; Kempf, Achim; Mann, Robert B.; Martín-Martínez, Eduardo

    2018-06-01

    We introduce a classification scheme for the generators of bosonic open Gaussian dynamics, providing instructive diagrams description for each type of dynamics. Using this classification, we discuss the consequences of imposing complete positivity on Gaussian dynamics. In particular, we show that non-symplectic operations must be active to allow for complete positivity. In addition, non-symplectic operations can, in fact, conserve the volume of phase space only if the restriction of complete positivity is lifted. We then discuss the implications for the relationship between information and energy flows in open quantum mechanics.

  11. Contemplating case mix: A primer on case mix classification and management.

    PubMed

    Costa, Andrew P; Poss, Jeffery W; McKillop, Ian

    2015-01-01

    Case mix classifications are the frameworks that underlie many healthcare funding schemes, including the so-called activity-based funding. Now more than ever, Canadian healthcare administrators are evaluating case mix-based funding and deciphering how they will influence their organization. Case mix is a topic fraught with technical jargon and largely relegated to government agencies or private industries. This article provides an abridged review of case mix classification as well as its implications for management in healthcare. © 2015 The Canadian College of Health Leaders.

  12. A three-parameter asteroid taxonomy

    NASA Technical Reports Server (NTRS)

    Tedesco, Edward F.; Williams, James G.; Matson, Dennis L.; Veeder, Glenn J.; Gradie, Jonathan C.

    1989-01-01

    Broadband U, V, and x photometry together with IRAS asteroid albedos have been used to construct an asteroid classification system. The system is based on three parameters (U-V and v-x color indices and visual geometric albedo), and it is able to place 96 percent of the present sample of 357 asteroids into 11 taxonomic classes. It is noted that all but one of these classes are analogous to those previously found using other classification schemes. The algorithm is shown to account for the observational uncertainties in each of the classification parameters.

  13. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    NASA Technical Reports Server (NTRS)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Henry; Koo, David; Bassett, Robert; hide

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H <24.5 involving the dedicated efforts of 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies spanning 0 < z < 4 over all the fields. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, k-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed - GOODS-S, which has been classified at various depths. The wide area coverage spanning the full field (wide+deep+ERS) includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all of the visual classifications in GOODS-S along with the Perl/Tk GUI that we developed to classify galaxies. We present our initial results here, including an analysis of our internal consistency and comparisons among multiple classifiers as well as a comparison to the Sersic index. We find that the level of agreement among classifiers is quite good and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement and irregulars the lowest. A comparison of our classifications with the Sersic index and restframe colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or are very faint in the V-band.

  14. Site classification of Indian strong motion network using response spectra ratios

    NASA Astrophysics Data System (ADS)

    Chopra, Sumer; Kumar, Vikas; Choudhury, Pallabee; Yadav, R. B. S.

    2018-03-01

    In the present study, we tried to classify the Indian strong motion sites spread all over Himalaya and adjoining region, located on varied geological formations, based on response spectral ratio. A total of 90 sites were classified based on 395 strong motion records from 94 earthquakes recorded at these sites. The magnitude of these earthquakes are between 2.3 and 7.7 and the hypocentral distance for most of the cases is less than 50 km. The predominant period obtained from response spectral ratios is used to classify these sites. It was found that the shape and predominant peaks of the spectra at these sites match with those in Japan, Italy, Iran, and at some of the sites in Europe and the same classification scheme can be applied to Indian strong motion network. We found that the earlier schemes based on description of near-surface geology, geomorphology, and topography were not able to capture the effect of sediment thickness. The sites are classified into seven classes (CL-I to CL-VII) with varying predominant periods and ranges as proposed by Alessandro et al. (Bull Seismol Soc Am 102:680-695 2012). The effect of magnitudes and hypocentral distances on the shape and predominant peaks were also studied and found to be very small. The classification scheme is robust and cost-effective and can be used in region-specific attenuation relationships for accounting local site effect.

  15. Reclassification: Rationale and Problems; Proceedings of a Conference on Reclassification held at the Center of Adult Education, University of Maryland, College Park, April 4 to 6, 1968.

    ERIC Educational Resources Information Center

    Perreault, Jean M., Ed.

    Several factors are involved in the decision to reclassify library collections and several problems and choices must be faced. The discussion of four classification schemes (Dewey Decimal, Library of Congress, Library of Congress subject-headings and Universal Decimal Classification) involved in the choices concerns their structure, currency,…

  16. Mandatory, Preferred, or Discretionary: How the Classification of Domestic Violence Warrantless Arrest Laws Impacts Their Estimated Effects on Intimate Partner Homicide

    ERIC Educational Resources Information Center

    Zeoli, April M.; Norris, Alexis; Brenner, Hannah

    2011-01-01

    Warrantless arrest laws for domestic violence (DV) are generally classified as discretionary, preferred, or mandatory, based on the level of power accorded to police in deciding whether to arrest. However, there is a lack of consensus in the literature regarding how each state's law should be categorized. Using three classification schemes, this…

  17. Formalizing Resources for Planning

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; McGann, Conor; Ramakrishnan, Sailesh

    2003-01-01

    In this paper we present a classification scheme which circumscribes a large class of resources found in the real world. Building on the work of others we also define key properties of resources that allow formal expression of the proposed classification. Furthermore, operations that change the state of a resource are formalized. Together, properties and operations go a long way in formalizing the representation and reasoning aspects of resources for planning.

  18. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    NASA Astrophysics Data System (ADS)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Harry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J.; Croton, Darren; Dahlen, Tomas; de Mello, Duilia F.; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Koekemoer, Anton; Liu, Nick; Lucas, Ray A.; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Trump, Jonathan; Weiner, Benjamin; Wuyts, Stijn; Inami, Hanae; Kassin, Susan; Lani, Caterina; Poole, Gregory B.; Rizer, Zachary

    2015-11-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H < 24.5 involving the dedicated efforts of over 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies spanning 0 < z < 4 over all the fields, with classifications from 3 to 5 independent classifiers for each galaxy. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, k-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed—GOODS-S, which has been classified at various depths. The wide area coverage spanning the full field (wide+deep+ERS) includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all of the visual classifications in GOODS-S along with the Perl/Tk GUI that we developed to classify galaxies. We present our initial results here, including an analysis of our internal consistency and comparisons among multiple classifiers as well as a comparison to the Sérsic index. We find that the level of agreement among classifiers is quite good (>70% across the full magnitude range) and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement (>50%) and irregulars the lowest (<10%). A comparison of our classifications with the Sérsic index and rest-frame colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or are very faint in the V-band.

  19. Control scheme for power modulation of a free piston Stirling engine

    DOEpatents

    Dhar, Manmohan

    1989-01-01

    The present invention relates to a control scheme for power modulation of a free-piston Stirling engine-linear alternator power generator system. The present invention includes connecting an autotransformer in series with a tuning capacitance between a linear alternator and a utility grid to maintain a constant displacement to piston stroke ratio and their relative phase angle over a wide range of operating conditions.

  20. Generalized interpretation scheme for arbitrary HR InSAR image pairs

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten

    2013-10-01

    Land cover classification of remote sensing imagery is an important topic of research. For example, different applications require precise and fast information about the land cover of the imaged scenery (e.g., disaster management and change detection). Focusing on high resolution (HR) spaceborne remote sensing imagery, the user has the choice between passive and active sensor systems. Passive systems, such as multispectral sensors, have the disadvantage of being dependent from weather influences (fog, dust, clouds, etc.) and time of day, since they work in the visible part of the electromagnetic spectrum. Here, active systems like Synthetic Aperture Radar (SAR) provide improved capabilities. As an interactive method analyzing HR InSAR image pairs, the CovAmCohTM method was introduced in former studies. CovAmCoh represents the joint analysis of locality (coefficient of variation - Cov), backscatter (amplitude - Am) and temporal stability (coherence - Coh). It delivers information on physical backscatter characteristics of imaged scene objects or structures and provides the opportunity to detect different classes of land cover (e.g., urban, rural, infrastructure and activity areas). As example, railway tracks are easily distinguishable from other infrastructure due to their characteristic bluish coloring caused by the gravel between the sleepers. In consequence, imaged objects or structures have a characteristic appearance in CovAmCoh images which allows the development of classification rules. In this paper, a generalized interpretation scheme for arbitrary InSAR image pairs using the CovAmCoh method is proposed. This scheme bases on analyzing the information content of typical CovAmCoh imagery using the semisupervised k-means clustering. It is shown that eight classes model the main local information content of CovAmCoh images sufficiently and can be used as basis for a classification scheme.

  1. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-07-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least 7 moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (>8°) to steep (>15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  2. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-12-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  3. Diagnosis of breast masses from dynamic contrast-enhanced and diffusion-weighted MR: a machine learning approach.

    PubMed

    Cai, Hongmin; Peng, Yanxia; Ou, Caiwen; Chen, Minsheng; Li, Li

    2014-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is increasingly used for breast cancer diagnosis as supplementary to conventional imaging techniques. Combining of diffusion-weighted imaging (DWI) of morphology and kinetic features from DCE-MRI to improve the discrimination power of malignant from benign breast masses is rarely reported. The study comprised of 234 female patients with 85 benign and 149 malignant lesions. Four distinct groups of features, coupling with pathological tests, were estimated to comprehensively characterize the pictorial properties of each lesion, which was obtained by a semi-automated segmentation method. Classical machine learning scheme including feature subset selection and various classification schemes were employed to build prognostic model, which served as a foundation for evaluating the combined effects of the multi-sided features for predicting of the types of lesions. Various measurements including cross validation and receiver operating characteristics were used to quantify the diagnostic performances of each feature as well as their combination. Seven features were all found to be statistically different between the malignant and the benign groups and their combination has achieved the highest classification accuracy. The seven features include one pathological variable of age, one morphological variable of slope, three texture features of entropy, inverse difference and information correlation, one kinetic feature of SER and one DWI feature of apparent diffusion coefficient (ADC). Together with the selected diagnostic features, various classical classification schemes were used to test their discrimination power through cross validation scheme. The averaged measurements of sensitivity, specificity, AUC and accuracy are 0.85, 0.89, 90.9% and 0.93, respectively. Multi-sided variables which characterize the morphological, kinetic, pathological properties and DWI measurement of ADC can dramatically improve the discriminatory power of breast lesions.

  4. A cancelable biometric scheme based on multi-lead ECGs.

    PubMed

    Peng-Tzu Chen; Shun-Chi Wu; Jui-Hsuan Hsieh

    2017-07-01

    Biometric technologies offer great advantages over other recognition methods, but there are concerns that they may compromise the privacy of individuals. In this paper, an electrocardiogram (ECG)-based cancelable biometric scheme is proposed to relieve such concerns. In this scheme, distinct biometric templates for a given beat bundle are constructed via "subspace collapsing." To determine the identity of any unknown beat bundle, the multiple signal classification (MUSIC) algorithm, incorporating a "suppression and poll" strategy, is adopted. Unlike the existing cancelable biometric schemes, knowledge of the distortion transform is not required for recognition. Experiments with real ECGs from 285 subjects are presented to illustrate the efficacy of the proposed scheme. The best recognition rate of 97.58 % was achieved under the test condition N train = 10 and N test = 10.

  5. Mechanization of Library Procedures in the Medium-Sized Medical Library: XIV. Correlations between National Library of Medicine Classification Numbers and MeSH Headings *

    PubMed Central

    Fenske, Ruth E.

    1972-01-01

    The purpose of this study was to determine the amount of correlation between National Library of Medicine classification numbers and MeSH headings in a body of cataloging which had already been done and then to find out which of two alternative methods of utilizing the correlation would be best. There was a correlation of 44.5% between classification numbers and subject headings in the data base studied, cataloging data covering 8,137 books. The results indicate that a subject heading index showing classification numbers would be the preferred method of utilization, because it would be more accurate than the alternative considered, an arrangement by classification numbers which would be consulted to obtain subject headings. PMID:16017607

  6. Classification image analysis: estimation and statistical inference for two-alternative forced-choice experiments

    NASA Technical Reports Server (NTRS)

    Abbey, Craig K.; Eckstein, Miguel P.

    2002-01-01

    We consider estimation and statistical hypothesis testing on classification images obtained from the two-alternative forced-choice experimental paradigm. We begin with a probabilistic model of task performance for simple forced-choice detection and discrimination tasks. Particular attention is paid to general linear filter models because these models lead to a direct interpretation of the classification image as an estimate of the filter weights. We then describe an estimation procedure for obtaining classification images from observer data. A number of statistical tests are presented for testing various hypotheses from classification images based on some more compact set of features derived from them. As an example of how the methods we describe can be used, we present a case study investigating detection of a Gaussian bump profile.

  7. A proposed radiographic classification scheme for congenital thoracic vertebral malformations in brachycephalic "screw-tailed" dog breeds.

    PubMed

    Gutierrez-Quintana, Rodrigo; Guevar, Julien; Stalin, Catherine; Faller, Kiterie; Yeamans, Carmen; Penderis, Jacques

    2014-01-01

    Congenital vertebral malformations are common in brachycephalic "screw-tailed" dog breeds such as French bulldogs, English bulldogs, Boston terriers, and pugs. The aim of this retrospective study was to determine whether a radiographic classification scheme developed for use in humans would be feasible for use in these dog breeds. Inclusion criteria were hospital admission between September 2009 and April 2013, neurologic examination findings available, diagnostic quality lateral and ventro-dorsal digital radiographs of the thoracic vertebral column, and at least one congenital vertebral malformation. Radiographs were retrieved and interpreted by two observers who were unaware of neurologic status. Vertebral malformations were classified based on a classification scheme modified from a previous human study and a consensus of both observers. Twenty-eight dogs met inclusion criteria (12 with neurologic deficits, 16 with no neurologic deficits). Congenital vertebral malformations affected 85/362 (23.5%) of thoracic vertebrae. Vertebral body formation defects were the most common (butterfly vertebrae 6.6%, ventral wedge-shaped vertebrae 5.5%, dorsal hemivertebrae 0.8%, and dorso-lateral hemivertebrae 0.5%). No lateral hemivertebrae or lateral wedge-shaped vertebrae were identified. The T7 vertebra was the most commonly affected (11/28 dogs), followed by T8 (8/28 dogs) and T12 (8/28 dogs). The number and type of vertebral malformations differed between groups (P = 0.01). Based on MRI, dorsal, and dorso-lateral hemivertebrae were the cause of spinal cord compression in 5/12 (41.6%) of dogs with neurologic deficits. Findings indicated that a modified human radiographic classification system of vertebral malformations is feasible for use in future studies of brachycephalic "screw-tailed" dogs. © 2014 American College of Veterinary Radiology.

  8. Automated Classification of Thermal Infrared Spectra Using Self-organizing Maps

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; Hogan, Robert

    2006-01-01

    Existing and planned space missions to a variety of planetary and satellite surfaces produce an ever increasing volume of spectral data. Understanding the scientific informational content in this large data volume is a daunting task. Fortunately various statistical approaches are available to assess such data sets. Here we discuss an automated classification scheme based on Kohonen Self-organizing maps (SOM) we have developed. The SUM process produces an output layer were spectra having similar properties lie in close proximity to each other. One major effort is partitioning this output layer into appropriate regions. This is prefonned by defining dosed regions based upon the strength of the boundaries between adjacent cells in the SOM output layer. We use the Davies-Bouldin index as a measure of the inter-class similarities and intra-class dissimilarities that determines the optimum partition of the output layer, and hence number of SOM clusters. This allows us to identify the natural number of clusters formed from the spectral data. Mineral spectral libraries prepared at Arizona State University (ASU) and John Hopkins University (JHU) are used to test and evaluate the classification scheme. We label the library sample spectra in a hierarchical scheme with class, subclass, and mineral group names. We use a portion of the spectra to train the SOM, i.e. produce the output layer, while the remaining spectra are used to test the SOM. The test spectra are presented to the SOM output layer and assigned membership to the appropriate cluster. We then evaluate these assignments to assess the scientific meaning and accuracy of the derived SOM classes as they relate to the labels. We demonstrate that unsupervised classification by SOMs can be a useful component in autonomous systems designed to identify mineral species from reflectance and emissivity spectra in the therrnal IR.

  9. Alternative Path Communication in Wide-Scale Cluster-Tree Wireless Sensor Networks Using Inactive Periods

    PubMed Central

    Leão, Erico; Montez, Carlos; Moraes, Ricardo; Portugal, Paulo; Vasques, Francisco

    2017-01-01

    The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable technology to deploy wide-scale Wireless Sensor Networks (WSNs). These networks are usually designed to support convergecast traffic, where all communication paths go through the PAN (Personal Area Network) coordinator. Nevertheless, peer-to-peer communication relationships may be also required for different types of WSN applications. That is the typical case of sensor and actuator networks, where local control loops must be closed using a reduced number of communication hops. The use of communication schemes optimised just for the support of convergecast traffic may result in higher network congestion and in a potentially higher number of communication hops. Within this context, this paper proposes an Alternative-Route Definition (ARounD) communication scheme for WSNs. The underlying idea of ARounD is to setup alternative communication paths between specific source and destination nodes, avoiding congested cluster-tree paths. These alternative paths consider shorter inter-cluster paths, using a set of intermediate nodes to relay messages during their inactive periods in the cluster-tree network. Simulation results show that the ARounD communication scheme can significantly decrease the end-to-end communication delay, when compared to the use of standard cluster-tree communication schemes. Moreover, the ARounD communication scheme is able to reduce the network congestion around the PAN coordinator, enabling the reduction of the number of message drops due to queue overflows in the cluster-tree network. PMID:28481245

  10. Alternative Path Communication in Wide-Scale Cluster-Tree Wireless Sensor Networks Using Inactive Periods.

    PubMed

    Leão, Erico; Montez, Carlos; Moraes, Ricardo; Portugal, Paulo; Vasques, Francisco

    2017-05-06

    The IEEE 802.15.4/ZigBee cluster-tree topology is a suitable technology to deploy wide-scale Wireless Sensor Networks (WSNs). These networks are usually designed to support convergecast traffic, where all communication paths go through the PAN (Personal Area Network) coordinator. Nevertheless, peer-to-peer communication relationships may be also required for different types of WSN applications. That is the typical case of sensor and actuator networks, where local control loops must be closed using a reduced number of communication hops. The use of communication schemes optimised just for the support of convergecast traffic may result in higher network congestion and in a potentially higher number of communication hops. Within this context, this paper proposes an Alternative-Route Definition (ARounD) communication scheme for WSNs. The underlying idea of ARounD is to setup alternative communication paths between specific source and destination nodes, avoiding congested cluster-tree paths. These alternative paths consider shorter inter-cluster paths, using a set of intermediate nodes to relay messages during their inactive periods in the cluster-tree network. Simulation results show that the ARounD communication scheme can significantly decrease the end-to-end communication delay, when compared to the use of standard cluster-tree communication schemes. Moreover, the ARounD communication scheme is able to reduce the network congestion around the PAN coordinator, enabling the reduction of the number of message drops due to queue overflows in the cluster-tree network.

  11. Computer-aided detection and diagnosis of masses and clustered microcalcifications from digital mammograms

    NASA Astrophysics Data System (ADS)

    Nishikawa, Robert M.; Giger, Maryellen L.; Doi, Kunio; Vyborny, Carl J.; Schmidt, Robert A.; Metz, Charles E.; Wu, Chris Y.; Yin, Fang-Fang; Jiang, Yulei; Huo, Zhimin; Lu, Ping; Zhang, Wei; Ema, Takahiro; Bick, Ulrich; Papaioannou, John; Nagel, Rufus H.

    1993-07-01

    We are developing an 'intelligent' workstation to assist radiologists in diagnosing breast cancer from mammograms. The hardware for the workstation will consist of a film digitizer, a high speed computer, a large volume storage device, a film printer, and 4 high resolution CRT monitors. The software for the workstation is a comprehensive package of automated detection and classification schemes. Two rule-based detection schemes have been developed, one for breast masses and the other for clustered microcalcifications. The sensitivity of both schemes is 85% with a false-positive rate of approximately 3.0 and 1.5 false detections per image, for the mass and cluster detection schemes, respectively. Computerized classification is performed by an artificial neural network (ANN). The ANN has a sensitivity of 100% with a specificity of 60%. Currently, the ANN, which is a three-layer, feed-forward network, requires as input ratings of 14 different radiographic features of the mammogram that were determined subjectively by a radiologist. We are in the process of developing automated techniques to objectively determine these 14 features. The workstation will be placed in the clinical reading area of the radiology department in the near future, where controlled clinical tests will be performed to measure its efficacy.

  12. A novel fractal image compression scheme with block classification and sorting based on Pearson's correlation coefficient.

    PubMed

    Wang, Jianji; Zheng, Nanning

    2013-09-01

    Fractal image compression (FIC) is an image coding technology based on the local similarity of image structure. It is widely used in many fields such as image retrieval, image denoising, image authentication, and encryption. FIC, however, suffers from the high computational complexity in encoding. Although many schemes are published to speed up encoding, they do not easily satisfy the encoding time or the reconstructed image quality requirements. In this paper, a new FIC scheme is proposed based on the fact that the affine similarity between two blocks in FIC is equivalent to the absolute value of Pearson's correlation coefficient (APCC) between them. First, all blocks in the range and domain pools are chosen and classified using an APCC-based block classification method to increase the matching probability. Second, by sorting the domain blocks with respect to APCCs between these domain blocks and a preset block in each class, the matching domain block for a range block can be searched in the selected domain set in which these APCCs are closer to APCC between the range block and the preset block. Experimental results show that the proposed scheme can significantly speed up the encoding process in FIC while preserving the reconstructed image quality well.

  13. Infant Mortality: Development of a Proposed Update to the Dollfus Classification of Infant Deaths

    PubMed Central

    Dove, Melanie S.; Minnal, Archana; Damesyn, Mark; Curtis, Michael P.

    2015-01-01

    Objective Identifying infant deaths with common underlying causes and potential intervention points is critical to infant mortality surveillance and the development of prevention strategies. We constructed an International Classification of Diseases 10th Revision (ICD-10) parallel to the Dollfus cause-of-death classification scheme first published in 1990, which organized infant deaths by etiology and their amenability to prevention efforts. Methods Infant death records for 1996, dual-coded to the ICD Ninth Revision (ICD-9) and ICD-10, were obtained from the CDC public-use multiple-cause-of-death file on comparability between ICD-9 and ICD-10. We used the underlying cause of death to group 27,821 infant deaths into the nine categories of the ICD-9-based update to Dollfus' original coding scheme, published by Sowards in 1999. Comparability ratios were computed to measure concordance between ICD versions. Results The Dollfus classification system updated with ICD-10 codes had limited agreement with the 1999 modified classification system. Although prematurity, congenital malformations, Sudden Infant Death Syndrome, and obstetric conditions were the first through fourth most common causes of infant death under both systems, most comparability ratios were significantly different from one system to the other. Conclusion The Dollfus classification system can be adapted for use with ICD-10 codes to create a comprehensive, etiology-based profile of infant deaths. The potential benefits of using Dollfus logic to guide perinatal mortality reduction strategies, particularly to maternal and child health programs and other initiatives focused on improving infant health, warrant further examination of this method's use in perinatal mortality surveillance. PMID:26556935

  14. Classification of gravity-flow deposits and their significance for unconventional petroleum exploration, with a case study from the Triassic Yanchang Formation (southern Ordos Basin, China)

    NASA Astrophysics Data System (ADS)

    Fan, Aiping; Yang, Renchao; (Tom) van Loon, A. J.; Yin, Wei; Han, Zuozhen; Zavala, Carlos

    2018-08-01

    The ongoing exploration for shale oil and gas has focused sedimentological research on the transport and deposition mechanisms of fine-grained sediments, and more specifically on fine-grained mass-flow deposits. It appears, however, that no easily applicable classification scheme for gravity-flow deposits exists, and that such classifications almost exclusively deal with sandy and coarser sediments. Since the lack of a good classification system for fine-grained gravity flow deposits hampers scientific communication and understanding, we propose a classification scheme on the basis of the mud content in combination with the presumed transport mechanism. This results in twelve types of gravity-flow deposits. In order to show the practical applicability of this classification system, we apply it to the Triassic lacustrine Yanchang Formation in the southern Ordos Basin (China), which contains numerous slumps, debris-flows deposits, turbidites and hyperpycnites. The slumps and debrites occur mostly close to a delta front, and the turbidites and hyperpycnites extend over large areas from the delta slopes into the basin plain. The case study shows that (1) mud cannot only be transported but also deposited under active hydrodynamic conditions; (2) fine-grained gravity-flow constitute a significant part of the lacustrine mudstones and shales; (3) muddy gravity flows are important for the transport and deposition of clastic particles, clay minerals and organic matter, and thus are important mechanisms involved in the generation of hydrocarbons, also largely determining the reservoir capability for unconventional petroleum.

  15. Functional Basis of Microorganism Classification.

    PubMed

    Zhu, Chengsheng; Delmont, Tom O; Vogel, Timothy M; Bromberg, Yana

    2015-08-01

    Correctly identifying nearest "neighbors" of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned with phylogenetic descent.

  16. Treatment outcomes of saddle nose correction.

    PubMed

    Hyun, Sang Min; Jang, Yong Ju

    2013-01-01

    Many valuable classification schemes for saddle nose have been suggested that integrate clinical deformity and treatment; however, there is no consensus regarding the most suitable classification and surgical method for saddle nose correction. To present clinical characteristics and treatment outcome of saddle nose deformity and to propose a modified classification system to better characterize the variety of different saddle nose deformities. The retrospective study included 91 patients who underwent rhinoplasty for correction of saddle nose from April 1, 2003, through December 31, 2011, with a minimum follow-up of 8 months. Saddle nose was classified into 4 types according to a modified classification. Aesthetic outcomes were classified as excellent, good, fair, or poor. Patients underwent minor cosmetic concealment by dorsal augmentation (n = 8) or major septal reconstruction combined with dorsal augmentation (n = 83). Autologous costal cartilages were used in 40 patients (44%), and homologous costal cartilages were used in 5 patients (6%). According to postoperative assessment, 29 patients had excellent, 42 patients had good, 18 patients had fair, and 2 patients had poor aesthetic outcomes. No statistical difference in surgical outcome according to saddle nose classification was observed. Eight patients underwent revision rhinoplasty, owing to recurrence of saddle, wound infection, or warping of the costal cartilage for dorsal augmentation. We introduce a modified saddle nose classification scheme that is simpler and better able to characterize different deformities. Among 91 patients with saddle nose, 20 (22%) had unsuccessful outcomes (fair or poor) and 8 (9%) underwent subsequent revision rhinoplasty. Thus, management of saddle nose deformities remains challenging. 4.

  17. Functional Basis of Microorganism Classification

    PubMed Central

    Zhu, Chengsheng; Delmont, Tom O.; Vogel, Timothy M.; Bromberg, Yana

    2015-01-01

    Correctly identifying nearest “neighbors” of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned with phylogenetic descent. PMID:26317871

  18. Changing Patient Classification System for Hospital Reimbursement in Romania

    PubMed Central

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-01-01

    Aim To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Methods Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). Results The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians’ knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Conclusion Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care. PMID:20564769

  19. Changing patient classification system for hospital reimbursement in Romania.

    PubMed

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-06-01

    To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians' knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case-mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case-mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care.

  20. What Is the Reference? An Examination of Alternatives to the Reference Sources Used in IES TM-30-15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royer, Michael P.

    A study was undertaken to document the role of the reference illuminant in the IES TM-30-15 method for evaluating color rendition. TM-30-15 relies on a relative reference scheme; that is, the reference illuminant and test source always have the same correlated color temperature (CCT). The reference illuminant is a Planckian radiator, model of daylight, or combination of those two, depending on the exact CCT of the test source. Three alternative reference schemes were considered: 1) either using all Planckian radiators or all daylight models; 2) using only one of ten possible illuminants (Planckian, daylight, or equal energy), regardless of themore » CCT of the test source; 3) using an off-Planckian reference illuminant (i.e., a source with a negative Duv). No reference scheme is inherently superior to another, with differences in metric values largely a result of small differences in gamut shape of the reference alternatives. While using any of the alternative schemes is more reasonable in the TM-30-15 evaluation framework than it was with the CIE CRI framework, the differences still ultimately manifest only as changes in interpretation of the results. References are employed in color rendering measures to provide a familiar point of comparison, not to establish an ideal source.« less

  1. In-vivo determination of chewing patterns using FBG and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Pegorini, Vinicius; Zen Karam, Leandro; Rocha Pitta, Christiano S.; Ribeiro, Richardson; Simioni Assmann, Tangriani; Cardozo da Silva, Jean Carlos; Bertotti, Fábio L.; Kalinowski, Hypolito J.; Cardoso, Rafael

    2015-09-01

    This paper reports the process of pattern classification of the chewing process of ruminants. We propose a simplified signal processing scheme for optical fiber Bragg grating (FBG) sensors based on machine learning techniques. The FBG sensors measure the biomechanical forces during jaw movements and an artificial neural network is responsible for the classification of the associated chewing pattern. In this study, three patterns associated to dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior studies were monitored, rumination and idle period. Experimental results show that the proposed approach for pattern classification has been capable of differentiating the materials involved in the chewing process with a small classification error.

  2. Automated source classification of new transient sources

    NASA Astrophysics Data System (ADS)

    Oertel, M.; Kreikenbohm, A.; Wilms, J.; DeLuca, A.

    2017-10-01

    The EXTraS project harvests the hitherto unexplored temporal domain information buried in the serendipitous data collected by the European Photon Imaging Camera (EPIC) onboard the ESA XMM-Newton mission since its launch. This includes a search for fast transients, missed by standard image analysis, and a search and characterization of variability in hundreds of thousands of sources. We present an automated classification scheme for new transient sources in the EXTraS project. The method is as follows: source classification features of a training sample are used to train machine learning algorithms (performed in R; randomForest (Breiman, 2001) in supervised mode) which are then tested on a sample of known source classes and used for classification.

  3. Development of a Procurement Task Classification Scheme.

    DTIC Science & Technology

    1987-12-01

    Office of Sci- entific Research, Arlington, Virginia, January 1970. Tornow , Walter W . and Pinto, Patrick R. "The Development of a Man- agerial Job...classification. [Ref. 4:271 -. 20 6 %° w Numerical taxonomy proponents hold [Ref. 4:271, ... that the relationships of contiguity and similarity should be...solving. 22 W i * These primitive categories are based on a sorting of learning pro- cesses into classes that have obvious differences at the

  4. USCS and the USDA Soil Classification System: Development of a Mapping Scheme

    DTIC Science & Technology

    2015-03-01

    important to human daily living. A variety of disciplines (geology, agriculture, engineering, etc.) require a sys- tematic categorization of soil, detailing...it is often important to also con- sider parameters that indicate soil strength. Two important properties used for engineering-related problems are...that many textural clas- sification systems were developed to meet specifics needs. In agriculture, textural classification is used to determine crop

  5. Revealing how different spinors can be: The Lounesto spinor classification

    NASA Astrophysics Data System (ADS)

    Hoff da Silva, J. M.; Cavalcanti, R. T.

    2017-11-01

    This paper aims to give a coordinate-based introduction to the so-called Lounesto spinorial classification scheme. Among other results, it has evinced classes of spinors which fail to satisfy Dirac equation. The underlying idea and the central aspects of such spinorial categorization are introduced in an argumentative basis, after which we delve into a commented account on recent results obtained from (and within) this branch of research.

  6. Classification and overview of research in real-time imaging

    NASA Astrophysics Data System (ADS)

    Sinha, Purnendu; Gorinsky, Sergey V.; Laplante, Phillip A.; Stoyenko, Alexander D.; Marlowe, Thomas J.

    1996-10-01

    Real-time imaging has application in areas such as multimedia, virtual reality, medical imaging, and remote sensing and control. Recently, the imaging community has witnessed a tremendous growth in research and new ideas in these areas. To lend structure to this growth, we outline a classification scheme and provide an overview of current research in real-time imaging. For convenience, we have categorized references by research area and application.

  7. Use of circulation types classifications to evaluate AR4 climate models over the Euro-Atlantic region

    NASA Astrophysics Data System (ADS)

    Pastor, M. A.; Casado, M. J.

    2012-10-01

    This paper presents an evaluation of the multi-model simulations for the 4th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) in terms of their ability to simulate the ERA40 circulation types over the Euro-Atlantic region in winter season. Two classification schemes, k-means and SANDRA, have been considered to test the sensitivity of the evaluation results to the classification procedure. The assessment allows establishing different rankings attending spatial and temporal features of the circulation types. Regarding temporal characteristics, in general, all AR4 models tend to underestimate the frequency of occurrence. The best model simulating spatial characteristics is the UKMO-HadGEM1 whereas CCSM3, UKMO-HadGEM1 and CGCM3.1(T63) are the best simulating the temporal features, for both classification schemes. This result agrees with the AR4 models ranking obtained when having analysed the ability of the same AR4 models to simulate Euro-Atlantic variability modes. This study has proved the utility of applying such a synoptic climatology approach as a diagnostic tool for models' assessment. The ability of the models to properly reproduce the position of ridges and troughs and the frequency of synoptic patterns, will therefore improve our confidence in the response of models to future climate changes.

  8. An artificial intelligence based improved classification of two-phase flow patterns with feature extracted from acquired images.

    PubMed

    Shanthi, C; Pappa, N

    2017-05-01

    Flow pattern recognition is necessary to select design equations for finding operating details of the process and to perform computational simulations. Visual image processing can be used to automate the interpretation of patterns in two-phase flow. In this paper, an attempt has been made to improve the classification accuracy of the flow pattern of gas/ liquid two- phase flow using fuzzy logic and Support Vector Machine (SVM) with Principal Component Analysis (PCA). The videos of six different types of flow patterns namely, annular flow, bubble flow, churn flow, plug flow, slug flow and stratified flow are recorded for a period and converted to 2D images for processing. The textural and shape features extracted using image processing are applied as inputs to various classification schemes namely fuzzy logic, SVM and SVM with PCA in order to identify the type of flow pattern. The results obtained are compared and it is observed that SVM with features reduced using PCA gives the better classification accuracy and computationally less intensive than other two existing schemes. This study results cover industrial application needs including oil and gas and any other gas-liquid two-phase flows. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. The Adam Walsh Act: An Examination of Sex Offender Risk Classification Systems.

    PubMed

    Zgoba, Kristen M; Miner, Michael; Levenson, Jill; Knight, Raymond; Letourneau, Elizabeth; Thornton, David

    2016-12-01

    This study was designed to compare the Adam Walsh Act (AWA) classification tiers with actuarial risk assessment instruments and existing state classification schemes in their respective abilities to identify sex offenders at high risk to re-offend. Data from 1,789 adult sex offenders released from prison in four states were collected (Minnesota, New Jersey, Florida, and South Carolina). On average, the sexual recidivism rate was approximately 5% at 5 years and 10% at 10 years. AWA Tier 2 offenders had higher Static-99R scores and higher recidivism rates than Tier 3 offenders, and in Florida, these inverse correlations were statistically significant. Actuarial measures and existing state tier systems, in contrast, did a better job of identifying high-risk offenders and recidivists. As well, we examined the distribution of risk assessment scores within and across tier categories, finding that a majority of sex offenders fall into AWA Tier 3, but more than half score low or moderately low on the Static-99R. The results indicate that the AWA sex offender classification scheme is a poor indicator of relative risk and is likely to result in a system that is less effective in protecting the public than those currently implemented in the states studied. © The Author(s) 2015.

  10. The Libraries of Rio.

    ERIC Educational Resources Information Center

    Foster, Barbara

    1988-01-01

    Describes aspects of several libraries in Rio de Janeiro. Topics covered include library policies, budgets, periodicals and books in the collections, classification schemes used, and literary areas of interest to patrons. (6 references) (CLB)

  11. Hydrometeorological application of an extratropical cyclone classification scheme in the southern United States

    NASA Astrophysics Data System (ADS)

    Senkbeil, J. C.; Brommer, D. M.; Comstock, I. J.; Loyd, T.

    2012-07-01

    Extratropical cyclones (ETCs) in the southern United States are often overlooked when compared with tropical cyclones in the region and ETCs in the northern United States. Although southern ETCs are significant weather events, there is currently not an operational scheme used for identifying and discussing these nameless storms. In this research, we classified 84 ETCs (1970-2009). We manually identified five distinct formation regions and seven unique ETC types using statistical classification. Statistical classification employed the use of principal components analysis and two methods of cluster analysis. Both manual and statistical storm types generally showed positive (negative) relationships with El Niño (La Niña). Manual storm types displayed precipitation swaths consistent with discrete storm tracks which further legitimizes the existence of multiple modes of southern ETCs. Statistical storm types also displayed unique precipitation intensity swaths, but these swaths were less indicative of track location. It is hoped that by classifying southern ETCs into types, that forecasters, hydrologists, and broadcast meteorologists might be able to better anticipate projected amounts of precipitation at their locations.

  12. Relevance popularity: A term event model based feature selection scheme for text classification.

    PubMed

    Feng, Guozhong; An, Baiguo; Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods.

  13. Fuzzy Classification of Ocean Color Satellite Data for Bio-optical Algorithm Constituent Retrievals

    NASA Technical Reports Server (NTRS)

    Campbell, Janet W.

    1998-01-01

    The ocean has been traditionally viewed as a 2 class system. Morel and Prieur (1977) classified ocean water according to the dominant absorbent particle suspended in the water column. Case 1 is described as having a high concentration of phytoplankton (and detritus) relative to other particles. Conversely, case 2 is described as having inorganic particles such as suspended sediments in high concentrations. Little work has gone into the problem of mixing bio-optical models for these different water types. An approach is put forth here to blend bio-optical algorithms based on a fuzzy classification scheme. This scheme involves two procedures. First, a clustering procedure identifies classes and builds class statistics from in-situ optical measurements. Next, a classification procedure assigns satellite pixels partial memberships to these classes based on their ocean color reflectance signature. These membership assignments can be used as the basis for a weighting retrievals from class-specific bio-optical algorithms. This technique is demonstrated with in-situ optical measurements and an image from the SeaWiFS ocean color satellite.

  14. ULTRA-SHARP nonoscillatory convection schemes for high-speed steady multidimensional flow

    NASA Technical Reports Server (NTRS)

    Leonard, B. P.; Mokhtari, Simin

    1990-01-01

    For convection-dominated flows, classical second-order methods are notoriously oscillatory and often unstable. For this reason, many computational fluid dynamicists have adopted various forms of (inherently stable) first-order upwinding over the past few decades. Although it is now well known that first-order convection schemes suffer from serious inaccuracies attributable to artificial viscosity or numerical diffusion under high convection conditions, these methods continue to enjoy widespread popularity for numerical heat transfer calculations, apparently due to a perceived lack of viable high accuracy alternatives. But alternatives are available. For example, nonoscillatory methods used in gasdynamics, including currently popular TVD schemes, can be easily adapted to multidimensional incompressible flow and convective transport. This, in itself, would be a major advance for numerical convective heat transfer, for example. But, as is shown, second-order TVD schemes form only a small, overly restrictive, subclass of a much more universal, and extremely simple, nonoscillatory flux-limiting strategy which can be applied to convection schemes of arbitrarily high order accuracy, while requiring only a simple tridiagonal ADI line-solver, as used in the majority of general purpose iterative codes for incompressible flow and numerical heat transfer. The new universal limiter and associated solution procedures form the so-called ULTRA-SHARP alternative for high resolution nonoscillatory multidimensional steady state high speed convective modelling.

  15. A back-fitting algorithm to improve real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaojing; Liu, Pan; Cheng, Lei; Liu, Zhangjun; Zhao, Yan

    2018-07-01

    Real-time flood forecasting is important for decision-making with regards to flood control and disaster reduction. The conventional approach involves a postprocessor calibration strategy that first calibrates the hydrological model and then estimates errors. This procedure can simulate streamflow consistent with observations, but obtained parameters are not optimal. Joint calibration strategies address this issue by refining hydrological model parameters jointly with the autoregressive (AR) model. In this study, five alternative schemes are used to forecast floods. Scheme I uses only the hydrological model, while scheme II includes an AR model for error correction. In scheme III, differencing is used to remove non-stationarity in the error series. A joint inference strategy employed in scheme IV calibrates the hydrological and AR models simultaneously. The back-fitting algorithm, a basic approach for training an additive model, is adopted in scheme V to alternately recalibrate hydrological and AR model parameters. The performance of the five schemes is compared with a case study of 15 recorded flood events from China's Baiyunshan reservoir basin. Our results show that (1) schemes IV and V outperform scheme III during the calibration and validation periods and (2) scheme V is inferior to scheme IV in the calibration period, but provides better results in the validation period. Joint calibration strategies can therefore improve the accuracy of flood forecasting. Additionally, the back-fitting recalibration strategy produces weaker overcorrection and a more robust performance compared with the joint inference strategy.

  16. Evaluation of management measures of software development. Volume 1: Analysis summary

    NASA Technical Reports Server (NTRS)

    Page, J.; Card, D.; Mcgarry, F.

    1982-01-01

    The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.

  17. SVM feature selection based rotation forest ensemble classifiers to improve computer-aided diagnosis of Parkinson disease.

    PubMed

    Ozcift, Akin

    2012-08-01

    Parkinson disease (PD) is an age-related deterioration of certain nerve systems, which affects movement, balance, and muscle control of clients. PD is one of the common diseases which affect 1% of people older than 60 years. A new classification scheme based on support vector machine (SVM) selected features to train rotation forest (RF) ensemble classifiers is presented for improving diagnosis of PD. The dataset contains records of voice measurements from 31 people, 23 with PD and each record in the dataset is defined with 22 features. The diagnosis model first makes use of a linear SVM to select ten most relevant features from 22. As a second step of the classification model, six different classifiers are trained with the subset of features. Subsequently, at the third step, the accuracies of classifiers are improved by the utilization of RF ensemble classification strategy. The results of the experiments are evaluated using three metrics; classification accuracy (ACC), Kappa Error (KE) and Area under the Receiver Operating Characteristic (ROC) Curve (AUC). Performance measures of two base classifiers, i.e. KStar and IBk, demonstrated an apparent increase in PD diagnosis accuracy compared to similar studies in literature. After all, application of RF ensemble classification scheme improved PD diagnosis in 5 of 6 classifiers significantly. We, numerically, obtained about 97% accuracy in RF ensemble of IBk (a K-Nearest Neighbor variant) algorithm, which is a quite high performance for Parkinson disease diagnosis.

  18. Synthesis and size classification of metal oxide nanoparticles for biomedical applications

    NASA Astrophysics Data System (ADS)

    Atsumi, Takashi; Jeyadevan, Balachandran; Sato, Yoshinori; Tamura, Kazuchika; Aiba, Setsuya; Tohji, Kazuyuki

    2004-12-01

    Magnetic nanoparticles are considered for biomedical applications, such as the medium in magnetic resonance imaging, hyperthermia, drug delivery, and for the purification or classification of DNA or virus. The performance of magnetic nanoparticles in biomedical application such as hyperthermia depends very much on the magnetic properties, size and size distribution. We briefly described the basic idea behind their use in drug delivery, magnetic separation and hyperthermia and discussed the prerequisite properties magnetic particles for biomedical applications. Finally reported the synthesis and classification scheme to prepare magnetite (Fe3O4) nanoparticles with narrow size distribution for magnetic fluid hyperthermia.

  19. Development and application of operational techniques for the inventory and monitoring of resources and uses for the Texas coastal zone

    NASA Technical Reports Server (NTRS)

    Harwood, P. (Principal Investigator); Finley, R.; Mcculloch, S.; Marphy, D.; Hupp, B.

    1976-01-01

    The author has identified the following significant results. Image interpretation mapping techniques were successfully applied to test site 5, an area with a semi-arid climate. The land cover/land use classification required further modification. A new program, HGROUP, added to the ADP classification schedule provides a convenient method for examining the spectral similarity between classes. This capability greatly simplifies the task of combining 25-30 unsupervised subclasses into about 15 major classes that approximately correspond to the land use/land cover classification scheme.

  20. Significance of clustering and classification applications in digital and physical libraries

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios

    2015-02-01

    Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.

  1. 46 CFR 126.235 - Alternate compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... purposes of this section, a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-5212.... Approved classification society rules and supplements are incorporated by reference into 46 CFR 8.110(b...

  2. 46 CFR 126.235 - Alternate compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... purposes of this section, a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-5212.... Approved classification society rules and supplements are incorporated by reference into 46 CFR 8.110(b...

  3. 46 CFR 91.15-5 - Alternate compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... this section, a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100.... Approved classification society rules and supplements are incorporated by reference into 46 CFR 8.110(b...

  4. 46 CFR 91.15-5 - Alternate compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... this section, a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100.... Approved classification society rules and supplements are incorporated by reference into 46 CFR 8.110(b...

  5. The AJCC 8th Edition Staging System for Soft Tissue Sarcoma of the Extremities or Trunk: A Cohort Study of the SEER Database.

    PubMed

    Cates, Justin M M

    2018-02-01

    Background: The AJCC recently published the 8th edition of its cancer staging system. Significant changes were made to the staging algorithm for soft tissue sarcoma (STS) of the extremities or trunk, including the addition of 2 additional T (size) classifications in lieu of tumor depth and grouping lymph node metastasis (LNM) with distant metastasis as stage IV disease. Whether these changes improve staging system performance is questionable. Patients and Methods: This retrospective cohort analysis of 21,396 adult patients with STS of the extremity or trunk in the SEER database compares the AJCC 8th edition staging system with the 7th edition and a newly proposed staging algorithm using a variety of statistical techniques. The effect of tumor size on disease-specific survival was assessed by flexible, nonlinear Cox proportional hazard regression using restricted cubic splines and fractional polynomials. Results: The slope of covariate-adjusted log hazards for sarcoma-specific survival decreases for tumors >8 cm in greatest dimension, limiting prognostic information contributed by the new T4 classification in the AJCC 8th edition. Anatomic depth independently provides significant prognostic information. LNM is not equivalent to distant, non-nodal metastasis. Based on these findings, an alternative staging system is proposed and demonstrated to outperform both AJCC staging schemes. The analyses presented also disclose no evidence of improved clinical performance of the 8th edition compared with the previous edition. Conclusions: The AJCC 8th edition staging system for STS is no better than the previous 7th edition. Instead, a proposed staging system based on histologic grade, tumor size, and anatomic depth shows significantly higher predictive accuracy, with higher model concordance than either AJCC staging system. Changes to existing staging systems should improve the performance of prognostic models. Until such improvements are documented, AJCC committees should refrain from modifying established staging schemes. Copyright © 2018 by the National Comprehensive Cancer Network.

  6. Automated connectionist-geostatistical classification as an approach to identify sea ice and land ice types, properties and provinces

    NASA Astrophysics Data System (ADS)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Trantow, T.; Hunke, E. C.; Maslanik, J. A.; Crocker, R. I.

    2016-12-01

    An important problem in model-data comparison is the identification of parameters that can be extracted from observational data as well as used in numerical models, which are typically based on idealized physical processes. Here, we present a suite of approaches to characterization and classification of sea ice and land ice types, properties and provinces based on several types of remote-sensing data. Applications will be given to not only illustrate the approach, but employ it in model evaluation and understanding of physical processes. (1) In a geostatistical characterization, spatial sea-ice properties in the Chukchi and Beaufort Sea and in Elsoon Lagoon are derived from analysis of RADARSAT and ERS-2 SAR data. (2) The analysis is taken further by utilizing multi-parameter feature vectors as inputs for unsupervised and supervised statistical classification, which facilitates classification of different sea-ice types. (3) Characteristic sea-ice parameters, as resultant from the classification, can then be applied in model evaluation, as demonstrated for the ridging scheme of the Los Alamos sea ice model, CICE, using high-resolution altimeter and image data collected from unmanned aircraft over Fram Strait during the Characterization of Arctic Sea Ice Experiment (CASIE). The characteristic parameters chosen in this application are directly related to deformation processes, which also underly the ridging scheme. (4) The method that is capable of the most complex classification tasks is the connectionist-geostatistical classification method. This approach has been developed to identify currently up to 18 different crevasse types in order to map progression of the surge through the complex Bering-Bagley Glacier System, Alaska, in 2011-2014. The analysis utilizes airborne altimeter data and video image data and satellite image data. Results of the crevasse classification are compare to fracture modeling and found to match.

  7. Optimization of breast mass classification using sequential forward floating selection (SFFS) and a support vector machine (SVM) model

    PubMed Central

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-01-01

    Purpose: Improving radiologists’ performance in classification between malignant and benign breast lesions is important to increase cancer detection sensitivity and reduce false-positive recalls. For this purpose, developing computer-aided diagnosis (CAD) schemes has been attracting research interest in recent years. In this study, we investigated a new feature selection method for the task of breast mass classification. Methods: We initially computed 181 image features based on mass shape, spiculation, contrast, presence of fat or calcifications, texture, isodensity, and other morphological features. From this large image feature pool, we used a sequential forward floating selection (SFFS)-based feature selection method to select relevant features, and analyzed their performance using a support vector machine (SVM) model trained for the classification task. On a database of 600 benign and 600 malignant mass regions of interest (ROIs), we performed the study using a ten-fold cross-validation method. Feature selection and optimization of the SVM parameters were conducted on the training subsets only. Results: The area under the receiver operating characteristic curve (AUC) = 0.805±0.012 was obtained for the classification task. The results also showed that the most frequently-selected features by the SFFS-based algorithm in 10-fold iterations were those related to mass shape, isodensity and presence of fat, which are consistent with the image features frequently used by radiologists in the clinical environment for mass classification. The study also indicated that accurately computing mass spiculation features from the projection mammograms was difficult, and failed to perform well for the mass classification task due to tissue overlap within the benign mass regions. Conclusions: In conclusion, this comprehensive feature analysis study provided new and valuable information for optimizing computerized mass classification schemes that may have potential to be useful as a “second reader” in future clinical practice. PMID:24664267

  8. Unstructured grids for sonic-boom analysis

    NASA Technical Reports Server (NTRS)

    Fouladi, Kamran

    1993-01-01

    A fast and efficient unstructured grid scheme is evaluated for sonic-boom applications. The scheme is used to predict the near-field pressure signatures of a body of revolution at several body lengths below the configuration, and those results are compared with experimental data. The introduction of the 'sonic-boom grid topology' to this scheme make it well suited for sonic-boom applications, thus providing an alternative to conventional multiblock structured grid schemes.

  9. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  10. Hydrological Climate Classification: Can We Improve on Köppen-Geiger?

    NASA Astrophysics Data System (ADS)

    Knoben, W.; Woods, R. A.; Freer, J. E.

    2017-12-01

    Classification is essential in the study of complex natural systems, yet hydrology so far has no formal way to structure the climate forcing which underlies hydrologic response. Various climate classification systems can be borrowed from other disciplines but these are based on different organizing principles than a hydrological classification might use. From gridded global data we calculate a gridded aridity index, an aridity seasonality index and a rain-vs-snow index, which we use to cluster global locations into climate groups. We then define the membership degree of nearly 1100 catchments to each of our climate groups based on each catchment's climate and investigate the extent to which streamflow responses within each climate group are similar. We compare this climate classification approach with the often-used Köppen-Geiger classification, using statistical tests based on streamflow signature values. We find that three climate indices are sufficient to distinguish 18 different climate types world-wide. Climates tend to change gradually in space and catchments can thus belong to multiple climate groups, albeit with different degrees of membership. Streamflow responses within a climate group tend to be similar, regardless of the catchments' geographical proximity. A Wilcoxon two-sample test based on streamflow signature values for each climate group shows that the new classification can distinguish different flow regimes using this classification scheme. The Köppen-Geiger approach uses 29 climate classes but is less able to differentiate streamflow regimes. Climate forcing exerts a strong control on typical hydrologic response and both change gradually in space. This makes arbitrary hard boundaries in any classification scheme difficult to defend. Any hydrological classification should thus acknowledge these gradual changes in forcing. Catchment characteristics (soil or vegetation type, land use, etc) can vary more quickly in space than climate does, which can explain streamflow differences between geographically close locations. Summarizing, this work shows that hydrology needs its own way to structure climate forcing, acknowledging that climates vary gradually on a global scale and explicitly including those climate aspects that drive seasonal changes in hydrologic regimes.

  11. Types of Crude Oil

    EPA Pesticide Factsheets

    The petroleum industry often classifies these types by geographical source, but the classification scheme here is more useful in a spill cleanup scenario. It indicates general toxicity, physical state, and changes caused by time and weathering.

  12. Transport on Riemannian manifold for functional connectivity-based classification.

    PubMed

    Ng, Bernard; Dressler, Martin; Varoquaux, Gaël; Poline, Jean Baptiste; Greicius, Michael; Thirion, Bertrand

    2014-01-01

    We present a Riemannian approach for classifying fMRI connectivity patterns before and after intervention in longitudinal studies. A fundamental difficulty with using connectivity as features is that covariance matrices live on the positive semi-definite cone, which renders their elements inter-related. The implicit independent feature assumption in most classifier learning algorithms is thus violated. In this paper, we propose a matrix whitening transport for projecting the covariance estimates onto a common tangent space to reduce the statistical dependencies between their elements. We show on real data that our approach provides significantly higher classification accuracy than directly using Pearson's correlation. We further propose a non-parametric scheme for identifying significantly discriminative connections from classifier weights. Using this scheme, a number of neuroanatomically meaningful connections are found, whereas no significant connections are detected with pure permutation testing.

  13. Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk

    PubMed Central

    Ramirez-Villegas, Juan F.; Lam-Espinosa, Eric; Ramirez-Moreno, David F.; Calvo-Echeverry, Paulo C.; Agredo-Rodriguez, Wilfredo

    2011-01-01

    Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis. PMID:21386966

  14. [Who benefits from systemic therapy with a reflecting team?].

    PubMed

    Höger, C; Temme, M; Geiken, G

    1994-03-01

    In an evaluation study we investigated the effectiveness of the reflecting team approach compared to eclectic child psychiatric treatment in an outpatient setting and the indications for each type of treatment. The relationship between treatment outcome and diagnostic data obtained with the Multi-axial Classification Scheme was examined in 22 families treated with the reflecting team approach and in a second group of families matched on all important sociodemographic and diagnostic variables but receiving eclectic treatment. No difference was found between the two groups regarding symptom improvement or changes in family functioning. Regarding satisfaction with treatment, the reflecting team approach was superior to the eclectic modality. In the reflecting team group parental mental disorder and inadequate intra-familial communication (according to the new fifth axis of the Multi-axial Classification Scheme) had a negative effect on outcome.

  15. Adaptive skin detection based on online training

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Tang, Liang; Zhou, Jie; Rong, Gang

    2007-11-01

    Skin is a widely used cue for porn image classification. Most conventional methods are off-line training schemes. They usually use a fixed boundary to segment skin regions in the images and are effective only in restricted conditions: e.g. good lightness and unique human race. This paper presents an adaptive online training scheme for skin detection which can handle these tough cases. In our approach, skin detection is considered as a classification problem on Gaussian mixture model. For each image, human face is detected and the face color is used to establish a primary estimation of skin color distribution. Then an adaptive online training algorithm is used to find the real boundary between skin color and background color in current image. Experimental results on 450 images showed that the proposed method is more robust in general situations than the conventional ones.

  16. Fault diagnosis for analog circuits utilizing time-frequency features and improved VVRKFA

    NASA Astrophysics Data System (ADS)

    He, Wei; He, Yigang; Luo, Qiwu; Zhang, Chaolong

    2018-04-01

    This paper proposes a novel scheme for analog circuit fault diagnosis utilizing features extracted from the time-frequency representations of signals and an improved vector-valued regularized kernel function approximation (VVRKFA). First, the cross-wavelet transform is employed to yield the energy-phase distribution of the fault signals over the time and frequency domain. Since the distribution is high-dimensional, a supervised dimensionality reduction technique—the bilateral 2D linear discriminant analysis—is applied to build a concise feature set from the distributions. Finally, VVRKFA is utilized to locate the fault. In order to improve the classification performance, the quantum-behaved particle swarm optimization technique is employed to gradually tune the learning parameter of the VVRKFA classifier. The experimental results for the analog circuit faults classification have demonstrated that the proposed diagnosis scheme has an advantage over other approaches.

  17. A Support Vector Machine-Based Gender Identification Using Speech Signal

    NASA Astrophysics Data System (ADS)

    Lee, Kye-Hwan; Kang, Sang-Ick; Kim, Deok-Hwan; Chang, Joon-Hyuk

    We propose an effective voice-based gender identification method using a support vector machine (SVM). The SVM is a binary classification algorithm that classifies two groups by finding the voluntary nonlinear boundary in a feature space and is known to yield high classification performance. In the present work, we compare the identification performance of the SVM with that of a Gaussian mixture model (GMM)-based method using the mel frequency cepstral coefficients (MFCC). A novel approach of incorporating a features fusion scheme based on a combination of the MFCC and the fundamental frequency is proposed with the aim of improving the performance of gender identification. Experimental results demonstrate that the gender identification performance using the SVM is significantly better than that of the GMM-based scheme. Moreover, the performance is substantially improved when the proposed features fusion technique is applied.

  18. 76 FR 35181 - Wireless Backhaul; Further Inquiry Into Fixed Service Sharing of the 6875-7125 and 12700-13200...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-16

    ... continuation of electronic newsgathering operations, and the appropriate channelization scheme, coordination... also sought comment on alternative channelization schemes. Several commenters, including FWCC and...

  19. The Future of Classification in Wheelchair Sports; Can Data Science and Technological Advancement Offer an Alternative Point of View?

    PubMed

    van der Slikke, Rienk M A; Bregman, Daan J J; Berger, Monique A M; de Witte, Annemarie M H; Veeger, Dirk-Jan H E J

    2017-11-01

    Classification is a defining factor for competition in wheelchair sports, but it is a delicate and time-consuming process with often questionable validity. 1 New inertial sensor based measurement methods applied in match play and field tests, allow for more precise and objective estimates of the impairment effect on wheelchair mobility performance. It was evaluated if these measures could offer an alternative point of view for classification. Six standard wheelchair mobility performance outcomes of different classification groups were measured in match play (n=29), as well as best possible performance in a field test (n=47). In match-results a clear relationship between classification and performance level is shown, with increased performance outcomes in each adjacent higher classification group. Three outcomes differed significantly between the low and mid-class groups, and one between the mid and high-class groups. In best performance (field test), a split between the low and mid-class groups shows (5 out of 6 outcomes differed significantly) but hardly any difference between the mid and high-class groups. This observed split was confirmed by cluster analysis, revealing the existence of only two performance based clusters. The use of inertial sensor technology to get objective measures of wheelchair mobility performance, combined with a standardized field-test, brought alternative views for evidence based classification. The results of this approach provided arguments for a reduced number of classes in wheelchair basketball. Future use of inertial sensors in match play and in field testing could enhance evaluation of classification guidelines as well as individual athlete performance.

  20. Women’s preferences for alternative financial incentive schemes for breastfeeding: A discrete choice experiment

    PubMed Central

    Anokye, Nana; de Bekker-Grob, Esther W.; Higgins, Ailish; Relton, Clare; Strong, Mark; Fox-Rushby, Julia

    2018-01-01

    Background Increasing breastfeeding rates have been associated with reductions in disease in babies and mothers as well as in related costs. ‘Nourishing Start for Health (NoSH)’, a financial incentive scheme has been proposed as a potentially effective way to increase both the number of mothers breastfeeding and duration of breastfeeding. Aims To establish women’s relative preferences for different aspects of a financial incentive scheme for breastfeeding and to identify importance of scheme characteristics on probability on participation in an incentive scheme. Methods A discrete choice experiment (DCE) obtained information on alternative specifications of the NoSH scheme designed to promote continued breastfeeding duration until at least 6 weeks after birth. Four attributes framed alternative scheme designs: value of the incentive; minimum breastfeeding duration required to receive incentive; method of verifying breastfeeding; type of incentive. Three versions of the DCE questionnaire, each containing 8 different choice sets, provided 24 choice sets for analysis. The questionnaire was mailed to 2,531 women in the South Yorkshire Cohort (SYC) aged 16–45 years in IMD quintiles 3–5. The analytic approach considered conditional and mixed effects logistic models to account for preference heterogeneity that may be associated with a variation in effects mediated by respondents’ characteristics. Results 564 women completed the questionnaire and a response rate of 22% was achieved. Most of the included attributes were found to affect utility and therefore the probability to participate in the incentive scheme. Higher rewards were preferred, although the type of incentive significantly affected women’s preferences on average. We found evidence for preference heterogeneity based on individual characteristics that mediated preferences for an incentive scheme.Conclusions Although participants’ opinion in our sample was mixed, financial incentives for breastfeeding may be an acceptable and effective instrument to change behaviour. However, individual characteristics could mediate the effect and should therefore be considered when developing and targeting future interventions. PMID:29649245

  1. Synthesis of Potential Trypanocides

    DTIC Science & Technology

    1987-12-01

    0188 Ia. REPORT SECURITY CLASSIFICATION 1b RESTRICTIVE MARKINGS Unclassified 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION /AVAILABILITY OF...and the phenyl ring, ring structures 2 and 3 , introduction of a -CH:CII- group between the phenyl ring and its 4’-substituent, ring structure 4...imidazole (9, 15) thiazole (11) and pyridine (12-14) into ether-linked and vinyl-linked structures. 3 t % SCHEME 1 HETEROAROMATIC RINGS OHw Ch3 CH3 +1 N% f

  2. Some Complexity Results About Packet Radio Networks

    DTIC Science & Technology

    1983-03-01

    divsio-muti-cces)schemes foP TD ^12A 1 660 SECURITY CLASSIFICATION OF THIS PAGE (ft* 60 IAD SZCURTY CLASSIFICATION Of THIS PAGE(Whan Data Entoted) -other...8217-21-. bbd bd a aK (a) (b) b-d b 0~~ C C (c) (d) Fig. 1. Situations in a PRN for which (c,d) conflicts with (a,b). -22- 12 3 3m 3m+1- 3m+2 Fig. 2. A

  3. High-order asynchrony-tolerant finite difference schemes for partial differential equations

    NASA Astrophysics Data System (ADS)

    Aditya, Konduri; Donzis, Diego A.

    2017-12-01

    Synchronizations of processing elements (PEs) in massively parallel simulations, which arise due to communication or load imbalances between PEs, significantly affect the scalability of scientific applications. We have recently proposed a method based on finite-difference schemes to solve partial differential equations in an asynchronous fashion - synchronization between PEs is relaxed at a mathematical level. While standard schemes can maintain their stability in the presence of asynchrony, their accuracy is drastically affected. In this work, we present a general methodology to derive asynchrony-tolerant (AT) finite difference schemes of arbitrary order of accuracy, which can maintain their accuracy when synchronizations are relaxed. We show that there are several choices available in selecting a stencil to derive these schemes and discuss their effect on numerical and computational performance. We provide a simple classification of schemes based on the stencil and derive schemes that are representative of different classes. Their numerical error is rigorously analyzed within a statistical framework to obtain the overall accuracy of the solution. Results from numerical experiments are used to validate the performance of the schemes.

  4. Mode of Action (MOA) Assignment Classifications for Ecotoxicology: An Evaluation of approaches

    EPA Science Inventory

    The mode of toxic action (MOA) is recognized as a key determinant of chemical toxicity and as an alternative to chemical class-based predictive toxicity modeling. However, MOA classification has never been standardized in ecotoxicology, and a comprehensive comparison of classific...

  5. 46 CFR 8.240 - Application for recognition.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ALTERNATIVES Recognition of a Classification Society § 8.240 Application for recognition. (a) A classification society must apply for recognition in writing to the Commandant (CG-521). (b) An application must indicate which specific authority the classification society seeks to have delegated. (c) Upon verification from...

  6. 46 CFR 8.240 - Application for recognition.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ALTERNATIVES Recognition of a Classification Society § 8.240 Application for recognition. (a) A classification society must apply for recognition in writing to the Commandant (CG-521). (b) An application must indicate which specific authority the classification society seeks to have delegated. (c) Upon verification from...

  7. 46 CFR 71.15-5 - Alternate compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100 2nd St. SW., Stop 7126, Washington, DC 20593-7126; telephone (202) 372-1372; or fax (202) 372-1925. Approved classification society...

  8. 46 CFR 71.15-5 - Alternate compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100 2nd St. SW., Stop 7126, Washington, DC 20593-7126; telephone (202) 372-1372; or fax (202) 372-1925. Approved classification society...

  9. [Evaluation of traditional pathological classification at molecular classification era for gastric cancer].

    PubMed

    Yu, Yingyan

    2014-01-01

    Histopathological classification is in a pivotal position in both basic research and clinical diagnosis and treatment of gastric cancer. Currently, there are different classification systems in basic science and clinical application. In medical literatures, different classifications are used including Lauren and WHO systems, which have confused many researchers. Lauren classification has been proposed for half a century, but is still used worldwide. It shows many advantages of simple, easy handling with prognostic significance. The WHO classification scheme is better than Lauren classification in that it is continuously being revised according to the progress of gastric cancer, and is always used in the clinical and pathological diagnosis of common scenarios. Along with the progression of genomics, transcriptomics, proteomics, metabolomics researches, molecular classification of gastric cancer becomes the current hot topics. The traditional therapeutic approach based on phenotypic characteristics of gastric cancer will most likely be replaced with a gene variation mode. The gene-targeted therapy against the same molecular variation seems more reasonable than traditional chemical treatment based on the same morphological change.

  10. The Retrieval of Information in an Elementary School Library Media Center: An Alternative Method of Classification in the Common School Library, Amherst, Massachusetts.

    ERIC Educational Resources Information Center

    Cooper, Linda

    1997-01-01

    Discusses the problems encountered by elementary school children in retrieving information from a library catalog, either the traditional card catalog or an OPAC (online public access catalog). An alternative system of classification using colors and symbols is described that was developed in the Common School (Amherst, Massachusetts). (Author/LRW)

  11. Benchmarking wastewater treatment plants under an eco-efficiency perspective.

    PubMed

    Lorenzo-Toja, Yago; Vázquez-Rowe, Ian; Amores, María José; Termes-Rifé, Montserrat; Marín-Navarro, Desirée; Moreira, María Teresa; Feijoo, Gumersindo

    2016-10-01

    The new ISO 14045 framework is expected to slowly start shifting the definition of eco-efficiency toward a life-cycle perspective, using Life Cycle Assessment (LCA) as the environmental impact assessment method together with a system value assessment method for the economic analysis. In the present study, a set of 22 wastewater treatment plants (WWTPs) in Spain were analyzed on the basis of eco-efficiency criteria, using LCA and Life Cycle Costing (LCC) as a system value assessment method. The study is intended to be useful to decision-makers in the wastewater treatment sector, since the combined method provides an alternative scheme for analyzing the relationship between environmental impacts and costs. Two midpoint impact categories, global warming and eutrophication potential, as well as an endpoint single score indicator were used for the environmental assessment, while LCC was used for value assessment. Results demonstrated that substantial differences can be observed between different WWTPs depending on a wide range of factors such as plant configuration, plant size or even legal discharge limits. Based on these results the benchmarking of wastewater treatment facilities was performed by creating a specific classification and certification scheme. The proposed eco-label for the WWTPs rating is based on the integration of the three environmental indicators and an economic indicator calculated within the study under the eco-efficiency new framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Force analysis of magnetic bearings with power-saving controls

    NASA Technical Reports Server (NTRS)

    Johnson, Dexter; Brown, Gerald V.; Inman, Daniel J.

    1992-01-01

    Most magnetic bearing control schemes use a bias current with a superimposed control current to linearize the relationship between the control current and the force it delivers. For most operating conditions, the existence of the bias current requires more power than alternative methods that do not use conventional bias. Two such methods are examined which diminish or eliminate bias current. In the typical bias control scheme it is found that for a harmonic control force command into a voltage limited transconductance amplifier, the desired force output is obtained only up to certain combinations of force amplitude and frequency. Above these values, the force amplitude is reduced and a phase lag occurs. The power saving alternative control schemes typically exhibit such deficiencies at even lower command frequencies and amplitudes. To assess the severity of these effects, a time history analysis of the force output is performed for the bias method and the alternative methods. Results of the analysis show that the alternative approaches may be viable. The various control methods examined were mathematically modeled using nondimensionalized variables to facilitate comparison of the various methods.

  13. Concept relationship editor: a visual interface to support the assertion of synonymy relationships between taxonomic classifications

    NASA Astrophysics Data System (ADS)

    Craig, Paul; Kennedy, Jessie

    2008-01-01

    An increasingly common approach being taken by taxonomists to define the relationships between taxa in alternative hierarchical classifications is to use a set-based notation which states relationship between two taxa from alternative classifications. Textual recording of these relationships is cumbersome and difficult for taxonomists to manage. While text based GUI tools are beginning to appear which ease the process, these have several limitations. Interactive visual tools offer greater potential to allow taxonomists to explore the taxa in these hierarchies and specify such relationships. This paper describes the Concept Relationship Editor, an interactive visualisation tool designed to support the assertion of relationships between taxonomic classifications. The tool operates using an interactive space-filling adjacency layout which allows users to expand multiple lists of taxa with common parents so they can explore and assert relationships between two classifications.

  14. A Review of Major Nursing Vocabularies and the Extent to Which They Have the Characteristics Required for Implementation in Computer-based Systems

    PubMed Central

    Henry, Suzanne Bakken; Warren, Judith J.; Lange, Linda; Button, Patricia

    1998-01-01

    Building on the work of previous authors, the Computer-based Patient Record Institute (CPRI) Work Group on Codes and Structures has described features of a classification scheme for implementation within a computer-based patient record. The authors of the current study reviewed the evaluation literature related to six major nursing vocabularies (the North American Nursing Diagnosis Association Taxonomy 1, the Nursing Interventions Classification, the Nursing Outcomes Classification, the Home Health Care Classification, the Omaha System, and the International Classification for Nursing Practice) to determine the extent to which the vocabularies include the CPRI features. None of the vocabularies met all criteria. The Omaha System, Home Health Care Classification, and International Classification for Nursing Practice each included five features. Criteria not fully met by any systems were clear and non-redundant representation of concepts, administrative cross-references, syntax and grammar, synonyms, uncertainty, context-free identifiers, and language independence. PMID:9670127

  15. Overview of classification systems in peripheral artery disease.

    PubMed

    Hardman, Rulon L; Jazaeri, Omid; Yi, J; Smith, M; Gupta, Rajan

    2014-12-01

    Peripheral artery disease (PAD), secondary to atherosclerotic disease, is currently the leading cause of morbidity and mortality in the western world. While PAD is common, it is estimated that the majority of patients with PAD are undiagnosed and undertreated. The challenge to the treatment of PAD is to accurately diagnose the symptoms and determine treatment for each patient. The varied presentations of peripheral vascular disease have led to numerous classification schemes throughout the literature. Consistent grading of patients leads to both objective criteria for treating patients and a baseline for clinical follow-up. Reproducible classification systems are also important in clinical trials and when comparing medical, surgical, and endovascular treatment paradigms. This article reviews the various classification systems for PAD and advantages to each system.

  16. Unsupervised classification of remote multispectral sensing data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.

  17. 46 CFR 8.410 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Compliance Program § 8.410 Applicability. This subpart applies to: (a) Recognized classification societies... recognized classification society that is authorized by the Coast Guard to participate in the Alternate...

  18. 46 CFR 8.410 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Compliance Program § 8.410 Applicability. This subpart applies to: (a) Recognized classification societies... recognized classification society that is authorized by the Coast Guard to participate in the Alternate...

  19. Update and validation of the Society for Vascular Surgery wound, ischemia, and foot infection threatened limb classification system.

    PubMed

    Mills, Joseph L

    2014-03-01

    The diagnosis of critical limb ischemia, first defined in 1982, was intended to delineate a patient cohort with a threatened limb and at risk for amputation due to severe peripheral arterial disease. The influence of diabetes and its associated neuropathy on the pathogenesis-threatened limb was an excluded comorbidity, despite its known contribution to amputation risk. The Fontaine and Rutherford classifications of limb ischemia severity have also been used to predict amputation risk and the likelihood of tissue healing. The dramatic increase in the prevalence of diabetes mellitus and the expanding techniques of arterial revascularization has prompted modification of peripheral arterial disease classification schemes to improve outcomes analysis for patients with threatened limbs. The diabetic patient with foot ulceration and infection is at risk for limb loss, with abnormal arterial perfusion as only one determinant of outcome. The wound extent and severity of infection also impact the likelihood of limb loss. To better predict amputation risk, the Society for Vascular Surgery Lower Extremity Guidelines Committee developed a classification of the threatened lower extremity that reflects these important clinical considerations. Risk stratification is based on three major factors that impact amputation risk and clinical management: wound, ischemia, and foot infection. This classification scheme is relevant to the patient with critical limb ischemia because many are also diabetic. Implementation of the wound, ischemia, and foot infection classification system in critical limb ischemia patients is recommended and should assist the clinician in more meaningful analysis of outcomes for various forms of wound and arterial revascularizations procedures required in this challenging, patient population. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Classification of product inspection items using nonlinear features

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.; Lee, H.-W.

    1998-03-01

    Automated processing and classification of real-time x-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non-invasive detection of defective product items on a conveyor belt. This approach involves two main steps: preprocessing and classification. Preprocessing locates individual items and segments ones that touch using a modified watershed algorithm. The second stage involves extraction of features that allow discrimination between damaged and clean items (pistachio nuts). This feature extraction and classification stage is the new aspect of this paper. We use a new nonlinear feature extraction scheme called the maximum representation and discriminating feature (MRDF) extraction method to compute nonlinear features that are used as inputs to a classifier. The MRDF is shown to provide better classification and a better ROC (receiver operating characteristic) curve than other methods.

  1. A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China

    NASA Astrophysics Data System (ADS)

    Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao

    2015-04-01

    The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.

  2. Speech Enhancement based on the Dominant Classification Between Speech and Noise Using Feature Data in Spectrogram of Observation Signal

    NASA Astrophysics Data System (ADS)

    Nomura, Yukihiro; Lu, Jianming; Sekiya, Hiroo; Yahagi, Takashi

    This paper presents a speech enhancement using the classification between the dominants of speech and noise. In our system, a new classification scheme between the dominants of speech and noise is proposed. The proposed classifications use the standard deviation of the spectrum of observation signal in each band. We introduce two oversubtraction factors for the dominants of speech and noise, respectively. And spectral subtraction is carried out after the classification. The proposed method is tested on several noise types from the Noisex-92 database. From the investigation of segmental SNR, Itakura-Saito distance measure, inspection of spectrograms and listening tests, the proposed system is shown to be effective to reduce background noise. Moreover, the enhanced speech using our system generates less musical noise and distortion than that of conventional systems.

  3. Assessment of current virotherapeutic application schemes: “hit hard and early” versus “killing softly”?

    PubMed Central

    Ruf, Benjamin; Lauer, Ulrich M

    2015-01-01

    Over the past two decades, a considerable amount of oncolytic vector families has entered numerous clinical trials. However, to this date, the field has not yet been able to come to a common understanding regarding the best possible ways to administer oncolytic viruses to cancer patients. This is mainly due to the fact that so far clinical trials being designed for head-to-head comparisons (such as using two different virotherapeutics originating from two distinct virus families being applied via identical routes in the same types of cancer) are still missing. Hence, there is no consent (i) on the best route of virotherapeutics administration (e.g., systemic versus intratumoral), (ii) on the virus dosages to be applied, (iii) on dosing intervals, and (iv) on the numbers of repetitive courses of virus administration. As the detailed comparison of clinical virotherapy trial regimens is time-consuming and complex, we here present an overview of current state-of-the-art virotherapeutic application schemes. Notably, our comprehensive assessment culminates in raising two rough classifications of virotherapeutic strategies, i.e., “hit hard and early” versus “killing softly”. In order to find out which one of these two gross alternatives might be most successful for each and every tumor entity, we here suggest the implementation of phase 1/2 studies, which primarily aim at a repetitive sampling and analysis of tumor samples in cancer patients treated with oncolytic viruses reading out (i) virus-specific, (ii) tumor-specific as well as (iii) immunotherapeutic parameters. On this basis, a rational design of significantly improved virotherapeutic application schemes should be possible in the future. PMID:27119110

  4. Wilderness ecology: a method of sampling and summarizing data for plant community classification.

    Treesearch

    Lewis F. Ohmann; Robert R. Ream

    1971-01-01

    Presents a flexible sampling scheme that researchers and land managers may use in surveying and classifying plant communities of forest lands. Includes methods, data sheets, and computer summarization printouts.

  5. Investigating Elementary Teachers' Conceptions of Animal Classification

    ERIC Educational Resources Information Center

    Burgoon, Jacob N.; Duran, Emilio

    2012-01-01

    Numerous studies have been conducted regarding alternative conceptions about animal diversity and classification, many of which have used a cross-age approach to investigate how students' conceptions change over time. None of these studies, however, have investigated teachers' conceptions of animal classification. This study was intended to…

  6. 46 CFR 189.15-5 - Alternate compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100 2nd St., SW., Stop 7126, Washington, DC 20593-7126; telephone (202) 372-1371; or fax (202) 372-1925. Approved classification society...

  7. 46 CFR 189.15-5 - Alternate compliance.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., a list of authorized classification societies, including information for ordering copies of approved classification society rules and supplements, is available from Commandant (CG-521), 2100 2nd St., SW., Stop 7126, Washington, DC 20593-7126; telephone (202) 372-1371; or fax (202) 372-1925. Approved classification society...

  8. Hybrid analysis of multiaxis electromagnetic data for discrimination of munitions and explosives of concern

    USGS Publications Warehouse

    Friedel, M.J.; Asch, T.H.; Oden, C.

    2012-01-01

    The remediation of land containing munitions and explosives of concern, otherwise known as unexploded ordnance, is an ongoing problem facing the U.S. Department of Defense and similar agencies worldwide that have used or are transferring training ranges or munitions disposal areas to civilian control. The expense associated with cleanup of land previously used for military training and war provides impetus for research towards enhanced discrimination of buried unexploded ordnance. Towards reducing that expense, a multiaxis electromagnetic induction data collection and software system, called ALLTEM, was designed and tested with support from the U.S. Department of Defense Environmental Security Technology Certification Program. ALLTEM is an on-time time-domain system that uses a continuous triangle-wave excitation to measure the target-step response rather than traditional impulse response. The system cycles through three orthogonal transmitting loops and records a total of 19 different transmitting and receiving loop combinations with a nominal spatial data sampling interval of 20 cm. Recorded data are pre-processed and then used in a hybrid discrimination scheme involving both data-driven and numerical classification techniques. The data-driven classification scheme is accomplished in three steps. First, field observations are used to train a type of unsupervised artificial neural network, a self-organizing map (SOM). Second, the SOM is used to simultaneously estimate target parameters (depth, azimuth, inclination, item type and weight) by iterative minimization of the topographic error vectors. Third, the target classification is accomplished by evaluating histograms of the estimated parameters. The numerical classification scheme is also accomplished in three steps. First, the Biot–Savart law is used to model the primary magnetic fields from the transmitter coils and the secondary magnetic fields generated by currents induced in the target materials in the ground. Second, the target response is modelled by three orthogonal dipoles from prolate, oblate and triaxial ellipsoids with one long axis and two shorter axes. Each target consists of all three dipoles. Third, unknown target parameters are determined by comparing modelled to measured target responses. By comparing the rms error among the self-organizing map and numerical classification results, we achieved greater than 95 per cent detection and correct classification of the munitions and explosives of concern at the direct fire and indirect fire test areas at the UXO Standardized Test Site at the Aberdeen Proving Ground, Maryland in 2010.

  9. Hybrid analysis of multiaxis electromagnetic data for discrimination of munitions and explosives of concern

    NASA Astrophysics Data System (ADS)

    Friedel, M. J.; Asch, T. H.; Oden, C.

    2012-08-01

    The remediation of land containing munitions and explosives of concern, otherwise known as unexploded ordnance, is an ongoing problem facing the U.S. Department of Defense and similar agencies worldwide that have used or are transferring training ranges or munitions disposal areas to civilian control. The expense associated with cleanup of land previously used for military training and war provides impetus for research towards enhanced discrimination of buried unexploded ordnance. Towards reducing that expense, a multiaxis electromagnetic induction data collection and software system, called ALLTEM, was designed and tested with support from the U.S. Department of Defense Environmental Security Technology Certification Program. ALLTEM is an on-time time-domain system that uses a continuous triangle-wave excitation to measure the target-step response rather than traditional impulse response. The system cycles through three orthogonal transmitting loops and records a total of 19 different transmitting and receiving loop combinations with a nominal spatial data sampling interval of 20 cm. Recorded data are pre-processed and then used in a hybrid discrimination scheme involving both data-driven and numerical classification techniques. The data-driven classification scheme is accomplished in three steps. First, field observations are used to train a type of unsupervised artificial neural network, a self-organizing map (SOM). Second, the SOM is used to simultaneously estimate target parameters (depth, azimuth, inclination, item type and weight) by iterative minimization of the topographic error vectors. Third, the target classification is accomplished by evaluating histograms of the estimated parameters. The numerical classification scheme is also accomplished in three steps. First, the Biot-Savart law is used to model the primary magnetic fields from the transmitter coils and the secondary magnetic fields generated by currents induced in the target materials in the ground. Second, the target response is modelled by three orthogonal dipoles from prolate, oblate and triaxial ellipsoids with one long axis and two shorter axes. Each target consists of all three dipoles. Third, unknown target parameters are determined by comparing modelled to measured target responses. By comparing the rms error among the self-organizing map and numerical classification results, we achieved greater than 95 per cent detection and correct classification of the munitions and explosives of concern at the direct fire and indirect fire test areas at the UXO Standardized Test Site at the Aberdeen Proving Ground, Maryland in 2010.

  10. Composite prognostic models across the non-alcoholic fatty liver disease spectrum: Clinical application in developing countries

    PubMed Central

    Lückhoff, Hilmar K; Kruger, Frederik C; Kotze, Maritha J

    2015-01-01

    Heterogeneity in clinical presentation, histological severity, prognosis and therapeutic outcomes characteristic of non-alcoholic fatty liver disease (NAFLD) necessitates the development of scientifically sound classification schemes to assist clinicians in stratifying patients into meaningful prognostic subgroups. The need for replacement of invasive liver biopsies as the standard method whereby NAFLD is diagnosed, graded and staged with biomarkers of histological severity injury led to the development of composite prognostic models as potentially viable surrogate alternatives. In the present article, we review existing scoring systems used to (1) confirm the presence of undiagnosed hepatosteatosis; (2) distinguish between simple steatosis and NASH; and (3) predict advanced hepatic fibrosis, with particular emphasis on the role of NAFLD as an independent cardio-metabolic risk factor. In addition, the incorporation of functional genomic markers and application of emerging imaging technologies are discussed as a means to improve the diagnostic accuracy and predictive performance of promising composite models found to be most appropriate for widespread clinical adoption. PMID:26019735

  11. Classifying aerosol type using in situ surface spectral aerosol optical properties

    NASA Astrophysics Data System (ADS)

    Schmeisser, Lauren; Andrews, Elisabeth; Ogren, John A.; Sheridan, Patrick; Jefferson, Anne; Sharma, Sangeeta; Kim, Jeong Eun; Sherman, James P.; Sorribas, Mar; Kalapov, Ivo; Arsov, Todor; Angelov, Christo; Mayol-Bracero, Olga L.; Labuschagne, Casper; Kim, Sang-Woo; Hoffer, András; Lin, Neng-Huei; Chia, Hao-Ping; Bergin, Michael; Sun, Junying; Liu, Peng; Wu, Hao

    2017-10-01

    Knowledge of aerosol size and composition is important for determining radiative forcing effects of aerosols, identifying aerosol sources and improving aerosol satellite retrieval algorithms. The ability to extrapolate aerosol size and composition, or type, from intensive aerosol optical properties can help expand the current knowledge of spatiotemporal variability in aerosol type globally, particularly where chemical composition measurements do not exist concurrently with optical property measurements. This study uses medians of the scattering Ångström exponent (SAE), absorption Ångström exponent (AAE) and single scattering albedo (SSA) from 24 stations within the NOAA/ESRL Federated Aerosol Monitoring Network to infer aerosol type using previously published aerosol classification schemes.Three methods are implemented to obtain a best estimate of dominant aerosol type at each station using aerosol optical properties. The first method plots station medians into an AAE vs. SAE plot space, so that a unique combination of intensive properties corresponds with an aerosol type. The second typing method expands on the first by introducing a multivariate cluster analysis, which aims to group stations with similar optical characteristics and thus similar dominant aerosol type. The third and final classification method pairs 3-day backward air mass trajectories with median aerosol optical properties to explore the relationship between trajectory origin (proxy for likely aerosol type) and aerosol intensive parameters, while allowing for multiple dominant aerosol types at each station.The three aerosol classification methods have some common, and thus robust, results. In general, estimating dominant aerosol type using optical properties is best suited for site locations with a stable and homogenous aerosol population, particularly continental polluted (carbonaceous aerosol), marine polluted (carbonaceous aerosol mixed with sea salt) and continental dust/biomass sites (dust and carbonaceous aerosol); however, current classification schemes perform poorly when predicting dominant aerosol type at remote marine and Arctic sites and at stations with more complex locations and topography where variable aerosol populations are not well represented by median optical properties. Although the aerosol classification methods presented here provide new ways to reduce ambiguity in typing schemes, there is more work needed to find aerosol typing methods that are useful for a larger range of geographic locations and aerosol populations.

  12. DIF Trees: Using Classification Trees to Detect Differential Item Functioning

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.; Wang, Qiu

    2010-01-01

    A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…

  13. 46 CFR 8.440 - Vessel enrollment in the Alternate Compliance Program.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... classification society and accepted by the Coast Guard, the cognizant OCMI may decline to issue a certificate of... recognized classification society authorized by the Coast Guard to determine compliance with applicable international treaties and agreements, the classification society's class rules, and the U.S. supplement...

  14. 46 CFR 8.440 - Vessel enrollment in the Alternate Compliance Program.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... classification society and accepted by the Coast Guard, the cognizant OCMI may decline to issue a certificate of... recognized classification society authorized by the Coast Guard to determine compliance with applicable international treaties and agreements, the classification society's class rules, and the U.S. Supplement...

  15. 46 CFR 8.440 - Vessel enrollment in the Alternate Compliance Program.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... classification society and accepted by the Coast Guard, the cognizant OCMI may decline to issue a certificate of... recognized classification society authorized by the Coast Guard to determine compliance with applicable international treaties and agreements, the classification society's class rules, and the U.S. supplement...

  16. 46 CFR 8.440 - Vessel enrollment in the Alternate Compliance Program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... issuance or renewal of a COI may submit the vessel for classification, plan review and inspection by a recognized classification society authorized by the Coast Guard to determine compliance with applicable international treaties and agreements, the classification society's class rules, and the U.S. Supplement...

  17. 46 CFR 8.440 - Vessel enrollment in the Alternate Compliance Program.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... issuance or renewal of a COI may submit the vessel for classification, plan review and inspection by a recognized classification society authorized by the Coast Guard to determine compliance with applicable international treaties and agreements, the classification society's class rules, and the U.S. Supplement...

  18. A comprehensive catalogue and classification of human thermal climate indices

    NASA Astrophysics Data System (ADS)

    de Freitas, C. R.; Grigorieva, E. A.

    2015-01-01

    The very large number of human thermal climate indices that have been proposed over the past 100 years or so is a manifestation of the perceived importance within the scientific community of the thermal environment and the desire to quantify it. Schemes used differ in approach according to the number of variables taken into account, the rationale employed, the relative sophistication of the underlying body-atmosphere heat exchange theory and the particular design for application. They also vary considerably in type and quality, as well as in several other aspects. Reviews appear in the literature, but they cover a limited number of indices. A project that produces a comprehensive documentation, classification and overall evaluation of the full range of existing human thermal climate indices has never been attempted. This paper deals with documentation and classification. A subsequent report will focus on evaluation. Here a comprehensive register of 162 thermal indices is assembled and a sorting scheme devised that groups them according to eight primary classification classes. It is the first stage in a project to organise and evaluate the full range of all human thermal climate indices. The work, when completed, will make it easier for users to reflect on the merits of all available thermal indices. It will be simpler to locate and compare indices and decide which is most appropriate for a particular application or investigation.

  19. Construction of an Yucatec Maya soil classification and comparison with the WRB framework

    PubMed Central

    2010-01-01

    Background Mayas living in southeast Mexico have used soils for millennia and provide thus a good example for understanding soil-culture relationships and for exploring the ways indigenous people name and classify the soils of their territory. This paper shows an attempt to organize the Maya soil knowledge into a soil classification scheme and compares the latter with the World Reference Base for Soil Resources (WRB). Methods Several participative soil surveys were carried out in the period 2000-2009 with the help of bilingual Maya-Spanish-speaking farmers. A multilingual soil database was built with 315 soil profile descriptions. Results On the basis of the diagnostic soil properties and the soil nomenclature used by Maya farmers, a soil classification scheme with a hierarchic, dichotomous and open structure was constructed, organized in groups and qualifiers in a fashion similar to that of the WRB system. Maya soil properties were used at the same categorical levels as similar diagnostic properties are used in the WRB system. Conclusions The Maya soil classification (MSC) is a natural system based on key properties, such as relief position, rock types, size and quantity of stones, color of topsoil and subsoil, depth, water dynamics, and plant-supporting processes. The MSC addresses the soil properties of surficial and subsurficial horizons, and uses plant communities as qualifier in some cases. The MSC is more accurate than the WRB for classifying Leptosols. PMID:20152047

  20. Construction of an Yucatec Maya soil classification and comparison with the WRB framework.

    PubMed

    Bautista, Francisco; Zinck, J Alfred

    2010-02-13

    Mayas living in southeast Mexico have used soils for millennia and provide thus a good example for understanding soil-culture relationships and for exploring the ways indigenous people name and classify the soils of their territory. This paper shows an attempt to organize the Maya soil knowledge into a soil classification scheme and compares the latter with the World Reference Base for Soil Resources (WRB). Several participative soil surveys were carried out in the period 2000-2009 with the help of bilingual Maya-Spanish-speaking farmers. A multilingual soil database was built with 315 soil profile descriptions. On the basis of the diagnostic soil properties and the soil nomenclature used by Maya farmers, a soil classification scheme with a hierarchic, dichotomous and open structure was constructed, organized in groups and qualifiers in a fashion similar to that of the WRB system. Maya soil properties were used at the same categorical levels as similar diagnostic properties are used in the WRB system. The Maya soil classification (MSC) is a natural system based on key properties, such as relief position, rock types, size and quantity of stones, color of topsoil and subsoil, depth, water dynamics, and plant-supporting processes. The MSC addresses the soil properties of surficial and subsurficial horizons, and uses plant communities as qualifier in some cases. The MSC is more accurate than the WRB for classifying Leptosols.

  1. A comprehensive catalogue and classification of human thermal climate indices.

    PubMed

    de Freitas, C R; Grigorieva, E A

    2015-01-01

    The very large number of human thermal climate indices that have been proposed over the past 100 years or so is a manifestation of the perceived importance within the scientific community of the thermal environment and the desire to quantify it. Schemes used differ in approach according to the number of variables taken into account, the rationale employed, the relative sophistication of the underlying body-atmosphere heat exchange theory and the particular design for application. They also vary considerably in type and quality, as well as in several other aspects. Reviews appear in the literature, but they cover a limited number of indices. A project that produces a comprehensive documentation, classification and overall evaluation of the full range of existing human thermal climate indices has never been attempted. This paper deals with documentation and classification. A subsequent report will focus on evaluation. Here a comprehensive register of 162 thermal indices is assembled and a sorting scheme devised that groups them according to eight primary classification classes. It is the first stage in a project to organise and evaluate the full range of all human thermal climate indices. The work, when completed, will make it easier for users to reflect on the merits of all available thermal indices. It will be simpler to locate and compare indices and decide which is most appropriate for a particular application or investigation.

  2. A Deep Learning Scheme for Motor Imagery Classification based on Restricted Boltzmann Machines.

    PubMed

    Lu, Na; Li, Tengfei; Ren, Xiaodong; Miao, Hongyu

    2017-06-01

    Motor imagery classification is an important topic in brain-computer interface (BCI) research that enables the recognition of a subject's intension to, e.g., implement prosthesis control. The brain dynamics of motor imagery are usually measured by electroencephalography (EEG) as nonstationary time series of low signal-to-noise ratio. Although a variety of methods have been previously developed to learn EEG signal features, the deep learning idea has rarely been explored to generate new representation of EEG features and achieve further performance improvement for motor imagery classification. In this study, a novel deep learning scheme based on restricted Boltzmann machine (RBM) is proposed. Specifically, frequency domain representations of EEG signals obtained via fast Fourier transform (FFT) and wavelet package decomposition (WPD) are obtained to train three RBMs. These RBMs are then stacked up with an extra output layer to form a four-layer neural network, which is named the frequential deep belief network (FDBN). The output layer employs the softmax regression to accomplish the classification task. Also, the conjugate gradient method and backpropagation are used to fine tune the FDBN. Extensive and systematic experiments have been performed on public benchmark datasets, and the results show that the performance improvement of FDBN over other selected state-of-the-art methods is statistically significant. Also, several findings that may be of significant interest to the BCI community are presented in this article.

  3. Evidences in Neurological Surgery and a Cutting Edge Classification of the Trigeminocardiac Reflex: A Systematic Review.

    PubMed

    Leon-Ariza, Daniel S; Leon-Ariza, Juan S; Nangiana, Jasvinder; Grau, Gabriel Vargas; Leon-Sarmiento, Fidias E; Quiñones-Hinojosa, Alfredo

    2018-06-05

    The trigeminocardiac reflex (TCR) is characterized by bradycardia, decrease of main arterial blood pressure (MABP), and sometimes, asystole during surgery. We critically reviewed TCR studies and devised a novel classification scheme for assessing the reflex. and Methods: A comprehensive systematic literature review was performed using PubMed, MEDLINE, Web of Science, EMBASE, and Scielo databases. Eligible studies were extracted based on stringent inclusion and exclusion criteria. Statistical analyses were used to assess cardiovascular variables. TCR was classified according to morphophysiological aspects involved with reflex elicitation. 575 patients were included in this study. TCR was found in 8.9% of patients. The reflex was more often triggered by interventions made within the anterior cranial fossa. The maxillary branch (type II in the new classification) was the most prevalent nerve branch found to trigger the TCR. Heart rate (HR) and mean arterial blood pressure (MABP) were similarly altered (p = 0.06; F = 0.3912809), covaried with age (p = 0.012; F = 9.302), and inversely correlated to each other (r = -0.27). TCR is a critical cardiovascular phenomenon that must be quickly identified, efficiently classified, and should trigger vigilance. Prompt therapeutic measures during neurosurgical procedures should be carefully addressed to avoid unwanted complications. Accurate categorization using the new classification scheme will help to improve understanding and guide the management of TCR in the perioperative period. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Land use and land cover classification for rural residential areas in China using soft-probability cascading of multifeatures

    NASA Astrophysics Data System (ADS)

    Zhang, Bin; Liu, Yueyan; Zhang, Zuyu; Shen, Yonglin

    2017-10-01

    A multifeature soft-probability cascading scheme to solve the problem of land use and land cover (LULC) classification using high-spatial-resolution images to map rural residential areas in China is proposed. The proposed method is used to build midlevel LULC features. Local features are frequently considered as low-level feature descriptors in a midlevel feature learning method. However, spectral and textural features, which are very effective low-level features, are neglected. The acquisition of the dictionary of sparse coding is unsupervised, and this phenomenon reduces the discriminative power of the midlevel feature. Thus, we propose to learn supervised features based on sparse coding, a support vector machine (SVM) classifier, and a conditional random field (CRF) model to utilize the different effective low-level features and improve the discriminability of midlevel feature descriptors. First, three kinds of typical low-level features, namely, dense scale-invariant feature transform, gray-level co-occurrence matrix, and spectral features, are extracted separately. Second, combined with sparse coding and the SVM classifier, the probabilities of the different LULC classes are inferred to build supervised feature descriptors. Finally, the CRF model, which consists of two parts: unary potential and pairwise potential, is employed to construct an LULC classification map. Experimental results show that the proposed classification scheme can achieve impressive performance when the total accuracy reached about 87%.

  5. Hyperspectral Image Classification via Multitask Joint Sparse Representation and Stepwise MRF Optimization.

    PubMed

    Yuan, Yuan; Lin, Jianzhe; Wang, Qi

    2016-12-01

    Hyperspectral image (HSI) classification is a crucial issue in remote sensing. Accurate classification benefits a large number of applications such as land use analysis and marine resource utilization. But high data correlation brings difficulty to reliable classification, especially for HSI with abundant spectral information. Furthermore, the traditional methods often fail to well consider the spatial coherency of HSI that also limits the classification performance. To address these inherent obstacles, a novel spectral-spatial classification scheme is proposed in this paper. The proposed method mainly focuses on multitask joint sparse representation (MJSR) and a stepwise Markov random filed framework, which are claimed to be two main contributions in this procedure. First, the MJSR not only reduces the spectral redundancy, but also retains necessary correlation in spectral field during classification. Second, the stepwise optimization further explores the spatial correlation that significantly enhances the classification accuracy and robustness. As far as several universal quality evaluation indexes are concerned, the experimental results on Indian Pines and Pavia University demonstrate the superiority of our method compared with the state-of-the-art competitors.

  6. Cooperative Learning for Distributed In-Network Traffic Classification

    NASA Astrophysics Data System (ADS)

    Joseph, S. B.; Loo, H. R.; Ismail, I.; Andromeda, T.; Marsono, M. N.

    2017-04-01

    Inspired by the concept of autonomic distributed/decentralized network management schemes, we consider the issue of information exchange among distributed network nodes to network performance and promote scalability for in-network monitoring. In this paper, we propose a cooperative learning algorithm for propagation and synchronization of network information among autonomic distributed network nodes for online traffic classification. The results show that network nodes with sharing capability perform better with a higher average accuracy of 89.21% (sharing data) and 88.37% (sharing clusters) compared to 88.06% for nodes without cooperative learning capability. The overall performance indicates that cooperative learning is promising for distributed in-network traffic classification.

  7. Malware distributed collection and pre-classification system using honeypot technology

    NASA Astrophysics Data System (ADS)

    Grégio, André R. A.; Oliveira, Isabela L.; Santos, Rafael D. C.; Cansian, Adriano M.; de Geus, Paulo L.

    2009-04-01

    Malware has become a major threat in the last years due to the ease of spread through the Internet. Malware detection has become difficult with the use of compression, polymorphic methods and techniques to detect and disable security software. Those and other obfuscation techniques pose a problem for detection and classification schemes that analyze malware behavior. In this paper we propose a distributed architecture to improve malware collection using different honeypot technologies to increase the variety of malware collected. We also present a daemon tool developed to grab malware distributed through spam and a pre-classification technique that uses antivirus technology to separate malware in generic classes.

  8. Interpolation Hermite Polynomials For Finite Element Method

    NASA Astrophysics Data System (ADS)

    Gusev, Alexander; Vinitsky, Sergue; Chuluunbaatar, Ochbadrakh; Chuluunbaatar, Galmandakh; Gerdt, Vladimir; Derbov, Vladimir; Góźdź, Andrzej; Krassovitskiy, Pavel

    2018-02-01

    We describe a new algorithm for analytic calculation of high-order Hermite interpolation polynomials of the simplex and give their classification. A typical example of triangle element, to be built in high accuracy finite element schemes, is given.

  9. Fundamental Concepts in the Teaching of Chemistry.

    ERIC Educational Resources Information Center

    Loeffler, Paul A.

    1989-01-01

    Presented is a simple, unified approach to chemical nomenclature which employs the distinction between the terms chemical substance and chemical species. The classification of matter and chemical nomenclature are used as examples to illustrate this scheme. (CW)

  10. Environmental classification scheme for Pontis.

    DOT National Transportation Integrated Search

    1994-01-01

    In an effort to comply with the federal mandate for bridge management systems, many states have chosen to implement an existing system rather than develop their own. One such system is Pontis, the network-level bridge management system developed thro...

  11. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95%), and progress is being made towards identifying the mapped spectral classes.

  12. Mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Messmore, J.; Copeland, G. E.; Levy, G. F.

    1975-01-01

    This study was undertaken with the intent of elucidating the forest mapping capabilities of ERTS-1 MSS data when analyzed with the aid of LARS' automatic data processing techniques. The site for this investigation was the Great Dismal Swamp, a 210,000 acre wilderness area located on the Middle Atlantic coastal plain. Due to inadequate ground truth information on the distribution of vegetation within the swamp, an unsupervised classification scheme was utilized. Initially pictureprints, resembling low resolution photographs, were generated in each of the four ERTS-1 channels. Data found within rectangular training fields was then clustered into 13 spectral groups and defined statistically. Using a maximum likelihood classification scheme, the unknown data points were subsequently classified into one of the designated training classes. Training field data was classified with a high degree of accuracy (greater than 95 percent), and progress is being made towards identifying the mapped spectral classes.

  13. Complex versus simple models: ion-channel cardiac toxicity prediction.

    PubMed

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  14. The role of man in flight experiment payload missions. Volume 1: Results

    NASA Technical Reports Server (NTRS)

    Malone, T. B.

    1973-01-01

    It is pointed out that a controversy exists concerning the required role of man, and his attendant skills and levels of skills, for Sortie Lab operations. As a result, a study was conducted to generate a taxonomy of candidate crew roles which would: (1) be applicable across all experiments, and (2) be usable for Sortie scientists and engineers in determination of level of skill as well as type of skill. Nine basic roles were identified in the study, and the tasks associated with each were developed from a functional description of a generalized in-flight experiment. The functional analysis comprised the baseline for establishment of crew roles, with roles being defined as combinations of tasks, associated skills, and knowledges. A role classification scheme was developed in which the functions and tasks identified were allocated to each of the nine role types. This classification scheme is presented together with the significant results of the study.

  15. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Spatial variations in US poverty: beyond metropolitan and non-metropolitan.

    PubMed

    Wang, Man; Kleit, Rachel Garshick; Cover, Jane; Fowler, Christopher S

    2012-01-01

    Because poverty in rural and urban areas of the US often has different causes, correlates and solutions, effective anti-poverty policies depend on a thorough understanding of the ruralness or urbanness of specific places. This paper compares several widely used classification schemes and the varying magnitudes of poverty that they reveal in the US. The commonly used ‘metropolitan/non-metropolitan’ distinction obscures important socioeconomic differences among metropolitan areas, making our understanding of the geography of poverty imprecise. Given the number and concentration of poor people living in mixed-rural and rural counties in metropolitan regions, researchers and policy-makers need to pay more nuanced attention to the opportunities and constraints such individuals face. A cross-classification of the Office of Management and Budget’s metro system with a nuanced RUDC scheme is the most effective for revealing the geographical complexities of poverty within metropolitan areas.

  17. Alternating direction implicit methods for parabolic equations with a mixed derivative

    NASA Technical Reports Server (NTRS)

    Beam, R. M.; Warming, R. F.

    1980-01-01

    Alternating direction implicit (ADI) schemes for two-dimensional parabolic equations with a mixed derivative are constructed by using the class of all A(0)-stable linear two-step methods in conjunction with the method of approximate factorization. The mixed derivative is treated with an explicit two-step method which is compatible with an implicit A(0)-stable method. The parameter space for which the resulting ADI schemes are second-order accurate and unconditionally stable is determined. Some numerical examples are given.

  18. Alternating direction implicit methods for parabolic equations with a mixed derivative

    NASA Technical Reports Server (NTRS)

    Beam, R. M.; Warming, R. F.

    1979-01-01

    Alternating direction implicit (ADI) schemes for two-dimensional parabolic equations with a mixed derivative are constructed by using the class of all A sub 0-stable linear two-step methods in conjunction with the method of approximation factorization. The mixed derivative is treated with an explicit two-step method which is compatible with an implicit A sub 0-stable method. The parameter space for which the resulting ADI schemes are second order accurate and unconditionally stable is determined. Some numerical examples are given.

  19. Stationary Wavelet Transform and AdaBoost with SVM Based Pathological Brain Detection in MRI Scanning.

    PubMed

    Nayak, Deepak Ranjan; Dash, Ratnakar; Majhi, Banshidhar

    2017-01-01

    This paper presents an automatic classification system for segregating pathological brain from normal brains in magnetic resonance imaging scanning. The proposed system employs contrast limited adaptive histogram equalization scheme to enhance the diseased region in brain MR images. Two-dimensional stationary wavelet transform is harnessed to extract features from the preprocessed images. The feature vector is constructed using the energy and entropy values, computed from the level- 2 SWT coefficients. Then, the relevant and uncorrelated features are selected using symmetric uncertainty ranking filter. Subsequently, the selected features are given input to the proposed AdaBoost with support vector machine classifier, where SVM is used as the base classifier of AdaBoost algorithm. To validate the proposed system, three standard MR image datasets, Dataset-66, Dataset-160, and Dataset- 255 have been utilized. The 5 runs of k-fold stratified cross validation results indicate the suggested scheme offers better performance than other existing schemes in terms of accuracy and number of features. The proposed system earns ideal classification over Dataset-66 and Dataset-160; whereas, for Dataset- 255, an accuracy of 99.45% is achieved. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Nationwide classification of forest types of India using remote sensing and GIS.

    PubMed

    Reddy, C Sudhakar; Jha, C S; Diwakar, P G; Dadhwal, V K

    2015-12-01

    India, a mega-diverse country, possesses a wide range of climate and vegetation types along with a varied topography. The present study has classified forest types of India based on multi-season IRS Resourcesat-2 Advanced Wide Field Sensor (AWiFS) data. The study has characterized 29 land use/land cover classes including 14 forest types and seven scrub types. Hybrid classification approach has been used for the classification of forest types. The classification of vegetation has been carried out based on the ecological rule bases followed by Champion and Seth's (1968) scheme of forest types in India. The present classification scheme has been compared with the available global and national level land cover products. The natural vegetation cover was estimated to be 29.36% of total geographical area of India. The predominant forest types of India are tropical dry deciduous and tropical moist deciduous. Of the total forest cover, tropical dry deciduous forests occupy an area of 2,17,713 km(2) (34.80%) followed by 2,07,649 km(2) (33.19%) under tropical moist deciduous forests, 48,295 km(2) (7.72%) under tropical semi-evergreen forests and 47,192 km(2) (7.54%) under tropical wet evergreen forests. The study has brought out a comprehensive vegetation cover and forest type maps based on inputs critical in defining the various categories of vegetation and forest types. This spatially explicit database will be highly useful for the studies related to changes in various forest types, carbon stocks, climate-vegetation modeling and biogeochemical cycles.

Top