Sample records for objective classification scheme

  1. OBJECTIVE METEOROLOGICAL CLASSIFICATION SCHEME DESIGNED TO ELUCIDATE OZONE'S DEPENDENCE ON METEOROLOGY

    EPA Science Inventory

    This paper utilizes a two-stage clustering approach as part of an objective classification scheme designed to elucidate 03's dependence on meteorology. hen applied to ten years (1981-1990) of meteorological data for Birmingham, Alabama, the classification scheme identified seven ...

  2. Classification schemes for knowledge translation interventions: a practical resource for researchers.

    PubMed

    Slaughter, Susan E; Zimmermann, Gabrielle L; Nuspl, Megan; Hanson, Heather M; Albrecht, Lauren; Esmail, Rosmin; Sauro, Khara; Newton, Amanda S; Donald, Maoliosa; Dyson, Michele P; Thomson, Denise; Hartling, Lisa

    2017-12-06

    As implementation science advances, the number of interventions to promote the translation of evidence into healthcare, health systems, or health policy is growing. Accordingly, classification schemes for these knowledge translation (KT) interventions have emerged. A recent scoping review identified 51 classification schemes of KT interventions to integrate evidence into healthcare practice; however, the review did not evaluate the quality of the classification schemes or provide detailed information to assist researchers in selecting a scheme for their context and purpose. This study aimed to further examine and assess the quality of these classification schemes of KT interventions, and provide information to aid researchers when selecting a classification scheme. We abstracted the following information from each of the original 51 classification scheme articles: authors' objectives; purpose of the scheme and field of application; socioecologic level (individual, organizational, community, system); adaptability (broad versus specific); target group (patients, providers, policy-makers), intent (policy, education, practice), and purpose (dissemination versus implementation). Two reviewers independently evaluated the methodological quality of the development of each classification scheme using an adapted version of the AGREE II tool. Based on these assessments, two independent reviewers reached consensus about whether to recommend each scheme for researcher use, or not. Of the 51 original classification schemes, we excluded seven that were not specific classification schemes, not accessible or duplicates. Of the remaining 44 classification schemes, nine were not recommended. Of the 35 recommended classification schemes, ten focused on behaviour change and six focused on population health. Many schemes (n = 29) addressed practice considerations. Fewer schemes addressed educational or policy objectives. Twenty-five classification schemes had broad applicability, six were specific, and four had elements of both. Twenty-three schemes targeted health providers, nine targeted both patients and providers and one targeted policy-makers. Most classification schemes were intended for implementation rather than dissemination. Thirty-five classification schemes of KT interventions were developed and reported with sufficient rigour to be recommended for use by researchers interested in KT in healthcare. Our additional categorization and quality analysis will aid in selecting suitable classification schemes for research initiatives in the field of implementation science.

  3. The search for structure - Object classification in large data sets. [for astronomers

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1988-01-01

    Research concerning object classifications schemes are reviewed, focusing on large data sets. Classification techniques are discussed, including syntactic, decision theoretic methods, fuzzy techniques, and stochastic and fuzzy grammars. Consideration is given to the automation of MK classification (Morgan and Keenan, 1973) and other problems associated with the classification of spectra. In addition, the classification of galaxies is examined, including the problems of systematic errors, blended objects, galaxy types, and galaxy clusters.

  4. What's in a Name? A Comparison of Methods for Classifying Predominant Type of Maltreatment

    ERIC Educational Resources Information Center

    Lau, A.S.; Leeb, R.T.; English, D.; Graham, J.C.; Briggs, E.C.; Brody, K.E.; Marshall, J.M.

    2005-01-01

    Objective:: The primary aim of the study was to identify a classification scheme, for determining the predominant type of maltreatment in a child's history that best predicts differences in developmental outcomes. Method:: Three different predominant type classification schemes were examined in a sample of 519 children with a history of alleged…

  5. MeMoVolc report on classification and dynamics of volcanic explosive eruptions

    NASA Astrophysics Data System (ADS)

    Bonadonna, C.; Cioni, R.; Costa, A.; Druitt, T.; Phillips, J.; Pioli, L.; Andronico, D.; Harris, A.; Scollo, S.; Bachmann, O.; Bagheri, G.; Biass, S.; Brogi, F.; Cashman, K.; Dominguez, L.; Dürig, T.; Galland, O.; Giordano, G.; Gudmundsson, M.; Hort, M.; Höskuldsson, A.; Houghton, B.; Komorowski, J. C.; Küppers, U.; Lacanna, G.; Le Pennec, J. L.; Macedonio, G.; Manga, M.; Manzella, I.; Vitturi, M. de'Michieli; Neri, A.; Pistolesi, M.; Polacci, M.; Ripepe, M.; Rossi, E.; Scheu, B.; Sulpizio, R.; Tripoli, B.; Valade, S.; Valentine, G.; Vidal, C.; Wallenstein, N.

    2016-11-01

    Classifications of volcanic eruptions were first introduced in the early twentieth century mostly based on qualitative observations of eruptive activity, and over time, they have gradually been developed to incorporate more quantitative descriptions of the eruptive products from both deposits and observations of active volcanoes. Progress in physical volcanology, and increased capability in monitoring, measuring and modelling of explosive eruptions, has highlighted shortcomings in the way we classify eruptions and triggered a debate around the need for eruption classification and the advantages and disadvantages of existing classification schemes. Here, we (i) review and assess existing classification schemes, focussing on subaerial eruptions; (ii) summarize the fundamental processes that drive and parameters that characterize explosive volcanism; (iii) identify and prioritize the main research that will improve the understanding, characterization and classification of volcanic eruptions and (iv) provide a roadmap for producing a rational and comprehensive classification scheme. In particular, classification schemes need to be objective-driven and simple enough to permit scientific exchange and promote transfer of knowledge beyond the scientific community. Schemes should be comprehensive and encompass a variety of products, eruptive styles and processes, including for example, lava flows, pyroclastic density currents, gas emissions and cinder cone or caldera formation. Open questions, processes and parameters that need to be addressed and better characterized in order to develop more comprehensive classification schemes and to advance our understanding of volcanic eruptions include conduit processes and dynamics, abrupt transitions in eruption regime, unsteadiness, eruption energy and energy balance.

  6. Physiotherapy movement based classification approaches to low back pain: comparison of subgroups through review and developer/expert survey.

    PubMed

    Karayannis, Nicholas V; Jull, Gwendolen A; Hodges, Paul W

    2012-02-20

    Several classification schemes, each with its own philosophy and categorizing method, subgroup low back pain (LBP) patients with the intent to guide treatment. Physiotherapy derived schemes usually have a movement impairment focus, but the extent to which other biological, psychological, and social factors of pain are encompassed requires exploration. Furthermore, within the prevailing 'biological' domain, the overlap of subgrouping strategies within the orthopaedic examination remains unexplored. The aim of this study was "to review and clarify through developer/expert survey, the theoretical basis and content of physical movement classification schemes, determine their relative reliability and similarities/differences, and to consider the extent of incorporation of the bio-psycho-social framework within the schemes". A database search for relevant articles related to LBP and subgrouping or classification was conducted. Five dominant movement-based schemes were identified: Mechanical Diagnosis and Treatment (MDT), Treatment Based Classification (TBC), Pathoanatomic Based Classification (PBC), Movement System Impairment Classification (MSI), and O'Sullivan Classification System (OCS) schemes. Data were extracted and a survey sent to the classification scheme developers/experts to clarify operational criteria, reliability, decision-making, and converging/diverging elements between schemes. Survey results were integrated into the review and approval obtained for accuracy. Considerable diversity exists between schemes in how movement informs subgrouping and in the consideration of broader neurosensory, cognitive, emotional, and behavioural dimensions of LBP. Despite differences in assessment philosophy, a common element lies in their objective to identify a movement pattern related to a pain reduction strategy. Two dominant movement paradigms emerge: (i) loading strategies (MDT, TBC, PBC) aimed at eliciting a phenomenon of centralisation of symptoms; and (ii) modified movement strategies (MSI, OCS) targeted towards documenting the movement impairments associated with the pain state. Schemes vary on: the extent to which loading strategies are pursued; the assessment of movement dysfunction; and advocated treatment approaches. A biomechanical assessment predominates in the majority of schemes (MDT, PBC, MSI), certain psychosocial aspects (fear-avoidance) are considered in the TBC scheme, certain neurophysiologic (central versus peripherally mediated pain states) and psychosocial (cognitive and behavioural) aspects are considered in the OCS scheme.

  7. Human Factors Engineering. Student Supplement,

    DTIC Science & Technology

    1981-08-01

    a job TASK TAXONOMY A classification scheme for the different levels of activities in a system, i.e., job - task - sub-task, etc. TASK-AN~ALYSIS...with the classification of learning objectives by learning category so as to identify learningPhas III guidelines necessary for optimum learning to...correct. .4... .the sequencing of all dependent tasks. .1.. .the classification of learning objectives by learning category and the Identification of

  8. Underwater target classification using wavelet packets and neural networks.

    PubMed

    Azimi-Sadjadi, M R; Yao, D; Huang, Q; Dobeck, G J

    2000-01-01

    In this paper, a new subband-based classification scheme is developed for classifying underwater mines and mine-like targets from the acoustic backscattered signals. The system consists of a feature extractor using wavelet packets in conjunction with linear predictive coding (LPC), a feature selection scheme, and a backpropagation neural-network classifier. The data set used for this study consists of the backscattered signals from six different objects: two mine-like targets and four nontargets for several aspect angles. Simulation results on ten different noisy realizations and for signal-to-noise ratio (SNR) of 12 dB are presented. The receiver operating characteristic (ROC) curve of the classifier generated based on these results demonstrated excellent classification performance of the system. The generalization ability of the trained network was demonstrated by computing the error and classification rate statistics on a large data set. A multiaspect fusion scheme was also adopted in order to further improve the classification performance.

  9. The impact of catchment source group classification on the accuracy of sediment fingerprinting outputs.

    PubMed

    Pulley, Simon; Foster, Ian; Collins, Adrian L

    2017-06-01

    The objective classification of sediment source groups is at present an under-investigated aspect of source tracing studies, which has the potential to statistically improve discrimination between sediment sources and reduce uncertainty. This paper investigates this potential using three different source group classification schemes. The first classification scheme was simple surface and subsurface groupings (Scheme 1). The tracer signatures were then used in a two-step cluster analysis to identify the sediment source groupings naturally defined by the tracer signatures (Scheme 2). The cluster source groups were then modified by splitting each one into a surface and subsurface component to suit catchment management goals (Scheme 3). The schemes were tested using artificial mixtures of sediment source samples. Controlled corruptions were made to some of the mixtures to mimic the potential causes of tracer non-conservatism present when using tracers in natural fluvial environments. It was determined how accurately the known proportions of sediment sources in the mixtures were identified after unmixing modelling using the three classification schemes. The cluster analysis derived source groups (2) significantly increased tracer variability ratios (inter-/intra-source group variability) (up to 2122%, median 194%) compared to the surface and subsurface groupings (1). As a result, the composition of the artificial mixtures was identified an average of 9.8% more accurately on the 0-100% contribution scale. It was found that the cluster groups could be reclassified into a surface and subsurface component (3) with no significant increase in composite uncertainty (a 0.1% increase over Scheme 2). The far smaller effects of simulated tracer non-conservatism for the cluster analysis based schemes (2 and 3) was primarily attributed to the increased inter-group variability producing a far larger sediment source signal that the non-conservatism noise (1). Modified cluster analysis based classification methods have the potential to reduce composite uncertainty significantly in future source tracing studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Sunspot Pattern Classification using PCA and Neural Networks (Poster)

    NASA Technical Reports Server (NTRS)

    Rajkumar, T.; Thompson, D. E.; Slater, G. L.

    2005-01-01

    The sunspot classification scheme presented in this paper is considered as a 2-D classification problem on archived datasets, and is not a real-time system. As a first step, it mirrors the Zuerich/McIntosh historical classification system and reproduces classification of sunspot patterns based on preprocessing and neural net training datasets. Ultimately, the project intends to move from more rudimentary schemes, to develop spatial-temporal-spectral classes derived by correlating spatial and temporal variations in various wavelengths to the brightness fluctuation spectrum of the sun in those wavelengths. Once the approach is generalized, then the focus will naturally move from a 2-D to an n-D classification, where "n" includes time and frequency. Here, the 2-D perspective refers both to the actual SOH0 Michelson Doppler Imager (MDI) images that are processed, but also refers to the fact that a 2-D matrix is created from each image during preprocessing. The 2-D matrix is the result of running Principal Component Analysis (PCA) over the selected dataset images, and the resulting matrices and their eigenvalues are the objects that are stored in a database, classified, and compared. These matrices are indexed according to the standard McIntosh classification scheme.

  11. Taxonomy and Classification Scheme for Artificial Space Objects

    DTIC Science & Technology

    2013-09-01

    filter UVB and spectroscopic measurements) and albedo (including polarimetry ). Earliest classifications of asteroids [17] were based on the filter...similarities of the asteroid colors to K0 to K2V stars. The first more complete asteroid taxonomy was based on a synthesis of polarimetry , radiometry, and

  12. Classification of basic facilities for high-rise residential: A survey from 100 housing scheme in Kajang area

    NASA Astrophysics Data System (ADS)

    Ani, Adi Irfan Che; Sairi, Ahmad; Tawil, Norngainy Mohd; Wahab, Siti Rashidah Hanum Abd; Razak, Muhd Zulhanif Abd

    2016-08-01

    High demand for housing and limited land in town area has increasing the provision of high-rise residential scheme. This type of housing has different owners but share the same land lot and common facilities. Thus, maintenance works of the buildings and common facilities must be well organized. The purpose of this paper is to identify and classify basic facilities for high-rise residential building hoping to improve the management of the scheme. The method adopted is a survey on 100 high-rise residential schemes that ranged from affordable housing to high cost housing by using a snowball sampling. The scope of this research is within Kajang area, which is rapidly developed with high-rise housing. The objective of the survey is to list out all facilities in every sample of the schemes. The result confirmed that pre-determined 11 classifications hold true and can provide the realistic classification for high-rise residential scheme. This paper proposed for redefinition of facilities provided to create a better management system and give a clear definition on the type of high-rise residential based on its facilities.

  13. A color-coded vision scheme for robotics

    NASA Technical Reports Server (NTRS)

    Johnson, Kelley Tina

    1991-01-01

    Most vision systems for robotic applications rely entirely on the extraction of information from gray-level images. Humans, however, regularly depend on color to discriminate between objects. Therefore, the inclusion of color in a robot vision system seems a natural extension of the existing gray-level capabilities. A method for robot object recognition using a color-coding classification scheme is discussed. The scheme is based on an algebraic system in which a two-dimensional color image is represented as a polynomial of two variables. The system is then used to find the color contour of objects. In a controlled environment, such as that of the in-orbit space station, a particular class of objects can thus be quickly recognized by its color.

  14. Cross-ontological analytics for alignment of different classification schemes

    DOEpatents

    Posse, Christian; Sanfilippo, Antonio P; Gopalan, Banu; Riensche, Roderick M; Baddeley, Robert L

    2010-09-28

    Quantification of the similarity between nodes in multiple electronic classification schemes is provided by automatically identifying relationships and similarities between nodes within and across the electronic classification schemes. Quantifying the similarity between a first node in a first electronic classification scheme and a second node in a second electronic classification scheme involves finding a third node in the first electronic classification scheme, wherein a first product value of an inter-scheme similarity value between the second and third nodes and an intra-scheme similarity value between the first and third nodes is a maximum. A fourth node in the second electronic classification scheme can be found, wherein a second product value of an inter-scheme similarity value between the first and fourth nodes and an intra-scheme similarity value between the second and fourth nodes is a maximum. The maximum between the first and second product values represents a measure of similarity between the first and second nodes.

  15. Nosology, ontology and promiscuous realism.

    PubMed

    Binney, Nicholas

    2015-06-01

    Medics may consider worrying about their metaphysics and ontology to be a waste of time. I will argue here that this is not the case. Promiscuous realism is a metaphysical position which holds that multiple, equally valid, classification schemes should be applied to objects (such as patients) to capture different aspects of their complex and heterogeneous nature. As medics at the bedside may need to capture different aspects of their patients' problems, they may need to use multiple classification schemes (multiple nosologies), and thus consider adopting a different metaphysics to the one commonly in use. © 2014 John Wiley & Sons, Ltd.

  16. A Rapid Approach to Modeling Species-Habitat Relationships

    NASA Technical Reports Server (NTRS)

    Carter, Geoffrey M.; Breinger, David R.; Stolen, Eric D.

    2005-01-01

    A growing number of species require conservation or management efforts. Success of these activities requires knowledge of the species' occurrence pattern. Species-habitat models developed from GIS data sources are commonly used to predict species occurrence but commonly used data sources are often developed for purposes other than predicting species occurrence and are of inappropriate scale and the techniques used to extract predictor variables are often time consuming and cannot be repeated easily and thus cannot efficiently reflect changing conditions. We used digital orthophotographs and a grid cell classification scheme to develop an efficient technique to extract predictor variables. We combined our classification scheme with a priori hypothesis development using expert knowledge and a previously published habitat suitability index and used an objective model selection procedure to choose candidate models. We were able to classify a large area (57,000 ha) in a fraction of the time that would be required to map vegetation and were able to test models at varying scales using a windowing process. Interpretation of the selected models confirmed existing knowledge of factors important to Florida scrub-jay habitat occupancy. The potential uses and advantages of using a grid cell classification scheme in conjunction with expert knowledge or an habitat suitability index (HSI) and an objective model selection procedure are discussed.

  17. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  18. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  19. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  20. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  1. 15 CFR 921.3 - National Estuarine Research Reserve System biogeographic classification scheme and estuarine...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... System biogeographic classification scheme and estuarine typologies. 921.3 Section 921.3 Commerce and... biogeographic classification scheme and estuarine typologies. (a) National Estuarine Research Reserves are... classification scheme based on regional variations in the nation's coastal zone has been developed. The...

  2. A Computerized English-Spanish Correlation Index to Five Biomedical Library Classification Schemes Based on MeSH*

    PubMed Central

    Muench, Eugene V.

    1971-01-01

    A computerized English/Spanish correlation index to five biomedical library classification schemes and a computerized English/Spanish, Spanish/English listings of MeSH are described. The index was accomplished by supplying appropriate classification numbers of five classification schemes (National Library of Medicine; Library of Congress; Dewey Decimal; Cunningham; Boston Medical) to MeSH and a Spanish translation of MeSH The data were keypunched, merged on magnetic tape, and sorted in a computer alphabetically by English and Spanish subject headings and sequentially by classification number. Some benefits and uses of the index are: a complete index to classification schemes based on MeSH terms; a tool for conversion of classification numbers when reclassifying collections; a Spanish index and a crude Spanish translation of five classification schemes; a data base for future applications, e.g., automatic classification. Other classification schemes, such as the UDC, and translations of MeSH into other languages can be added. PMID:5172471

  3. Object links in the repository

    NASA Technical Reports Server (NTRS)

    Beck, Jon; Eichmann, David

    1991-01-01

    Some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life-cycle of software development are explored. In particular, we wish to consider a model which provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The model we consider uses object-oriented terminology. Thus, the lattice is viewed as a data structure which contains class objects which exhibit inheritance. A description of the types of objects in the repository is presented, followed by a discussion of how they interrelate. We discuss features of the object-oriented model which support these objects and their links, and consider behavior which an implementation of the model should exhibit. Finally, we indicate some thoughts on implementing a prototype of this repository architecture.

  4. Improved biliary detection and diagnosis through intelligent machine analysis.

    PubMed

    Logeswaran, Rajasvaran

    2012-09-01

    This paper reports on work undertaken to improve automated detection of bile ducts in magnetic resonance cholangiopancreatography (MRCP) images, with the objective of conducting preliminary classification of the images for diagnosis. The proposed I-BDeDIMA (Improved Biliary Detection and Diagnosis through Intelligent Machine Analysis) scheme is a multi-stage framework consisting of successive phases of image normalization, denoising, structure identification, object labeling, feature selection and disease classification. A combination of multiresolution wavelet, dynamic intensity thresholding, segment-based region growing, region elimination, statistical analysis and neural networks, is used in this framework to achieve good structure detection and preliminary diagnosis. Tests conducted on over 200 clinical images with known diagnosis have shown promising results of over 90% accuracy. The scheme outperforms related work in the literature, making it a viable framework for computer-aided diagnosis of biliary diseases. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. A learning scheme for reach to grasp movements: on EMG-based interfaces using task specific motion decoding models.

    PubMed

    Liarokapis, Minas V; Artemiadis, Panagiotis K; Kyriakopoulos, Kostas J; Manolakos, Elias S

    2013-09-01

    A learning scheme based on random forests is used to discriminate between different reach to grasp movements in 3-D space, based on the myoelectric activity of human muscles of the upper-arm and the forearm. Task specificity for motion decoding is introduced in two different levels: Subspace to move toward and object to be grasped. The discrimination between the different reach to grasp strategies is accomplished with machine learning techniques for classification. The classification decision is then used in order to trigger an EMG-based task-specific motion decoding model. Task specific models manage to outperform "general" models providing better estimation accuracy. Thus, the proposed scheme takes advantage of a framework incorporating both a classifier and a regressor that cooperate advantageously in order to split the task space. The proposed learning scheme can be easily used to a series of EMG-based interfaces that must operate in real time, providing data-driven capabilities for multiclass problems, that occur in everyday life complex environments.

  6. Diagnostic classification scheme in Iranian breast cancer patients using a decision tree.

    PubMed

    Malehi, Amal Saki

    2014-01-01

    The objective of this study was to determine a diagnostic classification scheme using a decision tree based model. The study was conducted as a retrospective case-control study in Imam Khomeini hospital in Tehran during 2001 to 2009. Data, including demographic and clinical-pathological characteristics, were uniformly collected from 624 females, 312 of them were referred with positive diagnosis of breast cancer (cases) and 312 healthy women (controls). The decision tree was implemented to develop a diagnostic classification scheme using CART 6.0 Software. The AUC (area under curve), was measured as the overall performance of diagnostic classification of the decision tree. Five variables as main risk factors of breast cancer and six subgroups as high risk were identified. The results indicated that increasing age, low age at menarche, single and divorced statues, irregular menarche pattern and family history of breast cancer are the important diagnostic factors in Iranian breast cancer patients. The sensitivity and specificity of the analysis were 66% and 86.9% respectively. The high AUC (0.82) also showed an excellent classification and diagnostic performance of the model. Decision tree based model appears to be suitable for identifying risk factors and high or low risk subgroups. It can also assists clinicians in making a decision, since it can identify underlying prognostic relationships and understanding the model is very explicit.

  7. A new classification of glaucomas

    PubMed Central

    Bordeianu, Constantin-Dan

    2014-01-01

    Purpose To suggest a new glaucoma classification that is pathogenic, etiologic, and clinical. Methods After discussing the logical pathway used in criteria selection, the paper presents the new classification and compares it with the classification currently in use, that is, the one issued by the European Glaucoma Society in 2008. Results The paper proves that the new classification is clear (being based on a coherent and consistently followed set of criteria), is comprehensive (framing all forms of glaucoma), and helps in understanding the sickness understanding (in that it uses a logical framing system). The great advantage is that it facilitates therapeutic decision making in that it offers direct therapeutic suggestions and avoids errors leading to disasters. Moreover, the scheme remains open to any new development. Conclusion The suggested classification is a pathogenic, etiologic, and clinical classification that fulfills the conditions of an ideal classification. The suggested classification is the first classification in which the main criterion is consistently used for the first 5 to 7 crossings until its differentiation capabilities are exhausted. Then, secondary criteria (etiologic and clinical) pick up the relay until each form finds its logical place in the scheme. In order to avoid unclear aspects, the genetic criterion is no longer used, being replaced by age, one of the clinical criteria. The suggested classification brings only benefits to all categories of ophthalmologists: the beginners will have a tool to better understand the sickness and to ease their decision making, whereas the experienced doctors will have their practice simplified. For all doctors, errors leading to therapeutic disasters will be less likely to happen. Finally, researchers will have the object of their work gathered in the group of glaucoma with unknown or uncertain pathogenesis, whereas the results of their work will easily find a logical place in the scheme, as the suggested classification remains open to any new development. PMID:25246759

  8. Circulation Type Classifications and their nexus to Van Bebber's storm track Vb

    NASA Astrophysics Data System (ADS)

    Hofstätter, M.; Chimani, B.

    2012-04-01

    Circulation Type Classifications (CTCs) are tools to identify repetitive and predominantly stationary patterns of the atmospheric circulation over a certain area, with the purpose to enable the recognition of specific characteristics in surface climate variables. On the other hand storm tracks can be used to identify similar types of synoptic events from a non-stationary, kinematic perspective. Such a storm track classification for Europe has been done in the late 19th century by Van Bebber (1882, 1891), from which the famous type Vb and Vc/d remained up to the present day because of to their association with major flooding events like in August 2002 in Europe. In this work a systematic tracking procedure has been developed, to determine storm track types and their characteristics especially for the Eastern Alpine Region in the period 1961-2002, using ERA40 and ERAinterim reanalysis. The focus thereby is on cyclone tracks of type V as suggested by van Bebber and congeneric types. This new catalogue is used as a reference to verify the hypothesis of a certain coherence of storm track Vb with certain circulation types (e.g. Fricke and Kaminski, 2002). Selected objective and subjective classification schemes from the COST733 action (http://cost733.met.no/, Phillip et al. 2010) are used therefore, as well as the manual classification from ZAMG (Lauscher 1972 and 1985), in which storm track Vb has been classified explicitly on a daily base since 1948. The latter scheme should prove itself as a valuable and unique data source in that issue. Results show that not less than 146 storm tracks are identified as Vb between 1961 and 2002, whereas only three events could be found from literature, pointing to big subjectivity and preconception in the issue of Vb storm tracks. The annual number of Vb storm tracks do not show any significant trend over the last 42 years, but large variations from year to year. Circulation type classification CAP27 (Cluster Analysis of Principal Components) is the best performing, fully objective scheme tested herein, showing the power to discriminate Vb events. Most of the other fully objective schemes do by far not perform as well. Largest skill in that issue can be seen from the subjective/manual CTCs, proving themselves to enhance relevant synoptic phenomena instead of emphasizing mathematic criteria in the classification. The hypothesis of Fricke and Kaminsky can definitely be supported by this work: Vb storm tracks are included in one or the other stationary circulation pattern, but to which extent depends on the specific characteristics of the CTC in question.

  9. Computer classification of remotely sensed multispectral image data by extraction and classification of homogeneous objects

    NASA Technical Reports Server (NTRS)

    Kettig, R. L.

    1975-01-01

    A method of classification of digitized multispectral images is developed and experimentally evaluated on actual earth resources data collected by aircraft and satellite. The method is designed to exploit the characteristic dependence between adjacent states of nature that is neglected by the more conventional simple-symmetric decision rule. Thus contextual information is incorporated into the classification scheme. The principle reason for doing this is to improve the accuracy of the classification. For general types of dependence this would generally require more computation per resolution element than the simple-symmetric classifier. But when the dependence occurs in the form of redundance, the elements can be classified collectively, in groups, therby reducing the number of classifications required.

  10. A Classification Scheme for Analyzing Mobile Apps Used to Prevent and Manage Disease in Late Life

    PubMed Central

    Wang, Aiguo; Lu, Xin; Chen, Hongtu; Li, Changqun; Levkoff, Sue

    2014-01-01

    Background There are several mobile apps that offer tools for disease prevention and management among older adults, and promote health behaviors that could potentially reduce or delay the onset of disease. A classification scheme that categorizes apps could be useful to both older adult app users and app developers. Objective The objective of our study was to build and evaluate the effectiveness of a classification scheme that classifies mobile apps available for older adults in the “Health & Fitness” category of the iTunes App Store. Methods We constructed a classification scheme for mobile apps according to three dimensions: (1) the Precede-Proceed Model (PPM), which classifies mobile apps in terms of predisposing, enabling, and reinforcing factors for behavior change; (2) health care process, specifically prevention versus management of disease; and (3) health conditions, including physical health and mental health. Content analysis was conducted by the research team on health and fitness apps designed specifically for older adults, as well as those applicable to older adults, released during the months of June and August 2011 and August 2012. Face validity was assessed by a different group of individuals, who were not related to the study. A reliability analysis was conducted to confirm the accuracy of the coding scheme of the sample apps in this study. Results After applying sample inclusion and exclusion criteria, a total of 119 apps were included in the study sample, of which 26/119 (21.8%) were released in June 2011, 45/119 (37.8%) in August 2011, and 48/119 (40.3%) in August 2012. Face validity was determined by interviewing 11 people, who agreed that this scheme accurately reflected the nature of this application. The entire study sample was successfully coded, demonstrating satisfactory inter-rater reliability by two independent coders (95.8% initial concordance and 100% concordance after consensus was reached). The apps included in the study sample were more likely to be used for the management of disease than prevention of disease (109/119, 91.6% vs 15/119, 12.6%). More apps contributed to physical health rather than mental health (81/119, 68.1% vs 47/119, 39.5%). Enabling apps (114/119, 95.8%) were more common than reinforcing (20/119, 16.8%) or predisposing apps (10/119, 8.4%). Conclusions The findings, including face validity and inter-rater reliability, support the integrity of the proposed classification scheme for categorizing mobile apps for older adults in the “Health and Fitness” category available in the iTunes App Store. Using the proposed classification system, older adult app users would be better positioned to identify apps appropriate for their needs, and app developers would be able to obtain the distributions of available mobile apps for health-related concerns of older adults more easily. PMID:25098687

  11. An analysis of the synoptic and climatological applicability of circulation type classifications for Ireland

    NASA Astrophysics Data System (ADS)

    Broderick, Ciaran; Fealy, Rowan

    2013-04-01

    Circulation type classifications (CTCs) compiled as part of the COST733 Action, entitled 'Harmonisation and Application of Weather Type Classifications for European Regions', are examined for their synoptic and climatological applicability to Ireland based on their ability to characterise surface temperature and precipitation. In all 16 different objective classification schemes, representative of four different methodological approaches to circulation typing (optimization algorithms, threshold based methods, eigenvector techniques and leader algorithms) are considered. Several statistical metrics which variously quantify the ability of CTCs to discretize daily data into well-defined homogeneous groups are used to evaluate and compare different approaches to synoptic typing. The records from 14 meteorological stations located across the island of Ireland are used in the study. The results indicate that while it was not possible to identify a single optimum classification or approach to circulation typing - conditional on the location and surface variables considered - a number of general assertions regarding the performance of different schemes can be made. The findings for surface temperature indicate that that those classifications based on predefined thresholds (e.g. Litynski, GrossWetterTypes and original Lamb Weather Type) perform well, as do the Kruizinga and Lund classification schemes. Similarly for precipitation predefined type classifications return high skill scores, as do those classifications derived using some optimization procedure (e.g. SANDRA, Self Organizing Maps and K-Means clustering). For both temperature and precipitation the results generally indicate that the classifications perform best for the winter season - reflecting the closer coupling between large-scale circulation and surface conditions during this period. In contrast to the findings for temperature, spatial patterns in the performance of classifications were more evident for precipitation. In the case of this variable those more westerly synoptic stations open to zonal airflow and less influenced by regional scale forcings generally exhibited a stronger link with large-scale circulation.

  12. Method of center localization for objects containing concentric arcs

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Elena G.; Shvets, Evgeny A.; Nikolaev, Dmitry P.

    2015-02-01

    This paper proposes a method for automatic center location of objects containing concentric arcs. The method utilizes structure tensor analysis and voting scheme optimized with Fast Hough Transform. Two applications of the proposed method are considered: (i) wheel tracking in video-based system for automatic vehicle classification and (ii) tree growth rings analysis on a tree cross cut image.

  13. Generalized interpretation scheme for arbitrary HR InSAR image pairs

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten

    2013-10-01

    Land cover classification of remote sensing imagery is an important topic of research. For example, different applications require precise and fast information about the land cover of the imaged scenery (e.g., disaster management and change detection). Focusing on high resolution (HR) spaceborne remote sensing imagery, the user has the choice between passive and active sensor systems. Passive systems, such as multispectral sensors, have the disadvantage of being dependent from weather influences (fog, dust, clouds, etc.) and time of day, since they work in the visible part of the electromagnetic spectrum. Here, active systems like Synthetic Aperture Radar (SAR) provide improved capabilities. As an interactive method analyzing HR InSAR image pairs, the CovAmCohTM method was introduced in former studies. CovAmCoh represents the joint analysis of locality (coefficient of variation - Cov), backscatter (amplitude - Am) and temporal stability (coherence - Coh). It delivers information on physical backscatter characteristics of imaged scene objects or structures and provides the opportunity to detect different classes of land cover (e.g., urban, rural, infrastructure and activity areas). As example, railway tracks are easily distinguishable from other infrastructure due to their characteristic bluish coloring caused by the gravel between the sleepers. In consequence, imaged objects or structures have a characteristic appearance in CovAmCoh images which allows the development of classification rules. In this paper, a generalized interpretation scheme for arbitrary InSAR image pairs using the CovAmCoh method is proposed. This scheme bases on analyzing the information content of typical CovAmCoh imagery using the semisupervised k-means clustering. It is shown that eight classes model the main local information content of CovAmCoh images sufficiently and can be used as basis for a classification scheme.

  14. An evaluation of several different classification schemes - Their parameters and performance. [maximum likelihood decision for crop identification

    NASA Technical Reports Server (NTRS)

    Scholz, D.; Fuhs, N.; Hixson, M.

    1979-01-01

    The overall objective of this study was to apply and evaluate several of the currently available classification schemes for crop identification. The approaches examined were: (1) a per point Gaussian maximum likelihood classifier, (2) a per point sum of normal densities classifier, (3) a per point linear classifier, (4) a per point Gaussian maximum likelihood decision tree classifier, and (5) a texture sensitive per field Gaussian maximum likelihood classifier. Three agricultural data sets were used in the study: areas from Fayette County, Illinois, and Pottawattamie and Shelby Counties in Iowa. The segments were located in two distinct regions of the Corn Belt to sample variability in soils, climate, and agricultural practices.

  15. On Classification in the Study of Failure, and a Challenge to Classifiers

    NASA Technical Reports Server (NTRS)

    Wasson, Kimberly S.

    2003-01-01

    Classification schemes are abundant in the literature of failure. They serve a number of purposes, some more successfully than others. We examine several classification schemes constructed for various purposes relating to failure and its investigation, and discuss their values and limits. The analysis results in a continuum of uses for classification schemes, that suggests that the value of certain properties of these schemes is dependent on the goals a classification is designed to forward. The contrast in the value of different properties for different uses highlights a particular shortcoming: we argue that while humans are good at developing one kind of scheme: dynamic, flexible classifications used for exploratory purposes, we are not so good at developing another: static, rigid classifications used to trap and organize data for specific analytic goals. Our lack of strong foundation in developing valid instantiations of the latter impedes progress toward a number of investigative goals. This shortcoming and its consequences pose a challenge to researchers in the study of failure: to develop new methods for constructing and validating static classification schemes of demonstrable value in promoting the goals of investigations. We note current productive activity in this area, and outline foundations for more.

  16. Learning viewpoint invariant object representations using a temporal coherence principle.

    PubMed

    Einhäuser, Wolfgang; Hipp, Jörg; Eggert, Julian; Körner, Edgar; König, Peter

    2005-07-01

    Invariant object recognition is arguably one of the major challenges for contemporary machine vision systems. In contrast, the mammalian visual system performs this task virtually effortlessly. How can we exploit our knowledge on the biological system to improve artificial systems? Our understanding of the mammalian early visual system has been augmented by the discovery that general coding principles could explain many aspects of neuronal response properties. How can such schemes be transferred to system level performance? In the present study we train cells on a particular variant of the general principle of temporal coherence, the "stability" objective. These cells are trained on unlabeled real-world images without a teaching signal. We show that after training, the cells form a representation that is largely independent of the viewpoint from which the stimulus is looked at. This finding includes generalization to previously unseen viewpoints. The achieved representation is better suited for view-point invariant object classification than the cells' input patterns. This property to facilitate view-point invariant classification is maintained even if training and classification take place in the presence of an--also unlabeled--distractor object. In summary, here we show that unsupervised learning using a general coding principle facilitates the classification of real-world objects, that are not segmented from the background and undergo complex, non-isomorphic, transformations.

  17. Proposed new classification scheme for chemical injury to the human eye.

    PubMed

    Bagley, Daniel M; Casterton, Phillip L; Dressler, William E; Edelhauser, Henry F; Kruszewski, Francis H; McCulley, James P; Nussenblatt, Robert B; Osborne, Rosemarie; Rothenstein, Arthur; Stitzel, Katherine A; Thomas, Karluss; Ward, Sherry L

    2006-07-01

    Various ocular alkali burn classification schemes have been published and used to grade human chemical eye injuries for the purpose of identifying treatments and forecasting outcomes. The ILSI chemical eye injury classification scheme was developed for the additional purpose of collecting detailed human eye injury data to provide information on the mechanisms associated with chemical eye injuries. This information will have clinical application, as well as use in the development and validation of new methods to assess ocular toxicity. A panel of ophthalmic researchers proposed the new classification scheme based upon current knowledge of the mechanisms of eye injury, and their collective clinical and research experience. Additional ophthalmologists and researchers were surveyed to critique the scheme. The draft scheme was revised, and the proposed scheme represents the best consensus from at least 23 physicians and scientists. The new scheme classifies chemical eye injury into five categories based on clinical signs, symptoms, and expected outcomes. Diagnostic classification is based primarily on two clinical endpoints: (1) the extent (area) of injury at the limbus, and (2) the degree of injury (area and depth) to the cornea. The new classification scheme provides a uniform system for scoring eye injury across chemical classes, and provides enough detail for the clinician to collect data that will be relevant to identifying the mechanisms of ocular injury.

  18. Guidelines for a priori grouping of species in hierarchical community models

    USGS Publications Warehouse

    Pacifici, Krishna; Zipkin, Elise; Collazo, Jaime; Irizarry, Julissa I.; DeWan, Amielle A.

    2014-01-01

    Recent methodological advances permit the estimation of species richness and occurrences for rare species by linking species-level occurrence models at the community level. The value of such methods is underscored by the ability to examine the influence of landscape heterogeneity on species assemblages at large spatial scales. A salient advantage of community-level approaches is that parameter estimates for data-poor species are more precise as the estimation process borrows from data-rich species. However, this analytical benefit raises a question about the degree to which inferences are dependent on the implicit assumption of relatedness among species. Here, we assess the sensitivity of community/group-level metrics, and individual-level species inferences given various classification schemes for grouping species assemblages using multispecies occurrence models. We explore the implications of these groupings on parameter estimates for avian communities in two ecosystems: tropical forests in Puerto Rico and temperate forests in northeastern United States. We report on the classification performance and extent of variability in occurrence probabilities and species richness estimates that can be observed depending on the classification scheme used. We found estimates of species richness to be most precise and to have the best predictive performance when all of the data were grouped at a single community level. Community/group-level parameters appear to be heavily influenced by the grouping criteria, but were not driven strictly by total number of detections for species. We found different grouping schemes can provide an opportunity to identify unique assemblage responses that would not have been found if all of the species were analyzed together. We suggest three guidelines: (1) classification schemes should be determined based on study objectives; (2) model selection should be used to quantitatively compare different classification approaches; and (3) sensitivity of results to different classification approaches should be assessed. These guidelines should help researchers apply hierarchical community models in the most effective manner.

  19. Development of a methodology for classifying software errors

    NASA Technical Reports Server (NTRS)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  20. FIELD TESTS OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED WATERSHED CLASSIFICATION SCHEMES IN THE GREAT LAKES BASIN

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1) Lake Superior tributaries and 2) watersheds of riverine coastal wetlands...

  1. FIELD TESTS OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED WATERSHED CLASSIFICATION SCHEMED IN THE GREAT LAKES BASIN

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1)Lake Superior tributaries and 2) watersheds of riverine coastal wetlands ...

  2. Enriching User-Oriented Class Associations for Library Classification Schemes.

    ERIC Educational Resources Information Center

    Pu, Hsiao-Tieh; Yang, Chyan

    2003-01-01

    Explores the possibility of adding user-oriented class associations to hierarchical library classification schemes. Analyses a log of book circulation records from a university library in Taiwan and shows that classification schemes can be made more adaptable by analyzing circulation patterns of similar users. (Author/LRW)

  3. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  4. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  5. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  6. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  7. 15 CFR Appendix I to Part 921 - Biogeographic Classification Scheme

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Biogeographic Classification Scheme I Appendix I to Part 921 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade... Part 921—Biogeographic Classification Scheme Acadian 1. Northern of Maine (Eastport to the Sheepscot...

  8. A Classification Methodology and Retrieval Model to Support Software Reuse

    DTIC Science & Technology

    1988-01-01

    Dewey Decimal Classification ( DDC 18), an enumerative scheme, occupies 40 pages [Buchanan 19791. Langridge [19731 states that the facets listed in the...sense of historical importance or wide spread use. The schemes are: Dewey Decimal Classification ( DDC ), Universal Decimal Classification (UDC...Classification Systems ..... ..... 2.3.3 Library Classification__- .52 23.3.1 Dewey Decimal Classification -53 2.33.2 Universal Decimal Classification 55 2333

  9. Classification of close binary systems by Svechnikov

    NASA Astrophysics Data System (ADS)

    Dryomova, G. N.

    The paper presents the historical overview of classification schemes of eclipsing variable stars with the foreground of advantages of the classification scheme by Svechnikov being widely appreciated for Close Binary Systems due to simplicity of classification criteria and brevity.

  10. A classification of errors in lay comprehension of medical documents.

    PubMed

    Keselman, Alla; Smith, Catherine Arnott

    2012-12-01

    Emphasis on participatory medicine requires that patients and consumers participate in tasks traditionally reserved for healthcare providers. This includes reading and comprehending medical documents, often but not necessarily in the context of interacting with Personal Health Records (PHRs). Research suggests that while giving patients access to medical documents has many benefits (e.g., improved patient-provider communication), lay people often have difficulty understanding medical information. Informatics can address the problem by developing tools that support comprehension; this requires in-depth understanding of the nature and causes of errors that lay people make when comprehending clinical documents. The objective of this study was to develop a classification scheme of comprehension errors, based on lay individuals' retellings of two documents containing clinical text: a description of a clinical trial and a typical office visit note. While not comprehensive, the scheme can serve as a foundation of further development of a taxonomy of patients' comprehension errors. Eighty participants, all healthy volunteers, read and retold two medical documents. A data-driven content analysis procedure was used to extract and classify retelling errors. The resulting hierarchical classification scheme contains nine categories and 23 subcategories. The most common error made by the participants involved incorrectly recalling brand names of medications. Other common errors included misunderstanding clinical concepts, misreporting the objective of a clinical research study and physician's findings during a patient's visit, and confusing and misspelling clinical terms. A combination of informatics support and health education is likely to improve the accuracy of lay comprehension of medical documents. Published by Elsevier Inc.

  11. State of the Art in the Cramer Classification Scheme and ...

    EPA Pesticide Factsheets

    Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD. Slide presentation at the SOT FDA Colloquium on State of the Art in the Cramer Classification Scheme and Threshold of Toxicological Concern in College Park, MD.

  12. Overview of classification systems in peripheral artery disease.

    PubMed

    Hardman, Rulon L; Jazaeri, Omid; Yi, J; Smith, M; Gupta, Rajan

    2014-12-01

    Peripheral artery disease (PAD), secondary to atherosclerotic disease, is currently the leading cause of morbidity and mortality in the western world. While PAD is common, it is estimated that the majority of patients with PAD are undiagnosed and undertreated. The challenge to the treatment of PAD is to accurately diagnose the symptoms and determine treatment for each patient. The varied presentations of peripheral vascular disease have led to numerous classification schemes throughout the literature. Consistent grading of patients leads to both objective criteria for treating patients and a baseline for clinical follow-up. Reproducible classification systems are also important in clinical trials and when comparing medical, surgical, and endovascular treatment paradigms. This article reviews the various classification systems for PAD and advantages to each system.

  13. Wing Classification in the Virtual Research Center

    NASA Technical Reports Server (NTRS)

    Campbell, William H.

    1999-01-01

    The Virtual Research Center (VRC) is a Web site that hosts a database of documents organized to allow teams of scientists and engineers to store and maintain documents. A number of other workgroup-related capabilities are provided. My tasks as a NASA/ASEE Summer Faculty Fellow included developing a scheme for classifying the workgroups using the VRC using the various Divisions within NASA Enterprises. To this end I developed a plan to use several CGI Perl scripts to gather classification information from the leaders of the workgroups, and to display all the workgroups within a specified classification. I designed, implemented, and partially tested scripts which can be used to do the classification. I was also asked to consider directions for future development of the VRC. I think that the VRC can use XML to advantage. XML is a markup language with designer tags that can be used to build meaning into documents. An investigation as to how CORBA, an object-oriented object request broker included with JDK 1.2, might be used also seems justified.

  14. Content-based unconstrained color logo and trademark retrieval with color edge gradient co-occurrence histograms

    NASA Astrophysics Data System (ADS)

    Phan, Raymond; Androutsos, Dimitrios

    2008-01-01

    In this paper, we present a logo and trademark retrieval system for unconstrained color image databases that extends the Color Edge Co-occurrence Histogram (CECH) object detection scheme. We introduce more accurate information to the CECH, by virtue of incorporating color edge detection using vector order statistics. This produces a more accurate representation of edges in color images, in comparison to the simple color pixel difference classification of edges as seen in the CECH. Our proposed method is thus reliant on edge gradient information, and as such, we call this the Color Edge Gradient Co-occurrence Histogram (CEGCH). We use this as the main mechanism for our unconstrained color logo and trademark retrieval scheme. Results illustrate that the proposed retrieval system retrieves logos and trademarks with good accuracy, and outperforms the CECH object detection scheme with higher precision and recall.

  15. Machine learning in infrared object classification - an all-sky selection of YSO candidates

    NASA Astrophysics Data System (ADS)

    Marton, Gabor; Zahorecz, Sarolta; Toth, L. Viktor; Magnus McGehee, Peregrine; Kun, Maria

    2015-08-01

    Object classification is a fundamental and challenging problem in the era of big data. I will discuss up-to-date methods and their application to classify infrared point sources.We analysed the ALLWISE catalogue, the most recent public source catalogue of the Wide-field Infrared Survey Explorer (WISE) to compile a reliable list of Young Stellar Object (YSO) candidates. We tested and compared classical and up-to-date statistical methods as well, to discriminate source types like extragalactic objects, evolved stars, main sequence stars, objects related to the interstellar medium and YSO candidates by using their mid-IR WISE properties and associated near-IR 2MASS data.In the particular classification problem the Support Vector Machines (SVM), a class of supervised learning algorithm turned out to be the best tool. As a result we classify Class I and II YSOs with >90% accuracy while the fraction of contaminating extragalactic objects remains well below 1%, based on the number of known objects listed in the SIMBAD and VizieR databases. We compare our results to other classification schemes from the literature and show that the SVM outperforms methods that apply linear cuts on the colour-colour and colour-magnitude space. Our homogenous YSO candidate catalog can serve as an excellent pathfinder for future detailed observations of individual objects and a starting point of statistical studies that aim to add pieces to the big picture of star formation theory.

  16. Multimedia

    NASA Technical Reports Server (NTRS)

    Kaye, Karen

    1993-01-01

    Multimedia initiative objectives for the NASA Scientific and Technical Information (STI) program are described. A multimedia classification scheme was developed and the types of non-print media currently in use are inventoried. The NASA STI Program multimedia initiative is driven by a changing user population and technical requirements in the areas of publications, dissemination, and user and management support.

  17. Polsar Land Cover Classification Based on Hidden Polarimetric Features in Rotation Domain and Svm Classifier

    NASA Astrophysics Data System (ADS)

    Tao, C.-S.; Chen, S.-W.; Li, Y.-Z.; Xiao, S.-P.

    2017-09-01

    Land cover classification is an important application for polarimetric synthetic aperture radar (PolSAR) data utilization. Rollinvariant polarimetric features such as H / Ani / α / Span are commonly adopted in PolSAR land cover classification. However, target orientation diversity effect makes PolSAR images understanding and interpretation difficult. Only using the roll-invariant polarimetric features may introduce ambiguity in the interpretation of targets' scattering mechanisms and limit the followed classification accuracy. To address this problem, this work firstly focuses on hidden polarimetric feature mining in the rotation domain along the radar line of sight using the recently reported uniform polarimetric matrix rotation theory and the visualization and characterization tool of polarimetric coherence pattern. The former rotates the acquired polarimetric matrix along the radar line of sight and fully describes the rotation characteristics of each entry of the matrix. Sets of new polarimetric features are derived to describe the hidden scattering information of the target in the rotation domain. The latter extends the traditional polarimetric coherence at a given rotation angle to the rotation domain for complete interpretation. A visualization and characterization tool is established to derive new polarimetric features for hidden information exploration. Then, a classification scheme is developed combing both the selected new hidden polarimetric features in rotation domain and the commonly used roll-invariant polarimetric features with a support vector machine (SVM) classifier. Comparison experiments based on AIRSAR and multi-temporal UAVSAR data demonstrate that compared with the conventional classification scheme which only uses the roll-invariant polarimetric features, the proposed classification scheme achieves both higher classification accuracy and better robustness. For AIRSAR data, the overall classification accuracy with the proposed classification scheme is 94.91 %, while that with the conventional classification scheme is 93.70 %. Moreover, for multi-temporal UAVSAR data, the averaged overall classification accuracy with the proposed classification scheme is up to 97.08 %, which is much higher than the 87.79 % from the conventional classification scheme. Furthermore, for multitemporal PolSAR data, the proposed classification scheme can achieve better robustness. The comparison studies also clearly demonstrate that mining and utilization of hidden polarimetric features and information in the rotation domain can gain the added benefits for PolSAR land cover classification and provide a new vision for PolSAR image interpretation and application.

  18. A Proposal to Develop Interactive Classification Technology

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    1998-01-01

    Research for the first year was oriented towards: 1) the design of an interactive classification tool (ICT); and 2) the development of an appropriate theory of inference for use in ICT technology. The general objective was to develop a theory of classification that could accommodate a diverse array of objects, including events and their constituent objects. Throughout this report, the term "object" is to be interpreted in a broad sense to cover any kind of object, including living beings, non-living physical things, events, even ideas and concepts. The idea was to produce a theory that could serve as the uniting fabric of a base technology capable of being implemented in a variety of automated systems. The decision was made to employ two technologies under development by the principal investigator, namely, SMS (Symbolic Manipulation System) and SL (Symbolic Language) [see debessonet, 1991, for detailed descriptions of SMS and SL]. The plan was to enhance and modify these technologies for use in an ICT environment. As a means of giving focus and direction to the proposed research, the investigators decided to design an interactive, classificatory tool for use in building accessible knowledge bases for selected domains. Accordingly, the proposed research was divisible into tasks that included: 1) the design of technology for classifying domain objects and for building knowledge bases from the results automatically; 2) the development of a scheme of inference capable of drawing upon previously processed classificatory schemes and knowledge bases; and 3) the design of a query/ search module for accessing the knowledge bases built by the inclusive system. The interactive tool for classifying domain objects was to be designed initially for textual corpora with a view to having the technology eventually be used in robots to build sentential knowledge bases that would be supported by inference engines specially designed for the natural or man-made environments in which the robots would be called upon to operate.

  19. Creating a Taxonomy of Local Boards of Health Based on Local Health Departments’ Perspectives

    PubMed Central

    Shah, Gulzar H.; Sotnikov, Sergey; Leep, Carolyn J.; Ye, Jiali; Van Wave, Timothy W.

    2017-01-01

    Objectives To develop a local board of health (LBoH) classification scheme and empirical definitions to provide a coherent framework for describing variation in the LBoHs. Methods This study is based on data from the 2015 Local Board of Health Survey, conducted among a nationally representative sample of local health department administrators, with 394 responses. The classification development consisted of the following steps: (1) theoretically guided initial domain development, (2) mapping of the survey variables to the proposed domains, (3) data reduction using principal component analysis and group consensus, and (4) scale development and testing for internal consistency. Results The final classification scheme included 60 items across 6 governance function domains and an additional domain—LBoH characteristics and strengths, such as meeting frequency, composition, and diversity of information sources. Application of this classification strongly supports the premise that LBoHs differ in their performance of governance functions and in other characteristics. Conclusions The LBoH taxonomy provides an empirically tested standardized tool for classifying LBoHs from the viewpoint of local health department administrators. Future studies can use this taxonomy to better characterize the impact of LBoHs. PMID:27854524

  20. A supervised 'lesion-enhancement' filter by use of a massive-training artificial neural network (MTANN) in computer-aided diagnosis (CAD).

    PubMed

    Suzuki, Kenji

    2009-09-21

    Computer-aided diagnosis (CAD) has been an active area of study in medical image analysis. A filter for the enhancement of lesions plays an important role for improving the sensitivity and specificity in CAD schemes. The filter enhances objects similar to a model employed in the filter; e.g. a blob-enhancement filter based on the Hessian matrix enhances sphere-like objects. Actual lesions, however, often differ from a simple model; e.g. a lung nodule is generally modeled as a solid sphere, but there are nodules of various shapes and with internal inhomogeneities such as a nodule with spiculations and ground-glass opacity. Thus, conventional filters often fail to enhance actual lesions. Our purpose in this study was to develop a supervised filter for the enhancement of actual lesions (as opposed to a lesion model) by use of a massive-training artificial neural network (MTANN) in a CAD scheme for detection of lung nodules in CT. The MTANN filter was trained with actual nodules in CT images to enhance actual patterns of nodules. By use of the MTANN filter, the sensitivity and specificity of our CAD scheme were improved substantially. With a database of 69 lung cancers, nodule candidate detection by the MTANN filter achieved a 97% sensitivity with 6.7 false positives (FPs) per section, whereas nodule candidate detection by a difference-image technique achieved a 96% sensitivity with 19.3 FPs per section. Classification-MTANNs were applied for further reduction of the FPs. The classification-MTANNs removed 60% of the FPs with a loss of one true positive; thus, it achieved a 96% sensitivity with 2.7 FPs per section. Overall, with our CAD scheme based on the MTANN filter and classification-MTANNs, an 84% sensitivity with 0.5 FPs per section was achieved.

  1. A Classification Scheme for Smart Manufacturing Systems’ Performance Metrics

    PubMed Central

    Lee, Y. Tina; Kumaraguru, Senthilkumaran; Jain, Sanjay; Robinson, Stefanie; Helu, Moneer; Hatim, Qais Y.; Rachuri, Sudarsan; Dornfeld, David; Saldana, Christopher J.; Kumara, Soundar

    2017-01-01

    This paper proposes a classification scheme for performance metrics for smart manufacturing systems. The discussion focuses on three such metrics: agility, asset utilization, and sustainability. For each of these metrics, we discuss classification themes, which we then use to develop a generalized classification scheme. In addition to the themes, we discuss a conceptual model that may form the basis for the information necessary for performance evaluations. Finally, we present future challenges in developing robust, performance-measurement systems for real-time, data-intensive enterprises. PMID:28785744

  2. Classification of product inspection items using nonlinear features

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.; Lee, H.-W.

    1998-03-01

    Automated processing and classification of real-time x-ray images of randomly oriented touching pistachio nuts is discussed. The ultimate objective is the development of a system for automated non-invasive detection of defective product items on a conveyor belt. This approach involves two main steps: preprocessing and classification. Preprocessing locates individual items and segments ones that touch using a modified watershed algorithm. The second stage involves extraction of features that allow discrimination between damaged and clean items (pistachio nuts). This feature extraction and classification stage is the new aspect of this paper. We use a new nonlinear feature extraction scheme called the maximum representation and discriminating feature (MRDF) extraction method to compute nonlinear features that are used as inputs to a classifier. The MRDF is shown to provide better classification and a better ROC (receiver operating characteristic) curve than other methods.

  3. A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China

    NASA Astrophysics Data System (ADS)

    Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao

    2015-04-01

    The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.

  4. CLASSIFICATION FRAMEWORK FOR COASTAL ECOSYSTEM RESPONSES TO AQUATIC STRESSORS

    EPA Science Inventory

    Many classification schemes have been developed to group ecosystems based on similar characteristics. To date, however, no single scheme has addressed coastal ecosystem responses to multiple stressors. We developed a classification framework for coastal ecosystems to improve the ...

  5. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders.

    PubMed

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-31

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately.

  6. Clinical Application of Esophageal High-resolution Manometry in the Diagnosis of Esophageal Motility Disorders

    PubMed Central

    van Hoeij, Froukje B; Bredenoord, Albert J

    2016-01-01

    Esophageal high-resolution manometry (HRM) is replacing conventional manometry in the clinical evaluation of patients with esophageal symptoms, especially dysphagia. The introduction of HRM gave rise to new objective metrics and recognizable patterns of esophageal motor function, requiring a new classification scheme: the Chicago classification. HRM measurements are more detailed and more easily performed compared to conventional manometry. The visual presentation of acquired data improved the analysis and interpretation of esophageal motor function. This led to a more sensitive, accurate, and objective analysis of esophageal motility. In this review we discuss how HRM changed the way we define and categorize esophageal motility disorders. Moreover, we discuss the clinical applications of HRM for each esophageal motility disorder separately. PMID:26631942

  7. A comparative study for chest radiograph image retrieval using binary texture and deep learning classification.

    PubMed

    Anavi, Yaron; Kogan, Ilya; Gelbart, Elad; Geva, Ofer; Greenspan, Hayit

    2015-08-01

    In this work various approaches are investigated for X-ray image retrieval and specifically chest pathology retrieval. Given a query image taken from a data set of 443 images, the objective is to rank images according to similarity. Different features, including binary features, texture features, and deep learning (CNN) features are examined. In addition, two approaches are investigated for the retrieval task. One approach is based on the distance of image descriptors using the above features (hereon termed the "descriptor"-based approach); the second approach ("classification"-based approach) is based on a probability descriptor, generated by a pair-wise classification of each two classes (pathologies) and their decision values using an SVM classifier. Best results are achieved using deep learning features in a classification scheme.

  8. Taxonomy of asteroids. [according to polarimetric, spectrophotometric, radiometric, and UBV photometric data

    NASA Technical Reports Server (NTRS)

    Bowell, E.; Chapman, C. R.; Gradie, J. C.; Zellner, B.; Morrison, D.

    1978-01-01

    A taxonomic system for asteroids is discussed which is based on seven directly observable parameters from polarimetry, spectrophotometry, radiometry, and UBV photometry. The classification scheme is entirely empirical and independent of specific mineralogical interpretations. Five broad classes (designated C, S, M, E, and R), as well as an 'unclassifiable' designation, are defined on the basis of observational data for 523 asteroids. Computer-generated type classifications and derived diameters are given for the 523 asteroids, and the application of the classification procedure is illustrated. Of the 523 asteroids classified, 190 are identified as C objects, 141 as S type, 13 as type M, three as type E, three as type R, 55 as unclassifiable, and 118 as ambiguous. The present taxonomic system is compared with several other asteroid classification systems.

  9. exprso: an R-package for the rapid implementation of machine learning algorithms.

    PubMed

    Quinn, Thomas; Tylee, Daniel; Glatt, Stephen

    2016-01-01

    Machine learning plays a major role in many scientific investigations. However, non-expert programmers may struggle to implement the elaborate pipelines necessary to build highly accurate and generalizable models. We introduce exprso , a new R package that is an intuitive machine learning suite designed specifically for non-expert programmers. Built initially for the classification of high-dimensional data, exprso uses an object-oriented framework to encapsulate a number of common analytical methods into a series of interchangeable modules. This includes modules for feature selection, classification, high-throughput parameter grid-searching, elaborate cross-validation schemes (e.g., Monte Carlo and nested cross-validation), ensemble classification, and prediction. In addition, exprso also supports multi-class classification (through the 1-vs-all generalization of binary classifiers) and the prediction of continuous outcomes.

  10. THE ROLE OF WATERSHED CLASSIFICATION IN DIAGNOSING CAUSES OF BIOLOGICAL IMPAIRMENT

    EPA Science Inventory

    We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmention with a gewographically-based classification scheme for two case studies involving 1) Lake Superior tributaries and 2) watersheds of riverine coastal wetlands ...

  11. Selective classification for improved robustness of myoelectric control under nonideal conditions.

    PubMed

    Scheme, Erik J; Englehart, Kevin B; Hudgins, Bernard S

    2011-06-01

    Recent literature in pattern recognition-based myoelectric control has highlighted a disparity between classification accuracy and the usability of upper limb prostheses. This paper suggests that the conventionally defined classification accuracy may be idealistic and may not reflect true clinical performance. Herein, a novel myoelectric control system based on a selective multiclass one-versus-one classification scheme, capable of rejecting unknown data patterns, is introduced. This scheme is shown to outperform nine other popular classifiers when compared using conventional classification accuracy as well as a form of leave-one-out analysis that may be more representative of real prosthetic use. Additionally, the classification scheme allows for real-time, independent adjustment of individual class-pair boundaries making it flexible and intuitive for clinical use.

  12. A classification scheme for edge-localized modes based on their probability distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Max Planck Institute for Plasma Physics, D-85748 Garching; Hornung, G.

    We present here an automated classification scheme which is particularly well suited to scenarios where the parameters have significant uncertainties or are stochastic quantities. To this end, the parameters are modeled with probability distributions in a metric space and classification is conducted using the notion of nearest neighbors. The presented framework is then applied to the classification of type I and type III edge-localized modes (ELMs) from a set of carbon-wall plasmas at JET. This provides a fast, standardized classification of ELM types which is expected to significantly reduce the effort of ELM experts in identifying ELM types. Further, themore » classification scheme is general and can be applied to various other plasma phenomena as well.« less

  13. Mapping Mangrove Density from Rapideye Data in Central America

    NASA Astrophysics Data System (ADS)

    Son, Nguyen-Thanh; Chen, Chi-Farn; Chen, Cheng-Ru

    2017-06-01

    Mangrove forests provide a wide range of socioeconomic and ecological services for coastal communities. Extensive aquaculture development of mangrove waters in many developing countries has constantly ignored services of mangrove ecosystems, leading to unintended environmental consequences. Monitoring the current status and distribution of mangrove forests is deemed important for evaluating forest management strategies. This study aims to delineate the density distribution of mangrove forests in the Gulf of Fonseca, Central America with Rapideye data using the support vector machines (SVM). The data collected in 2012 for density classification of mangrove forests were processed based on four different band combination schemes: scheme-1 (bands 1-3, 5 excluding the red-edge band 4), scheme-2 (bands 1-5), scheme-3 (bands 1-3, 5 incorporating with the normalized difference vegetation index, NDVI), and scheme-4 (bands 1-3, 5 incorporating with the normalized difference red-edge index, NDRI). We also hypothesized if the obvious contribution of Rapideye red-edge band could improve the classification results. Three main steps of data processing were employed: (1), data pre-processing, (2) image classification, and (3) accuracy assessment to evaluate the contribution of red-edge band in terms of the accuracy of classification results across these four schemes. The classification maps compared with the ground reference data indicated the slightly higher accuracy level observed for schemes 2 and 4. The overall accuracies and Kappa coefficients were 97% and 0.95 for scheme-2 and 96.9% and 0.95 for scheme-4, respectively.

  14. Realistic Expectations for Rock Identification.

    ERIC Educational Resources Information Center

    Westerback, Mary Elizabeth; Azer, Nazmy

    1991-01-01

    Presents a rock classification scheme for use by beginning students. The scheme is based on rock textures (glassy, crystalline, clastic, and organic framework) and observable structures (vesicles and graded bedding). Discusses problems in other rock classification schemes which may produce confusion, misidentification, and anxiety. (10 references)…

  15. A Philosophical Approach to Describing Science Content: An Example From Geologic Classification.

    ERIC Educational Resources Information Center

    Finley, Fred N.

    1981-01-01

    Examines how research of philosophers of science may be useful to science education researchers and curriculum developers in the development of descriptions of science content related to classification schemes. Provides examples of concept analysis of two igneous rock classification schemes. (DS)

  16. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Validation of a selective ensemble-based classification scheme for myoelectric control using a three-dimensional Fitts' Law test.

    PubMed

    Scheme, Erik J; Englehart, Kevin B

    2013-07-01

    When controlling a powered upper limb prosthesis it is important not only to know how to move the device, but also when not to move. A novel approach to pattern recognition control, using a selective multiclass one-versus-one classification scheme has been shown to be capable of rejecting unintended motions. This method was shown to outperform other popular classification schemes when presented with muscle contractions that did not correspond to desired actions. In this work, a 3-D Fitts' Law test is proposed as a suitable alternative to using virtual limb environments for evaluating real-time myoelectric control performance. The test is used to compare the selective approach to a state-of-the-art linear discriminant analysis classification based scheme. The framework is shown to obey Fitts' Law for both control schemes, producing linear regression fittings with high coefficients of determination (R(2) > 0.936). Additional performance metrics focused on quality of control are discussed and incorporated in the evaluation. Using this framework the selective classification based scheme is shown to produce significantly higher efficiency and completion rates, and significantly lower overshoot and stopping distances, with no significant difference in throughput.

  18. Computer-aided detection and diagnosis of masses and clustered microcalcifications from digital mammograms

    NASA Astrophysics Data System (ADS)

    Nishikawa, Robert M.; Giger, Maryellen L.; Doi, Kunio; Vyborny, Carl J.; Schmidt, Robert A.; Metz, Charles E.; Wu, Chris Y.; Yin, Fang-Fang; Jiang, Yulei; Huo, Zhimin; Lu, Ping; Zhang, Wei; Ema, Takahiro; Bick, Ulrich; Papaioannou, John; Nagel, Rufus H.

    1993-07-01

    We are developing an 'intelligent' workstation to assist radiologists in diagnosing breast cancer from mammograms. The hardware for the workstation will consist of a film digitizer, a high speed computer, a large volume storage device, a film printer, and 4 high resolution CRT monitors. The software for the workstation is a comprehensive package of automated detection and classification schemes. Two rule-based detection schemes have been developed, one for breast masses and the other for clustered microcalcifications. The sensitivity of both schemes is 85% with a false-positive rate of approximately 3.0 and 1.5 false detections per image, for the mass and cluster detection schemes, respectively. Computerized classification is performed by an artificial neural network (ANN). The ANN has a sensitivity of 100% with a specificity of 60%. Currently, the ANN, which is a three-layer, feed-forward network, requires as input ratings of 14 different radiographic features of the mammogram that were determined subjectively by a radiologist. We are in the process of developing automated techniques to objectively determine these 14 features. The workstation will be placed in the clinical reading area of the radiology department in the near future, where controlled clinical tests will be performed to measure its efficacy.

  19. THE WESTERN LAKE SUPERIOR COMPARATIVE WATERSHED FRAMEWORK: A FIELD TEST OF GEOGRAPHICALLY-DEPENDENT VS. THRESHOLD-BASED GEOGRAPHICALLY-INDEPENDENT CLASSIFICATION

    EPA Science Inventory

    Stratified random selection of watersheds allowed us to compare geographically-independent classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme within the Northern Lakes a...

  20. Reconsideration of the scheme of the international classification of functioning, disability and health: incentives from the Netherlands for a global debate.

    PubMed

    Heerkens, Yvonne F; de Weerd, Marjolein; Huber, Machteld; de Brouwer, Carin P M; van der Veen, Sabina; Perenboom, Rom J M; van Gool, Coen H; Ten Napel, Huib; van Bon-Martens, Marja; Stallinga, Hillegonda A; van Meeteren, Nico L U

    2018-03-01

    The ICF (International Classification of Functioning, Disability and Health) framework (used worldwide to describe 'functioning' and 'disability'), including the ICF scheme (visualization of functioning as result of interaction with health condition and contextual factors), needs reconsideration. The purpose of this article is to discuss alternative ICF schemes. Reconsideration of ICF via literature review and discussions with 23 Dutch ICF experts. Twenty-six experts were invited to rank the three resulting alternative schemes. The literature review provided five themes: 1) societal developments; 2) health and research influences; 3) conceptualization of health; 4) models/frameworks of health and disability; and 5) ICF-criticism (e.g. position of 'health condition' at the top and role of 'contextual factors'). Experts concluded that the ICF scheme gives the impression that the medical perspective is dominant instead of the biopsychosocial perspective. Three alternative ICF schemes were ranked by 16 (62%) experts, resulting in one preferred scheme. There is a need for a new ICF scheme, better reflecting the ICF framework, for further (inter)national consideration. These Dutch schemes should be reviewed on a global scale, to develop a scheme that is more consistent with current and foreseen developments and changing ideas on health. Implications for Rehabilitation We propose policy makers on community, regional and (inter)national level to consider the use of the alternative schemes of the International Classification of Functioning, Disability and Health within their plans to promote functioning and health of their citizens and researchers and teachers to incorporate the alternative schemes into their research and education to emphasize the biopsychosocial paradigm. We propose to set up an international Delphi procedure involving citizens (including patients), experts in healthcare, occupational care, research, education and policy, and planning to get consensus on an alternative scheme of the International Classification of Functioning, Disability and Health. We recommend to discuss the alternatives for the present scheme of the International Classification of Functioning, Disability and Health in the present update and revision process within the World Health Organization as a part of the discussion on the future of the International Classification of Functioning, Disability and Health framework (including ontology, title and relation with the International Classification of Diseases). We recommend to revise the definition of personal factors and to draft a list of personal factors that can be used in policy making, clinical practice, research, and education and to put effort in the revision of the present list of environmental factors to make it more useful in, e.g., occupational health care.

  1. A kernel-based novelty detection scheme for the ultra-fast detection of chirp evoked Auditory Brainstem Responses.

    PubMed

    Corona-Strauss, Farah I; Delb, Wolfgang; Schick, Bernhard; Strauss, Daniel J

    2010-01-01

    Auditory Brainstem Responses (ABRs) are used as objective method for diagnostics and quantification of hearing loss. Many methods for automatic recognition of ABRs have been developed, but none of them include the individual measurement setup in the analysis. The purpose of this work was to design a fast recognition scheme for chirp-evoked ABRs that is adjusted to the individual measurement condition using spontaneous electroencephalographic activity (SA). For the classification, the kernel-based novelty detection scheme used features based on the inter-sweep instantaneous phase synchronization as well as energy and entropy relations in the time-frequency domain. This method provided SA discrimination from stimulations above the hearing threshold with a minimum number of sweeps, i.e., 200 individual responses. It is concluded that the proposed paradigm, processing procedures and stimulation techniques improve the detection of ABRs in terms of the degree of objectivity, i.e., automation of procedure, and measurement time.

  2. Towards a Collaborative Intelligent Tutoring System Classification Scheme

    ERIC Educational Resources Information Center

    Harsley, Rachel

    2014-01-01

    This paper presents a novel classification scheme for Collaborative Intelligent Tutoring Systems (CITS), an emergent research field. The three emergent classifications of CITS are unstructured, semi-structured, and fully structured. While all three types of CITS offer opportunities to improve student learning gains, the full extent to which these…

  3. CNN universal machine as classificaton platform: an art-like clustering algorithm.

    PubMed

    Bálya, David

    2003-12-01

    Fast and robust classification of feature vectors is a crucial task in a number of real-time systems. A cellular neural/nonlinear network universal machine (CNN-UM) can be very efficient as a feature detector. The next step is to post-process the results for object recognition. This paper shows how a robust classification scheme based on adaptive resonance theory (ART) can be mapped to the CNN-UM. Moreover, this mapping is general enough to include different types of feed-forward neural networks. The designed analogic CNN algorithm is capable of classifying the extracted feature vectors keeping the advantages of the ART networks, such as robust, plastic and fault-tolerant behaviors. An analogic algorithm is presented for unsupervised classification with tunable sensitivity and automatic new class creation. The algorithm is extended for supervised classification. The presented binary feature vector classification is implemented on the existing standard CNN-UM chips for fast classification. The experimental evaluation shows promising performance after 100% accuracy on the training set.

  4. A new classification scheme of European cold-water coral habitats: Implications for ecosystem-based management of the deep sea

    NASA Astrophysics Data System (ADS)

    Davies, J. S.; Guillaumont, B.; Tempera, F.; Vertino, A.; Beuck, L.; Ólafsdóttir, S. H.; Smith, C. J.; Fosså, J. H.; van den Beld, I. M. J.; Savini, A.; Rengstorf, A.; Bayle, C.; Bourillet, J.-F.; Arnaud-Haond, S.; Grehan, A.

    2017-11-01

    Cold-water corals (CWC) can form complex structures which provide refuge, nursery grounds and physical support for a diversity of other living organisms. However, irrespectively from such ecological significance, CWCs are still vulnerable to human pressures such as fishing, pollution, ocean acidification and global warming Providing coherent and representative conservation of vulnerable marine ecosystems including CWCs is one of the aims of the Marine Protected Areas networks being implemented across European seas and oceans under the EC Habitats Directive, the Marine Strategy Framework Directive and the OSPAR Convention. In order to adequately represent ecosystem diversity, these initiatives require a standardised habitat classification that organises the variety of biological assemblages and provides consistent and functional criteria to map them across European Seas. One such classification system, EUNIS, enables a broad level classification of the deep sea based on abiotic and geomorphological features. More detailed lower biotope-related levels are currently under-developed, particularly with regards to deep-water habitats (>200 m depth). This paper proposes a hierarchical CWC biotope classification scheme that could be incorporated by existing classification schemes such as EUNIS. The scheme was developed within the EU FP7 project CoralFISH to capture the variability of CWC habitats identified using a wealth of seafloor imagery datasets from across the Northeast Atlantic and Mediterranean. Depending on the resolution of the imagery being interpreted, this hierarchical scheme allows data to be recorded from broad CWC biotope categories down to detailed taxonomy-based levels, thereby providing a flexible yet valuable information level for management. The CWC biotope classification scheme identifies 81 biotopes and highlights the limitations of the classification framework and guidance provided by EUNIS, the EC Habitats Directive, OSPAR and FAO; which largely underrepresent CWC habitats.

  5. Map Classification: A Comparison of Schemes with Special Reference to the Continent of Africa. Occasional Papers, Number 154.

    ERIC Educational Resources Information Center

    Merrett, Christopher E.

    This guide to the theory and practice of map classification begins with a discussion of the filing of maps and the function of map classification based on area and theme as illustrated by four maps of Africa. The description of the various classification systems which follows is divided into book schemes with provision for maps (including Dewey…

  6. Predominant-period site classification for response spectra prediction equations in Italy

    USGS Publications Warehouse

    Di Alessandro, Carola; Bonilla, Luis Fabian; Boore, David M.; Rovelli, Antonio; Scotti, Oona

    2012-01-01

    We propose a site‐classification scheme based on the predominant period of the site, as determined from the average horizontal‐to‐vertical (H/V) spectral ratios of ground motion. Our scheme extends Zhao et al. (2006) classifications by adding two classes, the most important of which is defined by flat H/V ratios with amplitudes less than 2. The proposed classification is investigated by using 5%‐damped response spectra from Italian earthquake records. We select a dataset of 602 three‐component analog and digital recordings from 120 earthquakes recorded at 214 seismic stations within a hypocentral distance of 200 km. Selected events are in the moment‐magnitude range 4.0≤Mw≤6.8 and focal depths from a few kilometers to 46 km. We computed H/V ratios for these data and used them to classify each site into one of six classes. We then investigate the impact of this classification scheme on empirical ground‐motion prediction equations (GMPEs) by comparing its performance with that of the conventional rock/soil classification. Although the adopted approach results in only a small reduction of the overall standard deviation, the use of H/V spectral ratios in site classification does capture the signature of sites with flat frequency‐response, as well as deep and shallow‐soil profiles, characterized by long‐ and short‐period resonance, respectively; in addition, the classification scheme is relatively quick and inexpensive, which is an advantage over schemes based on measurements of shear‐wave velocity.

  7. Classification scheme for phenomenological universalities in growth problems in physics and other sciences.

    PubMed

    Castorina, P; Delsanto, P P; Guiot, C

    2006-05-12

    A classification in universality classes of broad categories of phenomenologies, belonging to physics and other disciplines, may be very useful for a cross fertilization among them and for the purpose of pattern recognition and interpretation of experimental data. We present here a simple scheme for the classification of nonlinear growth problems. The success of the scheme in predicting and characterizing the well known Gompertz, West, and logistic models, suggests to us the study of a hitherto unexplored class of nonlinear growth problems.

  8. Objectification of Orthodontic Treatment Needs: Does the Classification of Malocclusions or a History of Orthodontic Treatment Matter?

    PubMed

    Kozanecka, Anna; Sarul, Michał; Kawala, Beata; Antoszewska-Smith, Joanna

    2016-01-01

    Orthodontic classifications make it possible to give an accurate diagnosis but do not indicate an objective orthodontic treatment need. In order to evaluate the need for treatment, it is necessary to use such indicators as the IOTN. The aim of the study was to find (i) relationships between individual diagnosis and objective recommendations for treatment and (ii) an answer to the question whether and which occlusal anomalies play an important role in the objectification of treatment needs. Two hundred three 18-year-old adolescents (104 girls, 99 boys) were examined. In order to recognize occlusal anomalies, the classifications proposed by Orlik-Grzybowska and Ackerman-Proffit were used. The occlusal anomalies were divided into three categories: belonging to both classifications, typical of OrlikGrzybowska classification and typical of Ackerman-Proffit classification. In order to determine the objective need for orthodontic treatment, the Dental Health Component (DHC) of the IOTN was used. The occurrence of the following malocclusions covered by both classifications, namely abnormal overjet, crossbite and Angle's class, had a statistically significant (p < 0.05) impact on an increase of treatment needs in the subjects (DHC > 3). As for the classification by Orlik-Grzybowska, dental malpositions and canine class significantly affected the need for orthodontic treatment, while in the case of the Ackerman-Proffit scheme, it was asymmetry and crowding. There was no statistically significant correlation between past orthodontic treatment and current orthodontic treatment need. IOTN may be affected by a greater number of occlusal anomalies than it was assumed. Orthodontic treatment received in the past slightly reduces the need for treatment in 18-year-olds.

  9. Enhancing Vocabulary Acquisition through Reading: A Hierarchy of Text-Related Exercise Types.

    ERIC Educational Resources Information Center

    Wesche, M.; Paribakht, T. Sima

    This paper describes a classification scheme developed to examine the effects of extensive reading on primary and second language vocabulary acquisition and reports on an experiment undertaken to test the model scheme. The classification scheme represents a hypothesized hierarchy of the degree and type of mental processing required by various…

  10. Using Simulations to Investigate the Longitudinal Stability of Alternative Schemes for Classifying and Identifying Children with Reading Disabilities

    ERIC Educational Resources Information Center

    Schatschneider, Christopher; Wagner, Richard K.; Hart, Sara A.; Tighe, Elizabeth L.

    2016-01-01

    The present study employed data simulation techniques to investigate the 1-year stability of alternative classification schemes for identifying children with reading disabilities. Classification schemes investigated include low performance, unexpected low performance, dual-discrepancy, and a rudimentary form of constellation model of reading…

  11. Mammogram classification scheme using 2D-discrete wavelet and local binary pattern for detection of breast cancer

    NASA Astrophysics Data System (ADS)

    Adi Putra, Januar

    2018-04-01

    In this paper, we propose a new mammogram classification scheme to classify the breast tissues as normal or abnormal. Feature matrix is generated using Local Binary Pattern to all the detailed coefficients from 2D-DWT of the region of interest (ROI) of a mammogram. Feature selection is done by selecting the relevant features that affect the classification. Feature selection is used to reduce the dimensionality of data and features that are not relevant, in this paper the F-test and Ttest will be performed to the results of the feature extraction dataset to reduce and select the relevant feature. The best features are used in a Neural Network classifier for classification. In this research we use MIAS and DDSM database. In addition to the suggested scheme, the competent schemes are also simulated for comparative analysis. It is observed that the proposed scheme has a better say with respect to accuracy, specificity and sensitivity. Based on experiments, the performance of the proposed scheme can produce high accuracy that is 92.71%, while the lowest accuracy obtained is 77.08%.

  12. Consensus embedding: theory, algorithms and application to segmentation and classification of biomedical data

    PubMed Central

    2012-01-01

    Background Dimensionality reduction (DR) enables the construction of a lower dimensional space (embedding) from a higher dimensional feature space while preserving object-class discriminability. However several popular DR approaches suffer from sensitivity to choice of parameters and/or presence of noise in the data. In this paper, we present a novel DR technique known as consensus embedding that aims to overcome these problems by generating and combining multiple low-dimensional embeddings, hence exploiting the variance among them in a manner similar to ensemble classifier schemes such as Bagging. We demonstrate theoretical properties of consensus embedding which show that it will result in a single stable embedding solution that preserves information more accurately as compared to any individual embedding (generated via DR schemes such as Principal Component Analysis, Graph Embedding, or Locally Linear Embedding). Intelligent sub-sampling (via mean-shift) and code parallelization are utilized to provide for an efficient implementation of the scheme. Results Applications of consensus embedding are shown in the context of classification and clustering as applied to: (1) image partitioning of white matter and gray matter on 10 different synthetic brain MRI images corrupted with 18 different combinations of noise and bias field inhomogeneity, (2) classification of 4 high-dimensional gene-expression datasets, (3) cancer detection (at a pixel-level) on 16 image slices obtained from 2 different high-resolution prostate MRI datasets. In over 200 different experiments concerning classification and segmentation of biomedical data, consensus embedding was found to consistently outperform both linear and non-linear DR methods within all applications considered. Conclusions We have presented a novel framework termed consensus embedding which leverages ensemble classification theory within dimensionality reduction, allowing for application to a wide range of high-dimensional biomedical data classification and segmentation problems. Our generalizable framework allows for improved representation and classification in the context of both imaging and non-imaging data. The algorithm offers a promising solution to problems that currently plague DR methods, and may allow for extension to other areas of biomedical data analysis. PMID:22316103

  13. A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science.

    PubMed

    Waltman, Ludo; Yan, Erjia; van Eck, Nees Jan

    2011-10-01

    Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators). We combine these two ideas in a single indicator, referred to as the recursive mean normalized citation score indicator, and we study the validity of this indicator. Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme. Based on these observations, we advise against the use of indicators in which the idea of normalization based on a field classification scheme and the idea of recursive citation weighing are combined.

  14. Infant Mortality: Development of a Proposed Update to the Dollfus Classification of Infant Deaths

    PubMed Central

    Dove, Melanie S.; Minnal, Archana; Damesyn, Mark; Curtis, Michael P.

    2015-01-01

    Objective Identifying infant deaths with common underlying causes and potential intervention points is critical to infant mortality surveillance and the development of prevention strategies. We constructed an International Classification of Diseases 10th Revision (ICD-10) parallel to the Dollfus cause-of-death classification scheme first published in 1990, which organized infant deaths by etiology and their amenability to prevention efforts. Methods Infant death records for 1996, dual-coded to the ICD Ninth Revision (ICD-9) and ICD-10, were obtained from the CDC public-use multiple-cause-of-death file on comparability between ICD-9 and ICD-10. We used the underlying cause of death to group 27,821 infant deaths into the nine categories of the ICD-9-based update to Dollfus' original coding scheme, published by Sowards in 1999. Comparability ratios were computed to measure concordance between ICD versions. Results The Dollfus classification system updated with ICD-10 codes had limited agreement with the 1999 modified classification system. Although prematurity, congenital malformations, Sudden Infant Death Syndrome, and obstetric conditions were the first through fourth most common causes of infant death under both systems, most comparability ratios were significantly different from one system to the other. Conclusion The Dollfus classification system can be adapted for use with ICD-10 codes to create a comprehensive, etiology-based profile of infant deaths. The potential benefits of using Dollfus logic to guide perinatal mortality reduction strategies, particularly to maternal and child health programs and other initiatives focused on improving infant health, warrant further examination of this method's use in perinatal mortality surveillance. PMID:26556935

  15. An efficient fully unsupervised video object segmentation scheme using an adaptive neural-network classifier architecture.

    PubMed

    Doulamis, A; Doulamis, N; Ntalianis, K; Kollias, S

    2003-01-01

    In this paper, an unsupervised video object (VO) segmentation and tracking algorithm is proposed based on an adaptable neural-network architecture. The proposed scheme comprises: 1) a VO tracking module and 2) an initial VO estimation module. Object tracking is handled as a classification problem and implemented through an adaptive network classifier, which provides better results compared to conventional motion-based tracking algorithms. Network adaptation is accomplished through an efficient and cost effective weight updating algorithm, providing a minimum degradation of the previous network knowledge and taking into account the current content conditions. A retraining set is constructed and used for this purpose based on initial VO estimation results. Two different scenarios are investigated. The first concerns extraction of human entities in video conferencing applications, while the second exploits depth information to identify generic VOs in stereoscopic video sequences. Human face/ body detection based on Gaussian distributions is accomplished in the first scenario, while segmentation fusion is obtained using color and depth information in the second scenario. A decision mechanism is also incorporated to detect time instances for weight updating. Experimental results and comparisons indicate the good performance of the proposed scheme even in sequences with complicated content (object bending, occlusion).

  16. Investigations on classification categories for wetlands of Chesapeake Bay using remotely sensed data

    NASA Technical Reports Server (NTRS)

    Williamson, F. S. L.

    1974-01-01

    The use of remote sensors to determine the characteristics of the wetlands of the Chesapeake Bay and surrounding areas is discussed. The objectives of the program are stated as follows: (1) to use data and remote sensing techniques developed from studies of Rhode River, West River, and South River salt marshes to develop a wetland classification scheme useful in other regions of the Chesapeake Bay and to evaluate the classification system with respect to vegetation types, marsh physiography, man-induced perturbation, and salinity; and (2) to develop a program using remote sensing techniques, for the extension of the classification to Chesapeake Bay salt marshes and to coordinate this program with the goals of the Chesapeake Research Consortium and the states of Maryland and Virginia. Maps of the Chesapeake Bay areas are developed from aerial photographs to display the wetland structure and vegetation.

  17. A Hybrid Classification System for Heart Disease Diagnosis Based on the RFRS Method.

    PubMed

    Liu, Xiao; Wang, Xiaoli; Su, Qiang; Zhang, Mo; Zhu, Yanhong; Wang, Qiugen; Wang, Qian

    2017-01-01

    Heart disease is one of the most common diseases in the world. The objective of this study is to aid the diagnosis of heart disease using a hybrid classification system based on the ReliefF and Rough Set (RFRS) method. The proposed system contains two subsystems: the RFRS feature selection system and a classification system with an ensemble classifier. The first system includes three stages: (i) data discretization, (ii) feature extraction using the ReliefF algorithm, and (iii) feature reduction using the heuristic Rough Set reduction algorithm that we developed. In the second system, an ensemble classifier is proposed based on the C4.5 classifier. The Statlog (Heart) dataset, obtained from the UCI database, was used for experiments. A maximum classification accuracy of 92.59% was achieved according to a jackknife cross-validation scheme. The results demonstrate that the performance of the proposed system is superior to the performances of previously reported classification techniques.

  18. Change classification in SAR time series: a functional approach

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  19. Defining functional biomes and monitoring their change globally.

    PubMed

    Higgins, Steven I; Buitenwerf, Robert; Moncrieff, Glenn R

    2016-11-01

    Biomes are important constructs for organizing understanding of how the worlds' major terrestrial ecosystems differ from one another and for monitoring change in these ecosystems. Yet existing biome classification schemes have been criticized for being overly subjective and for explicitly or implicitly invoking climate. We propose a new biome map and classification scheme that uses information on (i) an index of vegetation productivity, (ii) whether the minimum of vegetation activity is in the driest or coldest part of the year, and (iii) vegetation height. Although biomes produced on the basis of this classification show a strong spatial coherence, they show little congruence with existing biome classification schemes. Our biome map provides an alternative classification scheme for comparing the biogeochemical rates of terrestrial ecosystems. We use this new biome classification scheme to analyse the patterns of biome change observed over recent decades. Overall, 13% to 14% of analysed pixels shifted in biome state over the 30-year study period. A wide range of biome transitions were observed. For example, biomes with tall vegetation and minimum vegetation activity in the cold season shifted to higher productivity biome states. Biomes with short vegetation and low seasonality shifted to seasonally moisture-limited biome states. Our findings and method provide a new source of data for rigorously monitoring global vegetation change, analysing drivers of vegetation change and for benchmarking models of terrestrial ecosystem function. © 2016 John Wiley & Sons Ltd.

  20. Etiological classification of ischemic stroke in young patients: a comparative study of TOAST, CCS, and ASCO.

    PubMed

    Gökçal, Elif; Niftaliyev, Elvin; Asil, Talip

    2017-09-01

    Analysis of stroke subtypes is important for making treatment decisions and prognostic evaluations. The TOAST classification system is most commonly used, but the CCS and ASCO classification systems might be more useful to identify stroke etiologies in young patients whose strokes have a wide range of different causes. In this manuscript, we aim to compare the differences in subtype classification between TOAST, CCS, and ASCO in young stroke patients. The TOAST, CCS, and ASCO classification schemes were applied to 151 patients with ischemic stroke aged 18-49 years old and the proportion of subtypes classified by each scheme was compared. For comparison, determined etiologies were defined as cases with evident and probable subtypes when using the CCS scheme and cases with grade 1 and 2 subtypes but no other grade 1 subtype when using the ASCO scheme. The McNemar test with Bonferroni correction was used to assess significance. By TOAST, 41.1% of patients' stroke etiology was classified as undetermined etiology, 19.2% as cardioembolic, 13.2% as large artery atherosclerosis, 11.3% as small vessel occlusion, and 15.2% as other causes. Compared with TOAST, both CCS and ASCO assigned fewer patients to the undetermined etiology group (30.5% p < 0.001 and 26.5% p < 0.001, respectively) and assigned more patients to the small vessel occlusion category (19.9%, p < 0.001, and 21.9%, p < 0.001, respectively). Additionally, both schemes assigned more patients to the large artery atherosclerosis group (15.9 and 16.6%, respectively). The proportion of patients assigned to either the cardioembolic or the other causes etiology did not differ significantly between the three schemes. Application of the CCS and ASCO classification schemes in young stroke patients seems feasible, and using both schemes may result in fewer patients being classified as undetermined etiology. New studies with more patients and a prospective design are needed to explore this topic further.

  1. Development and application of a new comprehensive image-based classification scheme for coastal and benthic environments along the southeast Florida continental shelf

    NASA Astrophysics Data System (ADS)

    Makowski, Christopher

    The coastal (terrestrial) and benthic environments along the southeast Florida continental shelf show a unique biophysical succession of marine features from a highly urbanized, developed coastal region in the north (i.e. northern Miami-Dade County) to a protective marine sanctuary in the southeast (i.e. Florida Keys National Marine Sanctuary). However, the establishment of a standard bio-geomorphological classification scheme for this area of coastal and benthic environments is lacking. The purpose of this study was to test the hypothesis and answer the research question of whether new parameters of integrating geomorphological components with dominant biological covers could be developed and applied across multiple remote sensing platforms for an innovative way to identify, interpret, and classify diverse coastal and benthic environments along the southeast Florida continental shelf. An ordered manageable hierarchical classification scheme was developed to incorporate the categories of Physiographic Realm, Morphodynamic Zone, Geoform, Landform, Dominant Surface Sediment, and Dominant Biological Cover. Six different remote sensing platforms (i.e. five multi-spectral satellite image sensors and one high-resolution aerial orthoimagery) were acquired, delineated according to the new classification scheme, and compared to determine optimal formats for classifying the study area. Cognitive digital classification at a nominal scale of 1:6000 proved to be more accurate than autoclassification programs and therefore used to differentiate coastal marine environments based on spectral reflectance characteristics, such as color, tone, saturation, pattern, and texture of the seafloor topology. In addition, attribute tables were created in conjugation with interpretations to quantify and compare the spatial relationships between classificatory units. IKONOS-2 satellite imagery was determined to be the optimal platform for applying the hierarchical classification scheme. However, each remote sensing platform had beneficial properties depending on research goals, logistical restrictions, and financial support. This study concluded that a new hierarchical comprehensive classification scheme for identifying coastal marine environments along the southeast Florida continental shelf could be achieved by integrating geomorphological features with biological coverages. This newly developed scheme, which can be applied across multiple remote sensing platforms with GIS software, establishes an innovative classification protocol to be used in future research studies.

  2. Demonstration of Advanced EMI Models for Live-Site UXO Discrimination at Former Camp Butner, North Carolina

    DTIC Science & Technology

    2012-05-01

    GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7 . PERFORMING ORGANIZATION NAME(S...2.3.3 Classification using template matching ...................................................... 7 2.4 Details of classification schemes... 7 2.4.1 Camp Butner TEMTADS data inversion and classification scheme .......... 9

  3. Transporter taxonomy - a comparison of different transport protein classification schemes.

    PubMed

    Viereck, Michael; Gaulton, Anna; Digles, Daniela; Ecker, Gerhard F

    2014-06-01

    Currently, there are more than 800 well characterized human membrane transport proteins (including channels and transporters) and there are estimates that about 10% (approx. 2000) of all human genes are related to transport. Membrane transport proteins are of interest as potential drug targets, for drug delivery, and as a cause of side effects and drug–drug interactions. In light of the development of Open PHACTS, which provides an open pharmacological space, we analyzed selected membrane transport protein classification schemes (Transporter Classification Database, ChEMBL, IUPHAR/BPS Guide to Pharmacology, and Gene Ontology) for their ability to serve as a basis for pharmacology driven protein classification. A comparison of these membrane transport protein classification schemes by using a set of clinically relevant transporters as use-case reveals the strengths and weaknesses of the different taxonomy approaches.

  4. A scheme for a flexible classification of dietary and health biomarkers.

    PubMed

    Gao, Qian; Praticò, Giulia; Scalbert, Augustin; Vergères, Guy; Kolehmainen, Marjukka; Manach, Claudine; Brennan, Lorraine; Afman, Lydia A; Wishart, David S; Andres-Lacueva, Cristina; Garcia-Aloy, Mar; Verhagen, Hans; Feskens, Edith J M; Dragsted, Lars O

    2017-01-01

    Biomarkers are an efficient means to examine intakes or exposures and their biological effects and to assess system susceptibility. Aided by novel profiling technologies, the biomarker research field is undergoing rapid development and new putative biomarkers are continuously emerging in the scientific literature. However, the existing concepts for classification of biomarkers in the dietary and health area may be ambiguous, leading to uncertainty about their application. In order to better understand the potential of biomarkers and to communicate their use and application, it is imperative to have a solid scheme for biomarker classification that will provide a well-defined ontology for the field. In this manuscript, we provide an improved scheme for biomarker classification based on their intended use rather than the technology or outcomes (six subclasses are suggested: food compound intake biomarkers (FCIBs), food or food component intake biomarkers (FIBs), dietary pattern biomarkers (DPBs), food compound status biomarkers (FCSBs), effect biomarkers, physiological or health state biomarkers). The application of this scheme is described in detail for the dietary and health area and is compared with previous biomarker classification for this field of research.

  5. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  6. Model Validation and Site Characterization for Early Deployment MHK Sites and Establishment of Wave Classification Scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilcher, Levi F

    Model Validation and Site Characterization for Early Deployment Marine and Hydrokinetic Energy Sites and Establishment of Wave Classification Scheme presentation from from Water Power Technologies Office Peer Review, FY14-FY16.

  7. Mathematical model of blasting schemes management in mining operations in presence of random disturbances

    NASA Astrophysics Data System (ADS)

    Kazakova, E. I.; Medvedev, A. N.; Kolomytseva, A. O.; Demina, M. I.

    2017-11-01

    The paper presents a mathematical model of blasting schemes management in presence of random disturbances. Based on the lemmas and theorems proved, a control functional is formulated, which is stable. A universal classification of blasting schemes is developed. The main classification attributes are suggested: the orientation in plan the charging wells rows relatively the block of rocks; the presence of cuts in the blasting schemes; the separation of the wells series onto elements; the sequence of the blasting. The periodic regularity of transition from one Short-delayed scheme of blasting to another is proved.

  8. Object linking in repositories

    NASA Technical Reports Server (NTRS)

    Eichmann, David (Editor); Beck, Jon; Atkins, John; Bailey, Bill

    1992-01-01

    This topic is covered in three sections. The first section explores some of the architectural ramifications of extending the Eichmann/Atkins lattice-based classification scheme to encompass the assets of the full life cycle of software development. A model is considered that provides explicit links between objects in addition to the edges connecting classification vertices in the standard lattice. The second section gives a description of the efforts to implement the repository architecture using a commercially available object-oriented database management system. Some of the features of this implementation are described, and some of the next steps to be taken to produce a working prototype of the repository are pointed out. In the final section, it is argued that design and instantiation of reusable components have competing criteria (design-for-reuse strives for generality, design-with-reuse strives for specificity) and that providing mechanisms for each can be complementary rather than antagonistic. In particular, it is demonstrated how program slicing techniques can be applied to customization of reusable components.

  9. Toward semantic-based retrieval of visual information: a model-based approach

    NASA Astrophysics Data System (ADS)

    Park, Youngchoon; Golshani, Forouzan; Panchanathan, Sethuraman

    2002-07-01

    This paper center around the problem of automated visual content classification. To enable classification based image or visual object retrieval, we propose a new image representation scheme called visual context descriptor (VCD) that is a multidimensional vector in which each element represents the frequency of a unique visual property of an image or a region. VCD utilizes the predetermined quality dimensions (i.e., types of features and quantization level) and semantic model templates mined in priori. Not only observed visual cues, but also contextually relevant visual features are proportionally incorporated in VCD. Contextual relevance of a visual cue to a semantic class is determined by using correlation analysis of ground truth samples. Such co-occurrence analysis of visual cues requires transformation of a real-valued visual feature vector (e.g., color histogram, Gabor texture, etc.,) into a discrete event (e.g., terms in text). Good-feature to track, rule of thirds, iterative k-means clustering and TSVQ are involved in transformation of feature vectors into unified symbolic representations called visual terms. Similarity-based visual cue frequency estimation is also proposed and used for ensuring the correctness of model learning and matching since sparseness of sample data causes the unstable results of frequency estimation of visual cues. The proposed method naturally allows integration of heterogeneous visual or temporal or spatial cues in a single classification or matching framework, and can be easily integrated into a semantic knowledge base such as thesaurus, and ontology. Robust semantic visual model template creation and object based image retrieval are demonstrated based on the proposed content description scheme.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mainzer, A.; Masiero, J.; Hand, E.

    The NEOWISE data set offers the opportunity to study the variations in albedo for asteroid classification schemes based on visible and near-infrared observations for a large sample of minor planets. We have determined the albedos for nearly 1900 asteroids classified by the Tholen, Bus, and Bus-DeMeo taxonomic classification schemes. We find that the S-complex spans a broad range of bright albedos, partially overlapping the low albedo C-complex at small sizes. As expected, the X-complex covers a wide range of albedos. The multiwavelength infrared coverage provided by NEOWISE allows determination of the reflectivity at 3.4 and 4.6 {mu}m relative to themore » visible albedo. The direct computation of the reflectivity at 3.4 and 4.6 {mu}m enables a new means of comparing the various taxonomic classes. Although C, B, D, and T asteroids all have similarly low visible albedos, the D and T types can be distinguished from the C and B types by examining their relative reflectance at 3.4 and 4.6 {mu}m. All of the albedo distributions are strongly affected by selection biases against small, low albedo objects, as all objects selected for taxonomic classification were chosen according to their visible light brightness. Due to these strong selection biases, we are unable to determine whether or not there are correlations between size, albedo, and space weathering. We argue that the current set of classified asteroids makes any such correlations difficult to verify. A sample of taxonomically classified asteroids drawn without significant albedo bias is needed in order to perform such an analysis.« less

  11. Classification of childhood epilepsies in a tertiary pediatric neurology clinic using a customized classification scheme from the international league against epilepsy 2010 report.

    PubMed

    Khoo, Teik-Beng

    2013-01-01

    In its 2010 report, the International League Against Epilepsy Commission on Classification and Terminology had made a number of changes to the organization, terminology, and classification of seizures and epilepsies. This study aims to test the usefulness of this revised classification scheme on children with epilepsies aged between 0 and 18 years old. Of 527 patients, 75.1% only had 1 type of seizure and the commonest was focal seizure (61.9%). A specific electroclinical syndrome diagnosis could be made in 27.5%. Only 2.1% had a distinctive constellation. In this cohort, 46.9% had an underlying structural, metabolic, or genetic etiology. Among the important causes were pre-/perinatal insults, malformation of cortical development, intracranial infections, and neurocutaneous syndromes. However, 23.5% of the patients in our cohort were classified as having "epilepsies of unknown cause." The revised classification scheme is generally useful for pediatric patients. To make it more inclusive and clinically meaningful, some local customizations are required.

  12. Toward an endovascular internal carotid artery classification system.

    PubMed

    Shapiro, M; Becske, T; Riina, H A; Raz, E; Zumofen, D; Jafar, J J; Huang, P P; Nelson, P K

    2014-02-01

    Does the world need another ICA classification scheme? We believe so. The purpose of proposed angiography-driven classification is to optimize description of the carotid artery from the endovascular perspective. A review of existing, predominantly surgically-driven classifications is performed, and a new scheme, based on the study of NYU aneurysm angiographic and cross-sectional databases is proposed. Seven segments - cervical, petrous, cavernous, paraophthlamic, posterior communicating, choroidal, and terminus - are named. This nomenclature recognizes intrinsic uncertainty in precise angiographic and cross-sectional localization of aneurysms adjacent to the dural rings, regarding all lesions distal to the cavernous segment as potentially intradural. Rather than subdividing various transitional, ophthalmic, and hypophyseal aneurysm subtypes, as necessitated by their varied surgical approaches and risks, the proposed classification emphasizes their common endovascular treatment features, while recognizing that many complex, trans-segmental, and fusiform aneurysms not readily classifiable into presently available, saccular aneurysm-driven schemes, are being increasingly addressed by endovascular means. We believe this classification may find utility in standardizing nomenclature for outcome tracking, treatment trials and physician communication.

  13. Planetary Taxonomy: Label Round Bodies "Worlds"

    NASA Astrophysics Data System (ADS)

    Margot, Jean-Luc; Levison, H. F.

    2009-05-01

    The classification of planetary bodies is as important to Astronomy as taxonomy is to other sciences. The etymological, historical, and IAU definitions of planet rely on a dynamical criterion, but some authors prefer a geophysical criterion based on "roundness". Although the former criterion is superior when it comes to classifying newly discovered objects, the conflict need not exist if we agree to identify the subset of "round" planetary objects as "worlds". This addition to the taxonomy would conveniently recognize that "round" objects such as Earth, Europa, Titan, Triton, and Pluto share some common planetary-type processes regardless of their distance from the host star. Some of these worlds are planets, others are not. Defining how round is round and handling the inevitable transition objects are non-trivial tasks. Because images at sufficient resolution are not available for the overwhelming majority of newly discovered objects, the degree of roundness is not a directly observable property and is inherently problematic as a basis for classification. We can tolerate some uncertainty in establishing the "world" status of a newly discovered object, and still establish its planet or satellite status with existing dynamical criteria. Because orbital parameters are directly observable, and because mass can often be measured either from orbital perturbations or from the presence of companions, the dynamics provide a robust and practical planet classification scheme. It may also be possible to determine which bodies are dynamically dominant from observations of the population magnitude/size distribution.

  14. Do thoraco-lumbar spinal injuries classification systems exhibit lower inter- and intra-observer agreement than other fractures classifications?: A comparison using fractures of the trochanteric area of the proximal femur as contrast model.

    PubMed

    Urrutia, Julio; Zamora, Tomas; Klaber, Ianiv; Carmona, Maximiliano; Palma, Joaquin; Campos, Mauricio; Yurac, Ratko

    2016-04-01

    It has been postulated that the complex patterns of spinal injuries have prevented adequate agreement using thoraco-lumbar spinal injuries (TLSI) classifications; however, limb fracture classifications have also shown variable agreements. This study compared agreement using two TLSI classifications with agreement using two classifications of fractures of the trochanteric area of the proximal femur (FTAPF). Six evaluators classified the radiographs and computed tomography scans of 70 patients with acute TLSI using the Denis and the new AO Spine thoraco-lumbar injury classifications. Additionally, six evaluators classified the radiographs of 70 patients with FTAPF using the Tronzo and the AO schemes. Six weeks later, all cases were presented in a random sequence for repeat assessment. The Kappa coefficient (κ) was used to determine agreement. Inter-observer agreement: For TLSI, using the AOSpine classification, the mean κ was 0.62 (0.57-0.66) considering fracture types, and 0.55 (0.52-0.57) considering sub-types; using the Denis classification, κ was 0.62 (0.59-0.65). For FTAPF, with the AO scheme, the mean κ was 0.58 (0.54-0.63) considering fracture types and 0.31 (0.28-0.33) considering sub-types; for the Tronzo classification, κ was 0.54 (0.50-0.57). Intra-observer agreement: For TLSI, using the AOSpine scheme, the mean κ was 0.77 (0.72-0.83) considering fracture types, and 0.71 (0.67-0.76) considering sub-types; for the Denis classification, κ was 0.76 (0.71-0.81). For FTAPF, with the AO scheme, the mean κ was 0.75 (0.69-0.81) considering fracture types and 0.45 (0.39-0.51) considering sub-types; for the Tronzo classification, κ was 0.64 (0.58-0.70). Using the main types of AO classifications, inter- and intra-observer agreement of TLSI were comparable to agreement evaluating FTAPF; including sub-types, inter- and intra-observer agreement evaluating TLSI were significantly better than assessing FTAPF. Inter- and intra-observer agreements using the Denis classification were also significantly better than agreement using the Tronzo scheme. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. A comparative agreement evaluation of two subaxial cervical spine injury classification systems: the AOSpine and the Allen and Ferguson schemes.

    PubMed

    Urrutia, Julio; Zamora, Tomas; Campos, Mauricio; Yurac, Ratko; Palma, Joaquin; Mobarec, Sebastian; Prada, Carlos

    2016-07-01

    We performed an agreement study using two subaxial cervical spine classification systems: the AOSpine and the Allen and Ferguson (A&F) classifications. We sought to determine which scheme allows better agreement by different evaluators and by the same evaluator on different occasions. Complete imaging studies of 65 patients with subaxial cervical spine injuries were classified by six evaluators (three spine sub-specialists and three senior orthopaedic surgery residents) using the AOSpine subaxial cervical spine classification system and the A&F scheme. The cases were displayed in a random sequence after a 6-week interval for repeat evaluation. The Kappa coefficient (κ) was used to determine inter- and intra-observer agreement. Inter-observer: considering the main AO injury types, the agreement was substantial for the AOSpine classification [κ = 0.61 (0.57-0.64)]; using AO sub-types, the agreement was moderate [κ = 0.57 (0.54-0.60)]. For the A&F classification, the agreement [κ = 0.46 (0.42-0.49)] was significantly lower than using the AOSpine scheme. Intra-observer: the agreement was substantial considering injury types [κ = 0.68 (0.62-0.74)] and considering sub-types [κ = 0.62 (0.57-0.66)]. Using the A&F classification, the agreement was also substantial [κ = 0.66 (0.61-0.71)]. No significant differences were observed between spine surgeons and orthopaedic residents in the overall inter- and intra-observer agreement, or in the inter- and intra-observer agreement of specific type of injuries. The AOSpine classification (using the four main injury types or at the sub-types level) allows a significantly better agreement than the A&F classification. The A&F scheme does not allow reliable communication between medical professionals.

  16. Developing a contributing factor classification scheme for Rasmussen's AcciMap: Reliability and validity evaluation.

    PubMed

    Goode, N; Salmon, P M; Taylor, N Z; Lenné, M G; Finch, C F

    2017-10-01

    One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (M T1  = 68.8%; M T2  = 73.9%), and were poor at the descriptor level (M T1  = 58.5%; M T2  = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (M T1  = 73.9%; M T2  = 75.3%). However, they were not consistently acceptable at the descriptor level (M T1  = 67.6%; M T2  = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Comparing the performance of flat and hierarchical Habitat/Land-Cover classification models in a NATURA 2000 site

    NASA Astrophysics Data System (ADS)

    Gavish, Yoni; O'Connell, Jerome; Marsh, Charles J.; Tarantino, Cristina; Blonda, Palma; Tomaselli, Valeria; Kunin, William E.

    2018-02-01

    The increasing need for high quality Habitat/Land-Cover (H/LC) maps has triggered considerable research into novel machine-learning based classification models. In many cases, H/LC classes follow pre-defined hierarchical classification schemes (e.g., CORINE), in which fine H/LC categories are thematically nested within more general categories. However, none of the existing machine-learning algorithms account for this pre-defined hierarchical structure. Here we introduce a novel Random Forest (RF) based application of hierarchical classification, which fits a separate local classification model in every branching point of the thematic tree, and then integrates all the different local models to a single global prediction. We applied the hierarchal RF approach in a NATURA 2000 site in Italy, using two land-cover (CORINE, FAO-LCCS) and one habitat classification scheme (EUNIS) that differ from one another in the shape of the class hierarchy. For all 3 classification schemes, both the hierarchical model and a flat model alternative provided accurate predictions, with kappa values mostly above 0.9 (despite using only 2.2-3.2% of the study area as training cells). The flat approach slightly outperformed the hierarchical models when the hierarchy was relatively simple, while the hierarchical model worked better under more complex thematic hierarchies. Most misclassifications came from habitat pairs that are thematically distant yet spectrally similar. In 2 out of 3 classification schemes, the additional constraints of the hierarchical model resulted with fewer such serious misclassifications relative to the flat model. The hierarchical model also provided valuable information on variable importance which can shed light into "black-box" based machine learning algorithms like RF. We suggest various ways by which hierarchical classification models can increase the accuracy and interpretability of H/LC classification maps.

  18. Computer-aided diagnosis system: a Bayesian hybrid classification method.

    PubMed

    Calle-Alonso, F; Pérez, C J; Arias-Nicolás, J P; Martín, J

    2013-10-01

    A novel method to classify multi-class biomedical objects is presented. The method is based on a hybrid approach which combines pairwise comparison, Bayesian regression and the k-nearest neighbor technique. It can be applied in a fully automatic way or in a relevance feedback framework. In the latter case, the information obtained from both an expert and the automatic classification is iteratively used to improve the results until a certain accuracy level is achieved, then, the learning process is finished and new classifications can be automatically performed. The method has been applied in two biomedical contexts by following the same cross-validation schemes as in the original studies. The first one refers to cancer diagnosis, leading to an accuracy of 77.35% versus 66.37%, originally obtained. The second one considers the diagnosis of pathologies of the vertebral column. The original method achieves accuracies ranging from 76.5% to 96.7%, and from 82.3% to 97.1% in two different cross-validation schemes. Even with no supervision, the proposed method reaches 96.71% and 97.32% in these two cases. By using a supervised framework the achieved accuracy is 97.74%. Furthermore, all abnormal cases were correctly classified. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Retinal vessel tortuosity measures and their applications.

    PubMed

    Kalitzeos, Angelos A; Lip, Gregory Y H; Heitmar, Rebekka

    2013-01-01

    Structural retinal vascular characteristics, such as vessel calibers, tortuosity and bifurcation angles are increasingly quantified in an objective manner, slowly replacing subjective qualitative disease classification schemes. This paper provides an overview of the current methodologies and calculations used to compute retinal vessel tortuosity. We set out the different parameter calculations and provide an insight into the clinical applications, while critically reviewing its pitfalls and shortcomings. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Acoustic signature recognition technique for Human-Object Interactions (HOI) in persistent surveillance systems

    NASA Astrophysics Data System (ADS)

    Alkilani, Amjad; Shirkhodaie, Amir

    2013-05-01

    Handling, manipulation, and placement of objects, hereon called Human-Object Interaction (HOI), in the environment generate sounds. Such sounds are readily identifiable by the human hearing. However, in the presence of background environment noises, recognition of minute HOI sounds is challenging, though vital for improvement of multi-modality sensor data fusion in Persistent Surveillance Systems (PSS). Identification of HOI sound signatures can be used as precursors to detection of pertinent threats that otherwise other sensor modalities may miss to detect. In this paper, we present a robust method for detection and classification of HOI events via clustering of extracted features from training of HOI acoustic sound waves. In this approach, salient sound events are preliminary identified and segmented from background via a sound energy tracking method. Upon this segmentation, frequency spectral pattern of each sound event is modeled and its features are extracted to form a feature vector for training. To reduce dimensionality of training feature space, a Principal Component Analysis (PCA) technique is employed to expedite fast classification of test feature vectors, a kd-tree and Random Forest classifiers are trained for rapid classification of training sound waves. Each classifiers employs different similarity distance matching technique for classification. Performance evaluations of classifiers are compared for classification of a batch of training HOI acoustic signatures. Furthermore, to facilitate semantic annotation of acoustic sound events, a scheme based on Transducer Mockup Language (TML) is proposed. The results demonstrate the proposed approach is both reliable and effective, and can be extended to future PSS applications.

  1. New Course Design: Classification Schemes and Information Architecture.

    ERIC Educational Resources Information Center

    Weinberg, Bella Hass

    2002-01-01

    Describes a course developed at St. John's University (New York) in the Division of Library and Information Science that relates traditional classification schemes to information architecture and Web sites. Highlights include functional aspects of information architecture, that is, the way content is structured; assignments; student reactions; and…

  2. Classification and global distribution of ocean precipitation types based on satellite passive microwave signatures

    NASA Astrophysics Data System (ADS)

    Gautam, Nitin

    The main objectives of this thesis are to develop a robust statistical method for the classification of ocean precipitation based on physical properties to which the SSM/I is sensitive and to examine how these properties vary globally and seasonally. A two step approach is adopted for the classification of oceanic precipitation classes from multispectral SSM/I data: (1)we subjectively define precipitation classes using a priori information about the precipitating system and its possible distinct signature on SSM/I data such as scattering by ice particles aloft in the precipitating cloud, emission by liquid rain water below freezing level, the difference of polarization at 19 GHz-an indirect measure of optical depth, etc.; (2)we then develop an objective classification scheme which is found to reproduce the subjective classification with high accuracy. This hybrid strategy allows us to use the characteristics of the data to define and encode classes and helps retain the physical interpretation of classes. The classification methods based on k-nearest neighbor and neural network are developed to objectively classify six precipitation classes. It is found that the classification method based neural network yields high accuracy for all precipitation classes. An inversion method based on minimum variance approach was used to retrieve gross microphysical properties of these precipitation classes such as column integrated liquid water path, column integrated ice water path, and column integrated min water path. This classification method is then applied to 2 years (1991-92) of SSM/I data to examine and document the seasonal and global distribution of precipitation frequency corresponding to each of these objectively defined six classes. The characteristics of the distribution are found to be consistent with assumptions used in defining these six precipitation classes and also with well known climatological patterns of precipitation regions. The seasonal and global distribution of these six classes is also compared with the earlier results obtained from Comprehensive Ocean Atmosphere Data Sets (COADS). It is found that the gross pattern of the distributions obtained from SSM/I and COADS data match remarkably well with each other.

  3. Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors.

    PubMed

    Spiers, Adam J; Liarokapis, Minas V; Calli, Berk; Dollar, Aaron M

    2016-01-01

    Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.

  4. Enhancing Vocabulary Acquisition Through Reading: A Hierarchy of Text-Related Exercise Types.

    ERIC Educational Resources Information Center

    Paribakht, T. Sima; Wesche, Marjorie

    1996-01-01

    Presents a classification scheme for reading-related exercises advocated in English-as-a-Foreign-Language textbooks. The scheme proposes a hierarchy of the degree and type of mental processing required by various vocabulary exercises. The categories of classification are selective attention, recognition, manipulation, interpretation and…

  5. Lagrangian methods of cosmic web classification

    NASA Astrophysics Data System (ADS)

    Fisher, J. D.; Faltenbacher, A.; Johnson, M. S. T.

    2016-05-01

    The cosmic web defines the large-scale distribution of matter we see in the Universe today. Classifying the cosmic web into voids, sheets, filaments and nodes allows one to explore structure formation and the role environmental factors have on halo and galaxy properties. While existing studies of cosmic web classification concentrate on grid-based methods, this work explores a Lagrangian approach where the V-web algorithm proposed by Hoffman et al. is implemented with techniques borrowed from smoothed particle hydrodynamics. The Lagrangian approach allows one to classify individual objects (e.g. particles or haloes) based on properties of their nearest neighbours in an adaptive manner. It can be applied directly to a halo sample which dramatically reduces computational cost and potentially allows an application of this classification scheme to observed galaxy samples. Finally, the Lagrangian nature admits a straightforward inclusion of the Hubble flow negating the necessity of a visually defined threshold value which is commonly employed by grid-based classification methods.

  6. Comparing ecoregional classifications for natural areas management in the Klamath Region, USA

    USGS Publications Warehouse

    Sarr, Daniel A.; Duff, Andrew; Dinger, Eric C.; Shafer, Sarah L.; Wing, Michael; Seavy, Nathaniel E.; Alexander, John D.

    2015-01-01

    We compared three existing ecoregional classification schemes (Bailey, Omernik, and World Wildlife Fund) with two derived schemes (Omernik Revised and Climate Zones) to explore their effectiveness in explaining species distributions and to better understand natural resource geography in the Klamath Region, USA. We analyzed presence/absence data derived from digital distribution maps for trees, amphibians, large mammals, small mammals, migrant birds, and resident birds using three statistical analyses of classification accuracy (Analysis of Similarity, Canonical Analysis of Principal Coordinates, and Classification Strength). The classifications were roughly comparable in classification accuracy, with Omernik Revised showing the best overall performance. Trees showed the strongest fidelity to the classifications, and large mammals showed the weakest fidelity. We discuss the implications for regional biogeography and describe how intermediate resolution ecoregional classifications may be appropriate for use as natural areas management domains.

  7. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  8. Monitoring nanotechnology using patent classifications: an overview and comparison of nanotechnology classification schemes

    NASA Astrophysics Data System (ADS)

    Jürgens, Björn; Herrero-Solana, Victor

    2017-04-01

    Patents are an essential information source used to monitor, track, and analyze nanotechnology. When it comes to search nanotechnology-related patents, a keyword search is often incomplete and struggles to cover such an interdisciplinary discipline. Patent classification schemes can reveal far better results since they are assigned by experts who classify the patent documents according to their technology. In this paper, we present the most important classifications to search nanotechnology patents and analyze how nanotechnology is covered in the main patent classification systems used in search systems nowadays: the International Patent Classification (IPC), the United States Patent Classification (USPC), and the Cooperative Patent Classification (CPC). We conclude that nanotechnology has a significantly better patent coverage in the CPC since considerable more nanotechnology documents were retrieved than by using other classifications, and thus, recommend its use for all professionals involved in nanotechnology patent searches.

  9. The Nutraceutical Bioavailability Classification Scheme: Classifying Nutraceuticals According to Factors Limiting their Oral Bioavailability.

    PubMed

    McClements, David Julian; Li, Fang; Xiao, Hang

    2015-01-01

    The oral bioavailability of a health-promoting dietary component (nutraceutical) may be limited by various physicochemical and physiological phenomena: liberation from food matrices, solubility in gastrointestinal fluids, interaction with gastrointestinal components, chemical degradation or metabolism, and epithelium cell permeability. Nutraceutical bioavailability can therefore be improved by designing food matrices that control their bioaccessibility (B*), absorption (A*), and transformation (T*) within the gastrointestinal tract (GIT). This article reviews the major factors influencing the gastrointestinal fate of nutraceuticals, and then uses this information to develop a new scheme to classify the major factors limiting nutraceutical bioavailability: the nutraceutical bioavailability classification scheme (NuBACS). This new scheme is analogous to the biopharmaceutical classification scheme (BCS) used by the pharmaceutical industry to classify drug bioavailability, but it contains additional factors important for understanding nutraceutical bioavailability in foods. The article also highlights potential strategies for increasing the oral bioavailability of nutraceuticals based on their NuBACS designation (B*A*T*).

  10. Application of a 5-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants in the InSiGHT locus-specific database.

    PubMed

    Thompson, Bryony A; Spurdle, Amanda B; Plazzer, John-Paul; Greenblatt, Marc S; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P; Farrington, Susan M; Frayling, Ian M; Frebourg, Thierry; Goldgar, David E; Heinen, Christopher D; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J; Sijmons, Rolf; Tavtigian, Sean V; Tops, Carli M; Weber, Thomas; Wijnen, Juul; Woods, Michael O; Macrae, Finlay; Genuardi, Maurizio

    2014-02-01

    The clinical classification of hereditary sequence variants identified in disease-related genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch syndrome-associated genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist in variant classification and was recognized through microattribution. The scheme was refined by multidisciplinary expert committee review of the clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants that were not obviously protein truncating from nomenclature. This large-scale endeavor will facilitate the consistent management of families suspected to have Lynch syndrome and demonstrates the value of multidisciplinary collaboration in the curation and classification of variants in public locus-specific databases.

  11. Cheese Classification, Characterization, and Categorization: A Global Perspective.

    PubMed

    Almena-Aliste, Montserrat; Mietton, Bernard

    2014-02-01

    Cheese is one of the most fascinating, complex, and diverse foods enjoyed today. Three elements constitute the cheese ecosystem: ripening agents, consisting of enzymes and microorganisms; the composition of the fresh cheese; and the environmental conditions during aging. These factors determine and define not only the sensory quality of the final cheese product but also the vast diversity of cheeses produced worldwide. How we define and categorize cheese is a complicated matter. There are various approaches to cheese classification, and a global approach for classification and characterization is needed. We review current cheese classification schemes and the limitations inherent in each of the schemes described. While some classification schemes are based on microbiological criteria, others rely on descriptions of the technologies used for cheese production. The goal of this review is to present an overview of comprehensive and practical integrative classification models in order to better describe cheese diversity and the fundamental differences within cheeses, as well as to connect fundamental technological, microbiological, chemical, and sensory characteristics to contribute to an overall characterization of the main families of cheese, including the expanding world of American artisanal cheeses.

  12. New KF-PP-SVM classification method for EEG in brain-computer interfaces.

    PubMed

    Yang, Banghua; Han, Zhijun; Zan, Peng; Wang, Qian

    2014-01-01

    Classification methods are a crucial direction in the current study of brain-computer interfaces (BCIs). To improve the classification accuracy for electroencephalogram (EEG) signals, a novel KF-PP-SVM (kernel fisher, posterior probability, and support vector machine) classification method is developed. Its detailed process entails the use of common spatial patterns to obtain features, based on which the within-class scatter is calculated. Then the scatter is added into the kernel function of a radial basis function to construct a new kernel function. This new kernel is integrated into the SVM to obtain a new classification model. Finally, the output of SVM is calculated based on posterior probability and the final recognition result is obtained. To evaluate the effectiveness of the proposed KF-PP-SVM method, EEG data collected from laboratory are processed with four different classification schemes (KF-PP-SVM, KF-SVM, PP-SVM, and SVM). The results showed that the overall average improvements arising from the use of the KF-PP-SVM scheme as opposed to KF-SVM, PP-SVM and SVM schemes are 2.49%, 5.83 % and 6.49 % respectively.

  13. Application of a five-tiered scheme for standardized classification of 2,360 unique mismatch repair gene variants lodged on the InSiGHT locus-specific database

    PubMed Central

    Plazzer, John-Paul; Greenblatt, Marc S.; Akagi, Kiwamu; Al-Mulla, Fahd; Bapat, Bharati; Bernstein, Inge; Capellá, Gabriel; den Dunnen, Johan T.; du Sart, Desiree; Fabre, Aurelie; Farrell, Michael P.; Farrington, Susan M.; Frayling, Ian M.; Frebourg, Thierry; Goldgar, David E.; Heinen, Christopher D.; Holinski-Feder, Elke; Kohonen-Corish, Maija; Robinson, Kristina Lagerstedt; Leung, Suet Yi; Martins, Alexandra; Moller, Pal; Morak, Monika; Nystrom, Minna; Peltomaki, Paivi; Pineda, Marta; Qi, Ming; Ramesar, Rajkumar; Rasmussen, Lene Juel; Royer-Pokora, Brigitte; Scott, Rodney J.; Sijmons, Rolf; Tavtigian, Sean V.; Tops, Carli M.; Weber, Thomas; Wijnen, Juul; Woods, Michael O.; Macrae, Finlay; Genuardi, Maurizio

    2015-01-01

    Clinical classification of sequence variants identified in hereditary disease genes directly affects clinical management of patients and their relatives. The International Society for Gastrointestinal Hereditary Tumours (InSiGHT) undertook a collaborative effort to develop, test and apply a standardized classification scheme to constitutional variants in the Lynch Syndrome genes MLH1, MSH2, MSH6 and PMS2. Unpublished data submission was encouraged to assist variant classification, and recognized by microattribution. The scheme was refined by multidisciplinary expert committee review of clinical and functional data available for variants, applied to 2,360 sequence alterations, and disseminated online. Assessment using validated criteria altered classifications for 66% of 12,006 database entries. Clinical recommendations based on transparent evaluation are now possible for 1,370 variants not obviously protein-truncating from nomenclature. This large-scale endeavor will facilitate consistent management of suspected Lynch Syndrome families, and demonstrates the value of multidisciplinary collaboration for curation and classification of variants in public locus-specific databases. PMID:24362816

  14. Improved opponent color local binary patterns: an effective local image descriptor for color texture classification

    NASA Astrophysics Data System (ADS)

    Bianconi, Francesco; Bello-Cerezo, Raquel; Napoletano, Paolo

    2018-01-01

    Texture classification plays a major role in many computer vision applications. Local binary patterns (LBP) encoding schemes have largely been proven to be very effective for this task. Improved LBP (ILBP) are conceptually simple, easy to implement, and highly effective LBP variants based on a point-to-average thresholding scheme instead of a point-to-point one. We propose the use of this encoding scheme for extracting intra- and interchannel features for color texture classification. We experimentally evaluated the resulting improved opponent color LBP alone and in concatenation with the ILBP of the local color contrast map on a set of image classification tasks over 9 datasets of generic color textures and 11 datasets of biomedical textures. The proposed approach outperformed other grayscale and color LBP variants in nearly all the datasets considered and proved competitive even against image features from last generation convolutional neural networks, particularly for the classification of biomedical images.

  15. TFM classification and staging of oral submucous fibrosis: A new proposal.

    PubMed

    Arakeri, Gururaj; Thomas, Deepak; Aljabab, Abdulsalam S; Hunasgi, Santosh; Rai, Kirthi Kumar; Hale, Beverley; Fonseca, Felipe Paiva; Gomez, Ricardo Santiago; Rahimi, Siavash; Merkx, Matthias A W; Brennan, Peter A

    2018-04-01

    We have evaluated the rationale of existing grading and staging schemes of oral submucous fibrosis (OSMF) based on how they are categorized. A novel classification and staging scheme is proposed. A total of 300 OSMF patients were evaluated for agreement between functional, clinical, and histopathological staging. Bilateral biopsies were assessed in 25 patients to evaluate for any differences in histopathological staging of OSMF in the same mouth. Extent of clinician agreement for categorized staging data was evaluated using Cohen's weighted kappa analysis. Cross-tabulation was performed on categorical grading data to understand the intercorrelation, and the unweighted kappa analysis was used to assess the bilateral grade agreement. Probabilities of less than 0.05 were considered significant. Data were analyzed using SPSS Statistics (version 25.0, IBM, USA). A low agreement was found between all the stages depicting the independent nature of trismus, clinical features, and histopathological components (K = 0.312, 0.167, 0.152) in OSMF. Following analysis, a three-component classification scheme (TFM classification) was developed that describes the severity of each independently, grouping them using a novel three-tier staging scheme as a guide to the treatment plan. The proposed classification and staging could be useful for effective communication, categorization, and for recording data and prognosis, and for guiding treatment plans. Furthermore, the classification considers OSMF malignant transformation in detail. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Meeting the Cool Neighbors. XII. An Optically Anchored Analysis of the Near-infrared Spectra of L Dwarfs

    NASA Astrophysics Data System (ADS)

    Cruz, Kelle L.; Núñez, Alejandro; Burgasser, Adam J.; Abrahams, Ellianna; Rice, Emily L.; Reid, I. Neill; Looper, Dagny

    2018-01-01

    Discrepancies between competing optical and near-infrared (NIR) spectral typing systems for L dwarfs have motivated us to search for a classification scheme that ties the optical and NIR schemes together, and addresses complexities in the spectral morphology. We use new and extant optical and NIR spectra to compile a sample of 171 L dwarfs, including 27 low-gravity β and γ objects, with spectral coverage from 0.6–2.4 μm. We present 155 new low-resolution NIR spectra and 19 new optical spectra. We utilize a method for analyzing NIR spectra that partially removes the broad-band spectral slope and reveals similarities in the absorption features between objects of the same optical spectral type. Using the optical spectra as an anchor, we generate near-infrared spectral average templates for L0–L8, L0–L4γ, and L0–L1β type dwarfs. These templates reveal that NIR spectral morphologies are correlated with the optical types. They also show the range of spectral morphologies spanned by each spectral type. We compare low-gravity and field-gravity templates to provide recommendations on the minimum required observations for credibly classifying low-gravity spectra using low-resolution NIR data. We use the templates to evaluate the existing NIR spectral standards and propose new ones where appropriate. Finally, we build on the work of Kirkpatrick et al. to provide a spectral typing method that is tied to the optical and can be used when only H or K band data are available. The methods we present here provide resolutions to several long-standing issues with classifying L dwarf spectra and could also be the foundation for a spectral classification scheme for cloudy exoplanets.

  17. Atmospheric circulation classification comparison based on wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Pereira, M. G.; Trigo, R. M.

    2009-04-01

    Atmospheric circulation classifications are not a simple description of atmospheric states but a tool to understand and interpret the atmospheric processes and to model the relation between atmospheric circulation and surface climate and other related variables (Radan Huth et al., 2008). Classifications were initially developed with weather forecasting purposes, however with the progress in computer processing capability, new and more robust objective methods were developed and applied to large datasets prompting atmospheric circulation classification methods to one of the most important fields in synoptic and statistical climatology. Classification studies have been extensively used in climate change studies (e.g. reconstructed past climates, recent observed changes and future climates), in bioclimatological research (e.g. relating human mortality to climatic factors) and in a wide variety of synoptic climatological applications (e.g. comparison between datasets, air pollution, snow avalanches, wine quality, fish captures and forest fires). Likewise, atmospheric circulation classifications are important for the study of the role of weather in wildfire occurrence in Portugal because the daily synoptic variability is the most important driver of local weather conditions (Pereira et al., 2005). In particular, the objective classification scheme developed by Trigo and DaCamara (2000) to classify the atmospheric circulation affecting Portugal have proved to be quite useful in discriminating the occurrence and development of wildfires as well as the distribution over Portugal of surface climatic variables with impact in wildfire activity such as maximum and minimum temperature and precipitation. This work aims to present: (i) an overview the existing circulation classification for the Iberian Peninsula, and (ii) the results of a comparison study between these atmospheric circulation classifications based on its relation with wildfires and relevant meteorological variables. To achieve these objectives we consider the main classifications for Iberia developed within the framework of COST action 733 (Radan Huth et al., 2008). This European project aims to provide a wide range of atmospheric circulation classifications for Europe and sub-regions (http://www.cost733.org/) with an ambitious objective of assessing, comparing and classifying all relevant weather situations in Europe. Pereira et al. (2005) "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology,129, 11-25. Radan Huth et al. (2008) "Classifications of Atmospheric circulation patterns. Recent advances and applications". Trends and Directions in Climate Research: Ann. N.Y. Acad. Sci. 1146: 105-152. doi: 10.1196/annals.1446.019. Trigo R.M., DaCamara C. (2000) "Circulation Weather Types and their impact on the precipitation regime in Portugal". Int J of Climatology, 20, 1559-1581.

  18. Discovery of User-Oriented Class Associations for Enriching Library Classification Schemes.

    ERIC Educational Resources Information Center

    Pu, Hsiao-Tieh

    2002-01-01

    Presents a user-based approach to exploring the possibility of adding user-oriented class associations to hierarchical library classification schemes. Classes not grouped in the same subject hierarchies yet relevant to users' knowledge are obtained by analyzing a log book of a university library's circulation records, using collaborative filtering…

  19. Social Constructivism: Botanical Classification Schemes of Elementary School Children.

    ERIC Educational Resources Information Center

    Tull, Delena

    The assertion that there is a social component to children's construction of knowledge about natural phenomena is supported by evidence from an examination of children's classification schemes for plants. An ethnographic study was conducted with nine sixth grade children in central Texas. The children classified plants in the outdoors, in a…

  20. A Classification Scheme for Career Education Resource Materials.

    ERIC Educational Resources Information Center

    Koontz, Ronald G.

    The introductory section of the paper expresses its purpose: to devise a classification scheme for career education resource material, which will be used to develop the USOE Office of Career Education Resource Library and will be disseminated to interested State departments of education and local school districts to assist them in classifying…

  1. An Alternative Classification Scheme for Teaching Performance Incentives Using a Factor Analytic Approach.

    ERIC Educational Resources Information Center

    Mertler, Craig A.

    This study attempted to (1) expand the dichotomous classification scheme typically used by educators and researchers to describe teaching incentives and (2) offer administrators and teachers an alternative framework within which to develop incentive systems. Elementary, middle, and high school teachers in Ohio rated 10 commonly instituted teaching…

  2. A Classification Scheme for Adult Education. Education Libraries Bulletin, Supplement Twelve.

    ERIC Educational Resources Information Center

    Greaves, Monica A., Comp.

    This classification scheme, based on the 'facet formula' theory of Ranganathan, is designed primarily for the library of the National Institute of Adult Education in London, England. Kinds of persons being educated (educands), methods and problems of education, specific countries, specific organizations, and forms in which the information is…

  3. A Computer Oriented Scheme for Coding Chemicals in the Field of Biomedicine.

    ERIC Educational Resources Information Center

    Bobka, Marilyn E.; Subramaniam, J.B.

    The chemical coding scheme of the Medical Coding Scheme (MCS), developed for use in the Comparative Systems Laboratory (CSL), is outlined and evaluated in this report. The chemical coding scheme provides a classification scheme and encoding method for drugs and chemical terms. Using the scheme complicated chemical structures may be expressed…

  4. A Noise-Filtered Under-Sampling Scheme for Imbalanced Classification.

    PubMed

    Kang, Qi; Chen, XiaoShuang; Li, SiSi; Zhou, MengChu

    2017-12-01

    Under-sampling is a popular data preprocessing method in dealing with class imbalance problems, with the purposes of balancing datasets to achieve a high classification rate and avoiding the bias toward majority class examples. It always uses full minority data in a training dataset. However, some noisy minority examples may reduce the performance of classifiers. In this paper, a new under-sampling scheme is proposed by incorporating a noise filter before executing resampling. In order to verify the efficiency, this scheme is implemented based on four popular under-sampling methods, i.e., Undersampling + Adaboost, RUSBoost, UnderBagging, and EasyEnsemble through benchmarks and significance analysis. Furthermore, this paper also summarizes the relationship between algorithm performance and imbalanced ratio. Experimental results indicate that the proposed scheme can improve the original undersampling-based methods with significance in terms of three popular metrics for imbalanced classification, i.e., the area under the curve, -measure, and -mean.

  5. Object-based delineation and classification of alluvial fans by application of mean-shift segmentation and support vector machines

    NASA Astrophysics Data System (ADS)

    Pipaud, Isabel; Lehmkuhl, Frank

    2017-09-01

    In the field of geomorphology, automated extraction and classification of landforms is one of the most active research areas. Until the late 2000s, this task has primarily been tackled using pixel-based approaches. As these methods consider pixels and pixel neighborhoods as the sole basic entities for analysis, they cannot account for the irregular boundaries of real-world objects. Object-based analysis frameworks emerging from the field of remote sensing have been proposed as an alternative approach, and were successfully applied in case studies falling in the domains of both general and specific geomorphology. In this context, the a-priori selection of scale parameters or bandwidths is crucial for the segmentation result, because inappropriate parametrization will either result in over-segmentation or insufficient segmentation. In this study, we describe a novel supervised method for delineation and classification of alluvial fans, and assess its applicability using a SRTM 1‧‧ DEM scene depicting a section of the north-eastern Mongolian Altai, located in northwest Mongolia. The approach is premised on the application of mean-shift segmentation and the use of a one-class support vector machine (SVM) for classification. To consider variability in terms of alluvial fan dimension and shape, segmentation is performed repeatedly for different weightings of the incorporated morphometric parameters as well as different segmentation bandwidths. The final classification layer is obtained by selecting, for each real-world object, the most appropriate segmentation result according to fuzzy membership values derived from the SVM classification. Our results show that mean-shift segmentation and SVM-based classification provide an effective framework for delineation and classification of a particular landform. Variable bandwidths and terrain parameter weightings were identified as being crucial for consideration of intra-class variability, and, in turn, for a constantly high segmentation quality. Our analysis further reveals that incorporation of morphometric parameters quantifying specific morphological aspects of a landform is indispensable for developing an accurate classification scheme. Alluvial fans exhibiting accentuated composite morphologies were identified as a major challenge for automatic delineation, as they cannot be fully captured by a single segmentation run. There is, however, a high probability that this shortcoming can be overcome by enhancing the presented approach with a routine merging fan sub-entities based on their spatial relationships.

  6. Maximum likelihood estimation of label imperfections and its use in the identification of mislabeled patterns

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B.

    1979-01-01

    The problem of estimating label imperfections and the use of the estimation in identifying mislabeled patterns is presented. Expressions for the maximum likelihood estimates of classification errors and a priori probabilities are derived from the classification of a set of labeled patterns. Expressions also are given for the asymptotic variances of probability of correct classification and proportions. Simple models are developed for imperfections in the labels and for classification errors and are used in the formulation of a maximum likelihood estimation scheme. Schemes are presented for the identification of mislabeled patterns in terms of threshold on the discriminant functions for both two-class and multiclass cases. Expressions are derived for the probability that the imperfect label identification scheme will result in a wrong decision and are used in computing thresholds. The results of practical applications of these techniques in the processing of remotely sensed multispectral data are presented.

  7. A Standardised Vocabulary for Identifying Benthic Biota and Substrata from Underwater Imagery: The CATAMI Classification Scheme

    PubMed Central

    Jordan, Alan; Rees, Tony; Gowlett-Holmes, Karen

    2015-01-01

    Imagery collected by still and video cameras is an increasingly important tool for minimal impact, repeatable observations in the marine environment. Data generated from imagery includes identification, annotation and quantification of biological subjects and environmental features within an image. To be long-lived and useful beyond their project-specific initial purpose, and to maximize their utility across studies and disciplines, marine imagery data should use a standardised vocabulary of defined terms. This would enable the compilation of regional, national and/or global data sets from multiple sources, contributing to broad-scale management studies and development of automated annotation algorithms. The classification scheme developed under the Collaborative and Automated Tools for Analysis of Marine Imagery (CATAMI) project provides such a vocabulary. The CATAMI classification scheme introduces Australian-wide acknowledged, standardised terminology for annotating benthic substrates and biota in marine imagery. It combines coarse-level taxonomy and morphology, and is a flexible, hierarchical classification that bridges the gap between habitat/biotope characterisation and taxonomy, acknowledging limitations when describing biological taxa through imagery. It is fully described, documented, and maintained through curated online databases, and can be applied across benthic image collection methods, annotation platforms and scoring methods. Following release in 2013, the CATAMI classification scheme was taken up by a wide variety of users, including government, academia and industry. This rapid acceptance highlights the scheme’s utility and the potential to facilitate broad-scale multidisciplinary studies of marine ecosystems when applied globally. Here we present the CATAMI classification scheme, describe its conception and features, and discuss its utility and the opportunities as well as challenges arising from its use. PMID:26509918

  8. A new Fourier transform based CBIR scheme for mammographic mass classification: a preliminary invariance assessment

    NASA Astrophysics Data System (ADS)

    Gundreddy, Rohith Reddy; Tan, Maxine; Qui, Yuchen; Zheng, Bin

    2015-03-01

    The purpose of this study is to develop and test a new content-based image retrieval (CBIR) scheme that enables to achieve higher reproducibility when it is implemented in an interactive computer-aided diagnosis (CAD) system without significantly reducing lesion classification performance. This is a new Fourier transform based CBIR algorithm that determines image similarity of two regions of interest (ROI) based on the difference of average regional image pixel value distribution in two Fourier transform mapped images under comparison. A reference image database involving 227 ROIs depicting the verified soft-tissue breast lesions was used. For each testing ROI, the queried lesion center was systematically shifted from 10 to 50 pixels to simulate inter-user variation of querying suspicious lesion center when using an interactive CAD system. The lesion classification performance and reproducibility as the queried lesion center shift were assessed and compared among the three CBIR schemes based on Fourier transform, mutual information and Pearson correlation. Each CBIR scheme retrieved 10 most similar reference ROIs and computed a likelihood score of the queried ROI depicting a malignant lesion. The experimental results shown that three CBIR schemes yielded very comparable lesion classification performance as measured by the areas under ROC curves with the p-value greater than 0.498. However, the CBIR scheme using Fourier transform yielded the highest invariance to both queried lesion center shift and lesion size change. This study demonstrated the feasibility of improving robustness of the interactive CAD systems by adding a new Fourier transform based image feature to CBIR schemes.

  9. The classification of anxiety and hysterical states. Part I. Historical review and empirical delineation.

    PubMed

    Sheehan, D V; Sheehan, K H

    1982-08-01

    The history of the classification of anxiety, hysterical, and hypochondriacal disorders is reviewed. Problems in the ability of current classification schemes to predict, control, and describe the relationship between the symptoms and other phenomena are outlined. Existing classification schemes failed the first test of a good classification model--that of providing categories that are mutually exclusive. The independence of these diagnostic categories from each other does not appear to hold up on empirical testing. In the absence of inherently mutually exclusive categories, further empirical investigation of these classes is obstructed since statistically valid analysis of the nominal data and any useful multivariate analysis would be difficult if not impossible. It is concluded that the existing classifications are unsatisfactory and require some fundamental reconceptualization.

  10. Evaluation of effectiveness of wavelet based denoising schemes using ANN and SVM for bearing condition classification.

    PubMed

    Vijay, G S; Kumar, H S; Srinivasa Pai, P; Sriram, N S; Rao, Raj B K N

    2012-01-01

    The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR) and reducing the root-mean-square error (RMSE). In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN) and the Support Vector Machine (SVM), for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB) test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher's Criterion (FC). Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal.

  11. Applying the Methodology of the Community College Classification Scheme to the Public Master's Colleges and Universities Sector

    ERIC Educational Resources Information Center

    Kinkead, J. Clint.; Katsinas, Stephen G.

    2011-01-01

    This work brings forward the geographically-based classification scheme for the public Master's Colleges and Universities sector. Using the same methodology developed by Katsinas and Hardy (2005) to classify community colleges, this work classifies Master's Colleges and Universities. This work has four major findings and conclusions. First, a…

  12. Data Analytics for Smart Parking Applications.

    PubMed

    Piovesan, Nicola; Turi, Leo; Toigo, Enrico; Martinez, Borja; Rossi, Michele

    2016-09-23

    We consider real-life smart parking systems where parking lot occupancy data are collected from field sensor devices and sent to backend servers for further processing and usage for applications. Our objective is to make these data useful to end users, such as parking managers, and, ultimately, to citizens. To this end, we concoct and validate an automated classification algorithm having two objectives: (1) outlier detection: to detect sensors with anomalous behavioral patterns, i.e., outliers; and (2) clustering: to group the parking sensors exhibiting similar patterns into distinct clusters. We first analyze the statistics of real parking data, obtaining suitable simulation models for parking traces. We then consider a simple classification algorithm based on the empirical complementary distribution function of occupancy times and show its limitations. Hence, we design a more sophisticated algorithm exploiting unsupervised learning techniques (self-organizing maps). These are tuned following a supervised approach using our trace generator and are compared against other clustering schemes, namely expectation maximization, k-means clustering and DBSCAN, considering six months of data from a real sensor deployment. Our approach is found to be superior in terms of classification accuracy, while also being capable of identifying all of the outliers in the dataset.

  13. Data Analytics for Smart Parking Applications

    PubMed Central

    Piovesan, Nicola; Turi, Leo; Toigo, Enrico; Martinez, Borja; Rossi, Michele

    2016-01-01

    We consider real-life smart parking systems where parking lot occupancy data are collected from field sensor devices and sent to backend servers for further processing and usage for applications. Our objective is to make these data useful to end users, such as parking managers, and, ultimately, to citizens. To this end, we concoct and validate an automated classification algorithm having two objectives: (1) outlier detection: to detect sensors with anomalous behavioral patterns, i.e., outliers; and (2) clustering: to group the parking sensors exhibiting similar patterns into distinct clusters. We first analyze the statistics of real parking data, obtaining suitable simulation models for parking traces. We then consider a simple classification algorithm based on the empirical complementary distribution function of occupancy times and show its limitations. Hence, we design a more sophisticated algorithm exploiting unsupervised learning techniques (self-organizing maps). These are tuned following a supervised approach using our trace generator and are compared against other clustering schemes, namely expectation maximization, k-means clustering and DBSCAN, considering six months of data from a real sensor deployment. Our approach is found to be superior in terms of classification accuracy, while also being capable of identifying all of the outliers in the dataset. PMID:27669259

  14. Cleaning and Cleanliness Measurement of Additive Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Mitchell, Mark A.; Raley, Randy

    2016-01-01

    The successful acquisition and utilization of piece parts and assemblies for contamination sensitive applications requires application of cleanliness acceptance criteria. Contamination can be classified using many different schemes. One common scheme is classification as organic, ionic and particulate contaminants. These may be present in and on the surface of solid components and assemblies or may be dispersed in various gaseous or liquid media. This discussion will focus on insoluble particle contamination on the surfaces of piece parts and assemblies. Cleanliness of parts can be controlled using two strategies, referred to as gross cleanliness and precision cleanliness. Under a gross cleanliness strategy acceptance is based on visual cleanliness. This approach introduces a number of concerns that render it unsuitable for controlling cleanliness of high technology products. Under the precision cleanliness strategy, subjective, visual assessment of cleanliness is replaced by objective measurement of cleanliness. When a precision cleanliness strategy is adopted there naturally arises the question: How clean is clean enough? The methods for establishing objective cleanliness acceptance limits will be discussed.

  15. A new classification scheme for periodontal and peri-implant diseases and conditions - Introduction and key changes from the 1999 classification.

    PubMed

    G Caton, Jack; Armitage, Gary; Berglundh, Tord; Chapple, Iain L C; Jepsen, Søren; S Kornman, Kenneth; L Mealey, Brian; Papapanou, Panos N; Sanz, Mariano; S Tonetti, Maurizio

    2018-06-01

    A classification scheme for periodontal and peri-implant diseases and conditions is necessary for clinicians to properly diagnose and treat patients as well as for scientists to investigate etiology, pathogenesis, natural history, and treatment of the diseases and conditions. This paper summarizes the proceedings of the World Workshop on the Classification of Periodontal and Peri-implant Diseases and Conditions. The workshop was co-sponsored by the American Academy of Periodontology (AAP) and the European Federation of Periodontology (EFP) and included expert participants from all over the world. Planning for the conference, which was held in Chicago on November 9 to 11, 2017, began in early 2015. An organizing committee from the AAP and EFP commissioned 19 review papers and four consensus reports covering relevant areas in periodontology and implant dentistry. The authors were charged with updating the 1999 classification of periodontal diseases and conditions and developing a similar scheme for peri-implant diseases and conditions. Reviewers and workgroups were also asked to establish pertinent case definitions and to provide diagnostic criteria to aid clinicians in the use of the new classification. All findings and recommendations of the workshop were agreed to by consensus. This introductory paper presents an overview for the new classification of periodontal and peri-implant diseases and conditions, along with a condensed scheme for each of four workgroup sections, but readers are directed to the pertinent consensus reports and review papers for a thorough discussion of the rationale, criteria, and interpretation of the proposed classification. Changes to the 1999 classification are highlighted and discussed. Although the intent of the workshop was to base classification on the strongest available scientific evidence, lower level evidence and expert opinion were inevitably used whenever sufficient research data were unavailable. The scope of this workshop was to align and update the classification scheme to the current understanding of periodontal and peri-implant diseases and conditions. This introductory overview presents the schematic tables for the new classification of periodontal and peri-implant diseases and conditions and briefly highlights changes made to the 1999 classification. It cannot present the wealth of information included in the reviews, case definition papers, and consensus reports that has guided the development of the new classification, and reference to the consensus and case definition papers is necessary to provide a thorough understanding of its use for either case management or scientific investigation. Therefore, it is strongly recommended that the reader use this overview as an introduction to these subjects. Accessing this publication online will allow the reader to use the links in this overview and the tables to view the source papers (Table ). © 2018 American Academy of Periodontology and European Federation of Periodontology.

  16. A new classification scheme for periodontal and peri-implant diseases and conditions - Introduction and key changes from the 1999 classification.

    PubMed

    G Caton, Jack; Armitage, Gary; Berglundh, Tord; Chapple, Iain L C; Jepsen, Søren; S Kornman, Kenneth; L Mealey, Brian; Papapanou, Panos N; Sanz, Mariano; S Tonetti, Maurizio

    2018-06-01

    A classification scheme for periodontal and peri-implant diseases and conditions is necessary for clinicians to properly diagnose and treat patients as well as for scientists to investigate etiology, pathogenesis, natural history, and treatment of the diseases and conditions. This paper summarizes the proceedings of the World Workshop on the Classification of Periodontal and Peri-implant Diseases and Conditions. The workshop was co-sponsored by the American Academy of Periodontology (AAP) and the European Federation of Periodontology (EFP) and included expert participants from all over the world. Planning for the conference, which was held in Chicago on November 9 to 11, 2017, began in early 2015. An organizing committee from the AAP and EFP commissioned 19 review papers and four consensus reports covering relevant areas in periodontology and implant dentistry. The authors were charged with updating the 1999 classification of periodontal diseases and conditions and developing a similar scheme for peri-implant diseases and conditions. Reviewers and workgroups were also asked to establish pertinent case definitions and to provide diagnostic criteria to aid clinicians in the use of the new classification. All findings and recommendations of the workshop were agreed to by consensus. This introductory paper presents an overview for the new classification of periodontal and peri-implant diseases and conditions, along with a condensed scheme for each of four workgroup sections, but readers are directed to the pertinent consensus reports and review papers for a thorough discussion of the rationale, criteria, and interpretation of the proposed classification. Changes to the 1999 classification are highlighted and discussed. Although the intent of the workshop was to base classification on the strongest available scientific evidence, lower level evidence and expert opinion were inevitably used whenever sufficient research data were unavailable. The scope of this workshop was to align and update the classification scheme to the current understanding of periodontal and peri-implant diseases and conditions. This introductory overview presents the schematic tables for the new classification of periodontal and peri-implant diseases and conditions and briefly highlights changes made to the 1999 classification. It cannot present the wealth of information included in the reviews, case definition papers, and consensus reports that has guided the development of the new classification, and reference to the consensus and case definition papers is necessary to provide a thorough understanding of its use for either case management or scientific investigation. Therefore, it is strongly recommended that the reader use this overview as an introduction to these subjects. Accessing this publication online will allow the reader to use the links in this overview and the tables to view the source papers (Table 1). © 2018 American Academy of Periodontology and European Federation of Periodontology.

  17. Probing the Dusty Stellar Populations of the Local Volume Galaxies with JWST/MIRI

    NASA Astrophysics Data System (ADS)

    Jones, Olivia C.; Meixner, Margaret; Justtanont, Kay; Glasse, Alistair

    2017-05-01

    The Mid-Infrared Instrument (MIRI) for the James Webb Space Telescope (JWST) will revolutionize our understanding of infrared stellar populations in the Local Volume. Using the rich Spitzer-IRS spectroscopic data set and spectral classifications from the Surveying the Agents of Galaxy Evolution (SAGE)-Spectroscopic survey of more than 1000 objects in the Magellanic Clouds, the Grid of Red Supergiant and Asymptotic Giant Branch Star Model (grams), and the grid of YSO models by Robitaille et al., we calculate the expected flux densities and colors in the MIRI broadband filters for prominent infrared stellar populations. We use these fluxes to explore the JWST/MIRI colors and magnitudes for composite stellar population studies of Local Volume galaxies. MIRI color classification schemes are presented; these diagrams provide a powerful means of identifying young stellar objects, evolved stars, and extragalactic background galaxies in Local Volume galaxies with a high degree of confidence. Finally, we examine which filter combinations are best for selecting populations of sources based on their JWST colors.

  18. Land cover

    USGS Publications Warehouse

    Jorgenson, Janet C.; Joria, Peter C.; Douglas, David C.; Douglas, David C.; Reynolds, Patricia E.; Rhode, E.B.

    2002-01-01

    Documenting the distribution of land-cover types on the Arctic National Wildlife Refuge coastal plain is the foundation for impact assessment and mitigation of potential oil exploration and development. Vegetation maps facilitate wildlife studies by allowing biologists to quantify the availability of important wildlife habitats, investigate the relationships between animal locations and the distribution or juxtaposition of habitat types, and assess or extrapolate habitat characteristics across regional areas.To meet the needs of refuge managers and biologists, satellite imagery was chosen as the most cost-effective method for mapping the large, remote landscape of the 1002 Area.Objectives of our study were the following: 1) evaluate a vegetation classification scheme for use in mapping. 2) determine optimal methods for producing a satellite-based vegetation map that adequately met the needs of the wildlife research and management objectives; 3) produce a digital vegetation map for the Arctic Refuge coastal plain using Lands at-Thematic Mapper(TM) satellite imagery, existing geobotanical classifications, ground data, and aerial photographs, and 4) perform an accuracy assessment of the map.

  19. Automatic classification of protein structures using physicochemical parameters.

    PubMed

    Mohan, Abhilash; Rao, M Divya; Sunderrajan, Shruthi; Pennathur, Gautam

    2014-09-01

    Protein classification is the first step to functional annotation; SCOP and Pfam databases are currently the most relevant protein classification schemes. However, the disproportion in the number of three dimensional (3D) protein structures generated versus their classification into relevant superfamilies/families emphasizes the need for automated classification schemes. Predicting function of novel proteins based on sequence information alone has proven to be a major challenge. The present study focuses on the use of physicochemical parameters in conjunction with machine learning algorithms (Naive Bayes, Decision Trees, Random Forest and Support Vector Machines) to classify proteins into their respective SCOP superfamily/Pfam family, using sequence derived information. Spectrophores™, a 1D descriptor of the 3D molecular field surrounding a structure was used as a benchmark to compare the performance of the physicochemical parameters. The machine learning algorithms were modified to select features based on information gain for each SCOP superfamily/Pfam family. The effect of combining physicochemical parameters and spectrophores on classification accuracy (CA) was studied. Machine learning algorithms trained with the physicochemical parameters consistently classified SCOP superfamilies and Pfam families with a classification accuracy above 90%, while spectrophores performed with a CA of around 85%. Feature selection improved classification accuracy for both physicochemical parameters and spectrophores based machine learning algorithms. Combining both attributes resulted in a marginal loss of performance. Physicochemical parameters were able to classify proteins from both schemes with classification accuracy ranging from 90-96%. These results suggest the usefulness of this method in classifying proteins from amino acid sequences.

  20. A classification scheme for risk assessment methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that amore » method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report--what a 'method' is and where it fits. In Section 3 we present background for our classification scheme--what other schemes we have found, the fundamental nature of methods and their necessary incompleteness. In Section 4 we present our classification scheme in the form of a matrix, then we present an analogy that should provide an understanding of the scheme, concluding with an explanation of the two dimensions and the nine types in our scheme. In Section 5 we present examples of each of our classification types. In Section 6 we present conclusions.« less

  1. Classification of instability after reverse shoulder arthroplasty guides surgical management and outcomes.

    PubMed

    Abdelfattah, Adham; Otto, Randall J; Simon, Peter; Christmas, Kaitlyn N; Tanner, Gregory; LaMartina, Joey; Levy, Jonathan C; Cuff, Derek J; Mighell, Mark A; Frankle, Mark A

    2018-04-01

    Revision of unstable reverse shoulder arthroplasty (RSA) remains a significant challenge. The purpose of this study was to determine the reliability of a new treatment-guiding classification for instability after RSA, to describe the clinical outcomes of patients stabilized operatively, and to identify those with higher risk of recurrence. All patients undergoing revision for instability after RSA were identified at our institution. Demographic, clinical, radiographic, and intraoperative data were collected. A classification was developed using all identified causes of instability after RSA and allocating them to 1 of 3 defined treatment-guiding categories. Eight surgeons reviewed all data and applied the classification scheme to each case. Interobserver and intraobserver reliability was used to evaluate the classification scheme. Preoperative clinical outcomes were compared with final follow-up in stabilized shoulders. Forty-three revision cases in 34 patients met the inclusion for study. Five patients remained unstable after revision. Persistent instability most commonly occurred in persistent deltoid dysfunction and postoperative acromial fractures but also in 1 case of soft tissue impingement. Twenty-one patients remained stable at minimum 2 years of follow-up and had significant improvement of clinical outcome scores and range of motion. Reliability of the classification scheme showed substantial and almost perfect interobserver and intraobserver agreement among all the participants (κ = 0.699 and κ = 0.851, respectively). Instability after RSA can be successfully treated with revision surgery using the reliable treatment-guiding classification scheme presented herein. However, more understanding is needed for patients with greater risk of recurrent instability after revision surgery. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  2. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI).

    PubMed

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-06-01

    Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research.

  3. Twenty five years of beach monitoring in Hong Kong: A re-examination of the beach water quality classification scheme from a comparative and global perspective.

    PubMed

    Thoe, W; Lee, Olive H K; Leung, K F; Lee, T; Ashbolt, Nicholas J; Yang, Ron R; Chui, Samuel H K

    2018-06-01

    Hong Kong's beach water quality classification scheme, used effectively for >25 years in protecting public health, was first established in local epidemiology studies during the late 1980s where Escherichia coli (E. coli) was identified as the most suitable faecal indicator bacteria. To review and further substantiate the scheme's robustness, a performance check was carried out to classify water quality of 37 major local beaches in Hong Kong during four bathing seasons (March-October) from 2010 to 2013. Given the enterococci and E. coli data collected, beach classification by the local scheme was found to be in line with the prominent international benchmarks recommended by the World Health Organization and the European Union. Local bacteriological studies over the last 15 years further confirmed that E. coli is the more suitable faecal indicator bacteria than enterococci in the local context. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Update on diabetes classification.

    PubMed

    Thomas, Celeste C; Philipson, Louis H

    2015-01-01

    This article highlights the difficulties in creating a definitive classification of diabetes mellitus in the absence of a complete understanding of the pathogenesis of the major forms. This brief review shows the evolving nature of the classification of diabetes mellitus. No classification scheme is ideal, and all have some overlap and inconsistencies. The only diabetes in which it is possible to accurately diagnose by DNA sequencing, monogenic diabetes, remains undiagnosed in more than 90% of the individuals who have diabetes caused by one of the known gene mutations. The point of classification, or taxonomy, of disease, should be to give insight into both pathogenesis and treatment. It remains a source of frustration that all schemes of diabetes mellitus continue to fall short of this goal. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Identification of terrain cover using the optimum polarimetric classifier

    NASA Technical Reports Server (NTRS)

    Kong, J. A.; Swartz, A. A.; Yueh, H. A.; Novak, L. M.; Shin, R. T.

    1988-01-01

    A systematic approach for the identification of terrain media such as vegetation canopy, forest, and snow-covered fields is developed using the optimum polarimetric classifier. The covariance matrices for various terrain cover are computed from theoretical models of random medium by evaluating the scattering matrix elements. The optimal classification scheme makes use of a quadratic distance measure and is applied to classify a vegetation canopy consisting of both trees and grass. Experimentally measured data are used to validate the classification scheme. Analytical and Monte Carlo simulated classification errors using the fully polarimetric feature vector are compared with classification based on single features which include the phase difference between the VV and HH polarization returns. It is shown that the full polarimetric results are optimal and provide better classification performance than single feature measurements.

  6. A proposed classification scheme for Ada-based software products

    NASA Technical Reports Server (NTRS)

    Cernosek, Gary J.

    1986-01-01

    As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.

  7. Application of NEPA to nuclear weapons production, storage, and testing Weinberger v. Catholic Action of Hawaii/Peace Education Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauber, A.J.

    The National Environmental Policy Act (NEPA) requirement of environmental impact statements for the testing of military equipment, specifically nuclear weapons, conflicts with national security objectives. The author examines NEPA and the Freedom of Information Act (FOIA) in terms of the environmental effects of weapons testing and the relevant case law. The Supreme Court's decision in Catholic Action of Hawaii/Peace Education Project sought to resolve the conflict by distinguishing between a project which is contemplated and one which is proposed. The classification scheme embodied in the FOIA exemption for national security may cause unwarranted frustration of NEPA's goals. The author outlinesmore » a new classification system and review mechanism that could curb military abuse in this area.« less

  8. Advanced Steel Microstructural Classification by Deep Learning Methods.

    PubMed

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  9. Multidimensional classification of magma types for altered igneous rocks and application to their tectonomagmatic discrimination and igneous provenance of siliciclastic sediments

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Pandarinath, Kailasa; Amezcua-Valdez, Alejandra; Rosales-Rivera, Mauricio; Verma, Sanjeet K.; Quiroz-Ruiz, Alfredo; Armstrong-Altrin, John S.

    2017-05-01

    A new multidimensional scheme consistent with the International Union of Geological Sciences (IUGS) is proposed for the classification of igneous rocks in terms of four magma types: ultrabasic, basic, intermediate, and acid. Our procedure is based on an extensive database of major element composition of a total of 33,868 relatively fresh rock samples having a multinormal distribution (initial database with 37,215 samples). Multinormally distributed database in terms of log-ratios of samples was ascertained by a new computer program DOMuDaF, in which the discordancy test was applied at the 99.9% confidence level. Isometric log-ratio (ilr) transformation was used to provide overall percent correct classification of 88.7%, 75.8%, 88.0%, and 80.9% for ultrabasic, basic, intermediate, and acid rocks, respectively. Given the known mathematical and uncertainty propagation properties, this transformation could be adopted for routine applications. The incorrect classification was mainly for the "neighbour" magma types, e.g., basic for ultrabasic and vice versa. Some of these misclassifications do not have any effect on multidimensional tectonic discrimination. For an efficient application of this multidimensional scheme, a new computer program MagClaMSys_ilr (MagClaMSys-Magma Classification Major-element based System) was written, which is available for on-line processing on http://tlaloc.ier.unam.mx/index.html. This classification scheme was tested from newly compiled data for relatively fresh Neogene igneous rocks and was found to be consistent with the conventional IUGS procedure. The new scheme was successfully applied to inter-laboratory data for three geochemical reference materials (basalts JB-1 and JB-1a, and andesite JA-3) from Japan and showed that the inferred magma types are consistent with the rock name (basic for basalts JB-1 and JB-1a and intermediate for andesite JA-3). The scheme was also successfully applied to five case studies of older Archaean to Mesozoic igneous rocks. Similar or more reliable results were obtained from existing tectonomagmatic discrimination diagrams when used in conjunction with the new computer program as compared to the IUGS scheme. The application to three case studies of igneous provenance of sedimentary rocks was demonstrated as a novel approach. Finally, we show that the new scheme is more robust for post-emplacement compositional changes than the conventional IUGS procedure.

  10. Automated flaw detection scheme for cast austenitic stainless steel weld specimens using Hilbert-Huang transform of ultrasonic phased array data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Tariq; Majumdar, Shantanu; Udpa, Lalita

    2012-05-17

    The objective of this work is to develop processing algorithms to detect and localize flaws using ultrasonic phased-array data. Data was collected on cast austenitic stainless stell (CASS) weld specimens onloan from the U.S. nuclear power industry' Pressurized Walter Reactor Owners Group (PWROG) traveling specimen set. Each specimen consists of a centrifugally cast stainless stell (CCSS) pipe section welded to a statically cst(SCSS) or wrought (WRSS) section. The paper presents a novel automated flaw detection and localization scheme using low frequency ultrasonic phased array inspection singals from the weld and heat affected zone of the based materials. The major stepsmore » of the overall scheme are preprocessing and region of interest (ROI) detection followed by the Hilbert-Huang transform (HHT) of A-scans in the detected ROIs. HHT offers time-frequency-energy distribution for each ROI. The Accumulation of energy in a particular frequency band is used as a classification feature for the particular ROI.« less

  11. The morphology of faint galaxies in Medium Deep Survey images using WFPC2

    NASA Technical Reports Server (NTRS)

    Griffiths, R. E.; Casertano, S.; Ratnatunga, K. U.; Neuschaefer, L. W.; Ellis, R. S.; Gilmore, G. F.; Glazebrook, K.; Santiago, B.; Huchra, J. P.; Windhorst, R. A.

    1994-01-01

    First results from Hubble Space Telescope (HST) Medium Deep Survey images taken with Wide Field/Planetary Camera-2 (WFPC2) demonstrate that galaxy classifications can be reliably performed to magnitudes I814 approximately less than 22.0 in the F815W band. Published spectroscopic surveys to this depth indicate a mean redshift of bar-z approximately 0.5. We have classified over 200 galaxies in nine WFPC2 fields according to a basic morphological scheme. The majority of these faint galaxies appear to be similar to regular Hubble-sequence examples observed at low redshift. To the precision of our classification scheme, the relative proportion of spheroidal and disk systems of normal appearance is as expected from nearby samples, indicating that the bulk of the local galaxy population was in place at half the Hubble time. However, the most intriguing result is the relatively high proportion (approximately 40%) of objects which are in some way anomalous, and which may be of relevance in understanding the origin of the familiar excess population of faint galaxies established by others. These diverse objects include apparently interacting pairs whose multiple structure is only revealed with HST's angular resolution, galaxies with superluminous star-forming regions, diffuse low surface brightness galaxies of various forms, and compact galaxies. These anomalous galaxies contribute a substantial fraction of the excess counts at our limiting magnitude, and may provide insights into the 'faint blue galaxy' problem.

  12. Classification of proteins: available structural space for molecular modeling.

    PubMed

    Andreeva, Antonina

    2012-01-01

    The wealth of available protein structural data provides unprecedented opportunity to study and better understand the underlying principles of protein folding and protein structure evolution. A key to achieving this lies in the ability to analyse these data and to organize them in a coherent classification scheme. Over the past years several protein classifications have been developed that aim to group proteins based on their structural relationships. Some of these classification schemes explore the concept of structural neighbourhood (structural continuum), whereas other utilize the notion of protein evolution and thus provide a discrete rather than continuum view of protein structure space. This chapter presents a strategy for classification of proteins with known three-dimensional structure. Steps in the classification process along with basic definitions are introduced. Examples illustrating some fundamental concepts of protein folding and evolution with a special focus on the exceptions to them are presented.

  13. Inter-sectoral costs and benefits of mental health prevention: towards a new classification scheme.

    PubMed

    Drost, Ruben M W A; Paulus, Aggie T G; Ruwaard, Dirk; Evers, Silvia M A A

    2013-12-01

    Many preventive interventions for mental disorders have costs and benefits that spill over to sectors outside the healthcare sector. Little is known about these "inter-sectoral costs and benefits" (ICBs) of prevention. However, to achieve an efficient allocation of scarce resources, insights on ICBs are indispensable. The main aim was to identify the ICBs related to the prevention of mental disorders and provide a sector-specific classification scheme for these ICBs. Using PubMed, a literature search was conducted for ICBs of mental disorders and related (psycho)social effects. A policy perspective was used to build the scheme's structure, which was adapted to the outcomes of the literature search. In order to validate the scheme's international applicability inside and outside the mental health domain, semi-structured interviews were conducted with (inter)national experts in the broad fields of health promotion and disease prevention. The searched-for items appeared in a total of 52 studies. The ICBs found were classified in one of four sectors: "Education", "Labor and Social Security", "Household and Leisure" or "Criminal Justice System". Psycho(social) effects were placed in a separate section under "Individual and Family". Based on interviews, the scheme remained unadjusted, apart from adding a population-based dimension. This is the first study which offers a sector-specific classification of ICBs. Given the explorative nature of the study, no guidelines on sector-specific classification of ICBs were available. Nevertheless, the classification scheme was acknowledged by an international audience and could therefore provide added value to researchers and policymakers in the field of mental health economics and prevention. The identification and classification of ICBs offers decision makers supporting information on how to optimally allocate scarce resources with respect to preventive interventions for mental disorders. By exploring a new area of research, which has remained largely unexplored until now, the current study has an added value as it may form the basis for the development of a tool which can be used to calculate the ICBs of specific mental health related preventive interventions.

  14. A physical classification scheme for blazars

    NASA Astrophysics Data System (ADS)

    Landt, Hermine; Padovani, Paolo; Perlman, Eric S.; Giommi, Paolo

    2004-06-01

    Blazars are currently separated into BL Lacertae objects (BL Lacs) and flat spectrum radio quasars based on the strength of their emission lines. This is performed rather arbitrarily by defining a diagonal line in the Ca H&K break value-equivalent width plane, following Marchã et al. We readdress this problem and put the classification scheme for blazars on firm physical grounds. We study ~100 blazars and radio galaxies from the Deep X-ray Radio Blazar Survey (DXRBS) and 2-Jy radio survey and find a significant bimodality for the narrow emission line [OIII]λ5007. This suggests the presence of two physically distinct classes of radio-loud active galactic nuclei (AGN). We show that all radio-loud AGN, blazars and radio galaxies, can be effectively separated into weak- and strong-lined sources using the [OIII]λ5007-[OII]λ3727 equivalent width plane. This plane allows one to disentangle orientation effects from intrinsic variations in radio-loud AGN. Based on DXRBS, the strongly beamed sources of the new class of weak-lined radio-loud AGN are made up of BL Lacs at the ~75 per cent level, whereas those of the strong-lined radio-loud AGN include mostly (~97 per cent) quasars.

  15. FR-type radio sources in COSMOS: relation of radio structure to size, accretion modes and large-scale environment

    NASA Astrophysics Data System (ADS)

    Vardoulaki, Eleni; Faustino Jimenez Andrade, Eric; Delvecchio, Ivan; Karim, Alexander; Smolčić, Vernesa; Magnelli, Benjamin; Bertoldi, Frank; Schinnener, Eva; Sargent, Mark; Finoguenov, Alexis; VLA COSMOS Team

    2018-01-01

    The radio sources associated with active galactic nuclei (AGN) can exhibit a variety of radio structures, from simple to more complex, giving rise to a variety of classification schemes. The question which still remains open, given deeper surveys revealing new populations of radio sources, is whether this plethora of radio structures can be attributed to the physical properties of the host or to the environment. Here we present an analysis on the radio structure of radio-selected AGN from the VLA-COSMOS Large Project at 3 GHz (JVLA-COSMOS; Smolčić et al.) in relation to: 1) their linear projected size, 2) the Eddington ratio, and 3) the environment their hosts lie within. We classify these as FRI (jet-like) and FRII (lobe-like) based on the FR-type classification scheme, and compare them to a sample of jet-less radio AGN in JVLA-COSMOS. We measure their linear projected sizes using a semi-automatic machine learning technique. Their Eddington ratios are calculated from X-ray data available for COSMOS. As environmental probes we take the X-ray groups (hundreds kpc) and the density fields (~Mpc-scale) in COSMOS. We find that FRII radio sources are on average larger than FRIs, which agrees with literature. But contrary to past studies, we find no dichotomy in FR objects in JVLA-COSMOS given their Eddington ratios, as on average they exhibit similar values. Furthermore our results show that the large-scale environment does not explain the observed dichotomy in lobe- and jet-like FR-type objects as both types are found on similar environments, but it does affect the shape of the radio structure introducing bents for objects closer to the centre of an X-ray group.

  16. Classifying GRB 170817A/GW170817 in a Fermi duration-hardness plane

    NASA Astrophysics Data System (ADS)

    Horváth, I.; Tóth, B. G.; Hakkila, J.; Tóth, L. V.; Balázs, L. G.; Rácz, I. I.; Pintér, S.; Bagoly, Z.

    2018-03-01

    GRB 170817A, associated with the LIGO-Virgo GW170817 neutron-star merger event, lacks the short duration and hard spectrum of a Short gamma-ray burst (GRB) expected from long-standing classification models. Correctly identifying the class to which this burst belongs requires comparison with other GRBs detected by the Fermi GBM. The aim of our analysis is to classify Fermi GRBs and to test whether or not GRB 170817A belongs—as suggested—to the Short GRB class. The Fermi GBM catalog provides a large database with many measured variables that can be used to explore gamma-ray burst classification. We use statistical techniques to look for clustering in a sample of 1298 gamma-ray bursts described by duration and spectral hardness. Classification of the detected bursts shows that GRB 170817A most likely belongs to the Intermediate, rather than the Short GRB class. We discuss this result in light of theoretical neutron-star merger models and existing GRB classification schemes. It appears that GRB classification schemes may not yet be linked to appropriate theoretical models, and that theoretical models may not yet adequately account for known GRB class properties. We conclude that GRB 170817A may not fit into a simple phenomenological classification scheme.

  17. Diffuse Lung Disease in Biopsied Children 2 to 18 Years of Age. Application of the chILD Classification Scheme.

    PubMed

    Fan, Leland L; Dishop, Megan K; Galambos, Csaba; Askin, Frederic B; White, Frances V; Langston, Claire; Liptzin, Deborah R; Kroehl, Miranda E; Deutsch, Gail H; Young, Lisa R; Kurland, Geoffrey; Hagood, James; Dell, Sharon; Trapnell, Bruce C; Deterding, Robin R

    2015-10-01

    Children's Interstitial and Diffuse Lung Disease (chILD) is a heterogeneous group of disorders that is challenging to categorize. In previous study, a classification scheme was successfully applied to children 0 to 2 years of age who underwent lung biopsies for chILD. This classification scheme has not been evaluated in children 2 to 18 years of age. This multicenter interdisciplinary study sought to describe the spectrum of biopsy-proven chILD in North America and to apply a previously reported classification scheme in children 2 to 18 years of age. Mortality and risk factors for mortality were also assessed. Patients 2 to 18 years of age who underwent lung biopsies for diffuse lung disease from 12 North American institutions were included. Demographic and clinical data were collected and described. The lung biopsies were reviewed by pediatric lung pathologists with expertise in diffuse lung disease and were classified by the chILD classification scheme. Logistic regression was used to determine risk factors for mortality. A total of 191 cases were included in the final analysis. Number of biopsies varied by center (5-49 biopsies; mean, 15.8) and by age (2-18 yr; mean, 10.6 yr). The most common classification category in this cohort was Disorders of the Immunocompromised Host (40.8%), and the least common was Disorders of Infancy (4.7%). Immunocompromised patients suffered the highest mortality (52.8%). Additional associations with mortality included mechanical ventilation, worse clinical status at time of biopsy, tachypnea, hemoptysis, and crackles. Pulmonary hypertension was found to be a risk factor for mortality but only in the immunocompetent patients. In patients 2 to 18 years of age who underwent lung biopsies for diffuse lung disease, there were far fewer diagnoses prevalent in infancy and more overlap with adult diagnoses. Immunocompromised patients with diffuse lung disease who underwent lung biopsies had less than 50% survival at time of last follow-up.

  18. The Semantic Management of Environmental Resources within the Interoperable Context of the EuroGEOSS: Alignment of GEMET and the GEOSS SBAs

    NASA Astrophysics Data System (ADS)

    Cialone, Claudia; Stock, Kristin

    2010-05-01

    EuroGEOSS is a European Commission funded project. It aims at improving a scientific understanding of the complex mechanisms which drive changes affecting our planet, identifying and establishing interoperable arrangements between environmental information systems. These systems would be sustained and operated by organizations with a clear mandate and resources and rendered available following the specifications of already existent frameworks such as GEOSS (the Global Earth Observation System of systems)1 and INSPIRE (the Infrastructure for Spatial Information in the European Community)2. The EuroGEOSS project's infrastructure focuses on three thematic areas: forestry, drought and biodiversity. One of the important activities in the project is the retrieval, parsing and harmonization of the large amount of heterogeneous environmental data available at local, regional and global levels between these strategic areas. The challenge is to render it semantically and technically interoperable in a simple way. An initial step in achieving this semantic and technical interoperability involves the selection of appropriate classification schemes (for example, thesauri, ontologies and controlled vocabularies) to describe the resources in the EuroGEOSS framework. These classifications become a crucial part of the interoperable framework scaffolding because they allow data providers to describe their resources and thus support resource discovery, execution and orchestration of varying levels of complexity. However, at present, given the diverse range of environmental thesauri, controlled vocabularies and ontologies and the large number of resources provided by project participants, the selection of appropriate classification schemes involves a number of considerations. First of all, there is the semantic difficulty of selecting classification schemes that contain concepts that are relevant to each thematic area. Secondly, EuroGEOSS is intended to accommodate a number of existing environmental projects (for example, GEOSS and INSPIRE). This requirement imposes constraints on the selection. Thirdly, the selected classification scheme or group of schemes (if more than one) must be capable of alignment (establishing different kinds of mappings between concepts, hence preserving intact the original knowledge schemes) or merging (the creation of another unique ontology from the original ontological sources) (Pérez-Gómez et al., 2004). Last but not least, there is the issue of including multi-lingual schemes that are based on free, open standards (non-proprietary). Using these selection criteria, we aim to support open and convenient data discovery and exchange for users who speak different languages (particularly the European ones for the broad scopes of EuroGEOSS). In order to support the project, we have developed a solution that employs two classification schemes: the Societal Benefit Areas (SBAs)3: the upper-level environmental categorization developed for the GEOSS project and the GEneral Multilingual Environmental Thesaurus (GEMET)4: a general environmental thesaurus whose conceptual structure has already been integrated with the spatial data themes proposed by the INSPIRE project. The former seems to provide the spatial data keywords relevant to the INSPIRE's Directive (JRC, 2008). In this way, we provide users with a basic set of concepts to support resource description and discovery in the thematic areas while supporting the requirements of INSPIRE and GEOSS. Furthermore, the use of only two classification schemes together with the fact that the SBAs are very general categories while GEMET includes much more detailed, yet still top-level, concepts, makes alignment an achievable task. Alignment was selected over merging because it leaves the existing classification schemes intact and requires only a simple activity of defining mappings from GEMET to the SBAs. In order to accomplish this task we are developing a simple, automated, open-source application to assist thematic experts in defining the mappings between concepts in the two classification schemes. The application will then generate SKOS mappings (exactMatch, closeMatch, broadMatch, narrowMatch, relatedMatch) based on thematic expert selections between the concepts in GEMET with the SBAs (including both the general Societal Benefit Areas and their subcategories). Once these mappings are defined and the SKOS files generated, resource providers will be able to select concepts from either GEMET or the SBAs (or a mixture) to describe their resources, and discovery approaches will support selection of concepts from either classification scheme, also returning results classified using the other scheme. While the focus of our work has been on the SBAs and GEMET, we also plan to provide a method for resource providers to further extend the semantic infrastructure by defining alignments to new classification schemes if these are required to support particular specialized thematic areas that are not covered by GEMET. In this way, the approach is flexible and suited to the general scope of EuroGEOSS, allowing specialists to increase at will the level of semantic quality and specificity of data to the initial infrastructural skeleton of the project. References ____________________________________________ Joint research Centre (JRC), 2008. INSPIRE Metadata Editor User Guide Pérez-Gómez A., Fernandez-Lopez M., Corcho O. Ontological engineering: With Examples from the Areas of Knowledge Management, e-Commerce and the Semantic Web.Spinger: London, 2004

  19. Sensitivity analysis of the GEMS soil organic carbon model to land cover land use classification uncertainties under different climate scenarios in Senegal

    USGS Publications Warehouse

    Dieye, A.M.; Roy, David P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.

    2012-01-01

    Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs.

  20. Approach for a Clinically Useful Comprehensive Classification of Vascular and Neural Aspects of Diabetic Retinal Disease

    PubMed Central

    Abramoff, Michael D.; Fort, Patrice E.; Han, Ian C.; Jayasundera, K. Thiran; Sohn, Elliott H.; Gardner, Thomas W.

    2018-01-01

    The Early Treatment Diabetic Retinopathy Study (ETDRS) and other standardized classification schemes have laid a foundation for tremendous advances in the understanding and management of diabetic retinopathy (DR). However, technological advances in optics and image analysis, especially optical coherence tomography (OCT), OCT angiography (OCTa), and ultra-widefield imaging, as well as new discoveries in diabetic retinal neuropathy (DRN), are exposing the limitations of ETDRS and other classification systems to completely characterize retinal changes in diabetes, which we term diabetic retinal disease (DRD). While it may be most straightforward to add axes to existing classification schemes, as diabetic macular edema (DME) was added as an axis to earlier DR classifications, doing so may make these classifications increasingly complicated and thus clinically intractable. Therefore, we propose future research efforts to develop a new, comprehensive, and clinically useful classification system that will identify multimodal biomarkers to reflect the complex pathophysiology of DRD and accelerate the development of therapies to prevent vision-threatening DRD. PMID:29372250

  1. Approach for a Clinically Useful Comprehensive Classification of Vascular and Neural Aspects of Diabetic Retinal Disease.

    PubMed

    Abramoff, Michael D; Fort, Patrice E; Han, Ian C; Jayasundera, K Thiran; Sohn, Elliott H; Gardner, Thomas W

    2018-01-01

    The Early Treatment Diabetic Retinopathy Study (ETDRS) and other standardized classification schemes have laid a foundation for tremendous advances in the understanding and management of diabetic retinopathy (DR). However, technological advances in optics and image analysis, especially optical coherence tomography (OCT), OCT angiography (OCTa), and ultra-widefield imaging, as well as new discoveries in diabetic retinal neuropathy (DRN), are exposing the limitations of ETDRS and other classification systems to completely characterize retinal changes in diabetes, which we term diabetic retinal disease (DRD). While it may be most straightforward to add axes to existing classification schemes, as diabetic macular edema (DME) was added as an axis to earlier DR classifications, doing so may make these classifications increasingly complicated and thus clinically intractable. Therefore, we propose future research efforts to develop a new, comprehensive, and clinically useful classification system that will identify multimodal biomarkers to reflect the complex pathophysiology of DRD and accelerate the development of therapies to prevent vision-threatening DRD.

  2. Local Laplacian Coding From Theoretical Analysis of Local Coding Schemes for Locally Linear Classification.

    PubMed

    Pang, Junbiao; Qin, Lei; Zhang, Chunjie; Zhang, Weigang; Huang, Qingming; Yin, Baocai

    2015-12-01

    Local coordinate coding (LCC) is a framework to approximate a Lipschitz smooth function by combining linear functions into a nonlinear one. For locally linear classification, LCC requires a coding scheme that heavily determines the nonlinear approximation ability, posing two main challenges: 1) the locality making faraway anchors have smaller influences on current data and 2) the flexibility balancing well between the reconstruction of current data and the locality. In this paper, we address the problem from the theoretical analysis of the simplest local coding schemes, i.e., local Gaussian coding and local student coding, and propose local Laplacian coding (LPC) to achieve the locality and the flexibility. We apply LPC into locally linear classifiers to solve diverse classification tasks. The comparable or exceeded performances of state-of-the-art methods demonstrate the effectiveness of the proposed method.

  3. Toward functional classification of neuronal types.

    PubMed

    Sharpee, Tatyana O

    2014-09-17

    How many types of neurons are there in the brain? This basic neuroscience question remains unsettled despite many decades of research. Classification schemes have been proposed based on anatomical, electrophysiological, or molecular properties. However, different schemes do not always agree with each other. This raises the question of whether one can classify neurons based on their function directly. For example, among sensory neurons, can a classification scheme be devised that is based on their role in encoding sensory stimuli? Here, theoretical arguments are outlined for how this can be achieved using information theory by looking at optimal numbers of cell types and paying attention to two key properties: correlations between inputs and noise in neural responses. This theoretical framework could help to map the hierarchical tree relating different neuronal classes within and across species. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Mapping Successional Stages in a Wet Tropical Forest Using Landsat ETM+ and Forest Inventory Data

    NASA Technical Reports Server (NTRS)

    Goncalves, Fabio G.; Yatskov, Mikhail; dos Santos, Joao Roberto; Treuhaft, Robert N.; Law, Beverly E.

    2010-01-01

    In this study, we test whether an existing classification technique based on the integration of Landsat ETM+ and forest inventory data enables detailed characterization of successional stages in a wet tropical forest site. The specific objectives were: (1) to map forest age classes across the La Selva Biological Station in Costa Rica; and (2) to quantify uncertainties in the proposed approach in relation to field data and existing vegetation maps. Although significant relationships between vegetation height entropy (a surrogate for forest age) and ETM+ data were detected, the classification scheme tested in this study was not suitable for characterizing spatial variation in age at La Selva, as evidenced by the error matrix and the low Kappa coefficient (12.9%). Factors affecting the performance of the classification at this particular study site include the smooth transition in vegetation structure between intermediate and advanced successional stages, and the low sensitivity of NDVI to variations in vertical structure at high biomass levels.

  5. Cross-label Suppression: a Discriminative and Fast Dictionary Learning with Group Regularization.

    PubMed

    Wang, Xiudong; Gu, Yuantao

    2017-05-10

    This paper addresses image classification through learning a compact and discriminative dictionary efficiently. Given a structured dictionary with each atom (columns in the dictionary matrix) related to some label, we propose crosslabel suppression constraint to enlarge the difference among representations for different classes. Meanwhile, we introduce group regularization to enforce representations to preserve label properties of original samples, meaning the representations for the same class are encouraged to be similar. Upon the cross-label suppression, we don't resort to frequently-used `0-norm or `1- norm for coding, and obtain computational efficiency without losing the discriminative power for categorization. Moreover, two simple classification schemes are also developed to take full advantage of the learnt dictionary. Extensive experiments on six data sets including face recognition, object categorization, scene classification, texture recognition and sport action categorization are conducted, and the results show that the proposed approach can outperform lots of recently presented dictionary algorithms on both recognition accuracy and computational efficiency.

  6. Applying graphics user interface ot group technology classification and coding at the Boeing aerospace company

    NASA Astrophysics Data System (ADS)

    Ness, P. H.; Jacobson, H.

    1984-10-01

    The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.

  7. Computer-aided classification of breast masses using contrast-enhanced digital mammograms

    NASA Astrophysics Data System (ADS)

    Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin

    2018-02-01

    By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (p<0.01). Since DES images eliminate overlapping effect of dense breast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.

  8. Mental Task Classification Scheme Utilizing Correlation Coefficient Extracted from Interchannel Intrinsic Mode Function.

    PubMed

    Rahman, Md Mostafizur; Fattah, Shaikh Anowarul

    2017-01-01

    In view of recent increase of brain computer interface (BCI) based applications, the importance of efficient classification of various mental tasks has increased prodigiously nowadays. In order to obtain effective classification, efficient feature extraction scheme is necessary, for which, in the proposed method, the interchannel relationship among electroencephalogram (EEG) data is utilized. It is expected that the correlation obtained from different combination of channels will be different for different mental tasks, which can be exploited to extract distinctive feature. The empirical mode decomposition (EMD) technique is employed on a test EEG signal obtained from a channel, which provides a number of intrinsic mode functions (IMFs), and correlation coefficient is extracted from interchannel IMF data. Simultaneously, different statistical features are also obtained from each IMF. Finally, the feature matrix is formed utilizing interchannel correlation features and intrachannel statistical features of the selected IMFs of EEG signal. Different kernels of the support vector machine (SVM) classifier are used to carry out the classification task. An EEG dataset containing ten different combinations of five different mental tasks is utilized to demonstrate the classification performance and a very high level of accuracy is achieved by the proposed scheme compared to existing methods.

  9. Comparative Study of SVM Methods Combined with Voxel Selection for Object Category Classification on fMRI Data

    PubMed Central

    Song, Sutao; Zhan, Zhichao; Long, Zhiying; Zhang, Jiacai; Yao, Li

    2011-01-01

    Background Support vector machine (SVM) has been widely used as accurate and reliable method to decipher brain patterns from functional MRI (fMRI) data. Previous studies have not found a clear benefit for non-linear (polynomial kernel) SVM versus linear one. Here, a more effective non-linear SVM using radial basis function (RBF) kernel is compared with linear SVM. Different from traditional studies which focused either merely on the evaluation of different types of SVM or the voxel selection methods, we aimed to investigate the overall performance of linear and RBF SVM for fMRI classification together with voxel selection schemes on classification accuracy and time-consuming. Methodology/Principal Findings Six different voxel selection methods were employed to decide which voxels of fMRI data would be included in SVM classifiers with linear and RBF kernels in classifying 4-category objects. Then the overall performances of voxel selection and classification methods were compared. Results showed that: (1) Voxel selection had an important impact on the classification accuracy of the classifiers: in a relative low dimensional feature space, RBF SVM outperformed linear SVM significantly; in a relative high dimensional space, linear SVM performed better than its counterpart; (2) Considering the classification accuracy and time-consuming holistically, linear SVM with relative more voxels as features and RBF SVM with small set of voxels (after PCA) could achieve the better accuracy and cost shorter time. Conclusions/Significance The present work provides the first empirical result of linear and RBF SVM in classification of fMRI data, combined with voxel selection methods. Based on the findings, if only classification accuracy was concerned, RBF SVM with appropriate small voxels and linear SVM with relative more voxels were two suggested solutions; if users concerned more about the computational time, RBF SVM with relative small set of voxels when part of the principal components were kept as features was a better choice. PMID:21359184

  10. Comparative study of SVM methods combined with voxel selection for object category classification on fMRI data.

    PubMed

    Song, Sutao; Zhan, Zhichao; Long, Zhiying; Zhang, Jiacai; Yao, Li

    2011-02-16

    Support vector machine (SVM) has been widely used as accurate and reliable method to decipher brain patterns from functional MRI (fMRI) data. Previous studies have not found a clear benefit for non-linear (polynomial kernel) SVM versus linear one. Here, a more effective non-linear SVM using radial basis function (RBF) kernel is compared with linear SVM. Different from traditional studies which focused either merely on the evaluation of different types of SVM or the voxel selection methods, we aimed to investigate the overall performance of linear and RBF SVM for fMRI classification together with voxel selection schemes on classification accuracy and time-consuming. Six different voxel selection methods were employed to decide which voxels of fMRI data would be included in SVM classifiers with linear and RBF kernels in classifying 4-category objects. Then the overall performances of voxel selection and classification methods were compared. Results showed that: (1) Voxel selection had an important impact on the classification accuracy of the classifiers: in a relative low dimensional feature space, RBF SVM outperformed linear SVM significantly; in a relative high dimensional space, linear SVM performed better than its counterpart; (2) Considering the classification accuracy and time-consuming holistically, linear SVM with relative more voxels as features and RBF SVM with small set of voxels (after PCA) could achieve the better accuracy and cost shorter time. The present work provides the first empirical result of linear and RBF SVM in classification of fMRI data, combined with voxel selection methods. Based on the findings, if only classification accuracy was concerned, RBF SVM with appropriate small voxels and linear SVM with relative more voxels were two suggested solutions; if users concerned more about the computational time, RBF SVM with relative small set of voxels when part of the principal components were kept as features was a better choice.

  11. Local classification: Locally weighted-partial least squares-discriminant analysis (LW-PLS-DA).

    PubMed

    Bevilacqua, Marta; Marini, Federico

    2014-08-01

    The possibility of devising a simple, flexible and accurate non-linear classification method, by extending the locally weighted partial least squares (LW-PLS) approach to the cases where the algorithm is used in a discriminant way (partial least squares discriminant analysis, PLS-DA), is presented. In particular, to assess which category an unknown sample belongs to, the proposed algorithm operates by identifying which training objects are most similar to the one to be predicted and building a PLS-DA model using these calibration samples only. Moreover, the influence of the selected training samples on the local model can be further modulated by adopting a not uniform distance-based weighting scheme which allows the farthest calibration objects to have less impact than the closest ones. The performances of the proposed locally weighted-partial least squares-discriminant analysis (LW-PLS-DA) algorithm have been tested on three simulated data sets characterized by a varying degree of non-linearity: in all cases, a classification accuracy higher than 99% on external validation samples was achieved. Moreover, when also applied to a real data set (classification of rice varieties), characterized by a high extent of non-linearity, the proposed method provided an average correct classification rate of about 93% on the test set. By the preliminary results, showed in this paper, the performances of the proposed LW-PLS-DA approach have proved to be comparable and in some cases better than those obtained by other non-linear methods (k nearest neighbors, kernel-PLS-DA and, in the case of rice, counterpropagation neural networks). Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Land cover classification of VHR airborne images for citrus grove identification

    NASA Astrophysics Data System (ADS)

    Amorós López, J.; Izquierdo Verdiguier, E.; Gómez Chova, L.; Muñoz Marí, J.; Rodríguez Barreiro, J. Z.; Camps Valls, G.; Calpe Maravilla, J.

    Managing land resources using remote sensing techniques is becoming a common practice. However, data analysis procedures should satisfy the high accuracy levels demanded by users (public or private companies and governments) in order to be extensively used. This paper presents a multi-stage classification scheme to update the citrus Geographical Information System (GIS) of the Comunidad Valenciana region (Spain). Spain is the first citrus fruit producer in Europe and the fourth in the world. In particular, citrus fruits represent 67% of the agricultural production in this region, with a total production of 4.24 million tons (campaign 2006-2007). The citrus GIS inventory, created in 2001, needs to be regularly updated in order to monitor changes quickly enough, and allow appropriate policy making and citrus production forecasting. Automatic methods are proposed in this work to facilitate this update, whose processing scheme is summarized as follows. First, an object-oriented feature extraction process is carried out for each cadastral parcel from very high spatial resolution aerial images (0.5 m). Next, several automatic classifiers (decision trees, artificial neural networks, and support vector machines) are trained and combined to improve the final classification accuracy. Finally, the citrus GIS is automatically updated if a high enough level of confidence, based on the agreement between classifiers, is achieved. This is the case for 85% of the parcels and accuracy results exceed 94%. The remaining parcels are classified by expert photo-interpreters in order to guarantee the high accuracy demanded by policy makers.

  13. Chondrule formation, metamorphism, brecciation, an important new primary chondrule group, and the classification of chondrules

    NASA Technical Reports Server (NTRS)

    Sears, Derek W. G.; Shaoxiong, Huang; Benoit, Paul H.

    1995-01-01

    The recently proposed compositional classification scheme for meteoritic chondrules divides the chondrules into groups depending on the composition of their two major phases, olivine (or pyroxene) and the mesostasis, both of which are genetically important. The scheme is here applied to discussions of three topics: the petrographic classification of Roosevelt County 075 (the least-metamorphosed H chondrite known), brecciation (an extremely important and ubiquitous process probably experienced by greater than 40% of all unequilibrated ordinary chondrites), and the group A5 chondrules in the least metamorphosed ordinary chondrites which have many similarities to chondrules in the highly metamorphosed 'equilibrated' chondrites. Since composition provides insights into both primary formation properties of the chondruies and the effects of metamorphism on the entire assemblage it is possible to determine the petrographic type of RC075 as 3.1 with unique certainty. Similarly, the near scheme can be applied to individual chondrules without knowledge of the petrographic type of the host chondrite, which makes it especially suitable for studying breccias. Finally, the new scheme has revealed the existence of chondrules not identified by previous techniques and which appear to be extremely important. Like group A1 and A2 chondrules (but unlike group B1 chondrules) the primitive group A5 chondruies did not supercool during formation, but unlike group A1 and A2 chondrules (and like group B1 chondrules) they did not suffer volatile loss and reduction during formation. It is concluded that the compositional classification scheme provides important new insights into the formation and history of chondrules and chondrites which would be overlooked by previous schemes.

  14. Carnegie's New Community Engagement Classification: Affirming Higher Education's Role in Community

    ERIC Educational Resources Information Center

    Driscoll, Amy

    2009-01-01

    In 2005, the Carnegie Foundation for the Advancement of Teaching (CFAT) stirred the higher education world with the announcement of a new classification for institutions that engage with community. The classification, community engagement, is the first in a set of planned classification schemes resulting from the foundation's reexamination of the…

  15. Computation offloading for real-time health-monitoring devices.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Tuan Le; Hosseini, Anahita; Sarrafzadeh, Majid

    2016-08-01

    Among the major challenges in the development of real-time wearable health monitoring systems is to optimize battery life. One of the major techniques with which this objective can be achieved is computation offloading, in which portions of computation can be partitioned between the device and other resources such as a server or cloud. In this paper, we describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data between the wearable device and mobile application as a function of desired classification accuracy.

  16. Low Temperature Performance of High-Speed Neural Network Circuits

    NASA Technical Reports Server (NTRS)

    Duong, T.; Tran, M.; Daud, T.; Thakoor, A.

    1995-01-01

    Artificial neural networks, derived from their biological counterparts, offer a new and enabling computing paradigm specially suitable for such tasks as image and signal processing with feature classification/object recognition, global optimization, and adaptive control. When implemented in fully parallel electronic hardware, it offers orders of magnitude speed advantage. Basic building blocks of the new architecture are the processing elements called neurons implemented as nonlinear operational amplifiers with sigmoidal transfer function, interconnected through weighted connections called synapses implemented using circuitry for weight storage and multiply functions either in an analog, digital, or hybrid scheme.

  17. Classification for Estuarine Ecosystems: A Review and Comparison of Selected Classification Schemes

    EPA Science Inventory

    Estuarine scientists have devoted considerable effort to classifying coastal, estuarine and marine environments and their watersheds, for a variety of purposes. These classifications group systems with similarities – most often in physical and hydrodynamic properties – in order ...

  18. Comprehensive 4-stage categorization of bicuspid aortic valve leaflet morphology by cardiac MRI in 386 patients.

    PubMed

    Murphy, I G; Collins, J; Powell, A; Markl, M; McCarthy, P; Malaisrie, S C; Carr, J C; Barker, A J

    2017-08-01

    Bicuspid aortic valve (BAV) disease is heterogeneous and related to valve dysfunction and aortopathy. Appropriate follow up and surveillance of patients with BAV may depend on correct phenotypic categorization. There are multiple classification schemes, however a need exists to comprehensively capture commissure fusion, leaflet asymmetry, and valve orifice orientation. Our aim was to develop a BAV classification scheme for use at MRI to ascertain the frequency of different phenotypes and the consistency of BAV classification. The BAV classification scheme builds on the Sievers surgical BAV classification, adding valve orifice orientation, partial leaflet fusion and leaflet asymmetry. A single observer successfully applied this classification to 386 of 398 Cardiac MRI studies. Repeatability of categorization was ascertained with intraobserver and interobserver kappa scores. Sensitivity and specificity of MRI findings was determined from operative reports, where available. Fusion of the right and left leaflets accounted for over half of all cases. Partial leaflet fusion was seen in 46% of patients. Good interobserver agreement was seen for orientation of the valve opening (κ = 0.90), type (κ = 0.72) and presence of partial fusion (κ = 0.83, p < 0.0001). Retrospective review of operative notes showed sensitivity and specificity for orientation (90, 93%) and for Sievers type (73, 87%). The proposed BAV classification schema was assessed by MRI for its reliability to classify valve morphology in addition to illustrating the wide heterogeneity of leaflet size, orifice orientation, and commissural fusion. The classification may be helpful in further understanding the relationship between valve morphology, flow derangement and aortopathy.

  19. A new classification scheme of plastic wastes based upon recycling labels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Özkan, Kemal, E-mail: kozkan@ogu.edu.tr; Ergin, Semih, E-mail: sergin@ogu.edu.tr; Işık, Şahin, E-mail: sahini@ogu.edu.tr

    Highlights: • PET, HPDE or PP types of plastics are considered. • An automated classification of plastic bottles based on the feature extraction and classification methods is performed. • The decision mechanism consists of PCA, Kernel PCA, FLDA, SVD and Laplacian Eigenmaps methods. • SVM is selected to achieve the classification task and majority voting technique is used. - Abstract: Since recycling of materials is widely assumed to be environmentally and economically beneficial, reliable sorting and processing of waste packaging materials such as plastics is very important for recycling with high efficiency. An automated system that can quickly categorize thesemore » materials is certainly needed for obtaining maximum classification while maintaining high throughput. In this paper, first of all, the photographs of the plastic bottles have been taken and several preprocessing steps were carried out. The first preprocessing step is to extract the plastic area of a bottle from the background. Then, the morphological image operations are implemented. These operations are edge detection, noise removal, hole removing, image enhancement, and image segmentation. These morphological operations can be generally defined in terms of the combinations of erosion and dilation. The effect of bottle color as well as label are eliminated using these operations. Secondly, the pixel-wise intensity values of the plastic bottle images have been used together with the most popular subspace and statistical feature extraction methods to construct the feature vectors in this study. Only three types of plastics are considered due to higher existence ratio of them than the other plastic types in the world. The decision mechanism consists of five different feature extraction methods including as Principal Component Analysis (PCA), Kernel PCA (KPCA), Fisher’s Linear Discriminant Analysis (FLDA), Singular Value Decomposition (SVD) and Laplacian Eigenmaps (LEMAP) and uses a simple experimental setup with a camera and homogenous backlighting. Due to the giving global solution for a classification problem, Support Vector Machine (SVM) is selected to achieve the classification task and majority voting technique is used as the decision mechanism. This technique equally weights each classification result and assigns the given plastic object to the class that the most classification results agree on. The proposed classification scheme provides high accuracy rate, and also it is able to run in real-time applications. It can automatically classify the plastic bottle types with approximately 90% recognition accuracy. Besides this, the proposed methodology yields approximately 96% classification rate for the separation of PET or non-PET plastic types. It also gives 92% accuracy for the categorization of non-PET plastic types into HPDE or PP.« less

  20. Inter-rater reliability of a modified version of Delitto et al.’s classification-based system for low back pain: a pilot study

    PubMed Central

    Apeldoorn, Adri T.; van Helvoirt, Hans; Ostelo, Raymond W.; Meihuizen, Hanneke; Kamper, Steven J.; van Tulder, Maurits W.; de Vet, Henrica C. W.

    2016-01-01

    Study design Observational inter-rater reliability study. Objectives To examine: (1) the inter-rater reliability of a modified version of Delitto et al.’s classification-based algorithm for patients with low back pain; (2) the influence of different levels of familiarity with the system; and (3) the inter-rater reliability of algorithm decisions in patients who clearly fit into a subgroup (clear classifications) and those who do not (unclear classifications). Methods Patients were examined twice on the same day by two of three participating physical therapists with different levels of familiarity with the system. Patients were classified into one of four classification groups. Raters were blind to the others’ classification decision. In order to quantify the inter-rater reliability, percentages of agreement and Cohen’s Kappa were calculated. Results A total of 36 patients were included (clear classification n = 23; unclear classification n = 13). The overall rate of agreement was 53% and the Kappa value was 0·34 [95% confidence interval (CI): 0·11–0·57], which indicated only fair inter-rater reliability. Inter-rater reliability for patients with a clear classification (agreement 52%, Kappa value 0·29) was not higher than for patients with an unclear classification (agreement 54%, Kappa value 0·33). Familiarity with the system (i.e. trained with written instructions and previous research experience with the algorithm) did not improve the inter-rater reliability. Conclusion Our pilot study challenges the inter-rater reliability of the classification procedure in clinical practice. Therefore, more knowledge is needed about factors that affect the inter-rater reliability, in order to improve the clinical applicability of the classification scheme. PMID:27559279

  1. Centrifuge: rapid and sensitive classification of metagenomic sequences

    PubMed Central

    Song, Li; Breitwieser, Florian P.

    2016-01-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649

  2. Modern radiosurgical and endovascular classification schemes for brain arteriovenous malformations.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-05-04

    Stereotactic radiosurgery (SRS) and endovascular techniques are commonly used for treating brain arteriovenous malformations (bAVMs). They are usually used as ancillary techniques to microsurgery but may also be used as solitary treatment options. Careful patient selection requires a clear estimate of the treatment efficacy and complication rates for the individual patient. As such, classification schemes are an essential part of patient selection paradigm for each treatment modality. While the Spetzler-Martin grading system and its subsequent modifications are commonly used for microsurgical outcome prediction for bAVMs, the same system(s) may not be easily applicable to SRS and endovascular therapy. Several radiosurgical- and endovascular-based grading scales have been proposed for bAVMs. However, a comprehensive review of these systems including a discussion on their relative advantages and disadvantages is missing. This paper is dedicated to modern classification schemes designed for SRS and endovascular techniques.

  3. Stokes space modulation format classification based on non-iterative clustering algorithm for coherent optical receivers.

    PubMed

    Mai, Xiaofeng; Liu, Jie; Wu, Xiong; Zhang, Qun; Guo, Changjian; Yang, Yanfu; Li, Zhaohui

    2017-02-06

    A Stokes-space modulation format classification (MFC) technique is proposed for coherent optical receivers by using a non-iterative clustering algorithm. In the clustering algorithm, two simple parameters are calculated to help find the density peaks of the data points in Stokes space and no iteration is required. Correct MFC can be realized in numerical simulations among PM-QPSK, PM-8QAM, PM-16QAM, PM-32QAM and PM-64QAM signals within practical optical signal-to-noise ratio (OSNR) ranges. The performance of the proposed MFC algorithm is also compared with those of other schemes based on clustering algorithms. The simulation results show that good classification performance can be achieved using the proposed MFC scheme with moderate time complexity. Proof-of-concept experiments are finally implemented to demonstrate MFC among PM-QPSK/16QAM/64QAM signals, which confirm the feasibility of our proposed MFC scheme.

  4. Towards biological plausibility of electronic noses: A spiking neural network based approach for tea odour classification.

    PubMed

    Sarkar, Sankho Turjo; Bhondekar, Amol P; Macaš, Martin; Kumar, Ritesh; Kaur, Rishemjit; Sharma, Anupma; Gulati, Ashu; Kumar, Amod

    2015-11-01

    The paper presents a novel encoding scheme for neuronal code generation for odour recognition using an electronic nose (EN). This scheme is based on channel encoding using multiple Gaussian receptive fields superimposed over the temporal EN responses. The encoded data is further applied to a spiking neural network (SNN) for pattern classification. Two forms of SNN, a back-propagation based SpikeProp and a dynamic evolving SNN are used to learn the encoded responses. The effects of information encoding on the performance of SNNs have been investigated. Statistical tests have been performed to determine the contribution of the SNN and the encoding scheme to overall odour discrimination. The approach has been implemented in odour classification of orthodox black tea (Kangra-Himachal Pradesh Region) thereby demonstrating a biomimetic approach for EN data analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Authentication of bee pollen grains in bright-field microscopy by combining one-class classification techniques and image processing.

    PubMed

    Chica, Manuel

    2012-11-01

    A novel method for authenticating pollen grains in bright-field microscopic images is presented in this work. The usage of this new method is clear in many application fields such as bee-keeping sector, where laboratory experts need to identify fraudulent bee pollen samples against local known pollen types. Our system is based on image processing and one-class classification to reject unknown pollen grain objects. The latter classification technique allows us to tackle the major difficulty of the problem, the existence of many possible fraudulent pollen types, and the impossibility of modeling all of them. Different one-class classification paradigms are compared to study the most suitable technique for solving the problem. In addition, feature selection algorithms are applied to reduce the complexity and increase the accuracy of the models. For each local pollen type, a one-class classifier is trained and aggregated into a multiclassifier model. This multiclassification scheme combines the output of all the one-class classifiers in a unique final response. The proposed method is validated by authenticating pollen grains belonging to different Spanish bee pollen types. The overall accuracy of the system on classifying fraudulent microscopic pollen grain objects is 92.3%. The system is able to rapidly reject pollen grains, which belong to nonlocal pollen types, reducing the laboratory work and effort. The number of possible applications of this authentication method in the microscopy research field is unlimited. Copyright © 2012 Wiley Periodicals, Inc.

  6. [The establishment, development and application of classification approach of freshwater phytoplankton based on the functional group: a review].

    PubMed

    Yang, Wen; Zhu, Jin-Yong; Lu, Kai-Hong; Wan, Li; Mao, Xiao-Hua

    2014-06-01

    Appropriate schemes for classification of freshwater phytoplankton are prerequisites and important tools for revealing phytoplanktonic succession and studying freshwater ecosystems. An alternative approach, functional group of freshwater phytoplankton, has been proposed and developed due to the deficiencies of Linnaean and molecular identification in ecological applications. The functional group of phytoplankton is a classification scheme based on autoecology. In this study, the theoretical basis and classification criterion of functional group (FG), morpho-functional group (MFG) and morphology-based functional group (MBFG) were summarized, as well as their merits and demerits. FG was considered as the optimal classification approach for the aquatic ecology research and aquatic environment evaluation. The application status of FG was introduced, with the evaluation standards and problems of two approaches to assess water quality on the basis of FG, index methods of Q and QR, being briefly discussed.

  7. Addition of Histology to the Paris Classification of Pediatric Crohn Disease Alters Classification of Disease Location.

    PubMed

    Fernandes, Melissa A; Verstraete, Sofia G; Garnett, Elizabeth A; Heyman, Melvin B

    2016-02-01

    The aim of the study was to investigate the value of microscopic findings in the classification of pediatric Crohn disease (CD) by determining whether classification of disease changes significantly with inclusion of histologic findings. Sixty patients were randomly selected from a cohort of patients studied at the Pediatric Inflammatory Bowel Disease Clinic at the University of California, San Francisco Benioff Children's Hospital. Two physicians independently reviewed the electronic health records of the included patients to determine the Paris classification for each patient by adhering to present guidelines and then by including microscopic findings. Macroscopic and combined disease location classifications were discordant in 34 (56.6%), with no statistically significant differences between groups. Interobserver agreement was higher in the combined classification (κ = 0.73, 95% confidence interval 0.65-0.82) as opposed to when classification was limited to macroscopic findings (κ = 0.53, 95% confidence interval 0.40-0.58). When evaluating the proximal upper gastrointestinal tract (Paris L4a), the interobserver agreement was better in macroscopic compared with the combined classification. Disease extent classifications differed significantly when comparing isolated macroscopic findings (Paris classification) with the combined scheme that included microscopy. Further studies are needed to determine which scheme provides more accurate representation of disease extent.

  8. A Three-Phase Decision Model of Computer-Aided Coding for the Iranian Classification of Health Interventions (IRCHI)

    PubMed Central

    Azadmanjir, Zahra; Safdari, Reza; Ghazisaeedi, Marjan; Mokhtaran, Mehrshad; Kameli, Mohammad Esmail

    2017-01-01

    Introduction: Accurate coded data in the healthcare are critical. Computer-Assisted Coding (CAC) is an effective tool to improve clinical coding in particular when a new classification will be developed and implemented. But determine the appropriate method for development need to consider the specifications of existing CAC systems, requirements for each type, our infrastructure and also, the classification scheme. Aim: The aim of the study was the development of a decision model for determining accurate code of each medical intervention in Iranian Classification of Health Interventions (IRCHI) that can be implemented as a suitable CAC system. Methods: first, a sample of existing CAC systems was reviewed. Then feasibility of each one of CAC types was examined with regard to their prerequisites for their implementation. The next step, proper model was proposed according to the structure of the classification scheme and was implemented as an interactive system. Results: There is a significant relationship between the level of assistance of a CAC system and integration of it with electronic medical documents. Implementation of fully automated CAC systems is impossible due to immature development of electronic medical record and problems in using language for medical documenting. So, a model was proposed to develop semi-automated CAC system based on hierarchical relationships between entities in the classification scheme and also the logic of decision making to specify the characters of code step by step through a web-based interactive user interface for CAC. It was composed of three phases to select Target, Action and Means respectively for an intervention. Conclusion: The proposed model was suitable the current status of clinical documentation and coding in Iran and also, the structure of new classification scheme. Our results show it was practical. However, the model needs to be evaluated in the next stage of the research. PMID:28883671

  9. Classification and reduction of pilot error

    NASA Technical Reports Server (NTRS)

    Rogers, W. H.; Logan, A. L.; Boley, G. D.

    1989-01-01

    Human error is a primary or contributing factor in about two-thirds of commercial aviation accidents worldwide. With the ultimate goal of reducing pilot error accidents, this contract effort is aimed at understanding the factors underlying error events and reducing the probability of certain types of errors by modifying underlying factors such as flight deck design and procedures. A review of the literature relevant to error classification was conducted. Classification includes categorizing types of errors, the information processing mechanisms and factors underlying them, and identifying factor-mechanism-error relationships. The classification scheme developed by Jens Rasmussen was adopted because it provided a comprehensive yet basic error classification shell or structure that could easily accommodate addition of details on domain-specific factors. For these purposes, factors specific to the aviation environment were incorporated. Hypotheses concerning the relationship of a small number of underlying factors, information processing mechanisms, and error types types identified in the classification scheme were formulated. ASRS data were reviewed and a simulation experiment was performed to evaluate and quantify the hypotheses.

  10. A Visual Basic program to plot sediment grain-size data on ternary diagrams

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.

    2008-01-01

    Sedimentologic datasets are typically large and compiled into tables or databases, but pure numerical information can be difficult to understand and interpret. Thus, scientists commonly use graphical representations to reduce complexities, recognize trends and patterns in the data, and develop hypotheses. Of the graphical techniques, one of the most common methods used by sedimentologists is to plot the basic gravel, sand, silt, and clay percentages on equilateral triangular diagrams. This means of presenting data is simple and facilitates rapid classification of sediments and comparison of samples.The original classification scheme developed by Shepard (1954) used a single ternary diagram with sand, silt, and clay in the corners and 10 categories to graphically show the relative proportions among these three grades within a sample. This scheme, however, did not allow for sediments with significant amounts of gravel. Therefore, Shepard's classification scheme was later modified by the addition of a second ternary diagram with two categories to account for gravel and gravelly sediment (Schlee, 1973). The system devised by Folk (1954, 1974)\\ is also based on two triangular diagrams, but it has 21 categories and uses the term mud (defined as silt plus clay). Patterns within the triangles of both systems differ, as does the emphasis placed on gravel. For example, in the system described by Shepard, gravelly sediments have more than 10% gravel; in Folk's system, slightly gravelly sediments have as little as 0.01% gravel. Folk's classification scheme stresses gravel because its concentration is a function of the highest current velocity at the time of deposition as is the maximum grain size of the detritus that is available; Shepard's classification scheme emphasizes the ratios of sand, silt, and clay because they reflect sorting and reworking (Poppe et al., 2005).The program described herein (SEDPLOT) generates verbal equivalents and ternary diagrams to characterize sediment grain-size distributions. It is written in Microsoft Visual Basic 6.0 and provides a window to facilitate program execution. The inputs for the sediment fractions are percentages of gravel, sand, silt, and clay in the Wentworth (1922) grade scale, and the program permits the user to select output in either the Shepard (1954) classification scheme, modified as described above, or the Folk (1954, 1974) scheme. Users select options primarily with mouse-click events and through interactive dialogue boxes. This program is intended as a companion to other Visual Basic software we have developed to process sediment data (Poppe et al., 2003, 2004).

  11. Semantic Shot Classification in Sports Video

    NASA Astrophysics Data System (ADS)

    Duan, Ling-Yu; Xu, Min; Tian, Qi

    2003-01-01

    In this paper, we present a unified framework for semantic shot classification in sports videos. Unlike previous approaches, which focus on clustering by aggregating shots with similar low-level features, the proposed scheme makes use of domain knowledge of a specific sport to perform a top-down video shot classification, including identification of video shot classes for each sport, and supervised learning and classification of the given sports video with low-level and middle-level features extracted from the sports video. It is observed that for each sport we can predefine a small number of semantic shot classes, about 5~10, which covers 90~95% of sports broadcasting video. With the supervised learning method, we can map the low-level features to middle-level semantic video shot attributes such as dominant object motion (a player), camera motion patterns, and court shape, etc. On the basis of the appropriate fusion of those middle-level shot classes, we classify video shots into the predefined video shot classes, each of which has a clear semantic meaning. The proposed method has been tested over 4 types of sports videos: tennis, basketball, volleyball and soccer. Good classification accuracy of 85~95% has been achieved. With correctly classified sports video shots, further structural and temporal analysis, such as event detection, video skimming, table of content, etc, will be greatly facilitated.

  12. Robust BMPM training based on second-order cone programming and its application in medical diagnosis.

    PubMed

    Peng, Xiang; King, Irwin

    2008-01-01

    The Biased Minimax Probability Machine (BMPM) constructs a classifier which deals with the imbalanced learning tasks. It provides a worst-case bound on the probability of misclassification of future data points based on reliable estimates of means and covariance matrices of the classes from the training data samples, and achieves promising performance. In this paper, we develop a novel yet critical extension training algorithm for BMPM that is based on Second-Order Cone Programming (SOCP). Moreover, we apply the biased classification model to medical diagnosis problems to demonstrate its usefulness. By removing some crucial assumptions in the original solution to this model, we make the new method more accurate and robust. We outline the theoretical derivatives of the biased classification model, and reformulate it into an SOCP problem which could be efficiently solved with global optima guarantee. We evaluate our proposed SOCP-based BMPM (BMPMSOCP) scheme in comparison with traditional solutions on medical diagnosis tasks where the objectives are to focus on improving the sensitivity (the accuracy of the more important class, say "ill" samples) instead of the overall accuracy of the classification. Empirical results have shown that our method is more effective and robust to handle imbalanced classification problems than traditional classification approaches, and the original Fractional Programming-based BMPM (BMPMFP).

  13. Probing the Dusty Stellar Populations of the Local Volume Galaxies with JWST /MIRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Olivia C.; Meixner, Margaret; Justtanont, Kay

    The Mid-Infrared Instrument (MIRI) for the James Webb Space Telescope ( JWST ) will revolutionize our understanding of infrared stellar populations in the Local Volume. Using the rich Spitzer -IRS spectroscopic data set and spectral classifications from the Surveying the Agents of Galaxy Evolution (SAGE)–Spectroscopic survey of more than 1000 objects in the Magellanic Clouds, the Grid of Red Supergiant and Asymptotic Giant Branch Star Model (grams), and the grid of YSO models by Robitaille et al., we calculate the expected flux densities and colors in the MIRI broadband filters for prominent infrared stellar populations. We use these fluxes tomore » explore the JWST /MIRI colors and magnitudes for composite stellar population studies of Local Volume galaxies. MIRI color classification schemes are presented; these diagrams provide a powerful means of identifying young stellar objects, evolved stars, and extragalactic background galaxies in Local Volume galaxies with a high degree of confidence. Finally, we examine which filter combinations are best for selecting populations of sources based on their JWST colors.« less

  14. A new classification scheme of plastic wastes based upon recycling labels.

    PubMed

    Özkan, Kemal; Ergin, Semih; Işık, Şahin; Işıklı, Idil

    2015-01-01

    Since recycling of materials is widely assumed to be environmentally and economically beneficial, reliable sorting and processing of waste packaging materials such as plastics is very important for recycling with high efficiency. An automated system that can quickly categorize these materials is certainly needed for obtaining maximum classification while maintaining high throughput. In this paper, first of all, the photographs of the plastic bottles have been taken and several preprocessing steps were carried out. The first preprocessing step is to extract the plastic area of a bottle from the background. Then, the morphological image operations are implemented. These operations are edge detection, noise removal, hole removing, image enhancement, and image segmentation. These morphological operations can be generally defined in terms of the combinations of erosion and dilation. The effect of bottle color as well as label are eliminated using these operations. Secondly, the pixel-wise intensity values of the plastic bottle images have been used together with the most popular subspace and statistical feature extraction methods to construct the feature vectors in this study. Only three types of plastics are considered due to higher existence ratio of them than the other plastic types in the world. The decision mechanism consists of five different feature extraction methods including as Principal Component Analysis (PCA), Kernel PCA (KPCA), Fisher's Linear Discriminant Analysis (FLDA), Singular Value Decomposition (SVD) and Laplacian Eigenmaps (LEMAP) and uses a simple experimental setup with a camera and homogenous backlighting. Due to the giving global solution for a classification problem, Support Vector Machine (SVM) is selected to achieve the classification task and majority voting technique is used as the decision mechanism. This technique equally weights each classification result and assigns the given plastic object to the class that the most classification results agree on. The proposed classification scheme provides high accuracy rate, and also it is able to run in real-time applications. It can automatically classify the plastic bottle types with approximately 90% recognition accuracy. Besides this, the proposed methodology yields approximately 96% classification rate for the separation of PET or non-PET plastic types. It also gives 92% accuracy for the categorization of non-PET plastic types into HPDE or PP. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. The Classification of Hysteria and Related Disorders: Historical and Phenomenological Considerations

    PubMed Central

    North, Carol S.

    2015-01-01

    This article examines the history of the conceptualization of dissociative, conversion, and somatoform syndromes in relation to one another, chronicles efforts to classify these and other phenomenologically-related psychopathology in the American diagnostic system for mental disorders, and traces the subsequent divergence in opinions of dissenting sectors on classification of these disorders. This article then considers the extensive phenomenological overlap across these disorders in empirical research, and from this foundation presents a new model for the conceptualization of these disorders. The classification of disorders formerly known as hysteria and phenomenologically-related syndromes has long been contentious and unsettled. Examination of the long history of the conceptual difficulties, which remain inherent in existing classification schemes for these disorders, can help to address the continuing controversy. This review clarifies the need for a major conceptual revision of the current classification of these disorders. A new phenomenologically-based classification scheme for these disorders is proposed that is more compatible with the agnostic and atheoretical approach to diagnosis of mental disorders used by the current classification system. PMID:26561836

  16. An on-line BCI for control of hand grasp sequence and holding using adaptive probabilistic neural network.

    PubMed

    Hazrati, Mehrnaz Kh; Erfanian, Abbas

    2008-01-01

    This paper presents a new EEG-based Brain-Computer Interface (BCI) for on-line controlling the sequence of hand grasping and holding in a virtual reality environment. The goal of this research is to develop an interaction technique that will allow the BCI to be effective in real-world scenarios for hand grasp control. Moreover, for consistency of man-machine interface, it is desirable the intended movement to be what the subject imagines. For this purpose, we developed an on-line BCI which was based on the classification of EEG associated with imagination of the movement of hand grasping and resting state. A classifier based on probabilistic neural network (PNN) was introduced for classifying the EEG. The PNN is a feedforward neural network that realizes the Bayes decision discriminant function by estimating probability density function using mixtures of Gaussian kernels. Two types of classification schemes were considered here for on-line hand control: adaptive and static. In contrast to static classification, the adaptive classifier was continuously updated on-line during recording. The experimental evaluation on six subjects on different days demonstrated that by using the static scheme, a classification accuracy as high as the rate obtained by the adaptive scheme can be achieved. At the best case, an average classification accuracy of 93.0% and 85.8% was obtained using adaptive and static scheme, respectively. The results obtained from more than 1500 trials on six subjects showed that interactive virtual reality environment can be used as an effective tool for subject training in BCI.

  17. Classification of Instructional Programs: 2000 Edition.

    ERIC Educational Resources Information Center

    Morgan, Robert L.; Hunt, E. Stephen

    This third revision of the Classification of Instructional Programs (CIP) updates and modifies education program classifications, providing a taxonomic scheme that supports the accurate tracking, assessment, and reporting of field of study and program completions activity. This edition has also been adopted as the standard field of study taxonomy…

  18. Attribution of local climate zones using a multitemporal land use/land cover classification scheme

    NASA Astrophysics Data System (ADS)

    Wicki, Andreas; Parlow, Eberhard

    2017-04-01

    Worldwide, the number of people living in an urban environment exceeds the rural population with increasing tendency. Especially in relation to global climate change, cities play a major role considering the impacts of extreme heat waves on the population. For urban planners, it is important to know which types of urban structures are beneficial for a comfortable urban climate and which actions can be taken to improve urban climate conditions. Therefore, it is essential to differ between not only urban and rural environments, but also between different levels of urban densification. To compare these built-up types within different cities worldwide, Stewart and Oke developed the concept of local climate zones (LCZ) defined by morphological characteristics. The original LCZ scheme often has considerable problems when adapted to European cities with historical city centers, including narrow streets and irregular patterns. In this study, a method to bridge the gap between a classical land use/land cover (LULC) classification and the LCZ scheme is presented. Multitemporal Landsat 8 data are used to create a high accuracy LULC map, which is linked to the LCZ by morphological parameters derived from a high-resolution digital surface model and cadastral data. A bijective combination of the different classification schemes could not be achieved completely due to overlapping threshold values and the spatially homogeneous distribution of morphological parameters, but the attribution of LCZ to the LULC classification was successful.

  19. Empirical and modeled synoptic cloud climatology of the Arctic Ocean

    NASA Technical Reports Server (NTRS)

    Barry, R. G.; Newell, J. P.; Schweiger, A.; Crane, R. G.

    1986-01-01

    A set of cloud cover data were developed for the Arctic during the climatically important spring/early summer transition months. Parallel with the determination of mean monthly cloud conditions, data for different synoptic pressure patterns were also composited as a means of evaluating the role of synoptic variability on Arctic cloud regimes. In order to carry out this analysis, a synoptic classification scheme was developed for the Arctic using an objective typing procedure. A second major objective was to analyze model output of pressure fields and cloud parameters from a control run of the Goddard Institue for Space Studies climate model for the same area and to intercompare the synoptic climatatology of the model with that based on the observational data.

  20. Prediction of cause of death from forensic autopsy reports using text classification techniques: A comparative study.

    PubMed

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa

    2018-07-01

    Automatic text classification techniques are useful for classifying plaintext medical documents. This study aims to automatically predict the cause of death from free text forensic autopsy reports by comparing various schemes for feature extraction, term weighing or feature value representation, text classification, and feature reduction. For experiments, the autopsy reports belonging to eight different causes of death were collected, preprocessed and converted into 43 master feature vectors using various schemes for feature extraction, representation, and reduction. The six different text classification techniques were applied on these 43 master feature vectors to construct a classification model that can predict the cause of death. Finally, classification model performance was evaluated using four performance measures i.e. overall accuracy, macro precision, macro-F-measure, and macro recall. From experiments, it was found that that unigram features obtained the highest performance compared to bigram, trigram, and hybrid-gram features. Furthermore, in feature representation schemes, term frequency, and term frequency with inverse document frequency obtained similar and better results when compared with binary frequency, and normalized term frequency with inverse document frequency. Furthermore, the chi-square feature reduction approach outperformed Pearson correlation, and information gain approaches. Finally, in text classification algorithms, support vector machine classifier outperforms random forest, Naive Bayes, k-nearest neighbor, decision tree, and ensemble-voted classifier. Our results and comparisons hold practical importance and serve as references for future works. Moreover, the comparison outputs will act as state-of-art techniques to compare future proposals with existing automated text classification techniques. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. DREAM: Classification scheme for dialog acts in clinical research query mediation.

    PubMed

    Hoxha, Julia; Chandar, Praveen; He, Zhe; Cimino, James; Hanauer, David; Weng, Chunhua

    2016-02-01

    Clinical data access involves complex but opaque communication between medical researchers and query analysts. Understanding such communication is indispensable for designing intelligent human-machine dialog systems that automate query formulation. This study investigates email communication and proposes a novel scheme for classifying dialog acts in clinical research query mediation. We analyzed 315 email messages exchanged in the communication for 20 data requests obtained from three institutions. The messages were segmented into 1333 utterance units. Through a rigorous process, we developed a classification scheme and applied it for dialog act annotation of the extracted utterances. Evaluation results with high inter-annotator agreement demonstrate the reliability of this scheme. This dataset is used to contribute preliminary understanding of dialog acts distribution and conversation flow in this dialog space. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. A Classification Scheme for Young Stellar Objects Using the WIDE-FIELD INFRARED SURVEY EXPLORER ALLWISE Catalog: Revealing Low-Density Star Formation in the Outer Galaxy

    NASA Technical Reports Server (NTRS)

    Koening, X. P.; Leisawitz, D. T.

    2014-01-01

    We present an assessment of the performance of WISE and the AllWISE data release in a section of the Galactic Plane. We lay out an approach to increasing the reliability of point source photometry extracted from the AllWISE catalog in Galactic Plane regions using parameters provided in the catalog. We use the resulting catalog to construct a new, revised young star detection and classification scheme combining WISE and 2MASS near and mid-infrared colors and magnitudes and test it in a section of the Outer Milky Way. The clustering properties of the candidate Class I and II stars using a nearest neighbor density calculation and the two-point correlation function suggest that the majority of stars do form in massive star forming regions, and any isolated mode of star formation is at most a small fraction of the total star forming output of the Galaxy. We also show that the isolated component may be very small and could represent the tail end of a single mechanism of star formation in line with models of molecular cloud collapse with supersonic turbulence and not a separate mode all to itself.

  3. Local classifier weighting by quadratic programming.

    PubMed

    Cevikalp, Hakan; Polikar, Robi

    2008-10-01

    It has been widely accepted that the classification accuracy can be improved by combining outputs of multiple classifiers. However, how to combine multiple classifiers with various (potentially conflicting) decisions is still an open problem. A rich collection of classifier combination procedures -- many of which are heuristic in nature -- have been developed for this goal. In this brief, we describe a dynamic approach to combine classifiers that have expertise in different regions of the input space. To this end, we use local classifier accuracy estimates to weight classifier outputs. Specifically, we estimate local recognition accuracies of classifiers near a query sample by utilizing its nearest neighbors, and then use these estimates to find the best weights of classifiers to label the query. The problem is formulated as a convex quadratic optimization problem, which returns optimal nonnegative classifier weights with respect to the chosen objective function, and the weights ensure that locally most accurate classifiers are weighted more heavily for labeling the query sample. Experimental results on several data sets indicate that the proposed weighting scheme outperforms other popular classifier combination schemes, particularly on problems with complex decision boundaries. Hence, the results indicate that local classification-accuracy-based combination techniques are well suited for decision making when the classifiers are trained by focusing on different regions of the input space.

  4. Using methods from the data mining and machine learning literature for disease classification and prediction: A case study examining classification of heart failure sub-types

    PubMed Central

    Austin, Peter C.; Tu, Jack V.; Ho, Jennifer E.; Levy, Daniel; Lee, Douglas S.

    2014-01-01

    Objective Physicians classify patients into those with or without a specific disease. Furthermore, there is often interest in classifying patients according to disease etiology or subtype. Classification trees are frequently used to classify patients according to the presence or absence of a disease. However, classification trees can suffer from limited accuracy. In the data-mining and machine learning literature, alternate classification schemes have been developed. These include bootstrap aggregation (bagging), boosting, random forests, and support vector machines. Study design and Setting We compared the performance of these classification methods with those of conventional classification trees to classify patients with heart failure according to the following sub-types: heart failure with preserved ejection fraction (HFPEF) vs. heart failure with reduced ejection fraction (HFREF). We also compared the ability of these methods to predict the probability of the presence of HFPEF with that of conventional logistic regression. Results We found that modern, flexible tree-based methods from the data mining literature offer substantial improvement in prediction and classification of heart failure sub-type compared to conventional classification and regression trees. However, conventional logistic regression had superior performance for predicting the probability of the presence of HFPEF compared to the methods proposed in the data mining literature. Conclusion The use of tree-based methods offers superior performance over conventional classification and regression trees for predicting and classifying heart failure subtypes in a population-based sample of patients from Ontario. However, these methods do not offer substantial improvements over logistic regression for predicting the presence of HFPEF. PMID:23384592

  5. Acute Oral Toxicity of Trimethylolethane Trinitrate (TMETN) in Sprague- Dawley Rats

    DTIC Science & Technology

    1989-07-01

    classification scheme of Hodge and Steiner, these results indicate that TMETN is a slightly toxic compound.1 20. ON-RIBUTION /AVAILABILITY OF ABSTRACT 21. ABSTRACT...the classification scheme of Hodge and Sterner, these results indcate that TMETN is a slightly toxic compound. KEY WORDS: Acute Oral Toxicit-y...Dawley rats and 1027.4 63.7 mg/kg in female Sprague-Dawley rats. These MLD values place TMETN in the "slightly toxic" range by the system of Hodge and

  6. NASA Scope and Subject Category Guide

    NASA Technical Reports Server (NTRS)

    2011-01-01

    This guide provides a simple, effective tool to assist aerospace information analysts and database builders in the high-level subject classification of technical materials. Each of the 76 subject categories comprising the classification scheme is presented with a description of category scope, a listing of subtopics, cross references, and an indication of particular areas of NASA interest. The guide also includes an index of nearly 3,000 specific research topics cross referenced to the subject categories. The portable document format (PDF) version of the guide contains links in the index from each input subject to its corresponding categories. In addition to subject classification, the guide can serve as an aid to searching databases that use the classification scheme, and is also an excellent selection guide for those involved in the acquisition of aerospace literature. The CD-ROM contains both HTML and PDF versions.

  7. Detection of concealed cars in complex cargo X-ray imagery using Deep Learning.

    PubMed

    Jaccard, Nicolas; Rogers, Thomas W; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    Non-intrusive inspection systems based on X-ray radiography techniques are routinely used at transport hubs to ensure the conformity of cargo content with the supplied shipping manifest. As trade volumes increase and regulations become more stringent, manual inspection by trained operators is less and less viable due to low throughput. Machine vision techniques can assist operators in their task by automating parts of the inspection workflow. Since cars are routinely involved in trafficking, export fraud, and tax evasion schemes, they represent an attractive target for automated detection and flagging for subsequent inspection by operators. Development and evaluation of a novel method for the automated detection of cars in complex X-ray cargo imagery. X-ray cargo images from a stream-of-commerce dataset were classified using a window-based scheme. The limited number of car images was addressed by using an oversampling scheme. Different Convolutional Neural Network (CNN) architectures were compared with well-established bag of words approaches. In addition, robustness to concealment was evaluated by projection of objects into car images. CNN approaches outperformed all other methods evaluated, achieving 100% car image classification rate for a false positive rate of 1-in-454. Cars that were partially or completely obscured by other goods, a modus operandi frequently adopted by criminals, were correctly detected. We believe that this level of performance suggests that the method is suitable for deployment in the field. It is expected that the generic object detection workflow described can be extended to other object classes given the availability of suitable training data.

  8. A risk-based classification scheme for genetically modified foods. III: Evaluation using a panel of reference foods.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    This paper presents an exploratory evaluation of four functional components of a proposed risk-based classification scheme (RBCS) for crop-derived genetically modified (GM) foods in a concordance study. Two independent raters assigned concern levels to 20 reference GM foods using a rating form based on the proposed RBCS. The four components of evaluation were: (1) degree of concordance, (2) distribution across concern levels, (3) discriminating ability of the scheme, and (4) ease of use. At least one of the 20 reference foods was assigned to each of the possible concern levels, demonstrating the ability of the scheme to identify GM foods of different concern with respect to potential health risk. There was reasonably good concordance between the two raters for the three separate parts of the RBCS. The raters agreed that the criteria in the scheme were sufficiently clear in discriminating reference foods into different concern levels, and that with some experience, the scheme was reasonably easy to use. Specific issues and suggestions for improvements identified in the concordance study are discussed.

  9. Poisoning by Herbs and Plants: Rapid Toxidromic Classification and Diagnosis.

    PubMed

    Diaz, James H

    2016-03-01

    The American Association of Poison Control Centers has continued to report approximately 50,000 telephone calls or 8% of incoming calls annually related to plant exposures, mostly in children. Although the frequency of plant ingestions in children is related to the presence of popular species in households, adolescents may experiment with hallucinogenic plants; and trekkers and foragers may misidentify poisonous plants as edible. Since plant exposures have continued at a constant rate, the objectives of this review were (1) to review the epidemiology of plant poisonings; and (2) to propose a rapid toxidromic classification system for highly toxic plant ingestions for field use by first responders in comparison to current classification systems. Internet search engines were queried to identify and select peer-reviewed articles on plant poisonings using the key words in order to classify plant poisonings into four specific toxidromes: cardiotoxic, neurotoxic, cytotoxic, and gastrointestinal-hepatotoxic. A simple toxidromic classification system of plant poisonings may permit rapid diagnoses of highly toxic versus less toxic and nontoxic plant ingestions both in households and outdoors; direct earlier management of potentially serious poisonings; and reduce costly inpatient evaluations for inconsequential plant ingestions. The current textbook classification schemes for plant poisonings were complex in comparison to the rapid classification system; and were based on chemical nomenclatures and pharmacological effects, and not on clearly presenting toxidromes. Validation of the rapid toxidromic classification system as compared to existing chemical classification systems for plant poisonings will require future adoption and implementation of the toxidromic system by its intended users. Copyright © 2016 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.

  10. A new local-global approach for classification.

    PubMed

    Peres, R T; Pedreira, C E

    2010-09-01

    In this paper, we propose a new local-global pattern classification scheme that combines supervised and unsupervised approaches, taking advantage of both, local and global environments. We understand as global methods the ones concerned with the aim of constructing a model for the whole problem space using the totality of the available observations. Local methods focus into sub regions of the space, possibly using an appropriately selected subset of the sample. In the proposed method, the sample is first divided in local cells by using a Vector Quantization unsupervised algorithm, the LBG (Linde-Buzo-Gray). In a second stage, the generated assemblage of much easier problems is locally solved with a scheme inspired by Bayes' rule. Four classification methods were implemented for comparison purposes with the proposed scheme: Learning Vector Quantization (LVQ); Feedforward Neural Networks; Support Vector Machine (SVM) and k-Nearest Neighbors. These four methods and the proposed scheme were implemented in eleven datasets, two controlled experiments, plus nine public available datasets from the UCI repository. The proposed method has shown a quite competitive performance when compared to these classical and largely used classifiers. Our method is simple concerning understanding and implementation and is based on very intuitive concepts. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. A novel encoding scheme for effective biometric discretization: Linearly Separable Subcode.

    PubMed

    Lim, Meng-Hui; Teoh, Andrew Beng Jin

    2013-02-01

    Separability in a code is crucial in guaranteeing a decent Hamming-distance separation among the codewords. In multibit biometric discretization where a code is used for quantization-intervals labeling, separability is necessary for preserving distance dissimilarity when feature components are mapped from a discrete space to a Hamming space. In this paper, we examine separability of Binary Reflected Gray Code (BRGC) encoding and reveal its inadequacy in tackling interclass variation during the discrete-to-binary mapping, leading to a tradeoff between classification performance and entropy of binary output. To overcome this drawback, we put forward two encoding schemes exhibiting full-ideal and near-ideal separability capabilities, known as Linearly Separable Subcode (LSSC) and Partially Linearly Separable Subcode (PLSSC), respectively. These encoding schemes convert the conventional entropy-performance tradeoff into an entropy-redundancy tradeoff in the increase of code length. Extensive experimental results vindicate the superiority of our schemes over the existing encoding schemes in discretization performance. This opens up possibilities of achieving much greater classification performance with high output entropy.

  12. Interpretation for scales of measurement linking with abstract algebra

    PubMed Central

    2014-01-01

    The Stevens classification of levels of measurement involves four types of scale: “Nominal”, “Ordinal”, “Interval” and “Ratio”. This classification has been used widely in medical fields and has accomplished an important role in composition and interpretation of scale. With this classification, levels of measurements appear organized and validated. However, a group theory-like systematization beckons as an alternative because of its logical consistency and unexceptional applicability in the natural sciences but which may offer great advantages in clinical medicine. According to this viewpoint, the Stevens classification is reformulated within an abstract algebra-like scheme; ‘Abelian modulo additive group’ for “Ordinal scale” accompanied with ‘zero’, ‘Abelian additive group’ for “Interval scale”, and ‘field’ for “Ratio scale”. Furthermore, a vector-like display arranges a mixture of schemes describing the assessment of patient states. With this vector-like notation, data-mining and data-set combination is possible on a higher abstract structure level based upon a hierarchical-cluster form. Using simple examples, we show that operations acting on the corresponding mixed schemes of this display allow for a sophisticated means of classifying, updating, monitoring, and prognosis, where better data mining/data usage and efficacy is expected. PMID:24987515

  13. Interpretation for scales of measurement linking with abstract algebra.

    PubMed

    Sawamura, Jitsuki; Morishita, Shigeru; Ishigooka, Jun

    2014-01-01

    THE STEVENS CLASSIFICATION OF LEVELS OF MEASUREMENT INVOLVES FOUR TYPES OF SCALE: "Nominal", "Ordinal", "Interval" and "Ratio". This classification has been used widely in medical fields and has accomplished an important role in composition and interpretation of scale. With this classification, levels of measurements appear organized and validated. However, a group theory-like systematization beckons as an alternative because of its logical consistency and unexceptional applicability in the natural sciences but which may offer great advantages in clinical medicine. According to this viewpoint, the Stevens classification is reformulated within an abstract algebra-like scheme; 'Abelian modulo additive group' for "Ordinal scale" accompanied with 'zero', 'Abelian additive group' for "Interval scale", and 'field' for "Ratio scale". Furthermore, a vector-like display arranges a mixture of schemes describing the assessment of patient states. With this vector-like notation, data-mining and data-set combination is possible on a higher abstract structure level based upon a hierarchical-cluster form. Using simple examples, we show that operations acting on the corresponding mixed schemes of this display allow for a sophisticated means of classifying, updating, monitoring, and prognosis, where better data mining/data usage and efficacy is expected.

  14. A new scheme for urban impervious surface classification from SAR images

    NASA Astrophysics Data System (ADS)

    Zhang, Hongsheng; Lin, Hui; Wang, Yunpeng

    2018-05-01

    Urban impervious surfaces have been recognized as a significant indicator for various environmental and socio-economic studies. There is an increasingly urgent demand for timely and accurate monitoring of the impervious surfaces with satellite technology from local to global scales. In the past decades, optical remote sensing has been widely employed for this task with various techniques. However, there are still a range of challenges, e.g. handling cloud contamination on optical data. Therefore, the Synthetic Aperture Radar (SAR) was introduced for the challenging task because it is uniquely all-time- and all-weather-capable. Nevertheless, with an increasing number of SAR data applied, the methodology used for impervious surfaces classification remains unchanged from the methods used for optical datasets. This shortcoming has prevented the community from fully exploring the potential of using SAR data for impervious surfaces classification. We proposed a new scheme that is comparable to the well-known and fundamental Vegetation-Impervious surface-Soil (V-I-S) model for mapping urban impervious surfaces. Three scenes of fully polarimetric Radsarsat-2 data for the cities of Shenzhen, Hong Kong and Macau were employed to test and validate the proposed methodology. Experimental results indicated that the overall accuracy and Kappa coefficient were 96.00% and 0.8808 in Shenzhen, 93.87% and 0.8307 in Hong Kong and 97.48% and 0.9354 in Macau, indicating the applicability and great potential of the new scheme for impervious surfaces classification using polarimetric SAR data. Comparison with the traditional scheme indicated that this new scheme was able to improve the overall accuracy by up to 4.6% and Kappa coefficient by up to 0.18.

  15. FORUM: A Suggestion for an Improved Vegetation Scheme for Local and Global Mapping and Monitoring.

    PubMed

    ADAMS

    1999-01-01

    / Understanding of global ecological problems is at least partly dependent on clear assessments of vegetation change, and such assessment is always dependent on the use of a vegetation classification scheme. Use of satellite remotely sensed data is the only practical means of carrying out any global-scale vegetation mapping exercise, but if the resulting maps are to be useful to most ecologists and conservationists, they must be closely tied to clearly defined features of vegetation on the ground. Furthermore, much of the mapping that does take place involves more local-scale description of field sites; for purposes of cost and practicality, such studies usually do not involve remote sensing using satellites. There is a need for a single scheme that integrates the smallest to the largest scale in a way that is meaningful to most environmental scientists. Existing schemes are unsatisfactory for this task; they are ambiguous, unnecessarily complex, and their categories do not correspond to common-sense definitions. In response to these problems, a simple structural-physiognomically based scheme with 23 fundamental categories is proposed here for mapping and monitoring on any scale, from local to global. The fundamental categories each subdivide into more specific structural categories for more detailed mapping, but all the categories can be used throughout the world and at any scale, allowing intercomparison between regions. The next stage in the process will be to obtain the views of as many people working in as many different fields as possible, to see whether the proposed scheme suits their needs and how it should be modified. With a few modifications, such a scheme could easily be appended to an existing land cover classification scheme, such as the FAO system, greatly increasing the usefulness and accessability of the results of the landcover classification. KEY WORDS: Vegetation scheme; Mapping; Monitoring; Land cover

  16. Discrimination of tooth layers and dental restorative materials using cutting sounds.

    PubMed

    Zakeri, Vahid; Arzanpour, Siamak; Chehroudi, Babak

    2015-03-01

    Dental restoration begins with removing carries and affected tissues with air-turbine rotary cutting handpieces, and later restoring the lost tissues with appropriate restorative materials to retain the functionality. Most restoration materials eventually fail as they age and need to be replaced. One of the difficulties in replacing failing restorations is discerning the boundary of restorative materials, which causes inadvertent removal of healthy tooth layers. Developing an objective and sensor-based method is a promising approach to monitor dental restorative operations and to prevent excessive tooth losses. This paper has analyzed cutting sounds of an air-turbine handpiece to discriminate between tooth layers and two commonly used restorative materials, amalgam and composite. Support vector machines were employed for classification, and the averaged short-time Fourier transform coefficients were selected as the features. The classifier performance was evaluated from different aspects such as the number of features, feature scaling methods, classification schemes, and utilized kernels. The total classification accuracies were 89% and 92% for cases included composite and amalgam materials, respectively. The obtained results indicated the feasibility and effectiveness of the proposed method.

  17. Active learning methods for interactive image retrieval.

    PubMed

    Gosselin, Philippe Henri; Cord, Matthieu

    2008-07-01

    Active learning methods have been considered with increased interest in the statistical learning community. Initially developed within a classification framework, a lot of extensions are now being proposed to handle multimedia applications. This paper provides algorithms within a statistical framework to extend active learning for online content-based image retrieval (CBIR). The classification framework is presented with experiments to compare several powerful classification techniques in this information retrieval context. Focusing on interactive methods, active learning strategy is then described. The limitations of this approach for CBIR are emphasized before presenting our new active selection process RETIN. First, as any active method is sensitive to the boundary estimation between classes, the RETIN strategy carries out a boundary correction to make the retrieval process more robust. Second, the criterion of generalization error to optimize the active learning selection is modified to better represent the CBIR objective of database ranking. Third, a batch processing of images is proposed. Our strategy leads to a fast and efficient active learning scheme to retrieve sets of online images (query concept). Experiments on large databases show that the RETIN method performs well in comparison to several other active strategies.

  18. What are the most fire-dangerous atmospheric circulations in the Eastern-Mediterranean? Analysis of the synoptic wildfire climatology.

    PubMed

    Paschalidou, A K; Kassomenos, P A

    2016-01-01

    Wildfire management is closely linked to robust forecasts of changes in wildfire risk related to meteorological conditions. This link can be bridged either through fire weather indices or through statistical techniques that directly relate atmospheric patterns to wildfire activity. In the present work the COST-733 classification schemes are applied in order to link wildfires in Greece with synoptic circulation patterns. The analysis reveals that the majority of wildfire events can be explained by a small number of specific synoptic circulations, hence reflecting the synoptic climatology of wildfires. All 8 classification schemes used, prove that the most fire-dangerous conditions in Greece are characterized by a combination of high atmospheric pressure systems located N to NW of Greece, coupled with lower pressures located over the very Eastern part of the Mediterranean, an atmospheric pressure pattern closely linked to the local Etesian winds over the Aegean Sea. During these events, the atmospheric pressure has been reported to be anomalously high, while anomalously low 500hPa geopotential heights and negative total water column anomalies were also observed. Among the various classification schemes used, the 2 Principal Component Analysis-based classifications, namely the PCT and the PXE, as well as the Leader Algorithm classification LND proved to be the best options, in terms of being capable to isolate the vast amount of fire events in a small number of classes with increased frequency of occurrence. It is estimated that these 3 schemes, in combination with medium-range to seasonal climate forecasts, could be used by wildfire risk managers to provide increased wildfire prediction accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Computer-aided diagnosis of pulmonary diseases using x-ray darkfield radiography

    NASA Astrophysics Data System (ADS)

    Einarsdóttir, Hildur; Yaroshenko, Andre; Velroyen, Astrid; Bech, Martin; Hellbach, Katharina; Auweter, Sigrid; Yildirim, Önder; Meinel, Felix G.; Eickelberg, Oliver; Reiser, Maximilian; Larsen, Rasmus; Kjær Ersbøll, Bjarne; Pfeiffer, Franz

    2015-12-01

    In this work we develop a computer-aided diagnosis (CAD) scheme for classification of pulmonary disease for grating-based x-ray radiography. In addition to conventional transmission radiography, the grating-based technique provides a dark-field imaging modality, which utilizes the scattering properties of the x-rays. This modality has shown great potential for diagnosing early stage emphysema and fibrosis in mouse lungs in vivo. The CAD scheme is developed to assist radiologists and other medical experts to develop new diagnostic methods when evaluating grating-based images. The scheme consists of three stages: (i) automatic lung segmentation; (ii) feature extraction from lung shape and dark-field image intensities; (iii) classification between healthy, emphysema and fibrosis lungs. A study of 102 mice was conducted with 34 healthy, 52 emphysema and 16 fibrosis subjects. Each image was manually annotated to build an experimental dataset. System performance was assessed by: (i) determining the quality of the segmentations; (ii) validating emphysema and fibrosis recognition by a linear support vector machine using leave-one-out cross-validation. In terms of segmentation quality, we obtained an overlap percentage (Ω) 92.63  ±  3.65%, Dice Similarity Coefficient (DSC) 89.74  ±  8.84% and Jaccard Similarity Coefficient 82.39  ±  12.62%. For classification, the accuracy, sensitivity and specificity of diseased lung recognition was 100%. Classification between emphysema and fibrosis resulted in an accuracy of 93%, whilst the sensitivity was 94% and specificity 88%. In addition to the automatic classification of lungs, deviation maps created by the CAD scheme provide a visual aid for medical experts to further assess the severity of pulmonary disease in the lung, and highlights regions affected.

  20. Centrifuge: rapid and sensitive classification of metagenomic sequences.

    PubMed

    Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L

    2016-12-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.

  1. "Interactive Classification Technology"

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    1999-01-01

    The investigators are upgrading a knowledge representation language called SL (Symbolic Language) and an automated reasoning system called SMS (Symbolic Manipulation System) to enable the technologies to be used in automated reasoning and interactive classification systems. The overall goals of the project are: a) the enhancement of the representation language SL to accommodate multiple perspectives and a wider range of meaning; b) the development of a sufficient set of operators to enable the interpreter of SL to handle representations of basic cognitive acts; and c) the development of a default inference scheme to operate over SL notation as it is encoded. As to particular goals the first-year work plan focused on inferencing and.representation issues, including: 1) the development of higher level cognitive/ classification functions and conceptual models for use in inferencing and decision making; 2) the specification of a more detailed scheme of defaults and the enrichment of SL notation to accommodate the scheme; and 3) the adoption of additional perspectives for inferencing.

  2. Combination of support vector machine, artificial neural network and random forest for improving the classification of convective and stratiform rain using spectral features of SEVIRI data

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2018-05-01

    A model combining three classifiers, namely Support vector machine, Artificial neural network and Random forest (SAR) is designed for improving the classification of convective and stratiform rain. This model (SAR model) has been trained and then tested on a datasets derived from MSG-SEVIRI (Meteosat Second Generation-Spinning Enhanced Visible and Infrared Imager). Well-classified, mid-classified and misclassified pixels are determined from the combination of three classifiers. Mid-classified and misclassified pixels that are considered unreliable pixels are reclassified by using a novel training of the developed scheme. In this novel training, only the input data corresponding to the pixels in question to are used. This whole process is repeated a second time and applied to mid-classified and misclassified pixels separately. Learning and validation of the developed scheme are realized against co-located data observed by ground radar. The developed scheme outperformed different classifiers used separately and reached 97.40% of overall accuracy of classification.

  3. Microtopographic characterization of ice-wedge polygon landscape in Barrow, Alaska: a digital map of troughs, rims, centers derived from high resolution (0.25 m) LiDAR data

    DOE Data Explorer

    Gangodagamage, Chandana; Wullschleger, Stan

    2014-07-03

    The dataset represents microtopographic characterization of the ice-wedge polygon landscape in Barrow, Alaska. Three microtopographic features are delineated using 0.25 m high resolution digital elevation dataset derived from LiDAR. The troughs, rims, and centers are the three categories in this classification scheme. The polygon troughs are the surface expression of the ice-wedges that are in lower elevations than the interior polygon. The elevated shoulders of the polygon interior immediately adjacent to the polygon troughs are the polygon rims for the low center polygons. In case of high center polygons, these features are the topographic highs. In this classification scheme, both topographic highs and rims are considered as polygon rims. The next version of the dataset will include more refined classification scheme including separate classes for rims ad topographic highs. The interior part of the polygon just adjacent to the polygon rims are the polygon centers.

  4. A curricula-based comparison of biomedical and health informatics programs in the USA

    PubMed Central

    Hemminger, Bradley M

    2011-01-01

    Objective The field of Biomedical and Health Informatics (BMHI) continues to define itself, and there are many educational programs offering ‘informatics’ degrees with varied foci. The goal of this study was to develop a scheme for systematic comparison of programs across the entire BMHI spectrum and to identify commonalities among informatics curricula. Design Guided by several published competency sets, a grounded theory approach was used to develop a program/curricula categorization scheme based on the descriptions of 636 courses offered by 73 public health, nursing, health, medical, and bioinformatics programs in the USA. The scheme was then used to compare the programs in the aforementioned five informatics disciplines. Results The authors developed a Course-Based Informatics Program Categorization (CBIPC) scheme that can be used both to classify coursework for any BMHI educational program and to compare programs from the same or related disciplines. The application of CBIPC scheme to the analysis of public health, nursing, health, medical, and bioinformatics programs reveals distinct intradisciplinary curricular patterns and a common core of courses across the entire BMHI education domain. Limitations The study is based on descriptions of courses from the university's webpages. Thus, it is limited to sampling courses at one moment in time, and classification for the coding scheme is based primarily on course titles and course descriptions. Conclusion The CBIPC scheme combines empirical data about educational curricula from diverse informatics programs and several published competency sets. It also provides a foundation for discussion of BMHI education as a whole and can help define subdisciplinary competencies. PMID:21292707

  5. Dewey Decimal Classification for U. S. Conn: An Advantage?

    ERIC Educational Resources Information Center

    Marek, Kate

    This paper examines the use of the Dewey Decimal Classification (DDC) system at the U. S. Conn Library at Wayne State College (WSC) in Nebraska. Several developments in the last 20 years which have eliminated the trend toward reclassification of academic library collections from DDC to the Library of Congress (LC) classification scheme are…

  6. A Global Classification System for Catchment Hydrology

    NASA Astrophysics Data System (ADS)

    Woods, R. A.

    2004-05-01

    It is a shocking state of affairs - there is no underpinning scientific taxonomy of catchments. There are widely used global classification systems for climate, river morphology, lakes and wetlands, but for river catchments there exists only a plethora of inconsistent, incomplete regional schemes. By proceeding without a common taxonomy for catchments, freshwater science has missed one of its key developmental stages, and has leapt from definition of phenomena to experiments, theories and models, without the theoretical framework of a classification. I propose the development of a global hierarchical classification system for physical aspects of river catchments, to help underpin physical science in the freshwater environment and provide a solid foundation for classification of river ecosystems. Such a classification scheme can open completely new vistas in hydrology: for example it will be possible to (i) rationally transfer experimental knowledge of hydrological processes between basins anywhere in the world, provided they belong to the same class; (ii) perform meaningful meta-analyses in order to reconcile studies that show inconsistent results (iii) generate new testable hypotheses which involve locations worldwide.

  7. Heuristic pattern correction scheme using adaptively trained generalized regression neural networks.

    PubMed

    Hoya, T; Chambers, J A

    2001-01-01

    In many pattern classification problems, an intelligent neural system is required which can learn the newly encountered but misclassified patterns incrementally, while keeping a good classification performance over the past patterns stored in the network. In the paper, an heuristic pattern correction scheme is proposed using adaptively trained generalized regression neural networks (GRNNs). The scheme is based upon both network growing and dual-stage shrinking mechanisms. In the network growing phase, a subset of the misclassified patterns in each incoming data set is iteratively added into the network until all the patterns in the incoming data set are classified correctly. Then, the redundancy in the growing phase is removed in the dual-stage network shrinking. Both long- and short-term memory models are considered in the network shrinking, which are motivated from biological study of the brain. The learning capability of the proposed scheme is investigated through extensive simulation studies.

  8. Particle-size distribution models for the conversion of Chinese data to FAO/USDA system.

    PubMed

    Shangguan, Wei; Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r (2), Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best.

  9. Semantic Labelling of Ultra Dense Mls Point Clouds in Urban Road Corridors Based on Fusing Crf with Shape Priors

    NASA Astrophysics Data System (ADS)

    Yao, W.; Polewski, P.; Krzystek, P.

    2017-09-01

    In this paper, a labelling method for the semantic analysis of ultra-high point density MLS data (up to 4000 points/m2) in urban road corridors is developed based on combining a conditional random field (CRF) for the context-based classification of 3D point clouds with shape priors. The CRF uses a Random Forest (RF) for generating the unary potentials of nodes and a variant of the contrastsensitive Potts model for the pair-wise potentials of node edges. The foundations of the classification are various geometric features derived by means of co-variance matrices and local accumulation map of spatial coordinates based on local neighbourhoods. Meanwhile, in order to cope with the ultra-high point density, a plane-based region growing method combined with a rule-based classifier is applied to first fix semantic labels for man-made objects. Once such kind of points that usually account for majority of entire data amount are pre-labeled; the CRF classifier can be solved by optimizing the discriminative probability for nodes within a subgraph structure excluded from pre-labeled nodes. The process can be viewed as an evidence fusion step inferring a degree of belief for point labelling from different sources. The MLS data used for this study were acquired by vehicle-borne Z+F phase-based laser scanner measurement, which permits the generation of a point cloud with an ultra-high sampling rate and accuracy. The test sites are parts of Munich City which is assumed to consist of seven object classes including impervious surfaces, tree, building roof/facade, low vegetation, vehicle and pole. The competitive classification performance can be explained by the diverse factors: e.g. the above ground height highlights the vertical dimension of houses, trees even cars, but also attributed to decision-level fusion of graph-based contextual classification approach with shape priors. The use of context-based classification methods mainly contributed to smoothing of labelling by removing outliers and the improvement in underrepresented object classes. In addition, the routine operation of a context-based classification for such high density MLS data becomes much more efficient being comparable to non-contextual classification schemes.

  10. Are we in the dark ages of environmental toxicology?

    PubMed

    McCarty, L S

    2013-12-01

    Environmental toxicity is judged to be in a "dark ages" period due to longstanding limitations in the implementation of the simple conceptual model that is the basis of current aquatic toxicity testing protocols. Fortunately, the environmental regulatory revolution of the last half-century is not substantially compromised as development of past regulatory guidance was designed to deal with limited amounts of relatively poor quality toxicity data. However, as regulatory objectives have substantially increased in breadth and depth, aquatic toxicity data derived with old testing methods are no longer adequate. In the near-term explicit model description and routine assumption validation should be mandatory. Updated testing methods could provide some improvements in toxicological data quality. A thorough reevaluation of toxicity testing objectives and methods resulting in substantially revised standard testing methods, plus a comprehensive scheme for classification of modes/mechanisms of toxic action, should be the long-term objective. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Log-ratio transformed major element based multidimensional classification for altered High-Mg igneous rocks

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Quiroz-Ruiz, Alfredo

    2016-12-01

    A new multidimensional classification scheme consistent with the chemical classification of the International Union of Geological Sciences (IUGS) is proposed for the nomenclature of High-Mg altered rocks. Our procedure is based on an extensive database of major element (SiO2, TiO2, Al2O3, Fe2O3t, MnO, MgO, CaO, Na2O, K2O, and P2O5) compositions of a total of 33,868 (920 High-Mg and 32,948 "Common") relatively fresh igneous rock samples. The database consisting of these multinormally distributed samples in terms of their isometric log-ratios was used to propose a set of 11 discriminant functions and 6 diagrams to facilitate High-Mg rock classification. The multinormality required by linear discriminant and canonical analysis was ascertained by a new computer program DOMuDaF. One multidimensional function can distinguish the High-Mg and Common igneous rocks with high percent success values of about 86.4% and 98.9%, respectively. Similarly, from 10 discriminant functions the High-Mg rocks can also be classified as one of the four rock types (komatiite, meimechite, picrite, and boninite), with high success values of about 88%-100%. Satisfactory functioning of this new classification scheme was confirmed by seven independent tests. Five further case studies involving application to highly altered rocks illustrate the usefulness of our proposal. A computer program HMgClaMSys was written to efficiently apply the proposed classification scheme, which will be available for online processing of igneous rock compositional data. Monte Carlo simulation modeling and mass-balance computations confirmed the robustness of our classification with respect to analytical errors and postemplacement compositional changes.

  12. Adaptive video-based vehicle classification technique for monitoring traffic.

    DOT National Transportation Integrated Search

    2015-08-01

    This report presents a methodology for extracting two vehicle features, vehicle length and number of axles in order : to classify the vehicles from video, based on Federal Highway Administration (FHWA)s recommended vehicle : classification scheme....

  13. Stygoregions – a promising approach to a bioregional classification of groundwater systems

    PubMed Central

    Stein, Heide; Griebler, Christian; Berkhoff, Sven; Matzke, Dirk; Fuchs, Andreas; Hahn, Hans Jürgen

    2012-01-01

    Linked to diverse biological processes, groundwater ecosystems deliver essential services to mankind, the most important of which is the provision of drinking water. In contrast to surface waters, ecological aspects of groundwater systems are ignored by the current European Union and national legislation. Groundwater management and protection measures refer exclusively to its good physicochemical and quantitative status. Current initiatives in developing ecologically sound integrative assessment schemes by taking groundwater fauna into account depend on the initial classification of subsurface bioregions. In a large scale survey, the regional and biogeographical distribution patterns of groundwater dwelling invertebrates were examined for many parts of Germany. Following an exploratory approach, our results underline that the distribution patterns of invertebrates in groundwater are not in accordance with any existing bioregional classification system established for surface habitats. In consequence, we propose to develope a new classification scheme for groundwater ecosystems based on stygoregions. PMID:22993698

  14. Modern classification and outcome predictors of surgery in patients with brain arteriovenous malformations.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-02-23

    Brain arteriovenous malformations (bAVM) are challenging lesions. Part of this challenge stems from the infinite diversity of these lesions regarding shape, location, anatomy, and physiology. This diversity has called on a variety of treatment modalities for these lesions, of which microsurgical resection prevails as the mainstay of treatment. As such, outcome prediction and managing strategy mainly rely on unraveling the nature of these complex tangles and ways each lesion responds to various therapeutic modalities. This strategy needs the ability to decipher each lesion through accurate and efficient categorization. Therefore, classification schemes are essential parts of treatment planning and outcome prediction. This article summarizes different surgical classification schemes and outcome predictors proposed for bAVMs.

  15. Medical X-ray Image Hierarchical Classification Using a Merging and Splitting Scheme in Feature Space.

    PubMed

    Fesharaki, Nooshin Jafari; Pourghassem, Hossein

    2013-07-01

    Due to the daily mass production and the widespread variation of medical X-ray images, it is necessary to classify these for searching and retrieving proposes, especially for content-based medical image retrieval systems. In this paper, a medical X-ray image hierarchical classification structure based on a novel merging and splitting scheme and using shape and texture features is proposed. In the first level of the proposed structure, to improve the classification performance, similar classes with regard to shape contents are grouped based on merging measures and shape features into the general overlapped classes. In the next levels of this structure, the overlapped classes split in smaller classes based on the classification performance of combination of shape and texture features or texture features only. Ultimately, in the last levels, this procedure is also continued forming all the classes, separately. Moreover, to optimize the feature vector in the proposed structure, we use orthogonal forward selection algorithm according to Mahalanobis class separability measure as a feature selection and reduction algorithm. In other words, according to the complexity and inter-class distance of each class, a sub-space of the feature space is selected in each level and then a supervised merging and splitting scheme is applied to form the hierarchical classification. The proposed structure is evaluated on a database consisting of 2158 medical X-ray images of 18 classes (IMAGECLEF 2005 database) and accuracy rate of 93.6% in the last level of the hierarchical structure for an 18-class classification problem is obtained.

  16. Ecosystem classifications based on summer and winter conditions.

    PubMed

    Andrew, Margaret E; Nelson, Trisalyn A; Wulder, Michael A; Hobart, George W; Coops, Nicholas C; Farmer, Carson J Q

    2013-04-01

    Ecosystem classifications map an area into relatively homogenous units for environmental research, monitoring, and management. However, their effectiveness is rarely tested. Here, three classifications are (1) defined and characterized for Canada along summertime productivity (moderate-resolution imaging spectrometer fraction of absorbed photosynthetically active radiation) and wintertime snow conditions (special sensor microwave/imager snow water equivalent), independently and in combination, and (2) comparatively evaluated to determine the ability of each classification to represent the spatial and environmental patterns of alternative schemes, including the Canadian ecozone framework. All classifications depicted similar patterns across Canada, but detailed class distributions differed. Class spatial characteristics varied with environmental conditions within classifications, but were comparable between classifications. There was moderate correspondence between classifications. The strongest association was between productivity classes and ecozones. The classification along both productivity and snow balanced these two sets of variables, yielding intermediate levels of association in all pairwise comparisons. Despite relatively low spatial agreement between classifications, they successfully captured patterns of the environmental conditions underlying alternate schemes (e.g., snow classes explained variation in productivity and vice versa). The performance of ecosystem classifications and the relevance of their input variables depend on the environmental patterns and processes used for applications and evaluation. Productivity or snow regimes, as constructed here, may be desirable when summarizing patterns controlled by summer- or wintertime conditions, respectively, or of climate change responses. General purpose ecosystem classifications should include both sets of drivers. Classifications should be carefully, quantitatively, and comparatively evaluated relative to a particular application prior to their implementation as monitoring and assessment frameworks.

  17. Cleaning and Cleanliness Measurement of Additive Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Welker, Roger W.; Mitchell, Mark A.

    2015-01-01

    The successful acquisition and utilization of piece parts and assemblies for contamination sensitive applications requires application of cleanliness acceptance criteria. Contamination can be classified using many different schemes. One common scheme is classification as organic, ionic and particulate contaminants. These may be present in and on the surface of solid components and assemblies or may be dispersed in various gaseous or liquid media. This discussion will focus on insoluble particle contamination on the surface of piece parts and assemblies. Cleanliness of parts can be controlled using two strategies, referred to as gross cleanliness and precision cleanliness. Under a gross cleanliness strategy acceptance is based on visual cleanliness. This approach introduces a number of concerns that render it unsuitable for controlling cleanliness of high technology products. Under the precision cleanliness strategy, subjective, visual assessment of cleanliness is replaced by objective measurement of cleanliness. When a precision cleanliness strategy is adopted there naturally arises the question: How clean is clean enough? The six commonly used methods for establishing objective cleanliness acceptance limits will be discussed. Special emphasis shall focus on the use of multiple extraction, a technique that has been demonstrated for additively manufactured parts.

  18. Weather types and the regime of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Pereira, M. G.; Trigo, R. M.; Dacamara, C. C.

    2009-04-01

    An objective classification scheme, as developed by Trigo and DaCamara (2000), was applied to classify the daily atmospheric circulation affecting Portugal between 1980 and 2007 into a set of 10 basic weather types (WTs). The classification scheme relies on a set of atmospheric circulation indices, namely southerly flow (SF), westerly flow (WF), total flow (F), southerly shear vorticity (ZS), westerly shear vorticity (ZW) and total vorticity (Z). The weather-typing approach, together with surfacemeteorological variables (e.g. intensity and direction of geostrophic wind, maximum and minimum temperature and precipitation) were then associated to wildfire events as recorded in the official Portuguese fire database consisting of information on each fire occurred in the 18 districts of Continental Portugal within the same period (>450.000 events). The objective of this study is to explore the dependence of wildfire activity on weather and climate and then evaluate the potential of WTs to discriminate among recorded wildfires on what respects to their occurrence and development. Results show that days characterised by surface flow with an eastern component (i.e. NE, E and SE) account for a high percentage of daily burnt area, as opposed to surface westerly flow (NW, W and SW), which represents about a quarter of the total number of days but only accounts for a very low percentage of active fires and of burnt area. Meteorological variables such as minimum and maximum temperatures, that are closely associated to surface wind intensity and direction, also present a good ability to discriminate between the different types of fire events.. Trigo R.M., DaCamara C. (2000) "Circulation Weather Types and their impact on the precipitation regime in Portugal". Int J of Climatology, 20, 1559-1581.

  19. A Visual Basic program to classify sediments based on gravel-sand-silt-clay ratios

    USGS Publications Warehouse

    Poppe, L.J.; Eliason, A.H.; Hastings, M.E.

    2003-01-01

    Nomenclature describing size distributions is important to geologists because grain size is the most basic attribute of sediments. Traditionally, geologists have divided sediments into four size fractions that include gravel, sand, silt, and clay, and classified these sediments based on ratios of the various proportions of the fractions. Definitions of these fractions have long been standardized to the grade scale described by Wentworth (1922), and two main classification schemes have been adopted to describe the approximate relationship between the size fractions.Specifically, according to the Wentworth grade scale gravel-sized particles have a nominal diameter of ⩾2.0 mm; sand-sized particles have nominal diameters from <2.0 mm to ⩾62.5 μm; silt-sized particles have nominal diameters from <62.5 to ⩾4.0 μm; and clay is <4.0 μm. As for sediment classification, most sedimentologists use one of the systems described either by Shepard (1954) or Folk (1954, 1974). The original scheme devised by Shepard (1954) utilized a single ternary diagram with sand, silt, and clay in the corners to graphically show the relative proportions among these three grades within a sample. This scheme, however, does not allow for sediments with significant amounts of gravel. Therefore, Shepard's classification scheme (Fig. 1) was subsequently modified by the addition of a second ternary diagram to account for the gravel fraction (Schlee, 1973). The system devised by Folk (1954, 1974) is also based on two triangular diagrams (Fig. 2), but it has 23 major categories, and uses the term mud (defined as silt plus clay). The patterns within the triangles of both systems differ, as does the emphasis placed on gravel. For example, in the system described by Shepard, gravelly sediments have more than 10% gravel; in Folk's system, slightly gravelly sediments have as little as 0.01% gravel. Folk's classification scheme stresses gravel because its concentration is a function of the highest current velocity at the time of deposition, together with the maximum grain size of the detritus that is available; Shepard's classification scheme emphasizes the ratios of sand, silt, and clay because they reflect sorting and reworking (Poppe et al., 2000).

  20. Etiologic classification of TIA and minor stroke by A-S-C-O and causative classification system as compared to TOAST reduces the proportion of patients categorized as cause undetermined.

    PubMed

    Desai, Jamsheed A; Abuzinadah, Ahmad R; Imoukhuede, Oje; Bernbaum, Manya L; Modi, Jayesh; Demchuk, Andrew M; Coutts, Shelagh B

    2014-01-01

    The assortment of patients based on the underlying pathophysiology is central to preventing recurrent stroke after a transient ischemic attack and minor stroke (TIA-MS). The causative classification of stroke (CCS) and the A-S-C-O (A for atherosclerosis, S for small vessel disease, C for Cardiac source, O for other cause) classification schemes have recently been developed. These systems have not been specifically applied to the TIA-MS population. We hypothesized that both CCS and A-S-C-O would increase the proportion of patients with a definitive etiologic mechanism for TIA-MS as compared with TOAST. Patients were analyzed from the CATCH study. A single-stroke physician assigned all patients to an etiologic subtype using published algorithms for TOAST, CCS and ASCO. We compared the proportions in the various categories for each classification scheme and then the association with stroke progression or recurrence was assessed. TOAST, CCS and A-S-C-O classification schemes were applied in 469 TIA-MS patients. When compared to TOAST both CCS (58.0 vs. 65.3%; p < 0.0001) and ASCO grade 1 or 2 (37.5 vs. 65.3%; p < 0.0001) assigned fewer patients as cause undetermined. CCS had increased assignment of cardioembolism (+3.8%, p = 0.0001) as compared with TOAST. ASCO grade 1 or 2 had increased assignment of cardioembolism (+8.5%, p < 0.0001), large artery atherosclerosis (+14.9%, p < 0.0001) and small artery occlusion (+4.3%, p < 0.0001) as compared with TOAST. Compared with CCS, using ASCO resulted in a 20.5% absolute reduction in patients assigned to the 'cause undetermined' category (p < 0.0001). Patients who had multiple high-risk etiologies either by CCS or ASCO classification or an ASCO undetermined classification had a higher chance of having a recurrent event. Both CCS and ASCO schemes reduce the proportion of TIA and minor stroke patients classified as 'cause undetermined.' ASCO resulted in the fewest patients classified as cause undetermined. Stroke recurrence after TIA-MS is highest in patients with multiple high-risk etiologies or cryptogenic stroke classified by ASCO. © 2014 S. Karger AG, Basel.

  1. Development of a Hazard Classification Scheme for Substances Used in the Fraudulent Adulteration of Foods.

    PubMed

    Everstine, Karen; Abt, Eileen; McColl, Diane; Popping, Bert; Morrison-Rowe, Sara; Lane, Richard W; Scimeca, Joseph; Winter, Carl; Ebert, Andrew; Moore, Jeffrey C; Chin, Henry B

    2018-01-01

    Food fraud, the intentional misrepresentation of the true identity of a food product or ingredient for economic gain, is a threat to consumer confidence and public health and has received increased attention from both regulators and the food industry. Following updates to food safety certification standards and publication of new U.S. regulatory requirements, we undertook a project to (i) develop a scheme to classify food fraud-related adulterants based on their potential health hazard and (ii) apply this scheme to the adulterants in a database of 2,970 food fraud records. The classification scheme was developed by a panel of experts in food safety and toxicology from the food industry, academia, and the U.S. Food and Drug Administration. Categories and subcategories were created through an iterative process of proposal, review, and validation using a subset of substances known to be associated with the fraudulent adulteration of foods. Once developed, the scheme was applied to the adulterants in the database. The resulting scheme included three broad categories: 1, potentially hazardous adulterants; 2, adulterants that are unlikely to be hazardous; and 3, unclassifiable adulterants. Categories 1 and 2 consisted of seven subcategories intended to further define the range of hazard potential for adulterants. Application of the scheme to the 1,294 adulterants in the database resulted in 45% of adulterants classified in category 1 (potentially hazardous). Twenty-seven percent of the 1,294 adulterants had a history of causing consumer illness or death, were associated with safety-related regulatory action, or were classified as allergens. These results reinforce the importance of including a consideration of food fraud-related adulterants in food safety systems. This classification scheme supports food fraud mitigation efforts and hazard identification as required in the U.S. Food Safety Modernization Act Preventive Controls Rules.

  2. Functional traits, convergent evolution, and periodic tables of niches.

    PubMed

    Winemiller, Kirk O; Fitzgerald, Daniel B; Bower, Luke M; Pianka, Eric R

    2015-08-01

    Ecology is often said to lack general theories sufficiently predictive for applications. Here, we examine the concept of a periodic table of niches and feasibility of niche classification schemes from functional trait and performance data. Niche differences and their influence on ecological patterns and processes could be revealed effectively by first performing data reduction/ordination analyses separately on matrices of trait and performance data compiled according to logical associations with five basic niche 'dimensions', or aspects: habitat, life history, trophic, defence and metabolic. Resultant patterns then are integrated to produce interpretable niche gradients, ordinations and classifications. Degree of scheme periodicity would depend on degrees of niche conservatism and convergence causing species clustering across multiple niche dimensions. We analysed a sample data set containing trait and performance data to contrast two approaches for producing niche schemes: species ordination within niche gradient space, and niche categorisation according to trait-value thresholds. Creation of niche schemes useful for advancing ecological knowledge and its applications will depend on research that produces functional trait and performance datasets directly related to niche dimensions along with criteria for data standardisation and quality. As larger databases are compiled, opportunities will emerge to explore new methods for data reduction, ordination and classification. © 2015 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  3. Using dual classifications in the development of avian wetland indices of biological integrity for wetlands in West Virginia, USA.

    PubMed

    Veselka, Walter; Anderson, James T; Kordek, Walter S

    2010-05-01

    Considerable resources are being used to develop and implement bioassessment methods for wetlands to ensure that "biological integrity" is maintained under the United States Clean Water Act. Previous research has demonstrated that avian composition is susceptible to human impairments at multiple spatial scales. Using a site-specific disturbance gradient, we built avian wetland indices of biological integrity (AW-IBI) specific to two wetland classification schemes, one based on vegetative structure and the other based on the wetland's position in the landscape and sources of water. The resulting class-specific AW-IBI was comprised of one to four metrics that varied in their sensitivity to the disturbance gradient. Some of these metrics were specific to only one of the classification schemes, whereas others could discriminate varying levels of disturbance regardless of classification scheme. Overall, all of the derived biological indices specific to the vegetative structure-based classes of wetlands had a significant relation with the disturbance gradient; however, the biological index derived for floodplain wetlands exhibited a more consistent response to a local disturbance gradient. We suspect that the consistency of this response is due to the inherent nature of the connectivity of available habitat in floodplain wetlands.

  4. A Critical Review of Mode of Action (MOA) Assignment ...

    EPA Pesticide Factsheets

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human health toxicology. With increasing calls to assess thousands of chemicals, some of which have little available information other than structure, clear understanding how each of these MOA schemes was devised, what information they are based on, and the limitations of each approach is critical. Several groups are developing low-tier methods to more easily classify or assess chemicals, using approaches such as the ecological threshold of concern (eco-TTC) and chemical-activity. Evaluation of these approaches and determination of their domain of applicability is partly dependent on the MOA classification that is used. The most commonly used MOA classification schemes for ecotoxicology include Verhaar and Russom (included in ASTER), both of which are used to predict acute aquatic toxicity MOA. Verhaar is a QSAR-based system that classifies chemicals into one of 4 classes, with a 5th class specified for those chemicals that are not classified in the other 4. ASTER/Russom includes 8 classifications: narcotics (3 groups), oxidative phosphorylation uncouplers, respiratory inhibitors, electrophiles/proelectrophiles, AChE inhibitors, or CNS seizure agents. Other methodologies include TEST (Toxicity Estimation Software Tool), a computational chemistry-based application that allows prediction to one of 5 broad MOA

  5. Global land cover mapping: a review and uncertainty analysis

    USGS Publications Warehouse

    Congalton, Russell G.; Gu, Jianyu; Yadav, Kamini; Thenkabail, Prasad S.; Ozdogan, Mutlu

    2014-01-01

    Given the advances in remotely sensed imagery and associated technologies, several global land cover maps have been produced in recent times including IGBP DISCover, UMD Land Cover, Global Land Cover 2000 and GlobCover 2009. However, the utility of these maps for specific applications has often been hampered due to considerable amounts of uncertainties and inconsistencies. A thorough review of these global land cover projects including evaluating the sources of error and uncertainty is prudent and enlightening. Therefore, this paper describes our work in which we compared, summarized and conducted an uncertainty analysis of the four global land cover mapping projects using an error budget approach. The results showed that the classification scheme and the validation methodology had the highest error contribution and implementation priority. A comparison of the classification schemes showed that there are many inconsistencies between the definitions of the map classes. This is especially true for the mixed type classes for which thresholds vary for the attributes/discriminators used in the classification process. Examination of these four global mapping projects provided quite a few important lessons for the future global mapping projects including the need for clear and uniform definitions of the classification scheme and an efficient, practical, and valid design of the accuracy assessment.

  6. A soft computing scheme incorporating ANN and MOV energy in fault detection, classification and distance estimation of EHV transmission line with FSC.

    PubMed

    Khadke, Piyush; Patne, Nita; Singh, Arvind; Shinde, Gulab

    2016-01-01

    In this article, a novel and accurate scheme for fault detection, classification and fault distance estimation for a fixed series compensated transmission line is proposed. The proposed scheme is based on artificial neural network (ANN) and metal oxide varistor (MOV) energy, employing Levenberg-Marquardt training algorithm. The novelty of this scheme is the use of MOV energy signals of fixed series capacitors (FSC) as input to train the ANN. Such approach has never been used in any earlier fault analysis algorithms in the last few decades. Proposed scheme uses only single end measurement energy signals of MOV in all the 3 phases over one cycle duration from the occurrence of a fault. Thereafter, these MOV energy signals are fed as input to ANN for fault distance estimation. Feasibility and reliability of the proposed scheme have been evaluated for all ten types of fault in test power system model at different fault inception angles over numerous fault locations. Real transmission system parameters of 3-phase 400 kV Wardha-Aurangabad transmission line (400 km) with 40 % FSC at Power Grid Wardha Substation, India is considered for this research. Extensive simulation experiments show that the proposed scheme provides quite accurate results which demonstrate complete protection scheme with high accuracy, simplicity and robustness.

  7. Evaluation of host and viral factors associated with severe dengue based on the 2009 WHO classification.

    PubMed

    Pozo-Aguilar, Jorge O; Monroy-Martínez, Verónica; Díaz, Daniel; Barrios-Palacios, Jacqueline; Ramos, Celso; Ulloa-García, Armando; García-Pillado, Janet; Ruiz-Ordaz, Blanca H

    2014-12-11

    Dengue fever (DF) is the most prevalent arthropod-borne viral disease affecting humans. The World Health Organization (WHO) proposed a revised classification in 2009 to enable the more effective identification of cases of severe dengue (SD). This was designed primarily as a clinical tool, but it also enables cases of SD to be differentiated into three specific subcategories (severe vascular leakage, severe bleeding, and severe organ dysfunction). However, no study has addressed whether this classification has advantage in estimating factors associated with the progression of disease severity or dengue pathogenesis. We evaluate in a dengue outbreak associated risk factors that could contribute to the development of SD according to the 2009 WHO classification. A prospective cross-sectional study was performed during an epidemic of dengue in 2009 in Chiapas, Mexico. Data were analyzed for host and viral factors associated with dengue cases, using the 1997 and 2009 WHO classifications. The cost-benefit ratio (CBR) was also estimated. The sensitivity in the 1997 WHO classification for determining SD was 75%, and the specificity was 97.7%. For the 2009 scheme, these were 100% and 81.1%, respectively. The 2009 classification showed a higher benefit (537%) with a lower cost (10.2%) than the 1997 WHO scheme. A secondary antibody response was strongly associated with SD. Early viral load was higher in cases of SD than in those with DF. Logistic regression analysis identified predictive SD factors (secondary infection, disease phase, viral load) within the 2009 classification. However, within the 1997 scheme it was not possible to differentiate risk factors between DF and dengue hemorrhagic fever or dengue shock syndrome. The critical clinical stage for determining SD progression was the transition from fever to defervescence in which plasma leakage can occur. The clinical phenotype of SD is influenced by the host (secondary response) and viral factors (viral load). The 2009 WHO classification showed greater sensitivity to identify SD in real time. Timely identification of SD enables accurate early decisions, allowing proper management of health resources for the benefit of patients at risk for SD. This is possible based on the 2009 WHO classification.

  8. COMPARISON OF GEOGRAPHIC CLASSIFICATION SCHEMES FOR MID-ATLANTIC STREAM FISH ASSEMBLAGES

    EPA Science Inventory

    Understanding the influence of geographic factors in structuring fish assemblages is crucial to developing a comprehensive assessment of stream conditions. We compared the classification strengths (CS) of geographic groups (ecoregions and catchments), stream order, and groups bas...

  9. Sorting Potatoes for Miss Bonner.

    ERIC Educational Resources Information Center

    Herreid, Clyde Freeman

    1998-01-01

    Discusses the basis of a classification scheme for types of case studies. Four major classification headings are identified: (1) individual assignment; (2) lecture; (3) discussion; and (4) small group activities. Describes each heading from the point of view of several teaching methods. (DDR)

  10. SOM Classification of Martian TES Data

    NASA Technical Reports Server (NTRS)

    Hogan, R. C.; Roush, T. L.

    2002-01-01

    A classification scheme based on unsupervised self-organizing maps (SOM) is described. Results from its application to the ASU mineral spectral database are presented. Applications to the Martian Thermal Emission Spectrometer data are discussed. Additional information is contained in the original extended abstract.

  11. Exploring the impact of wavelet-based denoising in the classification of remote sensing hyperspectral images

    NASA Astrophysics Data System (ADS)

    Quesada-Barriuso, Pablo; Heras, Dora B.; Argüello, Francisco

    2016-10-01

    The classification of remote sensing hyperspectral images for land cover applications is a very intensive topic. In the case of supervised classification, Support Vector Machines (SVMs) play a dominant role. Recently, the Extreme Learning Machine algorithm (ELM) has been extensively used. The classification scheme previously published by the authors, and called WT-EMP, introduces spatial information in the classification process by means of an Extended Morphological Profile (EMP) that is created from features extracted by wavelets. In addition, the hyperspectral image is denoised in the 2-D spatial domain, also using wavelets and it is joined to the EMP via a stacked vector. In this paper, the scheme is improved achieving two goals. The first one is to reduce the classification time while preserving the accuracy of the classification by using ELM instead of SVM. The second one is to improve the accuracy results by performing not only a 2-D denoising for every spectral band, but also a previous additional 1-D spectral signature denoising applied to each pixel vector of the image. For each denoising the image is transformed by applying a 1-D or 2-D wavelet transform, and then a NeighShrink thresholding is applied. Improvements in terms of classification accuracy are obtained, especially for images with close regions in the classification reference map, because in these cases the accuracy of the classification in the edges between classes is more relevant.

  12. Classification of extraterrestrial civilizations

    NASA Astrophysics Data System (ADS)

    Tang, Tong B.; Chang, Grace

    1991-06-01

    A scheme of classification of extraterrestrial intelligence (ETI) communities based on the scope of energy accessible to the civilization in question is proposed as an alternative to the Kardeshev (1964) scheme that includes three types of civilization, as determined by their levels of energy expenditure. The proposed scheme includes six classes: (1) a civilization that runs essentially on energy exerted by individual beings or by domesticated lower life forms, (2) harnessing of natural sources on planetary surface with artificial constructions, like water wheels and wind sails, (3) energy from fossils and fissionable isotopes, mined beneath the planet surface, (4) exploitation of nuclear fusion on a large scale, whether on the planet, in space, or from primary solar energy, (5) extensive use of antimatter for energy storage, and (6) energy from spacetime, perhaps via the action of naked singularities.

  13. Extending a field-based Sonoran desert vegetation classification to a regional scale using optical and microwave satellite imagery

    NASA Astrophysics Data System (ADS)

    Shupe, Scott Marshall

    2000-10-01

    Vegetation mapping in and regions facilitates ecological studies, land management, and provides a record to which future land changes can be compared. Accurate and representative mapping of desert vegetation requires a sound field sampling program and a methodology to transform the data collected into a representative classification system. Time and cost constraints require that a remote sensing approach be used if such a classification system is to be applied on a regional scale. However, desert vegetation may be sparse and thus difficult to sense at typical satellite resolutions, especially given the problem of soil reflectance. This study was designed to address these concerns by conducting vegetation mapping research using field and satellite data from the US Army Yuma Proving Ground (USYPG) in Southwest Arizona. Line and belt transect data from the Army's Land Condition Trend Analysis (LCTA) Program were transformed into relative cover and relative density classification schemes using cluster analysis. Ordination analysis of the same data produced two and three-dimensional graphs on which the homogeneity of each vegetation class could be examined. It was found that the use of correspondence analysis (CA), detrended correspondence analysis (DCA), and non-metric multidimensional scaling (NMS) ordination methods was superior to the use of any single ordination method for helping to clarify between-class and within-class relationships in vegetation composition. Analysis of these between-class and within-class relationships were of key importance in examining how well relative cover and relative density schemes characterize the USYPG vegetation. Using these two classification schemes as reference data, maximum likelihood and artificial neural net classifications were then performed on a coregistered dataset consisting of a summer Landsat Thematic Mapper (TM) image, one spring and one summer ERS-1 microwave image, and elevation, slope, and aspect layers. Classifications using a combination of ERS-1 imagery and elevation, slope, and aspect data were superior to classifications carried out using Landsat TM data alone. In all classification iterations it was consistently found that the highest classification accuracy was obtained by using a combination of Landsat TM, ERS-1, and elevation, slope, and aspect data. Maximum likelihood classification accuracy was found to be higher than artificial neural net classification in all cases.

  14. Characterizing the population of Asteroids in Cometary Orbits (ACOs)

    NASA Astrophysics Data System (ADS)

    Tancredi, Gonzalo; Licandro, Javier; Alí-Lagoa, Victor; Martino, Silvia; Vieira Monteiro, Filipe; Silva, Jose Sergio; Lazzaro, Daniela

    2015-08-01

    The classification criterion between asteroids and comets has evolved in recent decades, but the main phenomenological distinction remains unchanged: comets are active objects as they present gas and dust ejection from the surface at some point of their orbits, while asteroids are inert objects as they do not show any kind of large scale gas and dust ejection.To identify the transitional objects several classification schemes based on the orbital elements have been used. They are usually based on the Tisserand’s parameter (TJ). Tancredi (2014) presents a much more restrictive criterion to identify ACOs that ensured that the objects have a dynamical evolution similar to the population of periodic comets. After applying the criteriaa to the sample of over half a million asteroids already discovered, we obtain 316 ACOs that are further classified in subclasses similar to the cometary classification: 203 objects belong to the Jupiter Family group; 72 objects are classified as Centaurs; and 56 objects have Halley Type Orbits (also known as Damocloids). These are the best-known extinct/dormant comets candidates from a dynamical point of view.We study the physical properties of this sample of ACOs. Two results will be presented:- We look for the ACOs detected by the NASA’s WISE and by fitting a thermal model to their observations, we derive: the effective diameter, beaming parameter and the visible geometric albedo, using the method described in Al-Lagoa et al (2013). We obtain these parameters for 37 of 203 ACOs in JFC orbits and 13 of 56 Damocloids. We also compute the Cumulative Size Distribution (CSDs) of these populations and compare them with the CSDs of JF Comets and Centaurs.- We have been monitoring the observable ACOs since 12/2014 up to 06/2015. Every other month we select all the ACOs with elongations >90deg and estimated magnitudes V<21. We try to observe them with the 1m IMPACTON telescope of the Observatório Astronômico do Sertão de Itaparica (OASI). By comparing the photometric profiles of the ACOs with background stars, we try to detect some hint of cometary activity. Over 20 ACOs have been observed in the six months.

  15. A Job Classification Scheme for Health Manpower

    PubMed Central

    Weiss, Jeffrey H.

    1968-01-01

    The Census Bureau's occupational classification scheme and concept of the “health services industry” are inadequate tools for analysis of the changing job structure of health manpower. In an attempt to remedy their inadequacies, a new analytical framework—drawing upon the work of James Scoville on the job content of the U.S. economy—was devised. The first stage in formulating this new framework was to determine which jobs should be considered health jobs. The overall health care job family was designed to encompass jobs in which the primary technical focus or function is oriented toward the provision of health services. There are two dimensions to the job classification scheme presented here. The first describes each job in terms of job content; relative income data and minimum education and training requirements were employed as surrogate measures. By this means, health care jobs were grouped by three levels of job content: high, medium, and low. The other dimension describes each job in terms of its technical focus or function; by this means, health care jobs were grouped into nine job families. PMID:5673666

  16. Exploiting unsupervised and supervised classification for segmentation of the pathological lung in CT

    NASA Astrophysics Data System (ADS)

    Korfiatis, P.; Kalogeropoulou, C.; Daoussis, D.; Petsas, T.; Adonopoulos, A.; Costaridou, L.

    2009-07-01

    Delineation of lung fields in presence of diffuse lung diseases (DLPDs), such as interstitial pneumonias (IP), challenges segmentation algorithms. To deal with IP patterns affecting the lung border an automated image texture classification scheme is proposed. The proposed segmentation scheme is based on supervised texture classification between lung tissue (normal and abnormal) and surrounding tissue (pleura and thoracic wall) in the lung border region. This region is coarsely defined around an initial estimate of lung border, provided by means of Markov Radom Field modeling and morphological operations. Subsequently, a support vector machine classifier was trained to distinguish between the above two classes of tissue, using textural feature of gray scale and wavelet domains. 17 patients diagnosed with IP, secondary to connective tissue diseases were examined. Segmentation performance in terms of overlap was 0.924±0.021, and for shape differentiation mean, rms and maximum distance were 1.663±0.816, 2.334±1.574 and 8.0515±6.549 mm, respectively. An accurate, automated scheme is proposed for segmenting abnormal lung fields in HRC affected by IP

  17. The future of transposable element annotation and their classification in the light of functional genomics - what we can learn from the fables of Jean de la Fontaine?

    PubMed

    Arensburger, Peter; Piégu, Benoît; Bigot, Yves

    2016-01-01

    Transposable element (TE) science has been significantly influenced by the pioneering ideas of David Finnegan near the end of the last century, as well as by the classification systems that were subsequently developed. Today, whole genome TE annotation is mostly done using tools that were developed to aid gene annotation rather than to specifically study TEs. We argue that further progress in the TE field is impeded both by current TE classification schemes and by a failure to recognize that TE biology is fundamentally different from that of multicellular organisms. Novel genome wide TE annotation methods are helping to redefine our understanding of TE sequence origins and evolution. We briefly discuss some of these new methods as well as ideas for possible alternative classification schemes. Our hope is to encourage the formation of a society to organize a larger debate on these questions and to promote the adoption of standards for annotation and an improved TE classification.

  18. Branch classification: A new mechanism for improving branch predictor performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, P.Y.; Hao, E.; Patt, Y.

    There is wide agreement that one of the most significant impediments to the performance of current and future pipelined superscalar processors is the presence of conditional branches in the instruction stream. Speculative execution is one solution to the branch problem, but speculative work is discarded if a branch is mispredicted. For it to be effective, speculative work is discarded if a branch is mispredicted. For it to be effective, speculative execution requires a very accurate branch predictor; 95% accuracy is not good enough. This paper proposes branch classification, a methodology for building more accurate branch predictors. Branch classification allows anmore » individual branch instruction to be associated with the branch predictor best suited to predict its direction. Using this approach, a hybrid branch predictor can be constructed such that each component branch predictor predicts those branches for which it is best suited. To demonstrate the usefulness of branch classification, an example classification scheme is given and a new hybrid predictor is built based on this scheme which achieves a higher prediction accuracy than any branch predictor previously reported in the literature.« less

  19. Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.

    PubMed

    Chen, Shizhi; Yang, Xiaodong; Tian, Yingli

    2015-09-01

    A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.

  20. Evaluation of several schemes for classification of remotely sensed data: Their parameters and performance. [Foster County, North Dakota; Grant County, Kansas; Iroquois County, Illinois, Tippecanoe County, Indiana; and Pottawattamie and Shelby Counties, Iowa

    NASA Technical Reports Server (NTRS)

    Scholz, D.; Fuhs, N.; Hixson, M.; Akiyama, T. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Data sets for corn, soybeans, winter wheat, and spring wheat were used to evaluate the following schemes for crop identification: (1) per point Gaussian maximum classifier; (2) per point sum of normal densities classifiers; (3) per point linear classifier; (4) per point Gaussian maximum likelihood decision tree classifiers; and (5) texture sensitive per field Gaussian maximum likelihood classifier. Test site location and classifier both had significant effects on classification accuracy of small grains; classifiers did not differ significantly in overall accuracy, with the majority of the difference among classifiers being attributed to training method rather than to the classification algorithm applied. The complexity of use and computer costs for the classifiers varied significantly. A linear classification rule which assigns each pixel to the class whose mean is closest in Euclidean distance was the easiest for the analyst and cost the least per classification.

  1. ERTS-1 data applications to Minnesota forest land use classification

    NASA Technical Reports Server (NTRS)

    Sizer, J. E. (Principal Investigator); Eller, R. G.; Meyer, M. P.; Ulliman, J. J.

    1973-01-01

    The author has identified the following significant results. Color-combined ERTS-1 MSS spectral slices were analyzed to determine the maximum (repeatable) level of meaningful forest resource classification data visually attainable by skilled forest photointerpreters for the following purposes: (1) periodic updating of the Minnesota Land Management Information System (MLMIS) statewide computerized land use data bank, and (2) to provide first-stage forest resources survey data for large area forest land management planning. Controlled tests were made of two forest classification schemes by experienced professional foresters with special photointerpretation training and experience. The test results indicate it is possible to discriminate the MLMIS forest class from the MLMIS nonforest classes, but that it is not possible, under average circumstances, to further stratify the forest classification into species components with any degree of reliability with ERTS-1 imagery. An ongoing test of the resulting classification scheme involves the interpretation, and mapping, of the south half of Itasca County, Minnesota, with ERTS-1 imagery. This map is undergoing field checking by on the ground field cooperators, whose evaluation will be completed in the fall of 1973.

  2. A fast and efficient segmentation scheme for cell microscopic image.

    PubMed

    Lebrun, G; Charrier, C; Lezoray, O; Meurie, C; Cardot, H

    2007-04-27

    Microscopic cellular image segmentation schemes must be efficient for reliable analysis and fast to process huge quantity of images. Recent studies have focused on improving segmentation quality. Several segmentation schemes have good quality but processing time is too expensive to deal with a great number of images per day. For segmentation schemes based on pixel classification, the classifier design is crucial since it is the one which requires most of the processing time necessary to segment an image. The main contribution of this work is focused on how to reduce the complexity of decision functions produced by support vector machines (SVM) while preserving recognition rate. Vector quantization is used in order to reduce the inherent redundancy present in huge pixel databases (i.e. images with expert pixel segmentation). Hybrid color space design is also used in order to improve data set size reduction rate and recognition rate. A new decision function quality criterion is defined to select good trade-off between recognition rate and processing time of pixel decision function. The first results of this study show that fast and efficient pixel classification with SVM is possible. Moreover posterior class pixel probability estimation is easy to compute with Platt method. Then a new segmentation scheme using probabilistic pixel classification has been developed. This one has several free parameters and an automatic selection must dealt with, but criteria for evaluate segmentation quality are not well adapted for cell segmentation, especially when comparison with expert pixel segmentation must be achieved. Another important contribution in this paper is the definition of a new quality criterion for evaluation of cell segmentation. The results presented here show that the selection of free parameters of the segmentation scheme by optimisation of the new quality cell segmentation criterion produces efficient cell segmentation.

  3. Applying a deep learning based CAD scheme to segment and quantify visceral and subcutaneous fat areas from CT images

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; Moore, Kathleen; Liu, Hong; Zheng, Bin

    2017-03-01

    Abdominal obesity is strongly associated with a number of diseases and accurately assessment of subtypes of adipose tissue volume plays a significant role in predicting disease risk, diagnosis and prognosis. The objective of this study is to develop and evaluate a new computer-aided detection (CAD) scheme based on deep learning models to automatically segment subcutaneous fat areas (SFA) and visceral (VFA) fat areas depicting on CT images. A dataset involving CT images from 40 patients were retrospectively collected and equally divided into two independent groups (i.e. training and testing group). The new CAD scheme consisted of two sequential convolutional neural networks (CNNs) namely, Selection-CNN and Segmentation-CNN. Selection-CNN was trained using 2,240 CT slices to automatically select CT slices belonging to abdomen areas and SegmentationCNN was trained using 84,000 fat-pixel patches to classify fat-pixels as belonging to SFA or VFA. Then, data from the testing group was used to evaluate the performance of the optimized CAD scheme. Comparing to manually labelled results, the classification accuracy of CT slices selection generated by Selection-CNN yielded 95.8%, while the accuracy of fat pixel segmentation using Segmentation-CNN yielded 96.8%. Therefore, this study demonstrated the feasibility of using deep learning based CAD scheme to recognize human abdominal section from CT scans and segment SFA and VFA from CT slices with high agreement compared with subjective segmentation results.

  4. A comparison of resampling schemes for estimating model observer performance with small ensembles

    NASA Astrophysics Data System (ADS)

    Elshahaby, Fatma E. A.; Jha, Abhinav K.; Ghaly, Michael; Frey, Eric C.

    2017-09-01

    In objective assessment of image quality, an ensemble of images is used to compute the 1st and 2nd order statistics of the data. Often, only a finite number of images is available, leading to the issue of statistical variability in numerical observer performance. Resampling-based strategies can help overcome this issue. In this paper, we compared different combinations of resampling schemes (the leave-one-out (LOO) and the half-train/half-test (HT/HT)) and model observers (the conventional channelized Hotelling observer (CHO), channelized linear discriminant (CLD) and channelized quadratic discriminant). Observer performance was quantified by the area under the ROC curve (AUC). For a binary classification task and for each observer, the AUC value for an ensemble size of 2000 samples per class served as a gold standard for that observer. Results indicated that each observer yielded a different performance depending on the ensemble size and the resampling scheme. For a small ensemble size, the combination [CHO, HT/HT] had more accurate rankings than the combination [CHO, LOO]. Using the LOO scheme, the CLD and CHO had similar performance for large ensembles. However, the CLD outperformed the CHO and gave more accurate rankings for smaller ensembles. As the ensemble size decreased, the performance of the [CHO, LOO] combination seriously deteriorated as opposed to the [CLD, LOO] combination. Thus, it might be desirable to use the CLD with the LOO scheme when smaller ensemble size is available.

  5. Regional shape-based feature space for segmenting biomedical images using neural networks

    NASA Astrophysics Data System (ADS)

    Sundaramoorthy, Gopal; Hoford, John D.; Hoffman, Eric A.

    1993-07-01

    In biomedical images, structure of interest, particularly the soft tissue structures, such as the heart, airways, bronchial and arterial trees often have grey-scale and textural characteristics similar to other structures in the image, making it difficult to segment them using only gray- scale and texture information. However, these objects can be visually recognized by their unique shapes and sizes. In this paper we discuss, what we believe to be, a novel, simple scheme for extracting features based on regional shapes. To test the effectiveness of these features for image segmentation (classification), we use an artificial neural network and a statistical cluster analysis technique. The proposed shape-based feature extraction algorithm computes regional shape vectors (RSVs) for all pixels that meet a certain threshold criteria. The distance from each such pixel to a boundary is computed in 8 directions (or in 26 directions for a 3-D image). Together, these 8 (or 26) values represent the pixel's (or voxel's) RSV. All RSVs from an image are used to train a multi-layered perceptron neural network which uses these features to 'learn' a suitable classification strategy. To clearly distinguish the desired object from other objects within an image, several examples from inside and outside the desired object are used for training. Several examples are presented to illustrate the strengths and weaknesses of our algorithm. Both synthetic and actual biomedical images are considered. Future extensions to this algorithm are also discussed.

  6. Stratification of pseudoprogression and true progression of glioblastoma multiform based on longitudinal diffusion tensor imaging without segmentation

    PubMed Central

    Qian, Xiaohua; Tan, Hua; Zhang, Jian; Zhao, Weilin; Chan, Michael D.; Zhou, Xiaobo

    2016-01-01

    Purpose: Pseudoprogression (PsP) can mimic true tumor progression (TTP) on magnetic resonance imaging in patients with glioblastoma multiform (GBM). The phenotypical similarity between PsP and TTP makes it a challenging task for physicians to distinguish these entities. So far, no approved biomarkers or computer-aided diagnosis systems have been used clinically for this purpose. Methods: To address this challenge, the authors developed an objective classification system for PsP and TTP based on longitudinal diffusion tensor imaging. A novel spatio-temporal discriminative dictionary learning scheme was proposed to differentiate PsP and TTP, thereby avoiding segmentation of the region of interest. The authors constructed a novel discriminative sparse matrix with the classification-oriented dictionary learning approach by excluding the shared features of two categories, so that the pooled features captured the subtle difference between PsP and TTP. The most discriminating features were then identified from the pooled features by their feature scoring system. Finally, the authors stratified patients with GBM into PsP and TTP by a support vector machine approach. Tenfold cross-validation (CV) and the area under the receiver operating characteristic (AUC) were used to assess the robustness of the developed system. Results: The average accuracy and AUC values after ten rounds of tenfold CV were 0.867 and 0.92, respectively. The authors also assessed the effects of different methods and factors (such as data types, pooling techniques, and dimensionality reduction approaches) on the performance of their classification system which obtained the best performance. Conclusions: The proposed objective classification system without segmentation achieved a desirable and reliable performance in differentiating PsP from TTP. Thus, the developed approach is expected to advance the clinical research and diagnosis of PsP and TTP. PMID:27806598

  7. Sensitivity and Specificity of the World Health Organization Dengue Classification Schemes for Severe Dengue Assessment in Children in Rio de Janeiro

    PubMed Central

    Macedo, Gleicy A.; Gonin, Michelle Luiza C.; Pone, Sheila M.; Cruz, Oswaldo G.; Nobre, Flávio F.; Brasil, Patrícia

    2014-01-01

    Background The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Principal Findings Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. Conclusions/Significance This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction. PMID:24777054

  8. Sensitivity and specificity of the World Health Organization dengue classification schemes for severe dengue assessment in children in Rio de Janeiro.

    PubMed

    Macedo, Gleicy A; Gonin, Michelle Luiza C; Pone, Sheila M; Cruz, Oswaldo G; Nobre, Flávio F; Brasil, Patrícia

    2014-01-01

    The clinical definition of severe dengue fever remains a challenge for researchers in hyperendemic areas like Brazil. The ability of the traditional (1997) as well as the revised (2009) World Health Organization (WHO) dengue case classification schemes to detect severe dengue cases was evaluated in 267 children admitted to hospital with laboratory-confirmed dengue. Using the traditional scheme, 28.5% of patients could not be assigned to any category, while the revised scheme categorized all patients. Intensive therapeutic interventions were used as the reference standard to evaluate the ability of both the traditional and revised schemes to detect severe dengue cases. Analyses of the classified cases (n = 183) demonstrated that the revised scheme had better sensitivity (86.8%, P<0.001), while the traditional scheme had better specificity (93.4%, P<0.001) for the detection of severe forms of dengue. This improved sensitivity of the revised scheme allows for better case capture and increased ICU admission, which may aid pediatricians in avoiding deaths due to severe dengue among children, but, in turn, it may also result in the misclassification of the patients' condition as severe, reflected in the observed lower positive predictive value (61.6%, P<0.001) when compared with the traditional scheme (82.6%, P<0.001). The inclusion of unusual dengue manifestations in the revised scheme has not shifted the emphasis from the most important aspects of dengue disease and the major factors contributing to fatality in this study: shock with consequent organ dysfunction.

  9. The reliability of axis V of the multiaxial classification scheme.

    PubMed

    van Goor-Lambo, G

    1987-07-01

    In a reliability study concerning axis V (abnormal psychosocial situations) of the Multiaxial classification scheme for psychiatric disorders in childhood and adolescence, it was found that the level of agreement in scoring was adequate for only 2 out of 12 categories. A proposal for a modification of axis V was made, including a differentiation and regrouping of the categories and an adjustment of the descriptions in the glossary. With this modification of axis V another reliability study was carried out, in which the level of agreement in scoring was adequate for 12 out of 16 categories.

  10. Analysis of DSN software anomalies

    NASA Technical Reports Server (NTRS)

    Galorath, D. D.; Hecht, H.; Hecht, M.; Reifer, D. J.

    1981-01-01

    A categorized data base of software errors which were discovered during the various stages of development and operational use of the Deep Space Network DSN/Mark 3 System was developed. A study team identified several existing error classification schemes (taxonomies), prepared a detailed annotated bibliography of the error taxonomy literature, and produced a new classification scheme which was tuned to the DSN anomaly reporting system and encapsulated the work of others. Based upon the DSN/RCI error taxonomy, error data on approximately 1000 reported DSN/Mark 3 anomalies were analyzed, interpreted and classified. Next, error data are summarized and histograms were produced highlighting key tendencies.

  11. Understanding Homicide-Suicide.

    PubMed

    Knoll, James L

    2016-12-01

    Homicide-suicide is the phenomenon in which an individual kills 1 or more people and commits suicide. Research on homicide-suicide has been hampered by a lack of an accepted classification scheme and reliance on media reports. Mass murder-suicide is gaining increasing attention particularly in the United States. This article reviews the research and literature on homicide-suicide, proposing a standard classification scheme. Preventive methods are discussed and sociocultural factors explored. For a more accurate and complete understanding of homicide-suicide, it is argued that future research should use the full psychological autopsy approach, to include collateral interviews. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Assessment of skeletal maturity in scoliosis patients to determine clinical management: a new classification scheme using distal radius and ulna radiographs.

    PubMed

    Luk, Keith D K; Saw, Lim Beng; Grozman, Samuel; Cheung, Kenneth M C; Samartzis, Dino

    2014-02-01

    Assessment of skeletal maturity in patients with adolescent idiopathic scoliosis (AIS) is important to guide clinical management. Understanding growth peak and cessation is crucial to determine clinical observational intervals, timing to initiate or end bracing therapy, and when to instrument and fuse. The commonly used clinical or radiologic methods to assess skeletal maturity are still deficient in predicting the growth peak and cessation among adolescents, and bone age is too complicated to apply. To address these concerns, we describe a new distal radius and ulna (DRU) classification scheme to assess skeletal maturity. A prospective study. One hundred fifty young, female AIS patients with hand x-rays and no previous history of spine surgery from a single institute were assessed. Radius and ulna plain radiographs, and various anthropomorphic parameters were assessed. We identified various stages of radius and ulna epiphysis maturity, which were graded as R1-R11 for the radius and U1-U9 for the ulna. The bone age, development of sexual characteristics, standing height, sitting height, arm span, radius length, and tibia length were studied prospectively at each stage of these epiphysis changes. Standing height, sitting height, and arm span growth were at their peak during stages R7 (mean, 11.4 years old) and U5 (mean, 11.0 years old). The long bone growths also demonstrated a common peak at R7 and U5. Cessation of height and arm span growth was noted after stages R10 (mean, 15.6 years old) and U9 (mean, 17.3 years old). The new DRU classification is a practical and easy-to-use scheme that can provide skeletal maturation status. This classification scheme provides close relationship with adolescent growth spurt and cessation of growth. This classification may have a tremendous utility in improving clinical-decision making in the conservative and operative management of scoliosis patients. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. National Library of Medicine Classification: A Scheme for the Shelf Arrangement of Books in a Field of Medicine and Its Related Sciences. Fourth Edition.

    ERIC Educational Resources Information Center

    Wiggins, Emilie, Ed.

    Outlined is the National Library of Medicine classification system for medicine and related sciences. In this system each preclinical science, such as human anatomy, biochemistry or pathology, and each medical subject, such as infectious diseases or pediatrics, receives a two-letter classification. Under each of these main headings numbered minor…

  14. A combined reconstruction-classification method for diffuse optical tomography.

    PubMed

    Hiltunen, P; Prince, S J D; Arridge, S

    2009-11-07

    We present a combined classification and reconstruction algorithm for diffuse optical tomography (DOT). DOT is a nonlinear ill-posed inverse problem. Therefore, some regularization is needed. We present a mixture of Gaussians prior, which regularizes the DOT reconstruction step. During each iteration, the parameters of a mixture model are estimated. These associate each reconstructed pixel with one of several classes based on the current estimate of the optical parameters. This classification is exploited to form a new prior distribution to regularize the reconstruction step and update the optical parameters. The algorithm can be described as an iteration between an optimization scheme with zeroth-order variable mean and variance Tikhonov regularization and an expectation-maximization scheme for estimation of the model parameters. We describe the algorithm in a general Bayesian framework. Results from simulated test cases and phantom measurements show that the algorithm enhances the contrast of the reconstructed images with good spatial accuracy. The probabilistic classifications of each image contain only a few misclassified pixels.

  15. Automated reuseable components system study results

    NASA Technical Reports Server (NTRS)

    Gilroy, Kathy

    1989-01-01

    The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.

  16. Reading the lesson: eliciting requirements for a mammography training application

    NASA Astrophysics Data System (ADS)

    Hartswood, M.; Blot, L.; Taylor, P.; Anderson, S.; Procter, R.; Wilkinson, L.; Smart, L.

    2009-02-01

    Demonstrations of a prototype training tool were used to elicit requirements for an intelligent training system for screening mammography. The prototype allowed senior radiologists (mentors) to select cases from a distributed database of images to meet the specific training requirements of junior colleagues (trainees) and then provided automated feedback in response to trainees' attempts at interpretation. The tool was demonstrated to radiologists and radiographers working in the breast screening service at four evaluation sessions. Participants highlighted ease of selecting cases that can deliver specific learning objectives as important for delivering effective training. To usefully structure a large data set of training images we undertook a classification exercise of mentor authored free text 'learning points' attached to training case obtained from two screening centres (n=333, n=129 respectively). We were able to adduce a hierarchy of abstract categories representing classes of lesson that groups of cases were intended to convey (e.g. Temporal change, Misleading juxtapositions, Position of lesion, Typical/Atypical presentation, and so on). In this paper we present the method used to devise this classification, the classification scheme itself, initial user-feedback, and our plans to incorporated it into a software tool to aid case selection.

  17. Real-time classification of auditory sentences using evoked cortical activity in humans

    NASA Astrophysics Data System (ADS)

    Moses, David A.; Leonard, Matthew K.; Chang, Edward F.

    2018-06-01

    Objective. Recent research has characterized the anatomical and functional basis of speech perception in the human auditory cortex. These advances have made it possible to decode speech information from activity in brain regions like the superior temporal gyrus, but no published work has demonstrated this ability in real-time, which is necessary for neuroprosthetic brain-computer interfaces. Approach. Here, we introduce a real-time neural speech recognition (rtNSR) software package, which was used to classify spoken input from high-resolution electrocorticography signals in real-time. We tested the system with two human subjects implanted with electrode arrays over the lateral brain surface. Subjects listened to multiple repetitions of ten sentences, and rtNSR classified what was heard in real-time from neural activity patterns using direct sentence-level and HMM-based phoneme-level classification schemes. Main results. We observed single-trial sentence classification accuracies of 90% or higher for each subject with less than 7 minutes of training data, demonstrating the ability of rtNSR to use cortical recordings to perform accurate real-time speech decoding in a limited vocabulary setting. Significance. Further development and testing of the package with different speech paradigms could influence the design of future speech neuroprosthetic applications.

  18. Neyman-Pearson classification algorithms and NP receiver operating characteristics

    PubMed Central

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-01-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies. PMID:29423442

  19. Neyman-Pearson classification algorithms and NP receiver operating characteristics.

    PubMed

    Tong, Xin; Feng, Yang; Li, Jingyi Jessica

    2018-02-01

    In many binary classification applications, such as disease diagnosis and spam detection, practitioners commonly face the need to limit type I error (that is, the conditional probability of misclassifying a class 0 observation as class 1) so that it remains below a desired threshold. To address this need, the Neyman-Pearson (NP) classification paradigm is a natural choice; it minimizes type II error (that is, the conditional probability of misclassifying a class 1 observation as class 0) while enforcing an upper bound, α, on the type I error. Despite its century-long history in hypothesis testing, the NP paradigm has not been well recognized and implemented in classification schemes. Common practices that directly limit the empirical type I error to no more than α do not satisfy the type I error control objective because the resulting classifiers are likely to have type I errors much larger than α, and the NP paradigm has not been properly implemented in practice. We develop the first umbrella algorithm that implements the NP paradigm for all scoring-type classification methods, such as logistic regression, support vector machines, and random forests. Powered by this algorithm, we propose a novel graphical tool for NP classification methods: NP receiver operating characteristic (NP-ROC) bands motivated by the popular ROC curves. NP-ROC bands will help choose α in a data-adaptive way and compare different NP classifiers. We demonstrate the use and properties of the NP umbrella algorithm and NP-ROC bands, available in the R package nproc, through simulation and real data studies.

  20. Family Traits of Galaxies: From the Tuning Fork to a Physical Classification in a Multi-Wavelength Context

    NASA Astrophysics Data System (ADS)

    Rampazzo, Roberto; D'Onofrio, Mauro; Zaggia, Simone; Elmegreen, Debra M.; Laurikainen, Eija; Duc, Pierre-Alain; Gallart, Carme; Fraix-Burnet, Didier

    At the time of the Great Debate nebulæ where recognized to have different morphologies and first classifications, sometimes only descriptive, have been attempted. A review of these early classification systems are well documented by the Allan Sandage's review in 2005 (Sandage 2005). This review emphasized the debt, in term of continuity of forms of spiral galaxies, due by the Hubble's classification scheme to the Reynold's systems proposed in 1920 (Reynolds, 1920).

  1. Multi-dimensional classification of GABAergic interneurons with Bayesian network-modeled label uncertainty.

    PubMed

    Mihaljević, Bojan; Bielza, Concha; Benavides-Piccione, Ruth; DeFelipe, Javier; Larrañaga, Pedro

    2014-01-01

    Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists' classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.

  2. Visual classification of feral cat Felis silvestris catus vocalizations.

    PubMed

    Owens, Jessica L; Olsen, Mariana; Fontaine, Amy; Kloth, Christopher; Kershenbaum, Arik; Waller, Sara

    2017-06-01

    Cat vocal behavior, in particular, the vocal and social behavior of feral cats, is poorly understood, as are the differences between feral and fully domestic cats. The relationship between feral cat social and vocal behavior is important because of the markedly different ecology of feral and domestic cats, and enhanced comprehension of the repertoire and potential information content of feral cat calls can provide both better understanding of the domestication and socialization process, and improved welfare for feral cats undergoing adoption. Previous studies have used conflicting classification schemes for cat vocalizations, often relying on onomatopoeic or popular descriptions of call types (e.g., "miow"). We studied the vocalizations of 13 unaltered domestic cats that complied with our behavioral definition used to distinguish feral cats from domestic. A total of 71 acoustic units were extracted and visually analyzed for the construction of a hierarchical classification of vocal sounds, based on acoustic properties. We identified 3 major categories (tonal, pulse, and broadband) that further breakdown into 8 subcategories, and show a high degree of reliability when sounds are classified blindly by independent observers (Fleiss' Kappa K  = 0.863). Due to the limited behavioral contexts in this study, additional subcategories of cat vocalizations may be identified in the future, but our hierarchical classification system allows for the addition of new categories and new subcategories as they are described. This study shows that cat vocalizations are diverse and complex, and provides an objective and reliable classification system that can be used in future studies.

  3. An evaluation of supervised classifiers for indirectly detecting salt-affected areas at irrigation scheme level

    NASA Astrophysics Data System (ADS)

    Muller, Sybrand Jacobus; van Niekerk, Adriaan

    2016-07-01

    Soil salinity often leads to reduced crop yield and quality and can render soils barren. Irrigated areas are particularly at risk due to intensive cultivation and secondary salinization caused by waterlogging. Regular monitoring of salt accumulation in irrigation schemes is needed to keep its negative effects under control. The dynamic spatial and temporal characteristics of remote sensing can provide a cost-effective solution for monitoring salt accumulation at irrigation scheme level. This study evaluated a range of pan-fused SPOT-5 derived features (spectral bands, vegetation indices, image textures and image transformations) for classifying salt-affected areas in two distinctly different irrigation schemes in South Africa, namely Vaalharts and Breede River. The relationship between the input features and electro conductivity measurements were investigated using regression modelling (stepwise linear regression, partial least squares regression, curve fit regression modelling) and supervised classification (maximum likelihood, nearest neighbour, decision tree analysis, support vector machine and random forests). Classification and regression trees and random forest were used to select the most important features for differentiating salt-affected and unaffected areas. The results showed that the regression analyses produced weak models (<0.4 R squared). Better results were achieved using the supervised classifiers, but the algorithms tend to over-estimate salt-affected areas. A key finding was that none of the feature sets or classification algorithms stood out as being superior for monitoring salt accumulation at irrigation scheme level. This was attributed to the large variations in the spectral responses of different crops types at different growing stages, coupled with their individual tolerances to saline conditions.

  4. A/T/N: An unbiased descriptive classification scheme for Alzheimer disease biomarkers

    PubMed Central

    Bennett, David A.; Blennow, Kaj; Carrillo, Maria C.; Feldman, Howard H.; Frisoni, Giovanni B.; Hampel, Harald; Jagust, William J.; Johnson, Keith A.; Knopman, David S.; Petersen, Ronald C.; Scheltens, Philip; Sperling, Reisa A.; Dubois, Bruno

    2016-01-01

    Biomarkers have become an essential component of Alzheimer disease (AD) research and because of the pervasiveness of AD pathology in the elderly, the same biomarkers are used in cognitive aging research. A number of current issues suggest that an unbiased descriptive classification scheme for these biomarkers would be useful. We propose the “A/T/N” system in which 7 major AD biomarkers are divided into 3 binary categories based on the nature of the pathophysiology that each measures. “A” refers to the value of a β-amyloid biomarker (amyloid PET or CSF Aβ42); “T,” the value of a tau biomarker (CSF phospho tau, or tau PET); and “N,” biomarkers of neurodegeneration or neuronal injury ([18F]-fluorodeoxyglucose–PET, structural MRI, or CSF total tau). Each biomarker category is rated as positive or negative. An individual score might appear as A+/T+/N−, or A+/T−/N−, etc. The A/T/N system includes the new modality tau PET. It is agnostic to the temporal ordering of mechanisms underlying AD pathogenesis. It includes all individuals in any population regardless of the mix of biomarker findings and therefore is suited to population studies of cognitive aging. It does not specify disease labels and thus is not a diagnostic classification system. It is a descriptive system for categorizing multidomain biomarker findings at the individual person level in a format that is easy to understand and use. Given the present lack of consensus among AD specialists on terminology across the clinically normal to dementia spectrum, a biomarker classification scheme will have broadest acceptance if it is independent from any one clinically defined diagnostic scheme. PMID:27371494

  5. A Critical Review of Mode of Action (MOA) Assignment Classifications for Ecotoxicology

    EPA Science Inventory

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human health toxicology. With increasing calls to assess thousands of chemicals, some of which have little available informatio...

  6. Solar wind classification from a machine learning perspective

    NASA Astrophysics Data System (ADS)

    Heidrich-Meisner, V.; Wimmer-Schweingruber, R. F.

    2017-12-01

    It is a very well known fact that the ubiquitous solar wind comes in at least two varieties, the slow solar wind and the coronal hole wind. The simplified view of two solar wind types has been frequently challenged. Existing solar wind categorization schemes rely mainly on different combinations of the solar wind proton speed, the O and C charge state ratios, the Alfvén speed, the expected proton temperature and the specific proton entropy. In available solar wind classification schemes, solar wind from stream interaction regimes is often considered either as coronal hole wind or slow solar wind, although their plasma properties are different compared to "pure" coronal hole or slow solar wind. As shown in Neugebauer et al. (2016), even if only two solar wind types are assumed, available solar wind categorization schemes differ considerably for intermediate solar wind speeds. Thus, the decision boundary between the coronal hole and the slow solar wind is so far not well defined.In this situation, a machine learning approach to solar wind classification can provide an additional perspective.We apply a well-known machine learning method, k-means, to the task of solar wind classification in order to answer the following questions: (1) How many solar wind types can reliably be identified in our data set comprised of ten years of solar wind observations from the Advanced Composition Explorer (ACE)? (2) Which combinations of solar wind parameters are particularly useful for solar wind classification?Potential subtypes of slow solar wind are of particular interest because they can provide hints of respective different source regions or release mechanisms of slow solar wind.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apellániz, J. Maíz; Sota, A.; Alfaro, E. J.

    This is the third installment of the Galactic O-Star Spectroscopic Survey (GOSSS), a massive spectroscopic survey of Galactic O stars, based on new homogeneous, high signal-to-noise ratio, R  ∼ 2500 digital observations selected from the Galactic O-Star Catalog. In this paper, we present 142 additional stellar systems with O stars from both hemispheres, bringing the total of O-type systems published within the project to 590. Among the new objects, there are 20 new O stars. We also identify 11 new double-lined spectroscopic binaries, 6 of which are of O+O type and 5 of O+B type, and an additional new tripled-lined spectroscopic binary of O+O+Bmore » type. We also revise some of the previous GOSSS classifications, present some egregious examples of stars erroneously classified as O-type in the past, introduce the use of luminosity class IV at spectral types O4-O5.5, and adapt the classification scheme to the work of Arias et al.« less

  8. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.

  9. Weather types in the South Shetlands (Antarctica) using a circulation type approach

    NASA Astrophysics Data System (ADS)

    Mora, Carla; João Rocha, Maria; Dutra, Emanuel; Trigo, Isabel; Vieira, Gonçalo; Fragoso, Marcelo; Ramos, Miguel

    2010-05-01

    Weather types in the South Shetlands (Antarctica) were defined using an automated method based on the Lamb Weather Type classification scheme (Jones et al. 1993). This is an objective classification originally developed for the British Isles (Jones et al., 1993) and also applied to southeast (Goodess and Palutikof 1998) and northwest Spain (Lorenzo et al, 2009), Portugal (Trigo and DaCamara 2000) and Greece (Maheras et al. 2004) with good results. Daily atmospheric circulation in the South Shetlands region from 1989 to 2009 was classified using a 16-node grid of sea level pressure data from the ERA Interim. The classification is obtained through the comparison of the magnitudes of the directional and rotational components of the geostrophic flow. Basic circulation types were combined into 10 groups of weather types: four directional types (NW, N, S and SW), three anticyclonic types (A, ASW and ANW), and three cyclonic types (C, CSW and CNW). Westerly flow and cyclonic circulation are the most frequent events throughout the year. The sea level pressure field for each weather type is presented and the synoptic characteristics are described. The analysis is based on ERA-Interim fields, including mean sea level pressure, precipitation, cloud cover, humidity and air temperature. Snow thickess modelled using HTESSEL is also considered. Analysis of variance (anova) and multivariate analysis (principal component analysis) are applied to evaluate the characteristics of each weather type. This circulation-type approach showed good results in the past for the downscaling of precipitation in other regions, and we are interested in evaluating the possibilities that the classification offers for downscaling precipitation, but also for snow and air temperature. For this we will be using observational data at test sites in Livingston and Deception islands. We are also motivated by the possibility of using the circulation-type approach as a predictor in statistical downscaling. References: Goodess CM, Palutikof JP.1998. Development of daily rainfall scenarios for southeast Spain using a Circulation-type approach to downscaling. International Journal of Climatology. 10: 1051-1083. JonesPD, Hulme M, Briffa KR. 1993. A comparison of Lamb circulation types with an objective classification scheme. International Journal of Climatology, 13:655-663. Lorenzo M N, Iglesias I , Taboada JJ , Gómez-Gesteira M. 2009. Relationship between monthly rainfall in northwest Iberian Peninsula and North Atlantic sea surface temperature. International Journal of Climatology. Maheras P, Tolika K, Anagnostopoulou C, Vafiadis M, Patrikas I, Flocas H. 2004. On the relationship between circulation types and changes in rainfall variability in Grece. International Journal of Climatology 24: 1695-1712. Trigo RM, DaCamara C. 2000. Circulation weather types and their influence on the precipitation regime in Portugal. International Journal of Climatology. 20: 1559-1581.

  10. Classification of diffuse lung diseases: why and how.

    PubMed

    Hansell, David M

    2013-09-01

    The understanding of complex lung diseases, notably the idiopathic interstitial pneumonias and small airways diseases, owes as much to repeated attempts over the years to classify them as to any single conceptual breakthrough. One of the many benefits of a successful classification scheme is that it allows workers, within and between disciplines, to be clear that they are discussing the same disease. This may be of particular importance in the recruitment of individuals for a clinical trial that requires a standardized and homogeneous study population. Different specialties require fundamentally different things from a classification: for epidemiologic studies, a classification that requires categorization of individuals according to histopathologic pattern is not usually practicable. Conversely, a scheme that simply divides diffuse parenchymal disease into inflammatory and noninflammatory categories is unlikely to further the understanding about the pathogenesis of disease. Thus, for some disease groupings, for example, pulmonary vasculopathies, there may be several appropriate classifications, each with its merits and demerits. There has been an interesting shift in the past few years, from the accepted primacy of histopathology as the sole basis on which the classification of parenchymal lung disease has rested, to new ways of considering how these entities relate to each other. Some inventive thinking has resulted in new classifications that undoubtedly benefit patients and clinicians in their endeavor to improve management and outcome. The challenge of understanding the logic behind current classifications and their shortcomings are explored in various examples of lung diseases.

  11. Video Games: Instructional Potential and Classification.

    ERIC Educational Resources Information Center

    Nawrocki, Leon H.; Winner, Janet L.

    1983-01-01

    Intended to provide a framework and impetus for future investigations of video games, this paper summarizes activities investigating the instructional use of such games, observations by the authors, and a proposed classification scheme and a paradigm to assist in the preliminary selection of instructional video games. Nine references are listed.…

  12. Fabric wrinkle characterization and classification using modified wavelet coefficients and optimized support-vector-machine classifier

    USDA-ARS?s Scientific Manuscript database

    This paper presents a novel wrinkle evaluation method that uses modified wavelet coefficients and an optimized support-vector-machine (SVM) classification scheme to characterize and classify wrinkle appearance of fabric. Fabric images were decomposed with the wavelet transform (WT), and five parame...

  13. Mode of Action (MOA) Assignment Classifications for Ecotoxicology: Evaluation of Available Methods

    EPA Science Inventory

    There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human toxicology. With increasing calls to assess 1000s of chemicals, some of which have little available information other tha...

  14. Surveillance system and method having an operating mode partitioned fault classification model

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor)

    2005-01-01

    A system and method which partitions a parameter estimation model, a fault detection model, and a fault classification model for a process surveillance scheme into two or more coordinated submodels together providing improved diagnostic decision making for at least one determined operating mode of an asset.

  15. Structure D'Ensemble, Multiple Classification, Multiple Seriation and Amount of Irrelevant Information

    ERIC Educational Resources Information Center

    Hamel, B. Remmo; Van Der Veer, M. A. A.

    1972-01-01

    A significant positive correlation between multiple classification was found, in testing 65 children aged 6 to 8 years, at the stage of concrete operations. This is interpreted as support for the existence of a structure d'ensemble of operational schemes in the period of concrete operations. (Authors)

  16. Statistical analysis of textural features for improved classification of oral histopathological images.

    PubMed

    Muthu Rama Krishnan, M; Shah, Pratik; Chakraborty, Chandan; Ray, Ajoy K

    2012-04-01

    The objective of this paper is to provide an improved technique, which can assist oncopathologists in correct screening of oral precancerous conditions specially oral submucous fibrosis (OSF) with significant accuracy on the basis of collagen fibres in the sub-epithelial connective tissue. The proposed scheme is composed of collagen fibres segmentation, its textural feature extraction and selection, screening perfomance enhancement under Gaussian transformation and finally classification. In this study, collagen fibres are segmented on R,G,B color channels using back-probagation neural network from 60 normal and 59 OSF histological images followed by histogram specification for reducing the stain intensity variation. Henceforth, textural features of collgen area are extracted using fractal approaches viz., differential box counting and brownian motion curve . Feature selection is done using Kullback-Leibler (KL) divergence criterion and the screening performance is evaluated based on various statistical tests to conform Gaussian nature. Here, the screening performance is enhanced under Gaussian transformation of the non-Gaussian features using hybrid distribution. Moreover, the routine screening is designed based on two statistical classifiers viz., Bayesian classification and support vector machines (SVM) to classify normal and OSF. It is observed that SVM with linear kernel function provides better classification accuracy (91.64%) as compared to Bayesian classifier. The addition of fractal features of collagen under Gaussian transformation improves Bayesian classifier's performance from 80.69% to 90.75%. Results are here studied and discussed.

  17. Performance evaluation of object based greenhouse detection from Sentinel-2 MSI and Landsat 8 OLI data: A case study from Almería (Spain)

    NASA Astrophysics Data System (ADS)

    Novelli, Antonio; Aguilar, Manuel A.; Nemmaoui, Abderrahim; Aguilar, Fernando J.; Tarantino, Eufemia

    2016-10-01

    This paper shows the first comparison between data from Sentinel-2 (S2) Multi Spectral Instrument (MSI) and Landsat 8 (L8) Operational Land Imager (OLI) headed up to greenhouse detection. Two closely related in time scenes, one for each sensor, were classified by using Object Based Image Analysis and Random Forest (RF). The RF input consisted of several object-based features computed from spectral bands and including mean values, spectral indices and textural features. S2 and L8 data comparisons were also extended using a common segmentation dataset extracted form VHR World-View 2 (WV2) imagery to test differences only due to their specific spectral contribution. The best band combinations to perform segmentation were found through a modified version of the Euclidian Distance 2 index. Four different RF classifications schemes were considered achieving 89.1%, 91.3%, 90.9% and 93.4% as the best overall accuracies respectively, evaluated over the whole study area.

  18. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    NASA Astrophysics Data System (ADS)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well along the primarily sandy Delmarva inner continental shelf, and outlines a method that can be used to efficiently and consistently produce surficial geologic interpretations of the seafloor from ground-truthed geophysical or hydrographic data.

  19. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  20. GRB 060614: a Fake Short Gamma-Ray Burst

    NASA Astrophysics Data System (ADS)

    Caito, L.; Bernardini, M. G.; Bianco, C. L.; Dainotti, M. G.; Guida, R.; Ruffini, R.

    2008-05-01

    The explosion of GRB 060614 produced a deep break in the GRB scenario and opened new horizons of investigation because it can't be traced back to any traditional scheme of classification. In fact, it has features both of long bursts and of short bursts and, above all, it is the first case of long duration near GRB without any bright Ib/c associated Supernova. We will show that, in our canonical GRB scenario [1], this ``anomalous'' situation finds a natural interpretation and allows us to discuss a possible variation to the traditional classification scheme, introducing the distinction between ``genuine'' and ``fake'' short bursts.

  1. GRB 060614: a progress report

    NASA Astrophysics Data System (ADS)

    Caito, L.; Bernardini, M. G.; Bianco, C. L.; Dainotti, M. G.; Guida, R.; Ruffini, R.

    2008-01-01

    The explosion of GRB 060614, detected by the Swift satellite, produced a deep break in the GRB scenario opening new horizons of investigation, because it can't be traced back to any traditional scheme of classification. In fact, it manifests peculiarities both of long bursts and of short bursts. Above all, it is the first case of long duration near GRB without any bright Ib/c associated Supernova. We will show that, in our canonical GRB scenario ([l]), this ``anomalous'' situation finds a natural interpretation and allows us to discuss a possible variation to the traditional classification scheme, introducing the distinction between ``genuine'' and ``fake'' short bursts.

  2. An objective daily Weather Type classification for Iberia since 1850; patterns, trends, variability and impact in precipitation

    NASA Astrophysics Data System (ADS)

    Ramos, A. M.; Trigo, R. M.; Lorenzo, M. N.; Vaquero, J. M.; Gallego, M. C.; Valente, M. A.; Gimeno, L.

    2009-04-01

    In recent years a large number of automated classifications of atmospheric circulation patterns have been published covering the entire European continent or specific sub-regions (Huth et al., 2008). This generalized use of objective classifications results from their relatively straightforward computation but crucially from their capacity to provide simple description of typical synoptic conditions as well as their climatic and environmental impact. For this purpose, the vast majority of authors has employed the Reanalyses datasets, namely from either NCEP/NCAR or ECMWF projects. However, both these widely used datasets suffer from important caveats, namely their restricted temporal coverage, that is limited to the last six decades (NCEP/NCAR since 1948 and ECMWF since 1958). This limitation has been partially mitigated by the recent availability of continuous daily mean sea level pressure obtained within the European project EMULATE, that extended the historic records over the extra-tropical Atlantic and Europe (70°-25° N by 70° W-50° E), for the period 1850 to the present (Ansell, T. J. et al. 2006). Here we have used the extended EMULATE dataset to construct an automated version of the Lamb Weather type (WTs) classification scheme (Jones et al 1993) adapted for the center of the Iberian Peninsula. We have identified 10 basic WTs (Cyclonic, Anticyclonic and 8 directional types) following a similar methodology to that previously adopted by Trigo and DaCamara, 2000 (for Portugal) and Lorenzo et al. 2008 (for Galicia, northwestern Iberia). We have evaluated trends of monthly/seasonal frequency of each WT for the entire period and several shorter periods. Finally, we use the long-term precipitation time series from Lisbon (recently digitized) and Cadiz (southern Spain) to evaluate, the impact of each WT on the precipitation regime. It is shown that the Anticyclonic (A) type, although being the most frequent class in winter, gives a rather small contribution to the winter precipitation amount, observed on a daily basis. On the other hand, the three wettest WTs, namely the Cyclonic (C), South-westerly (SW) and Westerly (W) types, together representing roughly a third of all winter days, do account for more than 60% of the observed daily precipitation. It is shown that the large inter-annual variability of precipitation in both cities is highly related with the corresponding inter-annual variability of the wet WTs. Ansell, T. J. et al. (2006) Daily mean sea level pressure reconstructions for the European - North Atlantic region for the period 1850-2003, Journal of Climate, 19, 2717-2742, doi: 10.1175/JCLI3775.1 Huth R., Beck C., Philipp A., Demuzere M, Ustrnul Z, Cahynová M., Kyselý J., Tveito O.E. (2008) Classifications of atmospheric circulation patterns: recent advances and applications. Trends and Directions in Climate Research: Ann. N.Y. Acad. Sci. 1146:, 105-152 Jones, P. D. , M. Hulme , K. R. Briffa. (1993) A comparison of Lamb circulation types with an objective classification scheme. Int. J. Climatol. 13: 655- 663. Lorenzo M.N., Taboada J.J. and Gimeno L. (2008) Links between circulation weather types and teleconnection patterns and their influence on precipitation patterns in Galicia (NW Spain). Int. J. Climatol. Published Online: Nov 12 2007 5:30AM DOI: 10.1002/joc.1646. Trigo R.M. and Da Camara C.C. (2000) Circulation weather types and their influence on the precipitation regime in Portugal. Int. J. Climatol., 20, 1559-1581.

  3. Classification in Astronomy: Past and Present

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric

    2012-03-01

    Astronomers have always classified celestial objects. The ancient Greeks distinguished between asteros, the fixed stars, and planetos, the roving stars. The latter were associated with the Gods and, starting with Plato in his dialog Timaeus, provided the first mathematical models of celestial phenomena. Giovanni Hodierna classified nebulous objects, seen with a Galilean refractor telescope in the mid-seventeenth century into three classes: "Luminosae," "Nebulosae," and "Occultae." A century later, Charles Messier compiled a larger list of nebulae, star clusters and galaxies, but did not attempt a classification. Classification of comets was a significant enterprise in the 19th century: Alexander (1850) considered two groups based on orbit sizes, Lardner (1853) proposed three groups of orbits, and Barnard (1891) divided them into two classes based on morphology. Aside from the segmentation of the bright stars into constellations, most stellar classifications were based on colors and spectral properties. During the 1860s, the pioneering spectroscopist Angelo Secchi classified stars into five classes: white, yellow, orange, carbon stars, and emission line stars. After many debates, the stellar spectral sequence was refined by the group at Harvard into the familiar OBAFGKM spectral types, later found to be a sequence on surface temperature (Cannon 1926). The spectral classification is still being extended with recent additions of O2 hot stars (Walborn et al. 2002) and L and T brown dwarfs (Kirkpatrick 2005). Townley (1913) reviews 30 years of variable star classification, emerging with six classes with five subclasses. The modern classification of variable stars has about 80 (sub)classes, and is still under debate (Samus 2009). Shortly after his confirmation that some nebulae are external galaxies, Edwin Hubble (1926) proposed his famous bifurcated classification of galaxy morphologies with three classes: ellipticals, spirals, and irregulars. These classes are still used today with many refinements by Gerard de Vaucouleurs and others. Supernovae, nearly all of which are found in external galaxies, have a complicated classification scheme:Type I with subtypes Ia, Ib, Ic, Ib/c pec and Type II with subtypes IIb, IIL, IIP, and IIn (Turatto 2003). The classification is based on elemental abundances in optical spectra and on optical light curve shapes. Tadhunter (2009) presents a three-dimensional classification of active galactic nuclei involving radio power, emission line width, and nuclear luminosity. These taxonomies have played enormously important roles in the development of astronomy, yet all were developed using heuristic methods. Many are based on qualitative and subjective assessments of spatial, temporal, or spectral properties. A qualitative, morphological approach to astronomical studies was explicitly promoted by Zwicky (1957). Other classifications are based on quantitative criteria, but these criteria were developed by subjective examination of training datasets. For example, starburst galaxies are discriminated from narrow-line Seyfert galaxies by a curved line in a diagramof the ratios of four emission lines (Veilleux and Osterbrock 1987). Class II young stellar objects have been defined by a rectangular region in a mid-infrared color-color diagram (Allen et al. 2004). Short and hard gamma-ray bursts are discriminated by a dip in the distribution of burst durations (Kouveliotou et al. 2000). In no case was a statistical or algorithmic procedure used to define the classes.

  4. Planetree health information services: public access to the health information people want.

    PubMed Central

    Cosgrove, T L

    1994-01-01

    In July 1981, the Planetree Health Resource Center opened on the San Francisco campus of California Pacific Medical Center (Pacific Presbyterian Medical Center). Planetree was founded on the belief that access to information can empower people and help them face health and medical challenges. The Health Resource Center was created to provide medical library and health information resources to the general public. Over the last twelve years, Planetree has tried to develop a consumer health library collection and information service that is responsive to the needs and interests of a diverse public. In an effort to increase accessibility to the medical literature, a consumer health library classification scheme was created for the organization of library materials. The scheme combines the specificity and sophistication of the National Library of Medicine classification scheme with the simplicity of common lay terminology. PMID:8136762

  5. User oriented ERTS-1 images. [vegetation identification in Canada through image enhancement

    NASA Technical Reports Server (NTRS)

    Shlien, S.; Goodenough, D.

    1974-01-01

    Photographic reproduction of ERTS-1 images are capable of displaying only a portion of the total information available from the multispectral scanner. Methods are being developed to generate ERTS-1 images oriented towards special users such as agriculturists, foresters, and hydrologists by applying image enhancement techniques and interactive statistical classification schemes. Spatial boundaries and linear features can be emphasized and delineated using simple filters. Linear and nonlinear transformations can be applied to the spectral data to emphasize certain ground information. An automatic classification scheme was developed to identify particular ground cover classes such as fallow, grain, rape seed or various vegetation covers. The scheme applies the maximum likelihood decision rule to the spectral information and classifies the ERTS-1 image on a pixel by pixel basis. Preliminary results indicate that the classifier has limited success in distinguishing crops, but is well adapted for identifying different types of vegetation.

  6. Three learning phases for radial-basis-function networks.

    PubMed

    Schwenker, F; Kestler, H A; Palm, G

    2001-05-01

    In this paper, learning algorithms for radial basis function (RBF) networks are discussed. Whereas multilayer perceptrons (MLP) are typically trained with backpropagation algorithms, starting the training procedure with a random initialization of the MLP's parameters, an RBF network may be trained in many different ways. We categorize these RBF training methods into one-, two-, and three-phase learning schemes. Two-phase RBF learning is a very common learning scheme. The two layers of an RBF network are learnt separately; first the RBF layer is trained, including the adaptation of centers and scaling parameters, and then the weights of the output layer are adapted. RBF centers may be trained by clustering, vector quantization and classification tree algorithms, and the output layer by supervised learning (through gradient descent or pseudo inverse solution). Results from numerical experiments of RBF classifiers trained by two-phase learning are presented in three completely different pattern recognition applications: (a) the classification of 3D visual objects; (b) the recognition hand-written digits (2D objects); and (c) the categorization of high-resolution electrocardiograms given as a time series (ID objects) and as a set of features extracted from these time series. In these applications, it can be observed that the performance of RBF classifiers trained with two-phase learning can be improved through a third backpropagation-like training phase of the RBF network, adapting the whole set of parameters (RBF centers, scaling parameters, and output layer weights) simultaneously. This, we call three-phase learning in RBF networks. A practical advantage of two- and three-phase learning in RBF networks is the possibility to use unlabeled training data for the first training phase. Support vector (SV) learning in RBF networks is a different learning approach. SV learning can be considered, in this context of learning, as a special type of one-phase learning, where only the output layer weights of the RBF network are calculated, and the RBF centers are restricted to be a subset of the training data. Numerical experiments with several classifier schemes including k-nearest-neighbor, learning vector quantization and RBF classifiers trained through two-phase, three-phase and support vector learning are given. The performance of the RBF classifiers trained through SV learning and three-phase learning are superior to the results of two-phase learning, but SV learning often leads to complex network structures, since the number of support vectors is not a small fraction of the total number of data points.

  7. Cloud cover determination in polar regions from satellite imagery

    NASA Technical Reports Server (NTRS)

    Barry, R. G.; Maslanik, J. A.; Key, J. R.

    1987-01-01

    A definition is undertaken of the spectral and spatial characteristics of clouds and surface conditions in the polar regions, and to the creation of calibrated, geometrically correct data sets suitable for quantitative analysis. Ways are explored in which this information can be applied to cloud classifications as new methods or as extensions to existing classification schemes. A methodology is developed that uses automated techniques to merge Advanced Very High Resolution Radiometer (AVHRR) and Scanning Multichannel Microwave Radiometer (SMMR) data, and to apply first-order calibration and zenith angle corrections to the AVHRR imagery. Cloud cover and surface types are manually interpreted, and manual methods are used to define relatively pure training areas to describe the textural and multispectral characteristics of clouds over several surface conditions. The effects of viewing angle and bidirectional reflectance differences are studied for several classes, and the effectiveness of some key components of existing classification schemes is tested.

  8. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    PubMed

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  9. Classification of Palmprint Using Principal Line

    NASA Astrophysics Data System (ADS)

    Prasad, Munaga V. N. K.; Kumar, M. K. Pramod; Sharma, Kuldeep

    In this paper, a new classification scheme for palmprint is proposed. Palmprint is one of the reliable physiological characteristics that can be used to authenticate an individual. Palmprint classification provides an important indexing mechanism in a very large palmprint database. Here, the palmprint database is initially categorized into two groups, right hand group and left hand group. Then, each group is further classified based on the distance traveled by principal line i.e. Heart Line During pre processing, a rectangular Region of Interest (ROI) in which only heart line is present, is extracted. Further, ROI is divided into 6 regions and depending upon the regions in which the heart line traverses the palmprint is classified accordingly. Consequently, our scheme allows 64 categories for each group forming a total number of 128 possible categories. The technique proposed in this paper includes only 15 such categories and it classifies not more than 20.96% of the images into a single category.

  10. Classification of topological phonons in linear mechanical metamaterials

    PubMed Central

    Süsstrunk, Roman

    2016-01-01

    Topological phononic crystals, alike their electronic counterparts, are characterized by a bulk–edge correspondence where the interior of a material dictates the existence of stable surface or boundary modes. In the mechanical setup, such surface modes can be used for various applications such as wave guiding, vibration isolation, or the design of static properties such as stable floppy modes where parts of a system move freely. Here, we provide a classification scheme of topological phonons based on local symmetries. We import and adapt the classification of noninteracting electron systems and embed it into the mechanical setup. Moreover, we provide an extensive set of examples that illustrate our scheme and can be used to generate models in unexplored symmetry classes. Our work unifies the vast recent literature on topological phonons and paves the way to future applications of topological surface modes in mechanical metamaterials. PMID:27482105

  11. WIRED for EC: New White Dwarfs with WISE Infrared Excesses and New Classification Schemes from the Edinburgh–Cape Blue Object Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennihy, E.; Clemens, J. C.; Dunlap, B. H.

    We present a simple method for identifying candidate white dwarf systems with dusty exoplanetary debris based on a single temperature blackbody model fit to the infrared excess. We apply this technique to a sample of Southern Hemisphere white dwarfs from the recently completed Edinburgh–Cape Blue Object Survey and identify four new promising dusty debris disk candidates. We demonstrate the efficacy of our selection method by recovering three of the four Spitzer confirmed dusty debris disk systems in our sample. Further investigation using archival high-resolution imaging shows that Spitzer data of the unrecovered fourth object is likely contaminated by a line-of-sightmore » object that either led to a misclassification as a dusty disk in the literature or is confounding our method. Finally, in our diagnostic plot, we show that dusty white dwarfs, which also host gaseous debris, lie along a boundary of our dusty debris disk region, providing clues to the origin and evolution of these especially interesting systems.« less

  12. Application of ERTS-1 imagery to the study of caribou movements and winter dispersal in relation to prevailing snowcover

    NASA Technical Reports Server (NTRS)

    Lent, P. C. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. A multiband classification scheme was applied to ERTS-1 MSS digital tape data in a portion of the Yukon Flats area. Primary analytic objectives of mapping the extent of recent wildfire burns and mature forest were realized illustrating application to moose and caribou biology. Additionally, the analysis indicated the presence of new lakes as well as disappearance of lakes present in 1956. Because this is an important waterfowl production area, similar analyses may have significant application potential to waterfowl biology for rapid updating of habitat information. Further field confirmation of this finding is required.

  13. Restoration of Wavelet-Compressed Images and Motion Imagery

    DTIC Science & Technology

    2004-01-01

    SECURITY CLASSIFICATION OF REPORT UNCLASSIFIED 18. SECURITY CLASSIFICATION OF THIS PAGE UNCLASSIFIED 19. SECURITY CLASSIFICATION...images is that they are global translates of each other, where 29 the global motion parameters are known. In a very simple sense , these five images form...Image Proc., vol. 1, Oct. 2001, pp. 185–188. [2] J. W. Woods and T. Naveen, “A filter based bit allocation scheme for subband compresion of HDTV,” IEEE

  14. Cross-mapping the ICNP with NANDA, HHCC, Omaha System and NIC for unified nursing language system development. International Classification for Nursing Practice. International Council of Nurses. North American Nursing Diagnosis Association. Home Health Care Classification. Nursing Interventions Classification.

    PubMed

    Hyun, S; Park, H A

    2002-06-01

    Nursing language plays an important role in describing and defining nursing phenomena and nursing actions. There are numerous vocabularies describing nursing diagnoses, interventions and outcomes in nursing. However, the lack of a standardized unified nursing language is considered a problem for further development of the discipline of nursing. In an effort to unify the nursing languages, the International Council of Nurses (ICN) has proposed the International Classification for Nursing Practice (ICNP) as a unified nursing language system. The purpose of this study was to evaluate the inclusiveness and expressiveness of the ICNP terms by cross-mapping them with the existing nursing terminologies, specifically the North American Nursing Diagnosis Association (NANDA) taxonomy I, the Omaha System, the Home Health Care Classification (HHCC) and the Nursing Interventions Classification (NIC). Nine hundred and seventy-four terms from these four classifications were cross-mapped with the ICNP terms. This was performed in accordance with the Guidelines for Composing a Nursing Diagnosis and Guidelines for Composing a Nursing Intervention, which were suggested by the ICNP development team. An expert group verified the results. The ICNP Phenomena Classification described 87.5% of the NANDA diagnoses, 89.7% of the HHCC diagnoses and 72.7% of the Omaha System problem classification scheme. The ICNP Action Classification described 79.4% of the NIC interventions, 80.6% of the HHCC interventions and 71.4% of the Omaha System intervention scheme. The results of this study suggest that the ICNP has a sound starting structure for a unified nursing language system and can be used to describe most of the existing terminologies. Recommendations for the addition of terms to the ICNP are provided.

  15. Determining the saliency of feature measurements obtained from images of sedimentary organic matter for use in its classification

    NASA Astrophysics Data System (ADS)

    Weller, Andrew F.; Harris, Anthony J.; Ware, J. Andrew; Jarvis, Paul S.

    2006-11-01

    The classification of sedimentary organic matter (OM) images can be improved by determining the saliency of image analysis (IA) features measured from them. Knowing the saliency of IA feature measurements means that only the most significant discriminating features need be used in the classification process. This is an important consideration for classification techniques such as artificial neural networks (ANNs), where too many features can lead to the 'curse of dimensionality'. The classification scheme adopted in this work is a hybrid of morphologically and texturally descriptive features from previous manual classification schemes. Some of these descriptive features are assigned to IA features, along with several others built into the IA software (Halcon) to ensure that a valid cross-section is available. After an image is captured and segmented, a total of 194 features are measured for each particle. To reduce this number to a more manageable magnitude, the SPSS AnswerTree Exhaustive CHAID (χ 2 automatic interaction detector) classification tree algorithm is used to establish each measurement's saliency as a classification discriminator. In the case of continuous data as used here, the F-test is used as opposed to the published algorithm. The F-test checks various statistical hypotheses about the variance of groups of IA feature measurements obtained from the particles to be classified. The aim is to reduce the number of features required to perform the classification without reducing its accuracy. In the best-case scenario, 194 inputs are reduced to 8, with a subsequent multi-layer back-propagation ANN recognition rate of 98.65%. This paper demonstrates the ability of the algorithm to reduce noise, help overcome the curse of dimensionality, and facilitate an understanding of the saliency of IA features as discriminators for sedimentary OM classification.

  16. Looking at Citations: Using Corpora in English for Academic Purposes.

    ERIC Educational Resources Information Center

    Thompson, Paul; Tribble, Chris

    2001-01-01

    Presents a classification scheme and the results of applying this scheme to the coding of academic texts in a corpus. The texts are doctoral theses from agricultural botany and agricultural economics departments. Results lead to a comparison of the citation practices of writers in different disciplines and the different rhetorical practices of…

  17. An unsupervised classification technique for multispectral remote sensing data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Cummings, R. E.

    1973-01-01

    Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.

  18. Unsupervised classification of earth resources data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.

    1972-01-01

    A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.

  19. Classification of cryocoolers

    NASA Technical Reports Server (NTRS)

    Walker, G.

    1985-01-01

    A great diversity of methods and mechanisms were devised to effect cryogenic refrigeration. The basic parameters and considerations affecting the selection of a particular system are reviewed. A classification scheme for mechanical cryocoolers is presented. An important distinguishing feature is the incorporation or not of a regenerative heat exchanger, of valves, and of the method for achieving a pressure variation.

  20. A satellite rainfall retrieval technique over northern Algeria based on the probability of rainfall intensities classification from MSG-SEVIRI

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2016-09-01

    In this paper, an algorithm based on the probability of rainfall intensities classification for rainfall estimation from Meteosat Second Generation/Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) has been developed. The classification scheme uses various spectral parameters of SEVIRI that provide information about cloud top temperature and optical and microphysical cloud properties. The presented method is developed and trained for the north of Algeria. The calibration of the method is carried out using as a reference rain classification fields derived from radar for rainy season from November 2006 to March 2007. Rainfall rates are assigned to rain areas previously identified and classified according to the precipitation formation processes. The comparisons between satellite-derived precipitation estimates and validation data show that the developed scheme performs reasonably well. Indeed, the correlation coefficient presents a significant level (r:0.87). The values of POD, POFD and FAR are 80%, 13% and 25%, respectively. Also, for a rainfall estimation of about 614 mm, the RMSD, Bias, MAD and PD indicate 102.06(mm), 2.18(mm), 68.07(mm) and 12.58, respectively.

  1. Texture as a basis for acoustic classification of substrate in the nearshore region

    NASA Astrophysics Data System (ADS)

    Dennison, A.; Wattrus, N. J.

    2016-12-01

    Segmentation and classification of substrate type from two locations in Lake Superior, are predicted using multivariate statistical processing of textural measures derived from shallow-water, high-resolution multibeam bathymetric data. During a multibeam sonar survey, both bathymetric and backscatter data are collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on substrate type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. Preliminary results from an analysis of bathymetric data and ground-truth samples collected from the Amnicon River, Superior, Wisconsin, and the Lester River, Duluth, Minnesota, demonstrate the ability to process and develop a novel classification scheme of the bottom type in two geomorphologically distinct areas.

  2. Classifying machinery condition using oil samples and binary logistic regression

    NASA Astrophysics Data System (ADS)

    Phillips, J.; Cripps, E.; Lau, John W.; Hodkiewicz, M. R.

    2015-08-01

    The era of big data has resulted in an explosion of condition monitoring information. The result is an increasing motivation to automate the costly and time consuming human elements involved in the classification of machine health. When working with industry it is important to build an understanding and hence some trust in the classification scheme for those who use the analysis to initiate maintenance tasks. Typically "black box" approaches such as artificial neural networks (ANN) and support vector machines (SVM) can be difficult to provide ease of interpretability. In contrast, this paper argues that logistic regression offers easy interpretability to industry experts, providing insight to the drivers of the human classification process and to the ramifications of potential misclassification. Of course, accuracy is of foremost importance in any automated classification scheme, so we also provide a comparative study based on predictive performance of logistic regression, ANN and SVM. A real world oil analysis data set from engines on mining trucks is presented and using cross-validation we demonstrate that logistic regression out-performs the ANN and SVM approaches in terms of prediction for healthy/not healthy engines.

  3. Using two classification schemes to develop vegetation indices of biological integrity for wetlands in West Virginia, USA.

    PubMed

    Veselka, Walter; Rentch, James S; Grafton, William N; Kordek, Walter S; Anderson, James T

    2010-11-01

    Bioassessment methods for wetlands, and other bodies of water, have been developed worldwide to measure and quantify changes in "biological integrity." These assessments are based on a classification system, meant to ensure appropriate comparisons between wetland types. Using a local site-specific disturbance gradient, we built vegetation indices of biological integrity (Veg-IBIs) based on two commonly used wetland classification systems in the USA: One based on vegetative structure and the other based on a wetland's position in a landscape and sources of water. The resulting class-specific Veg-IBIs were comprised of 1-5 metrics that varied in their sensitivity to the disturbance gradient (R2=0.14-0.65). Moreover, the sensitivity to the disturbance gradient increased as metrics from each of the two classification schemes were combined (added). Using this information to monitor natural and created wetlands will help natural resource managers track changes in biological integrity of wetlands in response to anthropogenic disturbance and allows the use of vegetative communities to set ecological performance standards for mitigation banks.

  4. Pāhoehoe, `a`ā, and block lava: an illustrated history of the nomenclature

    NASA Astrophysics Data System (ADS)

    Harris, Andrew J. L.; Rowland, Scott K.; Villeneuve, Nicolas; Thordarson, Thor

    2017-01-01

    Lava flows occur worldwide, and throughout history, various cultures (and geologists) have described flows based on their surface textures. As a result, surface morphology-based nomenclature schemes have been proposed in most languages to aid in the classification and distinction of lava surface types. One of the first to be published was likely the nine-class, Italian-language description-based classification proposed by Mario Gemmellaro in 1858. By far, the most commonly used terms to describe lava surfaces today are not descriptive but, instead, are merely words, specifically the Hawaiian words `a`ā (rough brecciated basalt lava) and pāhoehoe (smooth glassy basalt lava), plus block lava (thick brecciated lavas that are typically more silicic than basalt). `A`ā and pāhoehoe were introduced into the Western geological vocabulary by American geologists working in Hawai`i during the 1800s. They and other nineteenth century geologists proposed formal lava-type classification schemes for scientific use, and most of them used the Hawaiian words. In 1933, Ruy Finch added the third lava type, block lava, to the classification scheme, with the tripartite system being formalized in 1953 by Gordon Macdonald. More recently, particularly since the 1980s and based largely on studies of lava flow interiors, a number of sub-types and transitional forms of all three major lava types have been defined. This paper reviews the early history of the development of the pāhoehoe, `a`ā, and block lava-naming system and presents a new descriptive classification so as to break out the three parental lava types into their many morphological sub-types.

  5. Validating the Danish adaptation of the World Health Organization's International Classification for Patient Safety classification of patient safety incident types

    PubMed Central

    Mikkelsen, Kim Lyngby; Thommesen, Jacob; Andersen, Henning Boje

    2013-01-01

    Objectives Validation of a Danish patient safety incident classification adapted from the World Health Organizaton's International Classification for Patient Safety (ICPS-WHO). Design Thirty-three hospital safety management experts classified 58 safety incident cases selected to represent all types and subtypes of the Danish adaptation of the ICPS (ICPS-DK). Outcome Measures Two measures of inter-rater agreement: kappa and intra-class correlation (ICC). Results An average number of incident types used per case per rater was 2.5. The mean ICC was 0.521 (range: 0.199–0.809) and the mean kappa was 0.513 (range: 0.193–0.804). Kappa and ICC showed high correlation (r = 0.99). An inverse correlation was found between the prevalence of type and inter-rater reliability. Results are discussed according to four factors known to determine the inter-rater agreement: skill and motivation of raters; clarity of case descriptions; clarity of the operational definitions of the types and the instructions guiding the coding process; adequacy of the underlying classification scheme. Conclusions The incident types of the ICPS-DK are adequate, exhaustive and well suited for classifying and structuring incident reports. With a mean kappa a little above 0.5 the inter-rater agreement of the classification system is considered ‘fair’ to ‘good’. The wide variation in the inter-rater reliability and low reliability and poor discrimination among the highly prevalent incident types suggest that for these types, precisely defined incident sub-types may be preferred. This evaluation of the reliability and usability of WHO's ICPS should be useful for healthcare administrations that consider or are in the process of adapting the ICPS. PMID:23287641

  6. TFOS DEWS II Definition and Classification Report.

    PubMed

    Craig, Jennifer P; Nichols, Kelly K; Akpek, Esen K; Caffery, Barbara; Dua, Harminder S; Joo, Choun-Ki; Liu, Zuguo; Nelson, J Daniel; Nichols, Jason J; Tsubota, Kazuo; Stapleton, Fiona

    2017-07-01

    The goals of the TFOS DEWS II Definition and Classification Subcommittee were to create an evidence-based definition and a contemporary classification system for dry eye disease (DED). The new definition recognizes the multifactorial nature of dry eye as a disease where loss of homeostasis of the tear film is the central pathophysiological concept. Ocular symptoms, as a broader term that encompasses reports of discomfort or visual disturbance, feature in the definition and the key etiologies of tear film instability, hyperosmolarity, and ocular surface inflammation and damage were determined to be important for inclusion in the definition. In the light of new data, neurosensory abnormalities were also included in the definition for the first time. In the classification of DED, recent evidence supports a scheme based on the pathophysiology where aqueous deficient and evaporative dry eye exist as a continuum, such that elements of each are considered in diagnosis and management. Central to the scheme is a positive diagnosis of DED with signs and symptoms, and this is directed towards management to restore homeostasis. The scheme also allows consideration of various related manifestations, such as non-obvious disease involving ocular surface signs without related symptoms, including neurotrophic conditions where dysfunctional sensation exists, and cases where symptoms exist without demonstrable ocular surface signs, including neuropathic pain. This approach is not intended to override clinical assessment and judgment but should prove helpful in guiding clinical management and research. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Taxonomy of breast cancer based on normal cell phenotype predicts outcome

    PubMed Central

    Santagata, Sandro; Thakkar, Ankita; Ergonul, Ayse; Wang, Bin; Woo, Terri; Hu, Rong; Harrell, J. Chuck; McNamara, George; Schwede, Matthew; Culhane, Aedin C.; Kindelberger, David; Rodig, Scott; Richardson, Andrea; Schnitt, Stuart J.; Tamimi, Rulla M.; Ince, Tan A.

    2014-01-01

    Accurate classification is essential for understanding the pathophysiology of a disease and can inform therapeutic choices. For hematopoietic malignancies, a classification scheme based on the phenotypic similarity between tumor cells and normal cells has been successfully used to define tumor subtypes; however, use of normal cell types as a reference by which to classify solid tumors has not been widely emulated, in part due to more limited understanding of epithelial cell differentiation compared with hematopoiesis. To provide a better definition of the subtypes of epithelial cells comprising the breast epithelium, we performed a systematic analysis of a large set of breast epithelial markers in more than 15,000 normal breast cells, which identified 11 differentiation states for normal luminal cells. We then applied information from this analysis to classify human breast tumors based on normal cell types into 4 major subtypes, HR0–HR3, which were differentiated by vitamin D, androgen, and estrogen hormone receptor (HR) expression. Examination of 3,157 human breast tumors revealed that these HR subtypes were distinct from the current classification scheme, which is based on estrogen receptor, progesterone receptor, and human epidermal growth factor receptor 2. Patient outcomes were best when tumors expressed all 3 hormone receptors (subtype HR3) and worst when they expressed none of the receptors (subtype HR0). Together, these data provide an ontological classification scheme associated with patient survival differences and provides actionable insights for treating breast tumors. PMID:24463450

  8. Multi-stage robust scheme for citrus identification from high resolution airborne images

    NASA Astrophysics Data System (ADS)

    Amorós-López, Julia; Izquierdo Verdiguier, Emma; Gómez-Chova, Luis; Muñoz-Marí, Jordi; Zoilo Rodríguez-Barreiro, Jorge; Camps-Valls, Gustavo; Calpe-Maravilla, Javier

    2008-10-01

    Identification of land cover types is one of the most critical activities in remote sensing. Nowadays, managing land resources by using remote sensing techniques is becoming a common procedure to speed up the process while reducing costs. However, data analysis procedures should satisfy the accuracy figures demanded by institutions and governments for further administrative actions. This paper presents a methodological scheme to update the citrus Geographical Information Systems (GIS) of the Comunidad Valenciana autonomous region, Spain). The proposed approach introduces a multi-stage automatic scheme to reduce visual photointerpretation and ground validation tasks. First, an object-oriented feature extraction process is carried out for each cadastral parcel from very high spatial resolution (VHR) images (0.5m) acquired in the visible and near infrared. Next, several automatic classifiers (decision trees, multilayer perceptron, and support vector machines) are trained and combined to improve the final accuracy of the results. The proposed strategy fulfills the high accuracy demanded by policy makers by means of combining automatic classification methods with visual photointerpretation available resources. A level of confidence based on the agreement between classifiers allows us an effective management by fixing the quantity of parcels to be reviewed. The proposed methodology can be applied to similar problems and applications.

  9. A Classification Scheme for Glaciological AVA Responses

    NASA Astrophysics Data System (ADS)

    Booth, A.; Emir, E.

    2014-12-01

    A classification scheme is proposed for amplitude vs. angle (AVA) responses as an aid to the interpretation of seismic reflectivity in glaciological research campaigns. AVA responses are a powerful tool in characterising the material properties of glacier ice and its substrate. However, before interpreting AVA data, careful true amplitude processing is required to constrain basal reflectivity and compensate amplitude decay mechanisms, including anelastic attenuation and spherical divergence. These fundamental processing steps can be difficult to design in cases of noisy data, e.g. where a target reflection is contaminated by surface wave energy (in the case of shallow glaciers) or by energy reflected from out of the survey plane. AVA methods have equally powerful usage in estimating the fluid fill of potential hydrocarbon reservoirs. However, such applications seldom use true amplitude data and instead consider qualitative AVA responses using a well-defined classification scheme. Such schemes are often defined in terms of the characteristics of best-fit responses to the observed reflectivity, e.g. the intercept (I) and gradient (G) of a linear approximation to the AVA data. The position of the response on a cross-plot of I and G then offers a diagnostic attribute for certain fluid types. We investigate the advantages in glaciology of emulating this practice, and develop a cross-plot based on the 3-term Shuey AVA approximation (using I, G, and a curvature term C). Model AVA curves define a clear lithification trend: AVA responses to stiff (lithified) substrates fall discretely into one quadrant of the cross-plot, with positive I and negative G, whereas those to fluid-rich substrates plot diagonally opposite (in the negative I and positive G quadrant). The remaining quadrants are unoccupied by plausible single-layer responses and may therefore be diagnostic of complex thin-layer reflectivity, and the magnitude and polarity of the C term serves as a further indicator of fluid content. The use of the AVA cross-plot is explored for seismic data from European Arctic glaciers, including Storglaciären and Midtre Lovénbreen, with additional examples from other published sources. The classification scheme should provide a useful reference for the initial assessment of a glaciological AVA response.

  10. Non-invasive classification of gas-liquid two-phase horizontal flow regimes using an ultrasonic Doppler sensor and a neural network

    NASA Astrophysics Data System (ADS)

    Musa Abbagoni, Baba; Yeung, Hoi

    2016-08-01

    The identification of flow pattern is a key issue in multiphase flow which is encountered in the petrochemical industry. It is difficult to identify the gas-liquid flow regimes objectively with the gas-liquid two-phase flow. This paper presents the feasibility of a clamp-on instrument for an objective flow regime classification of two-phase flow using an ultrasonic Doppler sensor and an artificial neural network, which records and processes the ultrasonic signals reflected from the two-phase flow. Experimental data is obtained on a horizontal test rig with a total pipe length of 21 m and 5.08 cm internal diameter carrying air-water two-phase flow under slug, elongated bubble, stratified-wavy and, stratified flow regimes. Multilayer perceptron neural networks (MLPNNs) are used to develop the classification model. The classifier requires features as an input which is representative of the signals. Ultrasound signal features are extracted by applying both power spectral density (PSD) and discrete wavelet transform (DWT) methods to the flow signals. A classification scheme of ‘1-of-C coding method for classification’ was adopted to classify features extracted into one of four flow regime categories. To improve the performance of the flow regime classifier network, a second level neural network was incorporated by using the output of a first level networks feature as an input feature. The addition of the two network models provided a combined neural network model which has achieved a higher accuracy than single neural network models. Classification accuracies are evaluated in the form of both the PSD and DWT features. The success rates of the two models are: (1) using PSD features, the classifier missed 3 datasets out of 24 test datasets of the classification and scored 87.5% accuracy; (2) with the DWT features, the network misclassified only one data point and it was able to classify the flow patterns up to 95.8% accuracy. This approach has demonstrated the success of a clamp-on ultrasound sensor for flow regime classification that would be possible in industry practice. It is considerably more promising than other techniques as it uses a non-invasive and non-radioactive sensor.

  11. Analysis of terrestrial and Martian volcanic compositions using thermal emission spectroscopy

    NASA Astrophysics Data System (ADS)

    Wyatt, Michael Bruce

    2002-11-01

    This dissertation comprises four separate parts that address the Mars Global Surveyor (MGS) Thermal Emission Spectrometer (TES) investigation objective of determining and mapping the composition and distribution of surface minerals and rocks on Mars from orbit. In Part 1, laboratory thermal infrared spectra (5 25 μm, at 2 cm-1 spectral sampling), deconvolved modal mineralogies, and derived mineral and bulk rock chemistries of basalt, basaltic andesite, andesite, and dacite were used to evaluate and revise volcanic rock classification schemes. Multiple steps of classification were required to distinguish volcanic rocks, reflecting the mineralogic diversity and continuum of compositions that exists in volcanic rock types. In Part 2, laboratory spectral data were convolved to TES 10 cm-1 sampling to ascertain whether adequate results for volcanic rock classification can be obtained with lower spectral resolution, comparable to that obtained from Mars orbit. Modeled spectra, modeled modal mineralogies, and derived bulk rock chemistries at low (10 cm-1) spectral sampling provide good matches to measured and high (2 cm-1) spectral sampling modeled values. These results demonstrate the feasibility of using similar techniques and classification schemes for the interpretation of terrestrial laboratory samples and TES-resolution data. In Part 3, new deconvolved mineral abundances from TES data and terrestrial basalts using a spectral end-member set representing minerals common in unaltered and low-temperature aqueously altered basalts were used to reclassify martian surface lithologies. The new formulations maintain the dominance of unaltered basalt in the southern highlands, but indicate the northern lowlands can be interpreted as weathered basalt. The coincidence between locations of altered basalt and a previously suggested northern ocean basin implies that lowland plains materials may be basalts altered under submarine conditions and/or weathered basaltic sediment transported into this depocenter. In Part 4, results from the previous parts are applied to examine the distribution of TES-derived surface compositions in the Oxia Palus region on Mars through high-spatial resolution mapping. Features of interest within Oxia Palus include volcanic/sedimentary materials in southern Acidalia Planitia, low-albedo crater floors and wind streaks in western Arabia Terra, and channel outflow deposits of the Mars Pathfinder (MP) landing site.

  12. VizieR Online Data Catalog: Catalogue of Stellar Spectral Classifications (Skiff, 2003)

    NASA Astrophysics Data System (ADS)

    Skiff, A. B.

    2003-07-01

    This file contains spectral classifications for stars collected from the literature, serving as a continuation of the compilations produced by the Jascheks, by Kennedy, and by Buscombe. The source of each spectral type is indicated by a standard 19-digit bibcode citation. The stars are identified either by the name used in each publication or by a valid SIMBAD identifier. Some effort has been made to determine accurate (~1" or better) coordinates for equinox J2000, and these serve as a secondary identifier. Magnitudes are provided as an indication of brightness, but these data are not necessarily accurate, as they often derive from photographic photometry or rough estimates. The classifications include MK types as well as types not strictly on the MK system (white dwarfs, Wolf-Rayet, etc), and in addition simple HD-style temperature types. Luminosity classes in the early Mount Wilson style (e.g. 'd' for dwarf, 'g' for giant) and other similar schemes have been converted to modern notation. Since a citation is provided for each entry, the source paper should be consulted for details about classification schemes, spectral dispersion, and instrumentation used. The file includes only spectral types determined from spectra (viz. line and band strengths or ratios), omitting those determined from photometry (e.g. DDO, Vilnius) or inferred from broadband colors or bulk spectral energy distributions. The catalogue includes for the first time results from many large-scale objective-prism spectral surveys done at Case, Stockholm, Crimea, Abastumani, and elsewhere. The stars in these surveys were usually identified only on charts or by other indirect means, and have been overlooked heretofore because of the difficulty in recovering the stars. More complete results from these separate publications, including notes and identifications, have been made available to the CDS, and are kept at the Lowell Observatory ftp area (ftp://ftp.lowell.edu/pub/bas/starcats). Not all of these stars are present in SIMBAD. As a 'living catalogue', an attempt will be made to keep up with current literature, and to extend the indexing of citations back in time. (1 data file).

  13. Visual classification of feral cat Felis silvestris catus vocalizations

    PubMed Central

    Owens, Jessica L.; Olsen, Mariana; Fontaine, Amy; Kloth, Christopher; Kershenbaum, Arik

    2017-01-01

    Abstract Cat vocal behavior, in particular, the vocal and social behavior of feral cats, is poorly understood, as are the differences between feral and fully domestic cats. The relationship between feral cat social and vocal behavior is important because of the markedly different ecology of feral and domestic cats, and enhanced comprehension of the repertoire and potential information content of feral cat calls can provide both better understanding of the domestication and socialization process, and improved welfare for feral cats undergoing adoption. Previous studies have used conflicting classification schemes for cat vocalizations, often relying on onomatopoeic or popular descriptions of call types (e.g., “miow”). We studied the vocalizations of 13 unaltered domestic cats that complied with our behavioral definition used to distinguish feral cats from domestic. A total of 71 acoustic units were extracted and visually analyzed for the construction of a hierarchical classification of vocal sounds, based on acoustic properties. We identified 3 major categories (tonal, pulse, and broadband) that further breakdown into 8 subcategories, and show a high degree of reliability when sounds are classified blindly by independent observers (Fleiss’ Kappa K = 0.863). Due to the limited behavioral contexts in this study, additional subcategories of cat vocalizations may be identified in the future, but our hierarchical classification system allows for the addition of new categories and new subcategories as they are described. This study shows that cat vocalizations are diverse and complex, and provides an objective and reliable classification system that can be used in future studies. PMID:29491992

  14. The EpiOcular™ Eye Irritation Test is the Method of Choice for the In Vitro Eye Irritation Testing of Agrochemical Formulations: Correlation Analysis of EpiOcular Eye Irritation Test and BCOP Test Data According to the UN GHS, US EPA and Brazil ANVISA Classification Schemes.

    PubMed

    Kolle, Susanne N; Rey Moreno, Maria Cecilia; Mayer, Winfried; van Cott, Andrew; van Ravenzwaay, Bennard; Landsiedel, Robert

    2015-07-01

    The Bovine Corneal Opacity and Permeability (BCOP) test is commonly used for the identification of severe ocular irritants (GHS Category 1), but it is not recommended for the identification of ocular irritants (GHS Category 2). The incorporation of human reconstructed tissue model-based tests into a tiered test strategy to identify ocular non-irritants and replace the Draize rabbit eye irritation test has been suggested (OECD TG 405). The value of the EpiOcular™ Eye Irritation Test (EIT) for the prediction of ocular non-irritants (GHS No Category) has been demonstrated, and an OECD Test Guideline (TG) was drafted in 2014. The purpose of this study was to evaluate whether the BCOP test, in conjunction with corneal histopathology (as suggested for the evaluation of the depth of the injury( and/or the EpiOcular-EIT, could be used to predict the eye irritation potential of agrochemical formulations according to the UN GHS, US EPA and Brazil ANVISA classification schemes. We have assessed opacity, permeability and histopathology in the BCOP assay, and relative tissue viability in the EpiOcular-EIT, for 97 agrochemical formulations with available in vivo eye irritation data. By using the OECD TG 437 protocol for liquids, the BCOP test did not result in sufficient correct predictions of severe ocular irritants for any of the three classification schemes. The lack of sensitivity could be improved somewhat by the inclusion of corneal histopathology, but the relative viability in the EpiOcular-EIT clearly outperformed the BCOP test for all three classification schemes. The predictive capacity of the EpiOcular-EIT for ocular non-irritants (UN GHS No Category) for the 97 agrochemical formulations tested (91% sensitivity, 72% specificity and 82% accuracy for UN GHS classification) was comparable to that obtained in the formal validation exercise underlying the OECD draft TG. We therefore conclude that the EpiOcular-EIT is currently the best in vitro method for the prediction of the eye irritation potential of liquid agrochemical formulations. 2015 FRAME.

  15. Automated identification of sleep states from EEG signals by means of ensemble empirical mode decomposition and random under sampling boosting.

    PubMed

    Hassan, Ahnaf Rashik; Bhuiyan, Mohammed Imamul Hassan

    2017-03-01

    Automatic sleep staging is essential for alleviating the burden of the physicians of analyzing a large volume of data by visual inspection. It is also a precondition for making an automated sleep monitoring system feasible. Further, computerized sleep scoring will expedite large-scale data analysis in sleep research. Nevertheless, most of the existing works on sleep staging are either multichannel or multiple physiological signal based which are uncomfortable for the user and hinder the feasibility of an in-home sleep monitoring device. So, a successful and reliable computer-assisted sleep staging scheme is yet to emerge. In this work, we propose a single channel EEG based algorithm for computerized sleep scoring. In the proposed algorithm, we decompose EEG signal segments using Ensemble Empirical Mode Decomposition (EEMD) and extract various statistical moment based features. The effectiveness of EEMD and statistical features are investigated. Statistical analysis is performed for feature selection. A newly proposed classification technique, namely - Random under sampling boosting (RUSBoost) is introduced for sleep stage classification. This is the first implementation of EEMD in conjunction with RUSBoost to the best of the authors' knowledge. The proposed feature extraction scheme's performance is investigated for various choices of classification models. The algorithmic performance of our scheme is evaluated against contemporary works in the literature. The performance of the proposed method is comparable or better than that of the state-of-the-art ones. The proposed algorithm gives 88.07%, 83.49%, 92.66%, 94.23%, and 98.15% for 6-state to 2-state classification of sleep stages on Sleep-EDF database. Our experimental outcomes reveal that RUSBoost outperforms other classification models for the feature extraction framework presented in this work. Besides, the algorithm proposed in this work demonstrates high detection accuracy for the sleep states S1 and REM. Statistical moment based features in the EEMD domain distinguish the sleep states successfully and efficaciously. The automated sleep scoring scheme propounded herein can eradicate the onus of the clinicians, contribute to the device implementation of a sleep monitoring system, and benefit sleep research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Creating a Canonical Scientific and Technical Information Classification System for NCSTRL+

    NASA Technical Reports Server (NTRS)

    Tiffany, Melissa E.; Nelson, Michael L.

    1998-01-01

    The purpose of this paper is to describe the new subject classification system for the NCSTRL+ project. NCSTRL+ is a canonical digital library (DL) based on the Networked Computer Science Technical Report Library (NCSTRL). The current NCSTRL+ classification system uses the NASA Scientific and Technical (STI) subject classifications, which has a bias towards the aerospace, aeronautics, and engineering disciplines. Examination of other scientific and technical information classification systems showed similar discipline-centric weaknesses. Traditional, library-oriented classification systems represented all disciplines, but were too generalized to serve the needs of a scientific and technically oriented digital library. Lack of a suitable existing classification system led to the creation of a lightweight, balanced, general classification system that allows the mapping of more specialized classification schemes into the new framework. We have developed the following classification system to give equal weight to all STI disciplines, while being compact and lightweight.

  17. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach. [Kansas

    NASA Technical Reports Server (NTRS)

    Hixson, M. M.; Bauer, M. E.; Davis, B. J.

    1979-01-01

    The effect of sampling on the accuracy (precision and bias) of crop area estimates made from classifications of LANDSAT MSS data was investigated. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plants. Four sampling schemes involving different numbers of samples and different size sampling units were evaluated. The precision of the wheat area estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling unit size.

  18. The Evolution of Complex Microsurgical Midface Reconstruction: A Classification Scheme and Reconstructive Algorithm.

    PubMed

    Alam, Daniel; Ali, Yaseen; Klem, Christopher; Coventry, Daniel

    2016-11-01

    Orbito-malar reconstruction after oncological resection represents one of the most challenging facial reconstructive procedures. Until the last few decades, rehabilitation was typically prosthesis based with a limited role for surgery. The advent of microsurgical techniques allowed large-volume tissue reconstitution from a distant donor site, revolutionizing the potential approaches to these defects. The authors report a novel surgery-based algorithm and a classification scheme for complete midface reconstruction with a foundation in the Gillies principles of like-to-like reconstruction and with a significant role of computer-aided virtual planning. With this approach, the authors have been able to achieve significantly better patient outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Introduction to the Apollo collections: Part 2: Lunar breccias

    NASA Technical Reports Server (NTRS)

    Mcgee, P. E.; Simonds, C. H.; Warner, J. L.; Phinney, W. C.

    1979-01-01

    Basic petrographic, chemical and age data for a representative suite of lunar breccias are presented for students and potential lunar sample investigators. Emphasis is on sample description and data presentation. Samples are listed, together with a classification scheme based on matrix texture and mineralogy and the nature and abundance of glass present both in the matrix and as clasts. A calculus of the classification scheme, describes the characteristic features of each of the breccia groups. The cratering process which describes the sequence of events immediately following an impact event is discussed, especially the thermal and material transport processes affecting the two major components of lunar breccias (clastic debris and fused material).

  20. Robust Transmission of H.264/AVC Streams Using Adaptive Group Slicing and Unequal Error Protection

    NASA Astrophysics Data System (ADS)

    Thomos, Nikolaos; Argyropoulos, Savvas; Boulgouris, Nikolaos V.; Strintzis, Michael G.

    2006-12-01

    We present a novel scheme for the transmission of H.264/AVC video streams over lossy packet networks. The proposed scheme exploits the error-resilient features of H.264/AVC codec and employs Reed-Solomon codes to protect effectively the streams. A novel technique for adaptive classification of macroblocks into three slice groups is also proposed. The optimal classification of macroblocks and the optimal channel rate allocation are achieved by iterating two interdependent steps. Dynamic programming techniques are used for the channel rate allocation process in order to reduce complexity. Simulations clearly demonstrate the superiority of the proposed method over other recent algorithms for transmission of H.264/AVC streams.

  1. The Why, What, and Impact of GPA at Oxford Brookes University

    ERIC Educational Resources Information Center

    Andrews, Matthew

    2016-01-01

    This paper examines the introduction at Oxford Brookes University of a Grade Point Average (GPA) scheme alongside the traditional honours degree classification. It considers the reasons for the introduction of GPA, the way in which the scheme was implemented, and offers an insight into the impact of GPA at Brookes. Finally, the paper considers…

  2. Regional assessment of lake ecological states using Landsat: A classification scheme for alkaline-saline, flamingo lakes in the East African Rift Valley

    NASA Astrophysics Data System (ADS)

    Tebbs, E. J.; Remedios, J. J.; Avery, S. T.; Rowland, C. S.; Harper, D. M.

    2015-08-01

    In situ reflectance measurements and Landsat satellite imagery were combined to develop an optical classification scheme for alkaline-saline lakes in the Eastern Rift Valley. The classification allows the ecological state and consequent value, in this case to Lesser Flamingos, to be determined using Landsat satellite imagery. Lesser Flamingos depend on a network of 15 alkaline-saline lakes in East African Rift Valley, where they feed by filtering cyanobacteria and benthic diatoms from the lakes' waters. The classification developed here was based on a decision tree which used the reflectance in Landsat ETM+ bands 2-4 to assign one of six classes: low phytoplankton biomass; suspended sediment-dominated; microphytobenthos; high cyanobacterial biomass; cyanobacterial scum and bleached cyanobacterial scum. The classification accuracy was 77% when verified against in situ measurements. Classified imagery and timeseries were produced for selected lakes, which show the different ecological behaviours of these complex systems. The results have highlighted the importance to flamingos of the food resources offered by the extremely remote Lake Logipi. This study has demonstrated the potential of high spatial resolution, low spectral resolution sensors for providing ecologically valuable information at a regional scale, for alkaline-saline lakes and similar hypereutrophic inland waters.

  3. Computer-aided Classification of Mammographic Masses Using Visually Sensitive Image Features

    PubMed Central

    Wang, Yunzhi; Aghaei, Faranak; Zarafshani, Ali; Qiu, Yuchen; Qian, Wei; Zheng, Bin

    2017-01-01

    Purpose To develop a new computer-aided diagnosis (CAD) scheme that computes visually sensitive image features routinely used by radiologists to develop a machine learning classifier and distinguish between the malignant and benign breast masses detected from digital mammograms. Methods An image dataset including 301 breast masses was retrospectively selected. From each segmented mass region, we computed image features that mimic five categories of visually sensitive features routinely used by radiologists in reading mammograms. We then selected five optimal features in the five feature categories and applied logistic regression models for classification. A new CAD interface was also designed to show lesion segmentation, computed feature values and classification score. Results Areas under ROC curves (AUC) were 0.786±0.026 and 0.758±0.027 when to classify mass regions depicting on two view images, respectively. By fusing classification scores computed from two regions, AUC increased to 0.806±0.025. Conclusion This study demonstrated a new approach to develop CAD scheme based on 5 visually sensitive image features. Combining with a “visual aid” interface, CAD results may be much more easily explainable to the observers and increase their confidence to consider CAD generated classification results than using other conventional CAD approaches, which involve many complicated and visually insensitive texture features. PMID:27911353

  4. Wittgenstein's philosophy and a dimensional approach to the classification of mental disorders -- a preliminary scheme.

    PubMed

    Mackinejad, Kioumars; Sharifi, Vandad

    2006-01-01

    In this paper the importance of Wittgenstein's philosophical ideas for the justification of a dimensional approach to the classification of mental disorders is discussed. Some of his basic concepts in his Philosophical Investigations, such as 'family resemblances', 'grammar' and 'language-game' and their relations to the concept of mental disorder are explored.

  5. Classification Scheme for Items in CAAT.

    ERIC Educational Resources Information Center

    Epstein, Marion G.

    In planning the development of the system for computer assisted assembly of tests, it was agreed at the outset that one of the basic requirements for the successful initiation of any such system would be the development of a detailed item content classification system. The design of the system for classifying item content is a key element in…

  6. Mutual information-based analysis of JPEG2000 contexts.

    PubMed

    Liu, Zhen; Karam, Lina J

    2005-04-01

    Context-based arithmetic coding has been widely adopted in image and video compression and is a key component of the new JPEG2000 image compression standard. In this paper, the contexts used in JPEG2000 are analyzed using the mutual information, which is closely related to the compression performance. We first show that, when combining the contexts, the mutual information between the contexts and the encoded data will decrease unless the conditional probability distributions of the combined contexts are the same. Given I, the initial number of contexts, and F, the final desired number of contexts, there are S(I, F) possible context classification schemes where S(I, F) is called the Stirling number of the second kind. The optimal classification scheme is the one that gives the maximum mutual information. Instead of using an exhaustive search, the optimal classification scheme can be obtained through a modified generalized Lloyd algorithm with the relative entropy as the distortion metric. For binary arithmetic coding, the search complexity can be reduced by using dynamic programming. Our experimental results show that the JPEG2000 contexts capture the correlations among the wavelet coefficients very well. At the same time, the number of contexts used as part of the standard can be reduced without loss in the coding performance.

  7. Parameter diagnostics of phases and phase transition learning by neural networks

    NASA Astrophysics Data System (ADS)

    Suchsland, Philippe; Wessel, Stefan

    2018-05-01

    We present an analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer. Such shallow neural networks were previously found to be efficient in classifying phases and locating phase transitions of various basic model systems. In order to rationalize the emergence of the classification process and for identifying any underlying physical quantities, it is feasible to examine the weight matrices and the convolutional filter kernels that result from the learning process of such shallow networks. Furthermore, we demonstrate how the learning-by-confusing scheme can be used, in combination with a simple threshold-value classification method, to diagnose the learning parameters of neural networks. In particular, we study the classification process of both fully-connected and convolutional neural networks for the two-dimensional Ising model with extended domain wall configurations included in the low-temperature regime. Moreover, we consider the two-dimensional XY model and contrast the performance of the learning-by-confusing scheme and convolutional neural networks trained on bare spin configurations to the case of preprocessed samples with respect to vortex configurations. We discuss these findings in relation to similar recent investigations and possible further applications.

  8. Psychological Features and Their Relationship to Movement-Based Subgroups in People Living With Low Back Pain.

    PubMed

    Karayannis, Nicholas V; Jull, Gwendolen A; Nicholas, Michael K; Hodges, Paul W

    2018-01-01

    To determine the distribution of higher psychological risk features within movement-based subgroups for people with low back pain (LBP). Cross-sectional observational study. Participants were recruited from physiotherapy clinics and community advertisements. Measures were collected at a university outpatient-based physiotherapy clinic. People (N=102) seeking treatment for LBP. Participants were subgrouped according to 3 classification schemes: Mechanical Diagnosis and Treatment (MDT), Treatment-Based Classification (TBC), and O'Sullivan Classification (OSC). Questionnaires were used to categorize low-, medium-, and high-risk features based on depression, anxiety, and stress (Depression, Anxiety, and Stress Scale-21 Items); fear avoidance (Fear-Avoidance Beliefs Questionnaire); catastrophizing and coping (Pain-Related Self-Symptoms Scale); and self-efficacy (Pain Self-Efficacy Questionnaire). Psychological risk profiles were compared between movement-based subgroups within each scheme. Scores across all questionnaires revealed that most patients had low psychological risk profiles, but there were instances of higher (range, 1%-25%) risk profiles within questionnaire components. The small proportion of individuals with higher psychological risk scores were distributed between subgroups across TBC, MDT, and OSC schemes. Movement-based subgrouping alone cannot inform on individuals with higher psychological risk features. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. 14 CFR Section 6 - Objective Classification of Balance Sheet Elements

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Objective Classification of Balance Sheet... AIR CARRIERS Balance Sheet Classifications Section 6 Objective Classification of Balance Sheet...) Record here all general and working funds available on demand as of the date of the balance sheet which...

  10. 14 CFR Section 6 - Objective Classification of Balance Sheet Elements

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Objective Classification of Balance Sheet... AIR CARRIERS Balance Sheet Classifications Section 6 Objective Classification of Balance Sheet...) Record here all general and working funds available on demand as of the date of the balance sheet which...

  11. Relationship of plasma N-terminal pro-brain natriuretic peptide concentrations to heart failure classification and cause of respiratory distress in dogs using a 2nd generation ELISA assay.

    PubMed

    Fox, P R; Oyama, M A; Hezzell, M J; Rush, J E; Nguyenba, T P; DeFrancesco, T C; Lehmkuhl, L B; Kellihan, H B; Bulmer, B; Gordon, S G; Cunningham, S M; MacGregor, J; Stepien, R L; Lefbom, B; Adin, D; Lamb, K

    2015-01-01

    Cardiac biomarkers provide objective data that augments clinical assessment of heart disease (HD). Determine the utility of plasma N-terminal pro-brain natriuretic peptide concentration [NT-proBNP] measured by a 2nd generation canine ELISA assay to discriminate cardiac from noncardiac respiratory distress and evaluate HD severity. Client-owned dogs (n = 291). Multicenter, cross-sectional, prospective investigation. Medical history, physical examination, echocardiography, and thoracic radiography classified 113 asymptomatic dogs (group 1, n = 39 without HD; group 2, n = 74 with HD), and 178 with respiratory distress (group 3, n = 104 respiratory disease, either with or without concurrent HD; group 4, n = 74 with congestive heart failure [CHF]). HD severity was graded using International Small Animal Cardiac Health Council (ISACHC) and ACVIM Consensus (ACVIM-HD) schemes without knowledge of [NT-proBNP] results. Receiver-operating characteristic curve analysis assessed the capacity of [NT-proBNP] to discriminate between dogs with cardiac and noncardiac respiratory distress. Multivariate general linear models containing key clinical variables tested associations between [NT-proBNP] and HD severity. Plasma [NT-proBNP] (median; IQR) was higher in CHF dogs (5,110; 2,769-8,466 pmol/L) compared to those with noncardiac respiratory distress (1,287; 672-2,704 pmol/L; P < .0001). A cut-off >2,447 pmol/L discriminated CHF from noncardiac respiratory distress (81.1% sensitivity; 73.1% specificity; area under curve, 0.84). A multivariate model comprising left atrial to aortic ratio, heart rate, left ventricular diameter, end-systole, and ACVIM-HD scheme most accurately associated average plasma [NT-proBNP] with HD severity. Plasma [NT-proBNP] was useful for discriminating CHF from noncardiac respiratory distress. Average plasma [NT-BNP] increased significantly as a function of HD severity using the ACVIM-HD classification scheme. Copyright © 2014 by the American College of Veterinary Internal Medicine.

  12. Murmur intensity in adult dogs with pulmonic and subaortic stenosis reflects disease severity.

    PubMed

    Caivano, D; Dickson, D; Martin, M; Rishniw, M

    2018-03-01

    The aims of this study were to determine whether murmur intensity in adult dogs with pulmonic stenosis or subaortic stenosis reflects echocardiographic disease severity and to determine whether a six-level murmur grading scheme provides clinical advantages over a four-level scheme. In this retrospective multi-investigator study on adult dogs with pulmonic stenosis or subaortic stenosis, murmur intensity was compared to echocardiographically determined pressure gradient across the affected valve. Disease severity, based on pressure gradients, was assessed between sequential murmur grades to identify redundancy in classification. A simplified four-level murmur intensity classification scheme ('soft', 'moderate', 'loud', 'palpable') was evaluated. In total, 284 dogs (153 with pulmonic stenosis, 131 with subaortic stenosis) were included; 55 dogs had soft, 59 had moderate, 72 had loud and 98 had palpable murmurs. 95 dogs had mild stenosis, 46 had moderate stenosis, and 143 had severe stenosis. No dogs with soft murmurs of either pulmonic or subaortic stenosis had transvalvular pressure gradients greater than 50 mmHg. Dogs with loud or palpable murmurs mostly, but not always, had severe stenosis. Stenosis severity increased with increasing murmur intensity. The traditional six-level murmur grading scheme provided no additional clinical information than the four-level descriptive murmur grading scheme. A simplified descriptive four-level murmur grading scheme differentiated stenosis severity without loss of clinical information, compared to the traditional six-level scheme. Soft murmurs in dogs with pulmonic or subaortic stenosis are strongly indicative of mild lesions. Loud or palpable murmurs are strongly suggestive of severe stenosis. © 2017 British Small Animal Veterinary Association.

  13. Random forest feature selection, fusion and ensemble strategy: Combining multiple morphological MRI measures to discriminate among healhy elderly, MCI, cMCI and alzheimer's disease patients: From the alzheimer's disease neuroimaging initiative (ADNI) database.

    PubMed

    Dimitriadis, S I; Liparas, Dimitris; Tsolaki, Magda N

    2018-05-15

    In the era of computer-assisted diagnostic tools for various brain diseases, Alzheimer's disease (AD) covers a large percentage of neuroimaging research, with the main scope being its use in daily practice. However, there has been no study attempting to simultaneously discriminate among Healthy Controls (HC), early mild cognitive impairment (MCI), late MCI (cMCI) and stable AD, using features derived from a single modality, namely MRI. Based on preprocessed MRI images from the organizers of a neuroimaging challenge, 3 we attempted to quantify the prediction accuracy of multiple morphological MRI features to simultaneously discriminate among HC, MCI, cMCI and AD. We explored the efficacy of a novel scheme that includes multiple feature selections via Random Forest from subsets of the whole set of features (e.g. whole set, left/right hemisphere etc.), Random Forest classification using a fusion approach and ensemble classification via majority voting. From the ADNI database, 60 HC, 60 MCI, 60 cMCI and 60 CE were used as a training set with known labels. An extra dataset of 160 subjects (HC: 40, MCI: 40, cMCI: 40 and AD: 40) was used as an external blind validation dataset to evaluate the proposed machine learning scheme. In the second blind dataset, we succeeded in a four-class classification of 61.9% by combining MRI-based features with a Random Forest-based Ensemble Strategy. We achieved the best classification accuracy of all teams that participated in this neuroimaging competition. The results demonstrate the effectiveness of the proposed scheme to simultaneously discriminate among four groups using morphological MRI features for the very first time in the literature. Hence, the proposed machine learning scheme can be used to define single and multi-modal biomarkers for AD. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Mapping shallow waters habitats using OBIA by applying several approaches of depth invariant index in North Kepulauan Seribu

    NASA Astrophysics Data System (ADS)

    Siregar, V. P.; Agus, S. B.; Subarno, T.; Prabowo, N. W.

    2018-05-01

    The availability of satellite imagery with a variety of spatial resolution, both free access and commercial become as an option in utilizing the remote sensing technology. Variability of the water column is one of the factors affecting the interpretation results when mapping marine shallow waters. This study aimed to evaluate the influence of water column correction (depth-invariant index) on the accuracy of shallow water habitat classification results using OBIA. This study was conducted in North of Kepulauan Seribu, precisely in Harapan Island and its surrounding areas. Habitat class schemes were based on field observations, which were then used to build habitat classes on satellite imagery. The water column correction was applied to the three pairs of SPOT-7 multispectral bands, which were subsequently used in object-based classification. Satellite image classification was performed with four different approaches, namely (i) using DII transformed bands with single pair band input (B1B2), (ii) multi pairs bands (B1B2, B1B3, and B2B3), (iii) combination of multi pairs band and initial bands, and (iv) only using initial bands. The accuracy test results of the four inputs show the values of Overall Accuracy and Kappa Statistics, respectively 55.84 and 0.48; 68.53 and 0.64; 78.68 and 0.76; 77.66 and 0.74. It shows that the best results when using DII and initial band combination for shallow water benthic classification in this study site.

  15. A reevaluation of the costs of heart failure and its implications for allocation of health resources in the United States.

    PubMed

    Voigt, Jeff; Sasha John, M; Taylor, Andrew; Krucoff, Mitchell; Reynolds, Matthew R; Michael Gibson, C

    2014-05-01

    The annual cost of heart failure (HF) is estimated at $39.2 billion. This has been acknowledged to underestimate the true costs for care. The objective of this analysis is to more accurately assess these costs. Publicly available data sources were used. Cost calculations incorporated relevant factors such as Medicare hospital cost-to-charge ratios, reimbursement from both government and private insurance, and out-of-pocket expenditures. A recently published Atherosclerosis Risk in Communities (ARIC) HF scheme was used to adjust the HF classification scheme. Costs were calculated with HF as the primary diagnosis (HF in isolation, or HFI) or HF as one of the diagnoses/part of a disease milieu (HF syndrome, or HFS). Total direct costs for HF were calculated at $60.2 billion (HFI) and $115.4 billion (HFS). Indirect costs were $10.6 billion for both. Costs attributable to HF may represent a much larger burden to US health care than what is commonly referenced. These revised and increased costs have implications for policy makers.

  16. Mastering cognitive development theory in computer science education

    NASA Astrophysics Data System (ADS)

    Gluga, Richard; Kay, Judy; Lister, Raymond; Simon; Kleitman, Sabina

    2013-03-01

    To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that classified activities and assessments are comparable across the subjects of a degree, and, ideally, comparable across institutions. One widespread approach to supporting this is to write learning objects in terms of Bloom's Taxonomy. This, or other such classifications, is likely to be more effective if educators can use them consistently, in the way experts would use them. To this end, we present the design and evaluation of our online interactive web-based tutorial system, which can be configured and used to offer training in different classification schemes. We report on results from three evaluations. First, 17 computer science educators complete a tutorial on using Bloom's Taxonomy to classify programming examination questions. Second, 20 computer science educators complete a Neo-Piagetian tutorial. Third evaluation was a comparison of inter-rater reliability scores of computer science educators classifying programming questions using Bloom's Taxonomy, before and after taking our tutorial. Based on the results from these evaluations, we discuss the effectiveness of our tutorial system design for teaching computer science educators how to systematically and consistently classify programming examination questions. We also discuss the suitability of Bloom's Taxonomy and Neo-Piagetian theory for achieving this goal. The Bloom's and Neo-Piagetian tutorials are made available as a community resource. The contributions of this paper are the following: the tutorial system for learning classification schemes for the purpose of coding the difficulty of computing learning materials; its evaluation; new insights into the consistency that computing educators can achieve using Bloom; and first insights into the use of Neo-Piagetian theory by a group of classifiers.

  17. A multistage approach to improve performance of computer-aided detection of pulmonary embolisms depicted on CT images: preliminary investigation.

    PubMed

    Park, Sang Cheol; Chapman, Brian E; Zheng, Bin

    2011-06-01

    This study developed a computer-aided detection (CAD) scheme for pulmonary embolism (PE) detection and investigated several approaches to improve CAD performance. In the study, 20 computed tomography examinations with various lung diseases were selected, which include 44 verified PE lesions. The proposed CAD scheme consists of five basic steps: 1) lung segmentation; 2) PE candidate extraction using an intensity mask and tobogganing region growing; 3) PE candidate feature extraction; 4) false-positive (FP) reduction using an artificial neural network (ANN); and 5) a multifeature-based k-nearest neighbor for positive/negative classification. In this study, we also investigated the following additional methods to improve CAD performance: 1) grouping 2-D detected features into a single 3-D object; 2) selecting features with a genetic algorithm (GA); and 3) limiting the number of allowed suspicious lesions to be cued in one examination. The results showed that 1) CAD scheme using tobogganing, an ANN, and grouping method achieved the maximum detection sensitivity of 79.2%; 2) the maximum scoring method achieved the superior performance over other scoring fusion methods; 3) GA was able to delete "redundant" features and further improve CAD performance; and 4) limiting the maximum number of cued lesions in an examination reduced FP rate by 5.3 times. Combining these approaches, CAD scheme achieved 63.2% detection sensitivity with 18.4 FP lesions per examination. The study suggested that performance of CAD schemes for PE detection depends on many factors that include 1) optimizing the 2-D region grouping and scoring methods; 2) selecting the optimal feature set; and 3) limiting the number of allowed cueing lesions per examination.

  18. A risk-based classification scheme for genetically modified foods. I: Conceptual development.

    PubMed

    Chao, Eunice; Krewski, Daniel

    2008-12-01

    The predominant paradigm for the premarket assessment of genetically modified (GM) foods reflects heightened public concern by focusing on foods modified by recombinant deoxyribonucleic acid (rDNA) techniques, while foods modified by other methods of genetic modification are generally not assessed for safety. To determine whether a GM product requires less or more regulatory oversight and testing, we developed and evaluated a risk-based classification scheme (RBCS) for crop-derived GM foods. The results of this research are presented in three papers. This paper describes the conceptual development of the proposed RBCS that focuses on two categories of adverse health effects: (1) toxic and antinutritional effects, and (2) allergenic effects. The factors that may affect the level of potential health risks of GM foods are identified. For each factor identified, criteria for differentiating health risk potential are developed. The extent to which a GM food satisfies applicable criteria for each factor is rated separately. A concern level for each category of health effects is then determined by aggregating the ratings for the factors using predetermined aggregation rules. An overview of the proposed scheme is presented, as well as the application of the scheme to a hypothetical GM food.

  19. Dynamic Computation Offloading for Low-Power Wearable Health Monitoring Systems.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Mortazavi, Bobak; Alshurafa, Nabil; Sarrafzadeh, Majid

    2017-03-01

    The objective of this paper is to describe and evaluate an algorithm to reduce power usage and increase battery lifetime for wearable health-monitoring devices. We describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data processing between the wearable device and mobile application as a function of desired classification accuracy. By making the correct offloading decision based on current system parameters, we show that we are able to reduce system power by as much as 20%. We demonstrate that computation offloading can be applied to real-time monitoring systems, and yields significant power savings. Making correct offloading decisions for health monitoring devices can extend battery life and improve adherence.

  20. VizieR Online Data Catalog: GALAH semi-automated classification scheme (Traven+, 2017)

    NASA Astrophysics Data System (ADS)

    Traven, G.; Matijevic, G.; Zwitter, T.; Zerjal, M.; Kos, J.; Asplund, M.; Bland-Hawthorn, J.; Casey, A. R.; de Silva, G.; Freeman, K.; Lin, J.; Martell, S. L.; Schlesinger, K. J.; Sharma, S.; Simpson, J. D.; Zucker, D. B.; Anguiano, B.; da Costa, G.; Duong, L.; Horner, J.; Hyde, E. A.; Kafle, P. R.; Munari, U.; Nataf, D.; Navin, C. A.; Reid, W.; Ting, Y.-S.

    2017-04-01

    The GALactic Archaeology with HERMES (GALAH) survey was the main driver for the construction of Hermes (High Efficiency and Resolution Multi-Element Spectrograph), a fiber-fed multi-object spectrograph on the 3.9m Anglo-Australian Telescope. Its spectral resolving power (R) is about 28000, and there is also an R=45000 mode using a slit mask. Hermes has four simultaneous non-contiguous spectral arms centered at 4800, 5761, 6610, and 7740Å, covering about 1000Å in total, including Hα and Hβ lines. About 300000 spectra have been taken to date, including various calibration exposures. However, we concentrate on ~210000 spectra recorded before 2016 January 30. We devise a custom classification procedure which is based on two independently developed methods, the novel dimensionality reduction technique t-SNE (t-distributed stochastic neighbor embedding; van der Maaten & Hinton 2008, Journal of Machine Learning Research 9, 2579) and the renowned clustering algorithm DBSCAN (Ester+ 1996, Proc. 2nd Int. Conf. on KDD, 226 ed. E. Simoudis, J. Han, and U. Fayyad). (4 data files).

  1. Threshold of toxicological concern values for non-genotoxic effects in industrial chemicals: re-evaluation of the Cramer classification.

    PubMed

    Kalkhof, H; Herzler, M; Stahlmann, R; Gundert-Remy, U

    2012-01-01

    The TTC concept employs available data from animal testing to derive a distribution of NOAELs. Taking a probabilistic view, the 5th percentile of the distribution is taken as a threshold value for toxicity. In this paper, we use 824 NOAELs from repeated dose toxicity studies of industrial chemicals to re-evaluate the currently employed TTC values, which have been derived for substances grouped according to the Cramer scheme (Cramer et al. in Food Cosm Toxicol 16:255-276, 1978) by Munro et al. (Food Chem Toxicol 34:829-867, 1996) and refined by Kroes and Kozianowski (Toxicol Lett 127:43-46, 2002), Kroes et al. 2000. In our data set, consisting of 756 NOAELs from 28-day repeated dose testing and 57 NOAELs from 90-days repeated dose testing, the experimental NOAEL had to be extrapolated to chronic TTC using regulatory accepted extrapolation factors. The TTC values derived from our data set were higher than the currently used TTC values confirming the safety of the latter. We analysed the prediction of the Cramer classification by comparing the classification by this tool with the guidance values for classification according to the Globally Harmonised System of classification and labelling of the United Nations (GHS). Nearly 90% of the chemicals were in Cramer class 3 and assumed as highly toxic compared to 22% according to the GHS. The Cramer classification does underestimate the toxicity of chemicals only in 4.6% of the cases. Hence, from a regulatory perspective, the Cramer classification scheme might be applied as it overestimates hazard of a chemical.

  2. Progressively expanded neural network for automatic material identification in hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Paheding, Sidike

    The science of hyperspectral remote sensing focuses on the exploitation of the spectral signatures of various materials to enhance capabilities including object detection, recognition, and material characterization. Hyperspectral imagery (HSI) has been extensively used for object detection and identification applications since it provides plenty of spectral information to uniquely identify materials by their reflectance spectra. HSI-based object detection algorithms can be generally classified into stochastic and deterministic approaches. Deterministic approaches are comparatively simple to apply since it is usually based on direct spectral similarity such as spectral angles or spectral correlation. In contrast, stochastic algorithms require statistical modeling and estimation for target class and non-target class. Over the decades, many single class object detection methods have been proposed in the literature, however, deterministic multiclass object detection in HSI has not been explored. In this work, we propose a deterministic multiclass object detection scheme, named class-associative spectral fringe-adjusted joint transform correlation. Human brain is capable of simultaneously processing high volumes of multi-modal data received every second of the day. In contrast, a machine sees input data simply as random binary numbers. Although machines are computationally efficient, they are inferior when comes to data abstraction and interpretation. Thus, mimicking the learning strength of human brain has been current trend in artificial intelligence. In this work, we present a biological inspired neural network, named progressively expanded neural network (PEN Net), based on nonlinear transformation of input neurons to a feature space for better pattern differentiation. In PEN Net, discrete fixed excitations are disassembled and scattered in the feature space as a nonlinear line. Each disassembled element on the line corresponds to a pattern with similar features. Unlike the conventional neural network where hidden neurons need to be iteratively adjusted to achieve better accuracy, our proposed PEN Net does not require hidden neurons tuning which achieves better computational efficiency, and it has also shown superior performance in HSI classification tasks compared to the state-of-the-arts. Spectral-spatial features based HSI classification framework has shown stronger strength compared to spectral-only based methods. In our lastly proposed technique, PEN Net is incorporated with multiscale spatial features (i.e., multiscale complete local binary pattern) to perform a spectral-spatial classification of HSI. Several experiments demonstrate excellent performance of our proposed technique compared to the more recent developed approaches.

  3. The Stillbirth Classification System for the Safe Passage Study: Incorporating Mechanism, Etiology, and Recurrence

    PubMed Central

    Boyd, Theonia K.; Wright, Colleen A.; Odendaal, Hein J.; Elliott, Amy J.; Sens, Mary Ann; Folkerth, Rebecca D.; Roberts, Drucilla J.; Kinney, Hannah C.

    2017-01-01

    OBJECTIVE Describe the classification system for the assignment of the cause of death for stillbirth in the Safe Passage Study, an international, multi-institutional, prospective analysis conducted by the NIAAA/NICHD funded PASS Network (The Prenatal Alcohol in SIDS and Stillbirth (PASS) Research Network). The study mission is to determine the role of prenatal alcohol and/or cigarette smoke exposure in adverse pregnancy outcomes, including stillbirth, in a high-risk cohort of 12,000 maternal/fetal dyads. METHODS The PASS Network classification system is based upon 5 ‘sites of origin’ for cause of stillbirth (Fetal, Placental, Maternal, External/Environmental, or Undetermined), further subdivided into mechanism subcategories (e.g., Placental Perfusion Failure). Both site of origin and mechanism stratification are employed to assign an ultimate cause of death. Each PASS stillbirth (n=19) in the feasibility study was assigned a cause of death, and status of sporadic versus recurrent. Adjudication involved review of the maternal and obstetrical records, and fetal autopsy and placental findings, with complete consensus in each case. Two published classification systems, i.e., INCODE and ReCoDe, were used for comparison. RESULTS Causes of stillbirth classified were: fetal (n=5, 26%), placental (n=10, 53%), external (n=1, 5%), and undetermined (n=3, 16%). Nine cases (47%) had placental causes of death due to maternal disorders that carry recurrence risks. There was complete agreement for the cause of death across the three classification systems in 26% of cases, and a combination of partial or complete agreement in 68% of cases. Complete vs. partial agreements were predicated upon the classification schemes used for comparison. CONCLUSIONS The proposed PASS system is a user-friendly classification system that provides comparable information to previously published systems. Advantages include its simplicity, mechanistic formulations, tight clinicopathologic integration, provision for an undetermined category, and its wide applicability for use by perinatal mortality review boards with access to information routinely collected during clinicopathologic evaluations. PMID:27116324

  4. Iatrogenic Bone and Soft Tissue Trauma in Robotic-Arm Assisted Total Knee Arthroplasty Compared With Conventional Jig-Based Total Knee Arthroplasty: A Prospective Cohort Study and Validation of a New Classification System.

    PubMed

    Kayani, Babar; Konan, Sujith; Pietrzak, Jurek R T; Haddad, Fares S

    2018-03-27

    The objective of this study was to compare macroscopic bone and soft tissue injury between robotic-arm assisted total knee arthroplasty (RA-TKA) and conventional jig-based total knee arthroplasty (CJ-TKA) and create a validated classification system for reporting iatrogenic bone and periarticular soft tissue injury after TKA. This study included 30 consecutive CJ-TKAs followed by 30 consecutive RA-TKAs performed by a single surgeon. Intraoperative photographs of the femur, tibia, and periarticular soft tissues were taken before implantation of prostheses. Using these outcomes, the macroscopic soft tissue injury (MASTI) classification system was developed to grade iatrogenic bone and soft tissue injuries. Interobserver and Intraobserver validity of the proposed classification system was assessed. Patients undergoing RA-TKA had reduced medial soft tissue injury in both passively correctible (P < .05) and noncorrectible varus deformities (P < .05); more pristine femoral (P < .05) and tibial (P < .05) bone resection cuts; and improved MASTI scores compared to CJ-TKA (P < .05). There was high interobserver (intraclass correlation coefficient 0.92 [95% confidence interval: 0.88-0.96], P < .05) and intraobserver agreement (intraclass correlation coefficient 0.94 [95% confidence interval: 0.92-0.97], P < .05) of the proposed MASTI classification system. There is reduced bone and periarticular soft tissue injury in patients undergoing RA-TKA compared to CJ-TKA. The proposed MASTI classification system is a reproducible grading scheme for describing iatrogenic bone and soft tissue injury in TKA. RA-TKA is associated with reduced bone and soft tissue injury compared with conventional jig-based TKA. The proposed MASTI classification may facilitate further research correlating macroscopic soft tissue injury during TKA to long-term clinical and functional outcomes. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation

    PubMed Central

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-01-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67×3 (67 clusters of three observations) and a 33×6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67×3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis. PMID:20011037

  6. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    PubMed Central

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  7. Classifying quantum entanglement through topological links

    NASA Astrophysics Data System (ADS)

    Quinta, Gonçalo M.; André, Rui

    2018-04-01

    We propose an alternative classification scheme for quantum entanglement based on topological links. This is done by identifying a nonrigid ring to a particle, attributing the act of cutting and removing a ring to the operation of tracing out the particle, and associating linked rings to entangled particles. This analogy naturally leads us to a classification of multipartite quantum entanglement based on all possible distinct links for a given number of rings. To determine all different possibilities, we develop a formalism that associates any link to a polynomial, with each polynomial thereby defining a distinct equivalence class. To demonstrate the use of this classification scheme, we choose qubit quantum states as our example of physical system. A possible procedure to obtain qubit states from the polynomials is also introduced, providing an example state for each link class. We apply the formalism for the quantum systems of three and four qubits and demonstrate the potential of these tools in a context of qubit networks.

  8. Integrating disparate lidar data at the national scale to assess the relationships between height above ground, land cover and ecoregions

    USGS Publications Warehouse

    Stoker, Jason M.; Cochrane, Mark A.; Roy, David P.

    2013-01-01

    With the acquisition of lidar data for over 30 percent of the US, it is now possible to assess the three-dimensional distribution of features at the national scale. This paper integrates over 350 billion lidar points from 28 disparate datasets into a national-scale database and evaluates if height above ground is an important variable in the context of other nationalscale layers, such as the US Geological Survey National Land Cover Database and the US Environmental Protection Agency ecoregions maps. While the results were not homoscedastic and the available data did not allow for a complete height census in any of the classes, it does appear that where lidar data were used, there were detectable differences in heights among many of these national classification schemes. This study supports the hypothesis that there were real, detectable differences in heights in certain national-scale classification schemes, despite height not being a variable used in any of the classification routines.

  9. Occupant detection using support vector machines with a polynomial kernel function

    NASA Astrophysics Data System (ADS)

    Destefanis, Eduardo A.; Kienzle, Eberhard; Canali, Luis R.

    2000-10-01

    The use of air bags in the presence of bad passenger and baby seat positions in car seats can injure or kill these individuals in case of an accident when this device is inflated. A proposed solution is the use of range sensors to detect passenger and baby seat risky positions. Such sensors allow the Airbag inflation to be controlled. This work is concerned with the application of different classification schemes to a real world problem and the optimization of a sensor as a function of the classification performance. The sensor is constructed using a new technology which is called Photo-Mixer-Device (PMD). A systematic analysis of the occupant detection problem was made using real and virtual environments. The challenge is to find the best sensor geometry and to adapt a classification scheme under the current technological constraints. Passenger head position detection is also a desirable issue. A couple of classifiers have been used into a simple configuration to reach this goal. Experiences and results are described.

  10. Classification of Uxo by Principal Dipole Polarizability

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.

    2010-12-01

    Data acquired by multiple-Transmitter, multiple-receiver time-domain electromagnetic devices show great potential for determining the geometric and compositional information relating to near surface conductive targets. Here is presented an analysis of data from one such system; the Berkeley Unexploded-ordnance Discriminator (BUD) system. BUD data are succinctly reduced by processing the multi-static data matrices to obtain magnetic dipole polarizability matrices for data from each time gate. When viewed over all time gates, the projections of the data onto the principal polar axes yield so-called polarizability curves. These curves are especially well suited to discriminating between subsurface conductivity anomalies which correspond to objects of rotational symmetry and irregularly shaped objects. The curves have previously been successfully employed as library elements in a pattern recognition scheme aimed at discriminating harmless scrap metal from dangerous intact unexploded ordnance. However, previous polarizability-curve matching methods have only been applied at field sites which are known a priori to be contaminated by a single type of ordnance, and furthermore, the particular ordnance present in the subsurface was known to be large. Thus signal amplitude was a key element in the discrimination process. The work presented here applies feature-based pattern classification techniques to BUD field data where more than 20 categories of object are present. Data soundings from a calibration grid at the Yuma, AZ proving ground are used in a cross validation study to calibrate the pattern recognition method. The resultant method is then applied to a Blind Test Grid. Results indicate that when lone UXO are present and SNR is reasonably high, Polarizability Curve Matching successfully discriminates UXO from scrap metal when a broad range of objects are present.

  11. Healthy and Unhealthy Perfectionists among Academically Gifted Chinese Students in Hong Kong: Do Different Classification Schemes Make a Difference?

    ERIC Educational Resources Information Center

    Chan, David W.

    2010-01-01

    This study investigated the identification and distribution of perfectionist types with a sample of 111 academically gifted Chinese students aged 17 to 20 in Hong Kong. Three approaches to classification were employed. Apart from the direct questioning approach, the rational approach and the clustering approach classified students using their…

  12. Application of a hierarchical habitat unit classification system: stream habitat and salmonid distribution in Ward Creek, southeast Alaska.

    Treesearch

    M.D. Bryant; B.E. Wright; B.J. Davies

    1992-01-01

    A hierarchical classification system separating stream habitat into habitat units defined by stream morphology and hydrology was used in a pre-enhancement stream survey. The system separates habitat units into macrounits, mesounits, and micro- units and includes a separate evaluation of instream cover that also uses the hierarchical scheme. This paper presents an...

  13. Identification and classification of known and putative antimicrobial compounds produced by a wide variety of Bacillales species.

    PubMed

    Zhao, Xin; Kuipers, Oscar P

    2016-11-07

    Gram-positive bacteria of the Bacillales are important producers of antimicrobial compounds that might be utilized for medical, food or agricultural applications. Thanks to the wide availability of whole genome sequence data and the development of specific genome mining tools, novel antimicrobial compounds, either ribosomally- or non-ribosomally produced, of various Bacillales species can be predicted and classified. Here, we provide a classification scheme of known and putative antimicrobial compounds in the specific context of Bacillales species. We identify and describe known and putative bacteriocins, non-ribosomally synthesized peptides (NRPs), polyketides (PKs) and other antimicrobials from 328 whole-genome sequenced strains of 57 species of Bacillales by using web based genome-mining prediction tools. We provide a classification scheme for these bacteriocins, update the findings of NRPs and PKs and investigate their characteristics and suitability for biocontrol by describing per class their genetic organization and structure. Moreover, we highlight the potential of several known and novel antimicrobials from various species of Bacillales. Our extended classification of antimicrobial compounds demonstrates that Bacillales provide a rich source of novel antimicrobials that can now readily be tapped experimentally, since many new gene clusters are identified.

  14. Flood Mapping in the Lower Mekong River Basin Using Daily MODIS Observations

    NASA Technical Reports Server (NTRS)

    Fayne, Jessica V.; Bolten, John D.; Doyle, Colin S.; Fuhrmann, Sven; Rice, Matthew T.; Houser, Paul R.; Lakshmi, Venkat

    2017-01-01

    In flat homogenous terrain such as in Cambodia and Vietnam, the monsoon season brings significant and consistent flooding between May and November. To monitor flooding in the Lower Mekong region, the near real-time NASA Flood Extent Product (NASA-FEP) was developed using seasonal normalized difference vegetation index (NDVI) differences from the 250 m resolution Moderate Resolution Imaging Spectroradiometer (MODIS) sensor compared to daily observations. The use of a percentage change interval classification relating to various stages of flooding reduces might be confusing to viewers or potential users, and therefore reducing the product usage. To increase the product usability through simplification, the classification intervals were compared with other commonly used change detection schemes to identify the change classification scheme that best delineates flooded areas. The percentage change method used in the NASA-FEP proved to be helpful in delineating flood boundaries compared to other change detection methods. The results of the accuracy assessments indicate that the -75% NDVI change interval can be reclassified to a descriptive 'flood' classification. A binary system was used to simplify the interpretation of the NASA-FEP by removing extraneous information from lower interval change classes.

  15. Automatic breast tissue density estimation scheme in digital mammography images

    NASA Astrophysics Data System (ADS)

    Menechelli, Renan C.; Pacheco, Ana Luisa V.; Schiabel, Homero

    2017-03-01

    Cases of breast cancer have increased substantially each year. However, radiologists are subject to subjectivity and failures of interpretation which may affect the final diagnosis in this examination. The high density features in breast tissue are important factors related to these failures. Thus, among many functions some CADx (Computer-Aided Diagnosis) schemes are classifying breasts according to the predominant density. In order to aid in such a procedure, this work attempts to describe automated software for classification and statistical information on the percentage change in breast tissue density, through analysis of sub regions (ROIs) from the whole mammography image. Once the breast is segmented, the image is divided into regions from which texture features are extracted. Then an artificial neural network MLP was used to categorize ROIs. Experienced radiologists have previously determined the ROIs density classification, which was the reference to the software evaluation. From tests results its average accuracy was 88.7% in ROIs classification, and 83.25% in the classification of the whole breast density in the 4 BI-RADS density classes - taking into account a set of 400 images. Furthermore, when considering only a simplified two classes division (high and low densities) the classifier accuracy reached 93.5%, with AUC = 0.95.

  16. Support vector machine and principal component analysis for microarray data classification

    NASA Astrophysics Data System (ADS)

    Astuti, Widi; Adiwijaya

    2018-03-01

    Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.

  17. Detailed Quantitative Classifications of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Nair, Preethi

    2018-01-01

    Understanding the physical processes responsible for the growth of galaxies is one of the key challenges in extragalactic astronomy. The assembly history of a galaxy is imprinted in a galaxy’s detailed morphology. The bulge-to-total ratio of galaxies, the presence or absence of bars, rings, spiral arms, tidal tails etc, all have implications for the past merger, star formation, and feedback history of a galaxy. However, current quantitative galaxy classification schemes are only useful for broad binning. They cannot classify or exploit the wide variety of galaxy structures seen in nature. Therefore, comparisons of observations with theoretical predictions of secular structure formation have only been conducted on small samples of visually classified galaxies. However large samples are needed to disentangle the complex physical processes of galaxy formation. With the advent of large surveys, like the Sloan Digital Sky Survey (SDSS) and the upcoming Large Synoptic Survey Telescope (LSST) and WFIRST, the problem of statistics will be resolved. However, the need for a robust quantitative classification scheme will still remain. Here I will present early results on promising machine learning algorithms that are providing detailed classifications, identifying bars, rings, multi-armed spiral galaxies, and Hubble type.

  18. Emotion recognition based on physiological changes in music listening.

    PubMed

    Kim, Jonghwa; André, Elisabeth

    2008-12-01

    Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\\% and 70\\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.

  19. Identification and mapping of natural vegetation on a coastal site using a Worldview-2 satellite image.

    PubMed

    Rapinel, Sébastien; Clément, Bernard; Magnanon, Sylvie; Sellin, Vanessa; Hubert-Moy, Laurence

    2014-11-01

    Identification and mapping of natural vegetation are major issues for biodiversity management and conservation. Remotely sensed data with very high spatial resolution are currently used to study vegetation, but most satellite sensors are limited to four spectral bands, which is insufficient to identify some natural vegetation formations. The study objectives are to discriminate natural vegetation and identify natural vegetation formations using a Worldview-2 satellite image. The classification of the Worldview-2 image and ancillary thematic data was performed using a hybrid pixel-based and object-oriented approach. A hierarchical scheme using three levels was implemented, from land cover at a field scale to vegetation formation. This method was applied on a 48 km² site located on the French Atlantic coast which includes a classified NATURA 2000 dune and marsh system. The classification accuracy was very high, the Kappa index varying between 0.90 and 0.74 at land cover and vegetation formation levels respectively. These results show that Wordlview-2 images are suitable to identify natural vegetation. Vegetation maps derived from Worldview-2 images are more detailed than existing ones. They provide a useful medium for environmental management of vulnerable areas. The approach used to map natural vegetation is reproducible for a wider application by environmental managers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. A Pareto-based Ensemble with Feature and Instance Selection for Learning from Multi-Class Imbalanced Datasets.

    PubMed

    Fernández, Alberto; Carmona, Cristobal José; José Del Jesus, María; Herrera, Francisco

    2017-09-01

    Imbalanced classification is related to those problems that have an uneven distribution among classes. In addition to the former, when instances are located into the overlapped areas, the correct modeling of the problem becomes harder. Current solutions for both issues are often focused on the binary case study, as multi-class datasets require an additional effort to be addressed. In this research, we overcome these problems by carrying out a combination between feature and instance selections. Feature selection will allow simplifying the overlapping areas easing the generation of rules to distinguish among the classes. Selection of instances from all classes will address the imbalance itself by finding the most appropriate class distribution for the learning task, as well as possibly removing noise and difficult borderline examples. For the sake of obtaining an optimal joint set of features and instances, we embedded the searching for both parameters in a Multi-Objective Evolutionary Algorithm, using the C4.5 decision tree as baseline classifier in this wrapper approach. The multi-objective scheme allows taking a double advantage: the search space becomes broader, and we may provide a set of different solutions in order to build an ensemble of classifiers. This proposal has been contrasted versus several state-of-the-art solutions on imbalanced classification showing excellent results in both binary and multi-class problems.

  1. A prototype of mammography CADx scheme integrated to imaging quality evaluation techniques

    NASA Astrophysics Data System (ADS)

    Schiabel, Homero; Matheus, Bruno R. N.; Angelo, Michele F.; Patrocínio, Ana Claudia; Ventura, Liliane

    2011-03-01

    As all women over the age of 40 are recommended to perform mammographic exams every two years, the demands on radiologists to evaluate mammographic images in short periods of time has increased considerably. As a tool to improve quality and accelerate analysis CADe/Dx (computer-aided detection/diagnosis) schemes have been investigated, but very few complete CADe/Dx schemes have been developed and most are restricted to detection and not diagnosis. The existent ones usually are associated to specific mammographic equipment (usually DR), which makes them very expensive. So this paper describes a prototype of a complete mammography CADx scheme developed by our research group integrated to an imaging quality evaluation process. The basic structure consists of pre-processing modules based on image acquisition and digitization procedures (FFDM, CR or film + scanner), a segmentation tool to detect clustered microcalcifications and suspect masses and a classification scheme, which evaluates as the presence of microcalcifications clusters as well as possible malignant masses based on their contour. The aim is to provide enough information not only on the detected structures but also a pre-report with a BI-RADS classification. At this time the system is still lacking an interface integrating all the modules. Despite this, it is functional as a prototype for clinical practice testing, with results comparable to others reported in literature.

  2. Divorcing Strain Classification from Species Names.

    PubMed

    Baltrus, David A

    2016-06-01

    Confusion about strain classification and nomenclature permeates modern microbiology. Although taxonomists have traditionally acted as gatekeepers of order, the numbers of, and speed at which, new strains are identified has outpaced the opportunity for professional classification for many lineages. Furthermore, the growth of bioinformatics and database-fueled investigations have placed metadata curation in the hands of researchers with little taxonomic experience. Here I describe practical challenges facing modern microbial taxonomy, provide an overview of complexities of classification for environmentally ubiquitous taxa like Pseudomonas syringae, and emphasize that classification can be independent of nomenclature. A move toward implementation of relational classification schemes based on inherent properties of whole genomes could provide sorely needed continuity in how strains are referenced across manuscripts and data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The 2016 revision of the WHO Classification of Central Nervous System Tumours: retrospective application to a cohort of diffuse gliomas.

    PubMed

    Rogers, Te Whiti; Toor, Gurvinder; Drummond, Katharine; Love, Craig; Field, Kathryn; Asher, Rebecca; Tsui, Alpha; Buckland, Michael; Gonzales, Michael

    2018-03-01

    The classification of central nervous system tumours has more recently been shaped by a focus on molecular pathology rather than histopathology. We re-classified 82 glial tumours according to the molecular-genetic criteria of the 2016 revision of the World Health Organization (WHO) Classification of Tumours of the Central Nervous System. Initial diagnoses and grading were based on the morphological criteria of the 2007 WHO scheme. Because of the impression of an oligodendroglial component on initial histological assessment, each tumour was tested for co-deletion of chromosomes 1p and 19q and mutations of isocitrate dehydrogenase (IDH-1 and 2) genes. Additionally, expression of proteins encoded by alpha-thalassemia X-linked mental retardation (ATRX) and TP53 genes was assessed by immunohistochemistry. We found that all but two tumours could be assigned to a specific category in the 2016 revision. The most common change in diagnosis was from oligoastrocytoma to specifically astrocytoma or oligodendroglioma. Analysis of progression free survival (PFS) for WHO grade II and III tumours showed that the objective criteria of the 2016 revision separated diffuse gliomas into three distinct molecular categories: chromosome 1p/19q co-deleted/IDH mutant, intact 1p/19q/IDH mutant and IDH wild type. No significant difference in PFS was found when comparing IDH mutant grade II and III tumours suggesting that IDH status is more informative than tumour grade. The segregation into distinct molecular sub-types that is achieved by the 2016 revision provides an objective evidence base for managing patients with grade II and III diffuse gliomas based on prognosis.

  4. Land Cover Analysis by Using Pixel-Based and Object-Based Image Classification Method in Bogor

    NASA Astrophysics Data System (ADS)

    Amalisana, Birohmatin; Rokhmatullah; Hernina, Revi

    2017-12-01

    The advantage of image classification is to provide earth’s surface information like landcover and time-series changes. Nowadays, pixel-based image classification technique is commonly performed with variety of algorithm such as minimum distance, parallelepiped, maximum likelihood, mahalanobis distance. On the other hand, landcover classification can also be acquired by using object-based image classification technique. In addition, object-based classification uses image segmentation from parameter such as scale, form, colour, smoothness and compactness. This research is aimed to compare the result of landcover classification and its change detection between parallelepiped pixel-based and object-based classification method. Location of this research is Bogor with 20 years range of observation from 1996 until 2016. This region is famous as urban areas which continuously change due to its rapid development, so that time-series landcover information of this region will be interesting.

  5. Maxillectomy defects: a suggested classification scheme.

    PubMed

    Akinmoladun, V I; Dosumu, O O; Olusanya, A A; Ikusika, O F

    2013-06-01

    The term "maxillectomy" has been used to describe a variety of surgical procedures for a spectrum of diseases involving a diverse anatomical site. Hence, classifications of maxillectomy defects have often made communication difficult. This article highlights this problem, emphasises the need for a uniform system of classification and suggests a classification system which is simple and comprehensive. Articles related to this subject, especially those with specified classifications of maxillary surgical defects were sourced from the internet through Google, Scopus and PubMed using the search terms maxillectomy defects classification. A manual search through available literature was also done. The review of the materials revealed many classifications and modifications of classifications from the descriptive, reconstructive and prosthodontic perspectives. No globally acceptable classification exists among practitioners involved in the management of diseases in the mid-facial region. There were over 14 classifications of maxillary defects found in the English literature. Attempts made to address the inadequacies of previous classifications have tended to result in cumbersome and relatively complex classifications. A single classification that is based on both surgical and prosthetic considerations is most desirable and is hereby proposed.

  6. The use of morphological characteristics and texture analysis in the identification of tissue composition in prostatic neoplasia.

    PubMed

    Diamond, James; Anderson, Neil H; Bartels, Peter H; Montironi, Rodolfo; Hamilton, Peter W

    2004-09-01

    Quantitative examination of prostate histology offers clues in the diagnostic classification of lesions and in the prediction of response to treatment and prognosis. To facilitate the collection of quantitative data, the development of machine vision systems is necessary. This study explored the use of imaging for identifying tissue abnormalities in prostate histology. Medium-power histological scenes were recorded from whole-mount radical prostatectomy sections at x 40 objective magnification and assessed by a pathologist as exhibiting stroma, normal tissue (nonneoplastic epithelial component), or prostatic carcinoma (PCa). A machine vision system was developed that divided the scenes into subregions of 100 x 100 pixels and subjected each to image-processing techniques. Analysis of morphological characteristics allowed the identification of normal tissue. Analysis of image texture demonstrated that Haralick feature 4 was the most suitable for discriminating stroma from PCa. Using these morphological and texture measurements, it was possible to define a classification scheme for each subregion. The machine vision system is designed to integrate these classification rules and generate digital maps of tissue composition from the classification of subregions; 79.3% of subregions were correctly classified. Established classification rates have demonstrated the validity of the methodology on small scenes; a logical extension was to apply the methodology to whole slide images via scanning technology. The machine vision system is capable of classifying these images. The machine vision system developed in this project facilitates the exploration of morphological and texture characteristics in quantifying tissue composition. It also illustrates the potential of quantitative methods to provide highly discriminatory information in the automated identification of prostatic lesions using computer vision.

  7. An assessment of commonly employed satellite-based remote sensors for mapping mangrove species in Mexico using an NDVI-based classification scheme.

    PubMed

    Valderrama-Landeros, L; Flores-de-Santiago, F; Kovacs, J M; Flores-Verdugo, F

    2017-12-14

    Optimizing the classification accuracy of a mangrove forest is of utmost importance for conservation practitioners. Mangrove forest mapping using satellite-based remote sensing techniques is by far the most common method of classification currently used given the logistical difficulties of field endeavors in these forested wetlands. However, there is now an abundance of options from which to choose in regards to satellite sensors, which has led to substantially different estimations of mangrove forest location and extent with particular concern for degraded systems. The objective of this study was to assess the accuracy of mangrove forest classification using different remotely sensed data sources (i.e., Landsat-8, SPOT-5, Sentinel-2, and WorldView-2) for a system located along the Pacific coast of Mexico. Specifically, we examined a stressed semiarid mangrove forest which offers a variety of conditions such as dead areas, degraded stands, healthy mangroves, and very dense mangrove island formations. The results indicated that Landsat-8 (30 m per pixel) had  the lowest overall accuracy at 64% and that WorldView-2 (1.6 m per pixel) had the highest at 93%. Moreover, the SPOT-5 and the Sentinel-2 classifications (10 m per pixel) were very similar having accuracies of 75 and 78%, respectively. In comparison to WorldView-2, the other sensors overestimated the extent of Laguncularia racemosa and underestimated the extent of Rhizophora mangle. When considering such type of sensors, the higher spatial resolution can be particularly important in mapping small mangrove islands that often occur in degraded mangrove systems.

  8. Mapping forest types in Worcester County, Maryland, using LANDSAT data

    NASA Technical Reports Server (NTRS)

    Burtis, J., Jr.; Witt, R. G.

    1981-01-01

    The feasibility of mapping Level 2 forest cover types for a county-sized area on Maryland's Eastern Shore was demonstrated. A Level 1 land use/land cover classification was carried out for all of Worcester County as well. A June 1978 LANDSAT scene was utilized in a classification which employed two software packages on different computers (IDIMS on an HP 3000 and ASTEP-II on a Univac 1108). A twelve category classification scheme was devised for the study area. Resulting products include black and white line printer maps, final color coded classification maps, digitally enhanced color imagery and tabulated acreage statistics for all land use and land cover types.

  9. Generalized Rainich conditions, generalized stress-energy conditions, and the Hawking-Ellis classification

    NASA Astrophysics Data System (ADS)

    Martín–Moruno, Prado; Visser, Matt

    2017-11-01

    The (generalized) Rainich conditions are algebraic conditions which are polynomial in the (mixed-component) stress-energy tensor. As such they are logically distinct from the usual classical energy conditions (NEC, WEC, SEC, DEC), and logically distinct from the usual Hawking-Ellis (Segré-Plebański) classification of stress-energy tensors (type I, type II, type III, type IV). There will of course be significant inter-connections between these classification schemes, which we explore in the current article. Overall, we shall argue that it is best to view the (generalized) Rainich conditions as a refinement of the classical energy conditions and the usual Hawking-Ellis classification.

  10. Computerized quantitative evaluation of mammographic accreditation phantom images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less

  11. 14 CFR Section 16 - Objective Classification-Discontinued Operations

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Objective Classification-Discontinued Operations Section 16 Section 16 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... AIR CARRIERS Profit and Loss Classification Section 16 Objective Classification—Discontinued...

  12. The Importance of Temporal and Spatial Vegetation Structure Information in Biotope Mapping Schemes: A Case Study in Helsingborg, Sweden

    NASA Astrophysics Data System (ADS)

    Gao, Tian; Qiu, Ling; Hammer, Mårten; Gunnarsson, Allan

    2012-02-01

    Temporal and spatial vegetation structure has impact on biodiversity qualities. Yet, current schemes of biotope mapping do only to a limited extend incorporate these factors in the mapping. The purpose of this study is to evaluate the application of a modified biotope mapping scheme that includes temporal and spatial vegetation structure. A refined scheme was developed based on a biotope classification, and applied to a green structure system in Helsingborg city in southern Sweden. It includes four parameters of vegetation structure: continuity of forest cover, age of dominant trees, horizontal structure, and vertical structure. The major green structure sites were determined by interpretation of panchromatic aerial photographs assisted with a field survey. A set of biotope maps was constructed on the basis of each level of modified classification. An evaluation of the scheme included two aspects in particular: comparison of species richness between long-continuity and short-continuity forests based on identification of woodland continuity using ancient woodland indicators (AWI) species and related historical documents, and spatial distribution of animals in the green space in relation to vegetation structure. The results indicate that (1) the relationship between forest continuity: according to verification of historical documents, the richness of AWI species was higher in long-continuity forests; Simpson's diversity was significantly different between long- and short-continuity forests; the total species richness and Shannon's diversity were much higher in long-continuity forests shown a very significant difference. (2) The spatial vegetation structure and age of stands influence the richness and abundance of the avian fauna and rabbits, and distance to the nearest tree and shrub was a strong determinant of presence for these animal groups. It is concluded that continuity of forest cover, age of dominant trees, horizontal and vertical structures of vegetation should now be included in urban biotope classifications.

  13. A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vatsavai, Raju; Bhaduri, Budhendra L

    2011-01-01

    Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less

  14. A novel transferable individual tree crown delineation model based on Fishing Net Dragging and boundary classification

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Im, Jungho; Quackenbush, Lindi J.

    2015-12-01

    This study provides a novel approach to individual tree crown delineation (ITCD) using airborne Light Detection and Ranging (LiDAR) data in dense natural forests using two main steps: crown boundary refinement based on a proposed Fishing Net Dragging (FiND) method, and segment merging based on boundary classification. FiND starts with approximate tree crown boundaries derived using a traditional watershed method with Gaussian filtering and refines these boundaries using an algorithm that mimics how a fisherman drags a fishing net. Random forest machine learning is then used to classify boundary segments into two classes: boundaries between trees and boundaries between branches that belong to a single tree. Three groups of LiDAR-derived features-two from the pseudo waveform generated along with crown boundaries and one from a canopy height model (CHM)-were used in the classification. The proposed ITCD approach was tested using LiDAR data collected over a mountainous region in the Adirondack Park, NY, USA. Overall accuracy of boundary classification was 82.4%. Features derived from the CHM were generally more important in the classification than the features extracted from the pseudo waveform. A comprehensive accuracy assessment scheme for ITCD was also introduced by considering both area of crown overlap and crown centroids. Accuracy assessment using this new scheme shows the proposed ITCD achieved 74% and 78% as overall accuracy, respectively, for deciduous and mixed forest.

  15. a Two-Step Classification Approach to Distinguishing Similar Objects in Mobile LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    He, H.; Khoshelham, K.; Fraser, C.

    2017-09-01

    Nowadays, lidar is widely used in cultural heritage documentation, urban modeling, and driverless car technology for its fast and accurate 3D scanning ability. However, full exploitation of the potential of point cloud data for efficient and automatic object recognition remains elusive. Recently, feature-based methods have become very popular in object recognition on account of their good performance in capturing object details. Compared with global features describing the whole shape of the object, local features recording the fractional details are more discriminative and are applicable for object classes with considerable similarity. In this paper, we propose a two-step classification approach based on point feature histograms and the bag-of-features method for automatic recognition of similar objects in mobile lidar point clouds. Lamp post, street light and traffic sign are grouped as one category in the first-step classification for their inter similarity compared with tree and vehicle. A finer classification of the lamp post, street light and traffic sign based on the result of the first-step classification is implemented in the second step. The proposed two-step classification approach is shown to yield a considerable improvement over the conventional one-step classification approach.

  16. Hologram representation of design data in an expert system knowledge base

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Klon, Peter F.

    1988-01-01

    A novel representational scheme for design object descriptions is presented. An abstract notion of modules and signals is developed as a conceptual foundation for the scheme. This abstraction relates the objects to the meaning of system descriptions. Anchored on this abstraction, a representational model which incorporates dynamic semantics for these objects is presented. This representational model is called a hologram scheme since it represents dual level information, namely, structural and semantic. The benefits of this scheme are presented.

  17. The classification of phobic disorders.

    PubMed

    Sheehan, D V; Sheehan, K H

    The history of classification of phobic disorders is reviewed. Problems in the ability of current classification schemes to predict, control and describe the relationship between the symptoms and other phenomena are outlined. A new classification of phobic disorders is proposed based on the presence or absence of an endogenous anxiety syndrome with the phobias. The two categories of phobic disorder have a different clinical presentation and course, a different mean age of onset, distribution of age of onset, sex distribution, response to treatment modalities, GSR testing and habituation response. Empirical evidence supporting this proposal is cited. This classification has heuristic merit in guiding research efforts and discussions and in directing the clinician to a simple and practical solution of his patient's phobic disorder.

  18. A new brain-computer interface design using fuzzy ARTMAP.

    PubMed

    Palaniappan, Ramaswamy; Paramesran, Raveendran; Nishida, Shogo; Saiwaki, Naoki

    2002-09-01

    This paper proposes a new brain-computer interface (BCI) design using fuzzy ARTMAP (FA) neural network, as well as an application of the design. The objective of this BCI-FA design is to classify the best three of the five available mental tasks for each subject using power spectral density (PSD) values of electroencephalogram (EEG) signals. These PSD values are extracted using the Wiener-Khinchine and autoregressive methods. Ten experiments employing different triplets of mental tasks are studied for each subject. The findings show that the average BCI-FA outputs for four subjects gave less than 6% of error using the best triplets of mental tasks identified from the classification performances of FA. This implies that the BCI-FA can be successfully used with a tri-state switching device. As an application, a proposed tri-state Morse code scheme could be utilized to translate the outputs of this BCI-FA design into English letters. In this scheme, the three BCI-FA outputs correspond to a dot and a dash, which are the two basic Morse code alphabets and a space to denote the end (or beginning) of a dot or a dash. The construction of English letters using this tri-state Morse code scheme is determined only by the sequence of mental tasks and is independent of the time duration of each mental task. This is especially useful for constructing letters that are represented as multiple dots or dashes. This combination of BCI-FA design and the tri-state Morse code scheme could be developed as a communication system for paralyzed patients.

  19. A classification of open Gaussian dynamics

    NASA Astrophysics Data System (ADS)

    Grimmer, Daniel; Brown, Eric; Kempf, Achim; Mann, Robert B.; Martín-Martínez, Eduardo

    2018-06-01

    We introduce a classification scheme for the generators of bosonic open Gaussian dynamics, providing instructive diagrams description for each type of dynamics. Using this classification, we discuss the consequences of imposing complete positivity on Gaussian dynamics. In particular, we show that non-symplectic operations must be active to allow for complete positivity. In addition, non-symplectic operations can, in fact, conserve the volume of phase space only if the restriction of complete positivity is lifted. We then discuss the implications for the relationship between information and energy flows in open quantum mechanics.

  20. Contemplating case mix: A primer on case mix classification and management.

    PubMed

    Costa, Andrew P; Poss, Jeffery W; McKillop, Ian

    2015-01-01

    Case mix classifications are the frameworks that underlie many healthcare funding schemes, including the so-called activity-based funding. Now more than ever, Canadian healthcare administrators are evaluating case mix-based funding and deciphering how they will influence their organization. Case mix is a topic fraught with technical jargon and largely relegated to government agencies or private industries. This article provides an abridged review of case mix classification as well as its implications for management in healthcare. © 2015 The Canadian College of Health Leaders.

  1. A three-parameter asteroid taxonomy

    NASA Technical Reports Server (NTRS)

    Tedesco, Edward F.; Williams, James G.; Matson, Dennis L.; Veeder, Glenn J.; Gradie, Jonathan C.

    1989-01-01

    Broadband U, V, and x photometry together with IRAS asteroid albedos have been used to construct an asteroid classification system. The system is based on three parameters (U-V and v-x color indices and visual geometric albedo), and it is able to place 96 percent of the present sample of 357 asteroids into 11 taxonomic classes. It is noted that all but one of these classes are analogous to those previously found using other classification schemes. The algorithm is shown to account for the observational uncertainties in each of the classification parameters.

  2. 14 CFR Section 17 - Objective Classification-Extraordinary Items

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Objective Classification-Extraordinary Items Section 17 Section 17 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... AIR CARRIERS Profit and Loss Classification Section 17 Objective Classification—Extraordinary Items...

  3. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    NASA Technical Reports Server (NTRS)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Henry; Koo, David; Bassett, Robert; hide

    2014-01-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H <24.5 involving the dedicated efforts of 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies spanning 0 < z < 4 over all the fields. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, k-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed - GOODS-S, which has been classified at various depths. The wide area coverage spanning the full field (wide+deep+ERS) includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all of the visual classifications in GOODS-S along with the Perl/Tk GUI that we developed to classify galaxies. We present our initial results here, including an analysis of our internal consistency and comparisons among multiple classifiers as well as a comparison to the Sersic index. We find that the level of agreement among classifiers is quite good and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement and irregulars the lowest. A comparison of our classifications with the Sersic index and restframe colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or are very faint in the V-band.

  4. Site classification of Indian strong motion network using response spectra ratios

    NASA Astrophysics Data System (ADS)

    Chopra, Sumer; Kumar, Vikas; Choudhury, Pallabee; Yadav, R. B. S.

    2018-03-01

    In the present study, we tried to classify the Indian strong motion sites spread all over Himalaya and adjoining region, located on varied geological formations, based on response spectral ratio. A total of 90 sites were classified based on 395 strong motion records from 94 earthquakes recorded at these sites. The magnitude of these earthquakes are between 2.3 and 7.7 and the hypocentral distance for most of the cases is less than 50 km. The predominant period obtained from response spectral ratios is used to classify these sites. It was found that the shape and predominant peaks of the spectra at these sites match with those in Japan, Italy, Iran, and at some of the sites in Europe and the same classification scheme can be applied to Indian strong motion network. We found that the earlier schemes based on description of near-surface geology, geomorphology, and topography were not able to capture the effect of sediment thickness. The sites are classified into seven classes (CL-I to CL-VII) with varying predominant periods and ranges as proposed by Alessandro et al. (Bull Seismol Soc Am 102:680-695 2012). The effect of magnitudes and hypocentral distances on the shape and predominant peaks were also studied and found to be very small. The classification scheme is robust and cost-effective and can be used in region-specific attenuation relationships for accounting local site effect.

  5. Reclassification: Rationale and Problems; Proceedings of a Conference on Reclassification held at the Center of Adult Education, University of Maryland, College Park, April 4 to 6, 1968.

    ERIC Educational Resources Information Center

    Perreault, Jean M., Ed.

    Several factors are involved in the decision to reclassify library collections and several problems and choices must be faced. The discussion of four classification schemes (Dewey Decimal, Library of Congress, Library of Congress subject-headings and Universal Decimal Classification) involved in the choices concerns their structure, currency,…

  6. Mandatory, Preferred, or Discretionary: How the Classification of Domestic Violence Warrantless Arrest Laws Impacts Their Estimated Effects on Intimate Partner Homicide

    ERIC Educational Resources Information Center

    Zeoli, April M.; Norris, Alexis; Brenner, Hannah

    2011-01-01

    Warrantless arrest laws for domestic violence (DV) are generally classified as discretionary, preferred, or mandatory, based on the level of power accorded to police in deciding whether to arrest. However, there is a lack of consensus in the literature regarding how each state's law should be categorized. Using three classification schemes, this…

  7. Formalizing Resources for Planning

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; McGann, Conor; Ramakrishnan, Sailesh

    2003-01-01

    In this paper we present a classification scheme which circumscribes a large class of resources found in the real world. Building on the work of others we also define key properties of resources that allow formal expression of the proposed classification. Furthermore, operations that change the state of a resource are formalized. Together, properties and operations go a long way in formalizing the representation and reasoning aspects of resources for planning.

  8. CANDELS Visual Classifications: Scheme, Data Release, and First Results

    NASA Astrophysics Data System (ADS)

    Kartaltepe, Jeyhan S.; Mozena, Mark; Kocevski, Dale; McIntosh, Daniel H.; Lotz, Jennifer; Bell, Eric F.; Faber, Sandy; Ferguson, Harry; Koo, David; Bassett, Robert; Bernyk, Maksym; Blancato, Kirsten; Bournaud, Frederic; Cassata, Paolo; Castellano, Marco; Cheung, Edmond; Conselice, Christopher J.; Croton, Darren; Dahlen, Tomas; de Mello, Duilia F.; DeGroot, Laura; Donley, Jennifer; Guedes, Javiera; Grogin, Norman; Hathi, Nimish; Hilton, Matt; Hollon, Brett; Koekemoer, Anton; Liu, Nick; Lucas, Ray A.; Martig, Marie; McGrath, Elizabeth; McPartland, Conor; Mobasher, Bahram; Morlock, Alice; O'Leary, Erin; Peth, Mike; Pforr, Janine; Pillepich, Annalisa; Rosario, David; Soto, Emmaris; Straughn, Amber; Telford, Olivia; Sunnquist, Ben; Trump, Jonathan; Weiner, Benjamin; Wuyts, Stijn; Inami, Hanae; Kassin, Susan; Lani, Caterina; Poole, Gregory B.; Rizer, Zachary

    2015-11-01

    We have undertaken an ambitious program to visually classify all galaxies in the five CANDELS fields down to H < 24.5 involving the dedicated efforts of over 65 individual classifiers. Once completed, we expect to have detailed morphological classifications for over 50,000 galaxies spanning 0 < z < 4 over all the fields, with classifications from 3 to 5 independent classifiers for each galaxy. Here, we present our detailed visual classification scheme, which was designed to cover a wide range of CANDELS science goals. This scheme includes the basic Hubble sequence types, but also includes a detailed look at mergers and interactions, the clumpiness of galaxies, k-corrections, and a variety of other structural properties. In this paper, we focus on the first field to be completed—GOODS-S, which has been classified at various depths. The wide area coverage spanning the full field (wide+deep+ERS) includes 7634 galaxies that have been classified by at least three different people. In the deep area of the field, 2534 galaxies have been classified by at least five different people at three different depths. With this paper, we release to the public all of the visual classifications in GOODS-S along with the Perl/Tk GUI that we developed to classify galaxies. We present our initial results here, including an analysis of our internal consistency and comparisons among multiple classifiers as well as a comparison to the Sérsic index. We find that the level of agreement among classifiers is quite good (>70% across the full magnitude range) and depends on both the galaxy magnitude and the galaxy type, with disks showing the highest level of agreement (>50%) and irregulars the lowest (<10%). A comparison of our classifications with the Sérsic index and rest-frame colors shows a clear separation between disk and spheroid populations. Finally, we explore morphological k-corrections between the V-band and H-band observations and find that a small fraction (84 galaxies in total) are classified as being very different between these two bands. These galaxies typically have very clumpy and extended morphology or are very faint in the V-band.

  9. Object-Based Random Forest Classification of Land Cover from Remotely Sensed Imagery for Industrial and Mining Reclamation

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Luo, M.; Xu, L.; Zhou, X.; Ren, J.; Zhou, J.

    2018-04-01

    The RF method based on grid-search parameter optimization could achieve a classification accuracy of 88.16 % in the classification of images with multiple feature variables. This classification accuracy was higher than that of SVM and ANN under the same feature variables. In terms of efficiency, the RF classification method performs better than SVM and ANN, it is more capable of handling multidimensional feature variables. The RF method combined with object-based analysis approach could highlight the classification accuracy further. The multiresolution segmentation approach on the basis of ESP scale parameter optimization was used for obtaining six scales to execute image segmentation, when the segmentation scale was 49, the classification accuracy reached the highest value of 89.58 %. The classification accuracy of object-based RF classification was 1.42 % higher than that of pixel-based classification (88.16 %), and the classification accuracy was further improved. Therefore, the RF classification method combined with object-based analysis approach could achieve relatively high accuracy in the classification and extraction of land use information for industrial and mining reclamation areas. Moreover, the interpretation of remotely sensed imagery using the proposed method could provide technical support and theoretical reference for remotely sensed monitoring land reclamation.

  10. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-07-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least 7 moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (>8°) to steep (>15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  11. Moraine-dammed lake failures in Patagonia and assessment of outburst susceptibility in the Baker Basin

    NASA Astrophysics Data System (ADS)

    Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.

    2014-12-01

    Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.

  12. Diagnosis of breast masses from dynamic contrast-enhanced and diffusion-weighted MR: a machine learning approach.

    PubMed

    Cai, Hongmin; Peng, Yanxia; Ou, Caiwen; Chen, Minsheng; Li, Li

    2014-01-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is increasingly used for breast cancer diagnosis as supplementary to conventional imaging techniques. Combining of diffusion-weighted imaging (DWI) of morphology and kinetic features from DCE-MRI to improve the discrimination power of malignant from benign breast masses is rarely reported. The study comprised of 234 female patients with 85 benign and 149 malignant lesions. Four distinct groups of features, coupling with pathological tests, were estimated to comprehensively characterize the pictorial properties of each lesion, which was obtained by a semi-automated segmentation method. Classical machine learning scheme including feature subset selection and various classification schemes were employed to build prognostic model, which served as a foundation for evaluating the combined effects of the multi-sided features for predicting of the types of lesions. Various measurements including cross validation and receiver operating characteristics were used to quantify the diagnostic performances of each feature as well as their combination. Seven features were all found to be statistically different between the malignant and the benign groups and their combination has achieved the highest classification accuracy. The seven features include one pathological variable of age, one morphological variable of slope, three texture features of entropy, inverse difference and information correlation, one kinetic feature of SER and one DWI feature of apparent diffusion coefficient (ADC). Together with the selected diagnostic features, various classical classification schemes were used to test their discrimination power through cross validation scheme. The averaged measurements of sensitivity, specificity, AUC and accuracy are 0.85, 0.89, 90.9% and 0.93, respectively. Multi-sided variables which characterize the morphological, kinetic, pathological properties and DWI measurement of ADC can dramatically improve the discriminatory power of breast lesions.

  13. A cancelable biometric scheme based on multi-lead ECGs.

    PubMed

    Peng-Tzu Chen; Shun-Chi Wu; Jui-Hsuan Hsieh

    2017-07-01

    Biometric technologies offer great advantages over other recognition methods, but there are concerns that they may compromise the privacy of individuals. In this paper, an electrocardiogram (ECG)-based cancelable biometric scheme is proposed to relieve such concerns. In this scheme, distinct biometric templates for a given beat bundle are constructed via "subspace collapsing." To determine the identity of any unknown beat bundle, the multiple signal classification (MUSIC) algorithm, incorporating a "suppression and poll" strategy, is adopted. Unlike the existing cancelable biometric schemes, knowledge of the distortion transform is not required for recognition. Experiments with real ECGs from 285 subjects are presented to illustrate the efficacy of the proposed scheme. The best recognition rate of 97.58 % was achieved under the test condition N train = 10 and N test = 10.

  14. A proposed radiographic classification scheme for congenital thoracic vertebral malformations in brachycephalic "screw-tailed" dog breeds.

    PubMed

    Gutierrez-Quintana, Rodrigo; Guevar, Julien; Stalin, Catherine; Faller, Kiterie; Yeamans, Carmen; Penderis, Jacques

    2014-01-01

    Congenital vertebral malformations are common in brachycephalic "screw-tailed" dog breeds such as French bulldogs, English bulldogs, Boston terriers, and pugs. The aim of this retrospective study was to determine whether a radiographic classification scheme developed for use in humans would be feasible for use in these dog breeds. Inclusion criteria were hospital admission between September 2009 and April 2013, neurologic examination findings available, diagnostic quality lateral and ventro-dorsal digital radiographs of the thoracic vertebral column, and at least one congenital vertebral malformation. Radiographs were retrieved and interpreted by two observers who were unaware of neurologic status. Vertebral malformations were classified based on a classification scheme modified from a previous human study and a consensus of both observers. Twenty-eight dogs met inclusion criteria (12 with neurologic deficits, 16 with no neurologic deficits). Congenital vertebral malformations affected 85/362 (23.5%) of thoracic vertebrae. Vertebral body formation defects were the most common (butterfly vertebrae 6.6%, ventral wedge-shaped vertebrae 5.5%, dorsal hemivertebrae 0.8%, and dorso-lateral hemivertebrae 0.5%). No lateral hemivertebrae or lateral wedge-shaped vertebrae were identified. The T7 vertebra was the most commonly affected (11/28 dogs), followed by T8 (8/28 dogs) and T12 (8/28 dogs). The number and type of vertebral malformations differed between groups (P = 0.01). Based on MRI, dorsal, and dorso-lateral hemivertebrae were the cause of spinal cord compression in 5/12 (41.6%) of dogs with neurologic deficits. Findings indicated that a modified human radiographic classification system of vertebral malformations is feasible for use in future studies of brachycephalic "screw-tailed" dogs. © 2014 American College of Veterinary Radiology.

  15. Automated Classification of Thermal Infrared Spectra Using Self-organizing Maps

    NASA Technical Reports Server (NTRS)

    Roush, Ted L.; Hogan, Robert

    2006-01-01

    Existing and planned space missions to a variety of planetary and satellite surfaces produce an ever increasing volume of spectral data. Understanding the scientific informational content in this large data volume is a daunting task. Fortunately various statistical approaches are available to assess such data sets. Here we discuss an automated classification scheme based on Kohonen Self-organizing maps (SOM) we have developed. The SUM process produces an output layer were spectra having similar properties lie in close proximity to each other. One major effort is partitioning this output layer into appropriate regions. This is prefonned by defining dosed regions based upon the strength of the boundaries between adjacent cells in the SOM output layer. We use the Davies-Bouldin index as a measure of the inter-class similarities and intra-class dissimilarities that determines the optimum partition of the output layer, and hence number of SOM clusters. This allows us to identify the natural number of clusters formed from the spectral data. Mineral spectral libraries prepared at Arizona State University (ASU) and John Hopkins University (JHU) are used to test and evaluate the classification scheme. We label the library sample spectra in a hierarchical scheme with class, subclass, and mineral group names. We use a portion of the spectra to train the SOM, i.e. produce the output layer, while the remaining spectra are used to test the SOM. The test spectra are presented to the SOM output layer and assigned membership to the appropriate cluster. We then evaluate these assignments to assess the scientific meaning and accuracy of the derived SOM classes as they relate to the labels. We demonstrate that unsupervised classification by SOMs can be a useful component in autonomous systems designed to identify mineral species from reflectance and emissivity spectra in the therrnal IR.

  16. Differentiation chronic post traumatic stress disorder patients from healthy subjects using objective and subjective sleep-related parameters.

    PubMed

    Tahmasian, Masoud; Jamalabadi, Hamidreza; Abedini, Mina; Ghadami, Mohammad R; Sepehry, Amir A; Knight, David C; Khazaie, Habibolah

    2017-05-22

    Sleep disturbance is common in chronic post-traumatic stress disorder (PTSD). However, prior work has demonstrated that there are inconsistencies between subjective and objective assessments of sleep disturbance in PTSD. Therefore, we investigated whether subjective or objective sleep assessment has greater clinical utility to differentiate PTSD patients from healthy subjects. Further, we evaluated whether the combination of subjective and objective methods improves the accuracy of classification into patient versus healthy groups, which has important diagnostic implications. We recruited 32 chronic war-induced PTSD patients and 32 age- and gender-matched healthy subjects to participate in this study. Subjective (i.e. from three self-reported sleep questionnaires) and objective sleep-related data (i.e. from actigraphy scores) were collected from each participant. Subjective, objective, and combined (subjective and objective) sleep data were then analyzed using support vector machine classification. The classification accuracy, sensitivity, and specificity for subjective variables were 89.2%, 89.3%, and 89%, respectively. The classification accuracy, sensitivity, and specificity for objective variables were 65%, 62.3%, and 67.8%, respectively. The classification accuracy, sensitivity, and specificity for the aggregate variables (combination of subjective and objective variables) were 91.6%, 93.0%, and 90.3%, respectively. Our findings indicate that classification accuracy using subjective measurements is superior to objective measurements and the combination of both assessments appears to improve the classification accuracy for differentiating PTSD patients from healthy individuals. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Object-based land cover classification based on fusion of multifrequency SAR data and THAICHOTE optical imagery

    NASA Astrophysics Data System (ADS)

    Sukawattanavijit, Chanika; Srestasathiern, Panu

    2017-10-01

    Land Use and Land Cover (LULC) information are significant to observe and evaluate environmental change. LULC classification applying remotely sensed data is a technique popularly employed on a global and local dimension particularly, in urban areas which have diverse land cover types. These are essential components of the urban terrain and ecosystem. In the present, object-based image analysis (OBIA) is becoming widely popular for land cover classification using the high-resolution image. COSMO-SkyMed SAR data was fused with THAICHOTE (namely, THEOS: Thailand Earth Observation Satellite) optical data for land cover classification using object-based. This paper indicates a comparison between object-based and pixel-based approaches in image fusion. The per-pixel method, support vector machines (SVM) was implemented to the fused image based on Principal Component Analysis (PCA). For the objectbased classification was applied to the fused images to separate land cover classes by using nearest neighbor (NN) classifier. Finally, the accuracy assessment was employed by comparing with the classification of land cover mapping generated from fused image dataset and THAICHOTE image. The object-based data fused COSMO-SkyMed with THAICHOTE images demonstrated the best classification accuracies, well over 85%. As the results, an object-based data fusion provides higher land cover classification accuracy than per-pixel data fusion.

  18. A novel fractal image compression scheme with block classification and sorting based on Pearson's correlation coefficient.

    PubMed

    Wang, Jianji; Zheng, Nanning

    2013-09-01

    Fractal image compression (FIC) is an image coding technology based on the local similarity of image structure. It is widely used in many fields such as image retrieval, image denoising, image authentication, and encryption. FIC, however, suffers from the high computational complexity in encoding. Although many schemes are published to speed up encoding, they do not easily satisfy the encoding time or the reconstructed image quality requirements. In this paper, a new FIC scheme is proposed based on the fact that the affine similarity between two blocks in FIC is equivalent to the absolute value of Pearson's correlation coefficient (APCC) between them. First, all blocks in the range and domain pools are chosen and classified using an APCC-based block classification method to increase the matching probability. Second, by sorting the domain blocks with respect to APCCs between these domain blocks and a preset block in each class, the matching domain block for a range block can be searched in the selected domain set in which these APCCs are closer to APCC between the range block and the preset block. Experimental results show that the proposed scheme can significantly speed up the encoding process in FIC while preserving the reconstructed image quality well.

  19. An improved fault detection classification and location scheme based on wavelet transform and artificial neural network for six phase transmission line using single end data only.

    PubMed

    Koley, Ebha; Verma, Khushaboo; Ghosh, Subhojit

    2015-01-01

    Restrictions on right of way and increasing power demand has boosted development of six phase transmission. It offers a viable alternative for transmitting more power, without major modification in existing structure of three phase double circuit transmission system. Inspite of the advantages, low acceptance of six phase system is attributed to the unavailability of a proper protection scheme. The complexity arising from large number of possible faults in six phase lines makes the protection quite challenging. The proposed work presents a hybrid wavelet transform and modular artificial neural network based fault detector, classifier and locator for six phase lines using single end data only. The standard deviation of the approximate coefficients of voltage and current signals obtained using discrete wavelet transform are applied as input to the modular artificial neural network for fault classification and location. The proposed scheme has been tested for all 120 types of shunt faults with variation in location, fault resistance, fault inception angles. The variation in power system parameters viz. short circuit capacity of the source and its X/R ratio, voltage, frequency and CT saturation has also been investigated. The result confirms the effectiveness and reliability of the proposed protection scheme which makes it ideal for real time implementation.

  20. GIS coupled Multiple Criteria based Decision Support for Classification of Urban Coastal Areas in India

    NASA Astrophysics Data System (ADS)

    Dhiman, R.; Kalbar, P.; Inamdar, A. B.

    2017-12-01

    Coastal area classification in India is a challenge for federal and state government agencies due to fragile institutional framework, unclear directions in implementation of costal regulations and violations happening at private and government level. This work is an attempt to improvise the objectivity of existing classification methods to synergies the ecological systems and socioeconomic development in coastal cities. We developed a Geographic information system coupled Multi-criteria Decision Making (GIS-MCDM) approach to classify urban coastal areas where utility functions are used to transform the costal features into quantitative membership values after assessing the sensitivity of urban coastal ecosystem. Furthermore, these membership values for costal features are applied in different weighting schemes to derive Coastal Area Index (CAI) which classifies the coastal areas in four distinct categories viz. 1) No Development Zone, 2) Highly Sensitive Zone, 3) Moderately Sensitive Zone and 4) Low Sensitive Zone based on the sensitivity of urban coastal ecosystem. Mumbai, a coastal megacity in India is used as case study for demonstration of proposed method. Finally, uncertainty analysis using Monte Carlo approach to validate the sensitivity of CAI under specific multiple scenarios is carried out. Results of CAI method shows the clear demarcation of coastal areas in GIS environment based on the ecological sensitivity. CAI provides better decision support for federal and state level agencies to classify urban coastal areas according to the regional requirement of coastal resources considering resilience and sustainable development. CAI method will strengthen the existing institutional framework for decision making in classification of urban coastal areas where most effective coastal management options can be proposed.

  1. Classification of gravity-flow deposits and their significance for unconventional petroleum exploration, with a case study from the Triassic Yanchang Formation (southern Ordos Basin, China)

    NASA Astrophysics Data System (ADS)

    Fan, Aiping; Yang, Renchao; (Tom) van Loon, A. J.; Yin, Wei; Han, Zuozhen; Zavala, Carlos

    2018-08-01

    The ongoing exploration for shale oil and gas has focused sedimentological research on the transport and deposition mechanisms of fine-grained sediments, and more specifically on fine-grained mass-flow deposits. It appears, however, that no easily applicable classification scheme for gravity-flow deposits exists, and that such classifications almost exclusively deal with sandy and coarser sediments. Since the lack of a good classification system for fine-grained gravity flow deposits hampers scientific communication and understanding, we propose a classification scheme on the basis of the mud content in combination with the presumed transport mechanism. This results in twelve types of gravity-flow deposits. In order to show the practical applicability of this classification system, we apply it to the Triassic lacustrine Yanchang Formation in the southern Ordos Basin (China), which contains numerous slumps, debris-flows deposits, turbidites and hyperpycnites. The slumps and debrites occur mostly close to a delta front, and the turbidites and hyperpycnites extend over large areas from the delta slopes into the basin plain. The case study shows that (1) mud cannot only be transported but also deposited under active hydrodynamic conditions; (2) fine-grained gravity-flow constitute a significant part of the lacustrine mudstones and shales; (3) muddy gravity flows are important for the transport and deposition of clastic particles, clay minerals and organic matter, and thus are important mechanisms involved in the generation of hydrocarbons, also largely determining the reservoir capability for unconventional petroleum.

  2. Functional Basis of Microorganism Classification.

    PubMed

    Zhu, Chengsheng; Delmont, Tom O; Vogel, Timothy M; Bromberg, Yana

    2015-08-01

    Correctly identifying nearest "neighbors" of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned with phylogenetic descent.

  3. Treatment outcomes of saddle nose correction.

    PubMed

    Hyun, Sang Min; Jang, Yong Ju

    2013-01-01

    Many valuable classification schemes for saddle nose have been suggested that integrate clinical deformity and treatment; however, there is no consensus regarding the most suitable classification and surgical method for saddle nose correction. To present clinical characteristics and treatment outcome of saddle nose deformity and to propose a modified classification system to better characterize the variety of different saddle nose deformities. The retrospective study included 91 patients who underwent rhinoplasty for correction of saddle nose from April 1, 2003, through December 31, 2011, with a minimum follow-up of 8 months. Saddle nose was classified into 4 types according to a modified classification. Aesthetic outcomes were classified as excellent, good, fair, or poor. Patients underwent minor cosmetic concealment by dorsal augmentation (n = 8) or major septal reconstruction combined with dorsal augmentation (n = 83). Autologous costal cartilages were used in 40 patients (44%), and homologous costal cartilages were used in 5 patients (6%). According to postoperative assessment, 29 patients had excellent, 42 patients had good, 18 patients had fair, and 2 patients had poor aesthetic outcomes. No statistical difference in surgical outcome according to saddle nose classification was observed. Eight patients underwent revision rhinoplasty, owing to recurrence of saddle, wound infection, or warping of the costal cartilage for dorsal augmentation. We introduce a modified saddle nose classification scheme that is simpler and better able to characterize different deformities. Among 91 patients with saddle nose, 20 (22%) had unsuccessful outcomes (fair or poor) and 8 (9%) underwent subsequent revision rhinoplasty. Thus, management of saddle nose deformities remains challenging. 4.

  4. Functional Basis of Microorganism Classification

    PubMed Central

    Zhu, Chengsheng; Delmont, Tom O.; Vogel, Timothy M.; Bromberg, Yana

    2015-01-01

    Correctly identifying nearest “neighbors” of a given microorganism is important in industrial and clinical applications where close relationships imply similar treatment. Microbial classification based on similarity of physiological and genetic organism traits (polyphasic similarity) is experimentally difficult and, arguably, subjective. Evolutionary relatedness, inferred from phylogenetic markers, facilitates classification but does not guarantee functional identity between members of the same taxon or lack of similarity between different taxa. Using over thirteen hundred sequenced bacterial genomes, we built a novel function-based microorganism classification scheme, functional-repertoire similarity-based organism network (FuSiON; flattened to fusion). Our scheme is phenetic, based on a network of quantitatively defined organism relationships across the known prokaryotic space. It correlates significantly with the current taxonomy, but the observed discrepancies reveal both (1) the inconsistency of functional diversity levels among different taxa and (2) an (unsurprising) bias towards prioritizing, for classification purposes, relatively minor traits of particular interest to humans. Our dynamic network-based organism classification is independent of the arbitrary pairwise organism similarity cut-offs traditionally applied to establish taxonomic identity. Instead, it reveals natural, functionally defined organism groupings and is thus robust in handling organism diversity. Additionally, fusion can use organism meta-data to highlight the specific environmental factors that drive microbial diversification. Our approach provides a complementary view to cladistic assignments and holds important clues for further exploration of microbial lifestyles. Fusion is a more practical fit for biomedical, industrial, and ecological applications, as many of these rely on understanding the functional capabilities of the microbes in their environment and are less concerned with phylogenetic descent. PMID:26317871

  5. Changing Patient Classification System for Hospital Reimbursement in Romania

    PubMed Central

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-01-01

    Aim To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Methods Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). Results The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians’ knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Conclusion Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care. PMID:20564769

  6. Changing patient classification system for hospital reimbursement in Romania.

    PubMed

    Radu, Ciprian-Paul; Chiriac, Delia Nona; Vladescu, Cristian

    2010-06-01

    To evaluate the effects of the change in the diagnosis-related group (DRG) system on patient morbidity and hospital financial performance in the Romanian public health care system. Three variables were assessed before and after the classification switch in July 2007: clinical outcomes, the case mix index, and hospital budgets, using the database of the National School of Public Health and Health Services Management, which contains data regularly received from hospitals reimbursed through the Romanian DRG scheme (291 in 2009). The lack of a Romanian system for the calculation of cost-weights imposed the necessity to use an imported system, which was criticized by some clinicians for not accurately reflecting resource consumption in Romanian hospitals. The new DRG classification system allowed a more accurate clinical classification. However, it also exposed a lack of physicians' knowledge on diagnosing and coding procedures, which led to incorrect coding. Consequently, the reported hospital morbidity changed after the DRG switch, reflecting an increase in the national case-mix index of 25% in 2009 (compared with 2007). Since hospitals received the same reimbursement over the first two years after the classification switch, the new DRG system led them sometimes to change patients' diagnoses in order to receive more funding. Lack of oversight of hospital coding and reporting to the national reimbursement scheme allowed the increase in the case-mix index. The complexity of the new classification system requires more resources (human and financial), better monitoring and evaluation, and improved legislation in order to achieve better hospital resource allocation and more efficient patient care.

  7. Haptic Classification of Common Objects: Knowledge-Driven Exploration.

    ERIC Educational Resources Information Center

    Lederman, Susan J.; Klatzky, Roberta L.

    1990-01-01

    Theoretical and empirical issues relating to haptic exploration and the representation of common objects during haptic classification were investigated in 3 experiments involving a total of 112 college students. Results are discussed in terms of a computational model of human haptic object classification with implications for dextrous robot…

  8. Objects Classification by Learning-Based Visual Saliency Model and Convolutional Neural Network.

    PubMed

    Li, Na; Zhao, Xinbo; Yang, Yongjia; Zou, Xiaochun

    2016-01-01

    Humans can easily classify different kinds of objects whereas it is quite difficult for computers. As a hot and difficult problem, objects classification has been receiving extensive interests with broad prospects. Inspired by neuroscience, deep learning concept is proposed. Convolutional neural network (CNN) as one of the methods of deep learning can be used to solve classification problem. But most of deep learning methods, including CNN, all ignore the human visual information processing mechanism when a person is classifying objects. Therefore, in this paper, inspiring the completed processing that humans classify different kinds of objects, we bring forth a new classification method which combines visual attention model and CNN. Firstly, we use the visual attention model to simulate the processing of human visual selection mechanism. Secondly, we use CNN to simulate the processing of how humans select features and extract the local features of those selected areas. Finally, not only does our classification method depend on those local features, but also it adds the human semantic features to classify objects. Our classification method has apparently advantages in biology. Experimental results demonstrated that our method made the efficiency of classification improve significantly.

  9. Professor Krystyna Kotełko and her contribution to the study of Proteus endotoxin.

    PubMed

    Różalski, Antoni W

    2018-04-01

    Professor Krystyna Kotełko was working as a microbiologist at the University of Łódź (Poland). Her main object of study was the LPS (endotoxin) of opportunistic urinary pathogens from the genus Proteus. She demonstrated, for the first time, the presence of uronic acids and amino acids, as well as two heptoses (L- glycero-D- manno-heptose and D- glycero-D- manno-heptose) and hexosamines in Proteus LPS, and developed a classification scheme of the Proteus LPS into chemotypes. Prof Kotełko also initiated studies on the chemical structure of Proteus O-specific polysaccharide and investigations on the serological specificity of this part of LPS, as well its core region. She also analysed the virulence factors of these bacteria, such as haemolysin and invasiveness.

  10. Extrasolar planetary systems.

    NASA Technical Reports Server (NTRS)

    Huang, S.-S.

    1973-01-01

    The terms 'planet' and 'planet-like objects' are defined. The observational search for extrasolar planetary systems is described, as performable by earthbound optical telescopes, by space probes, by long baseline radio interferometry, and finally by inference from the reception of signals sent by intelligent beings in other worlds. It is shown that any planetary system must be preceded by a rotating disk of gas and dust around a central mass. A brief review of the theories of the formation of the solar system is given, along with a proposed scheme for classification of these theories. The evidence for magnetic activity in the early stages of stellar evolution is presented. The magnetic braking theories of solar and stellar rotation are discussed, and an estimate is made for the frequency of occurrence of planetary systems in the universe.

  11. Drug-induced sedation endoscopy (DISE) classification systems: a systematic review and meta-analysis.

    PubMed

    Dijemeni, Esuabom; D'Amone, Gabriele; Gbati, Israel

    2017-12-01

    Drug-induced sedation endoscopy (DISE) classification systems have been used to assess anatomical findings on upper airway obstruction, and decide and plan surgical treatments and act as a predictor for surgical treatment outcome for obstructive sleep apnoea management. The first objective is to identify if there is a universally accepted DISE grading and classification system for analysing DISE findings. The second objective is to identify if there is one DISE grading and classification treatment planning framework for deciding appropriate surgical treatment for obstructive sleep apnoea (OSA). The third objective is to identify if there is one DISE grading and classification treatment outcome framework for determining the likelihood of success for a given OSA surgical intervention. A systematic review was performed to identify new and significantly modified DISE classification systems: concept, advantages and disadvantages. Fourteen studies proposing a new DISE classification system and three studies proposing a significantly modified DISE classification were identified. None of the studies were based on randomised control trials. DISE is an objective method for visualising upper airway obstruction. The classification and assessment of clinical findings based on DISE is highly subjective due to the increasing number of DISE classification systems. Hence, this creates a growing divergence in surgical treatment planning and treatment outcome. Further research on a universally accepted objective DISE assessment is critically needed.

  12. In-vivo determination of chewing patterns using FBG and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Pegorini, Vinicius; Zen Karam, Leandro; Rocha Pitta, Christiano S.; Ribeiro, Richardson; Simioni Assmann, Tangriani; Cardozo da Silva, Jean Carlos; Bertotti, Fábio L.; Kalinowski, Hypolito J.; Cardoso, Rafael

    2015-09-01

    This paper reports the process of pattern classification of the chewing process of ruminants. We propose a simplified signal processing scheme for optical fiber Bragg grating (FBG) sensors based on machine learning techniques. The FBG sensors measure the biomechanical forces during jaw movements and an artificial neural network is responsible for the classification of the associated chewing pattern. In this study, three patterns associated to dietary supplement, hay and ryegrass were considered. Additionally, two other important events for ingestive behavior studies were monitored, rumination and idle period. Experimental results show that the proposed approach for pattern classification has been capable of differentiating the materials involved in the chewing process with a small classification error.

  13. Automated source classification of new transient sources

    NASA Astrophysics Data System (ADS)

    Oertel, M.; Kreikenbohm, A.; Wilms, J.; DeLuca, A.

    2017-10-01

    The EXTraS project harvests the hitherto unexplored temporal domain information buried in the serendipitous data collected by the European Photon Imaging Camera (EPIC) onboard the ESA XMM-Newton mission since its launch. This includes a search for fast transients, missed by standard image analysis, and a search and characterization of variability in hundreds of thousands of sources. We present an automated classification scheme for new transient sources in the EXTraS project. The method is as follows: source classification features of a training sample are used to train machine learning algorithms (performed in R; randomForest (Breiman, 2001) in supervised mode) which are then tested on a sample of known source classes and used for classification.

  14. An evaluation of sampling and full enumeration strategies for Fisher Jenks classification in big data settings

    USGS Publications Warehouse

    Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.

    2017-01-01

    Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.

  15. Automatic ICD-10 multi-class classification of cause of death from plaintext autopsy reports through expert-driven feature selection

    PubMed Central

    Mujtaba, Ghulam; Shuib, Liyana; Raj, Ram Gopal; Rajandram, Retnagowri; Shaikh, Khairunisa; Al-Garadi, Mohammed Ali

    2017-01-01

    Objectives Widespread implementation of electronic databases has improved the accessibility of plaintext clinical information for supplementary use. Numerous machine learning techniques, such as supervised machine learning approaches or ontology-based approaches, have been employed to obtain useful information from plaintext clinical data. This study proposes an automatic multi-class classification system to predict accident-related causes of death from plaintext autopsy reports through expert-driven feature selection with supervised automatic text classification decision models. Methods Accident-related autopsy reports were obtained from one of the largest hospital in Kuala Lumpur. These reports belong to nine different accident-related causes of death. Master feature vector was prepared by extracting features from the collected autopsy reports by using unigram with lexical categorization. This master feature vector was used to detect cause of death [according to internal classification of disease version 10 (ICD-10) classification system] through five automated feature selection schemes, proposed expert-driven approach, five subset sizes of features, and five machine learning classifiers. Model performance was evaluated using precisionM, recallM, F-measureM, accuracy, and area under ROC curve. Four baselines were used to compare the results with the proposed system. Results Random forest and J48 decision models parameterized using expert-driven feature selection yielded the highest evaluation measure approaching (85% to 90%) for most metrics by using a feature subset size of 30. The proposed system also showed approximately 14% to 16% improvement in the overall accuracy compared with the existing techniques and four baselines. Conclusion The proposed system is feasible and practical to use for automatic classification of ICD-10-related cause of death from autopsy reports. The proposed system assists pathologists to accurately and rapidly determine underlying cause of death based on autopsy findings. Furthermore, the proposed expert-driven feature selection approach and the findings are generally applicable to other kinds of plaintext clinical reports. PMID:28166263

  16. Development of a Procurement Task Classification Scheme.

    DTIC Science & Technology

    1987-12-01

    Office of Sci- entific Research, Arlington, Virginia, January 1970. Tornow , Walter W . and Pinto, Patrick R. "The Development of a Man- agerial Job...classification. [Ref. 4:271 -. 20 6 %° w Numerical taxonomy proponents hold [Ref. 4:271, ... that the relationships of contiguity and similarity should be...solving. 22 W i * These primitive categories are based on a sorting of learning pro- cesses into classes that have obvious differences at the

  17. USCS and the USDA Soil Classification System: Development of a Mapping Scheme

    DTIC Science & Technology

    2015-03-01

    important to human daily living. A variety of disciplines (geology, agriculture, engineering, etc.) require a sys- tematic categorization of soil, detailing...it is often important to also con- sider parameters that indicate soil strength. Two important properties used for engineering-related problems are...that many textural clas- sification systems were developed to meet specifics needs. In agriculture, textural classification is used to determine crop

  18. Revealing how different spinors can be: The Lounesto spinor classification

    NASA Astrophysics Data System (ADS)

    Hoff da Silva, J. M.; Cavalcanti, R. T.

    2017-11-01

    This paper aims to give a coordinate-based introduction to the so-called Lounesto spinorial classification scheme. Among other results, it has evinced classes of spinors which fail to satisfy Dirac equation. The underlying idea and the central aspects of such spinorial categorization are introduced in an argumentative basis, after which we delve into a commented account on recent results obtained from (and within) this branch of research.

  19. Classification and overview of research in real-time imaging

    NASA Astrophysics Data System (ADS)

    Sinha, Purnendu; Gorinsky, Sergey V.; Laplante, Phillip A.; Stoyenko, Alexander D.; Marlowe, Thomas J.

    1996-10-01

    Real-time imaging has application in areas such as multimedia, virtual reality, medical imaging, and remote sensing and control. Recently, the imaging community has witnessed a tremendous growth in research and new ideas in these areas. To lend structure to this growth, we outline a classification scheme and provide an overview of current research in real-time imaging. For convenience, we have categorized references by research area and application.

  20. Use of circulation types classifications to evaluate AR4 climate models over the Euro-Atlantic region

    NASA Astrophysics Data System (ADS)

    Pastor, M. A.; Casado, M. J.

    2012-10-01

    This paper presents an evaluation of the multi-model simulations for the 4th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) in terms of their ability to simulate the ERA40 circulation types over the Euro-Atlantic region in winter season. Two classification schemes, k-means and SANDRA, have been considered to test the sensitivity of the evaluation results to the classification procedure. The assessment allows establishing different rankings attending spatial and temporal features of the circulation types. Regarding temporal characteristics, in general, all AR4 models tend to underestimate the frequency of occurrence. The best model simulating spatial characteristics is the UKMO-HadGEM1 whereas CCSM3, UKMO-HadGEM1 and CGCM3.1(T63) are the best simulating the temporal features, for both classification schemes. This result agrees with the AR4 models ranking obtained when having analysed the ability of the same AR4 models to simulate Euro-Atlantic variability modes. This study has proved the utility of applying such a synoptic climatology approach as a diagnostic tool for models' assessment. The ability of the models to properly reproduce the position of ridges and troughs and the frequency of synoptic patterns, will therefore improve our confidence in the response of models to future climate changes.

  1. Entropy-based gene ranking without selection bias for the predictive classification of microarray data.

    PubMed

    Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe

    2003-11-06

    We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  2. An artificial intelligence based improved classification of two-phase flow patterns with feature extracted from acquired images.

    PubMed

    Shanthi, C; Pappa, N

    2017-05-01

    Flow pattern recognition is necessary to select design equations for finding operating details of the process and to perform computational simulations. Visual image processing can be used to automate the interpretation of patterns in two-phase flow. In this paper, an attempt has been made to improve the classification accuracy of the flow pattern of gas/ liquid two- phase flow using fuzzy logic and Support Vector Machine (SVM) with Principal Component Analysis (PCA). The videos of six different types of flow patterns namely, annular flow, bubble flow, churn flow, plug flow, slug flow and stratified flow are recorded for a period and converted to 2D images for processing. The textural and shape features extracted using image processing are applied as inputs to various classification schemes namely fuzzy logic, SVM and SVM with PCA in order to identify the type of flow pattern. The results obtained are compared and it is observed that SVM with features reduced using PCA gives the better classification accuracy and computationally less intensive than other two existing schemes. This study results cover industrial application needs including oil and gas and any other gas-liquid two-phase flows. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  3. The Adam Walsh Act: An Examination of Sex Offender Risk Classification Systems.

    PubMed

    Zgoba, Kristen M; Miner, Michael; Levenson, Jill; Knight, Raymond; Letourneau, Elizabeth; Thornton, David

    2016-12-01

    This study was designed to compare the Adam Walsh Act (AWA) classification tiers with actuarial risk assessment instruments and existing state classification schemes in their respective abilities to identify sex offenders at high risk to re-offend. Data from 1,789 adult sex offenders released from prison in four states were collected (Minnesota, New Jersey, Florida, and South Carolina). On average, the sexual recidivism rate was approximately 5% at 5 years and 10% at 10 years. AWA Tier 2 offenders had higher Static-99R scores and higher recidivism rates than Tier 3 offenders, and in Florida, these inverse correlations were statistically significant. Actuarial measures and existing state tier systems, in contrast, did a better job of identifying high-risk offenders and recidivists. As well, we examined the distribution of risk assessment scores within and across tier categories, finding that a majority of sex offenders fall into AWA Tier 3, but more than half score low or moderately low on the Static-99R. The results indicate that the AWA sex offender classification scheme is a poor indicator of relative risk and is likely to result in a system that is less effective in protecting the public than those currently implemented in the states studied. © The Author(s) 2015.

  4. The Libraries of Rio.

    ERIC Educational Resources Information Center

    Foster, Barbara

    1988-01-01

    Describes aspects of several libraries in Rio de Janeiro. Topics covered include library policies, budgets, periodicals and books in the collections, classification schemes used, and literary areas of interest to patrons. (6 references) (CLB)

  5. Object Detection and Classification by Decision-Level Fusion for Intelligent Vehicle Systems.

    PubMed

    Oh, Sang-Il; Kang, Hang-Bong

    2017-01-22

    To understand driving environments effectively, it is important to achieve accurate detection and classification of objects detected by sensor-based intelligent vehicle systems, which are significantly important tasks. Object detection is performed for the localization of objects, whereas object classification recognizes object classes from detected object regions. For accurate object detection and classification, fusing multiple sensor information into a key component of the representation and perception processes is necessary. In this paper, we propose a new object-detection and classification method using decision-level fusion. We fuse the classification outputs from independent unary classifiers, such as 3D point clouds and image data using a convolutional neural network (CNN). The unary classifiers for the two sensors are the CNN with five layers, which use more than two pre-trained convolutional layers to consider local to global features as data representation. To represent data using convolutional layers, we apply region of interest (ROI) pooling to the outputs of each layer on the object candidate regions generated using object proposal generation to realize color flattening and semantic grouping for charge-coupled device and Light Detection And Ranging (LiDAR) sensors. We evaluate our proposed method on a KITTI benchmark dataset to detect and classify three object classes: cars, pedestrians and cyclists. The evaluation results show that the proposed method achieves better performance than the previous methods. Our proposed method extracted approximately 500 proposals on a 1226 × 370 image, whereas the original selective search method extracted approximately 10 6 × n proposals. We obtained classification performance with 77.72% mean average precision over the entirety of the classes in the moderate detection level of the KITTI benchmark dataset.

  6. Object Detection and Classification by Decision-Level Fusion for Intelligent Vehicle Systems

    PubMed Central

    Oh, Sang-Il; Kang, Hang-Bong

    2017-01-01

    To understand driving environments effectively, it is important to achieve accurate detection and classification of objects detected by sensor-based intelligent vehicle systems, which are significantly important tasks. Object detection is performed for the localization of objects, whereas object classification recognizes object classes from detected object regions. For accurate object detection and classification, fusing multiple sensor information into a key component of the representation and perception processes is necessary. In this paper, we propose a new object-detection and classification method using decision-level fusion. We fuse the classification outputs from independent unary classifiers, such as 3D point clouds and image data using a convolutional neural network (CNN). The unary classifiers for the two sensors are the CNN with five layers, which use more than two pre-trained convolutional layers to consider local to global features as data representation. To represent data using convolutional layers, we apply region of interest (ROI) pooling to the outputs of each layer on the object candidate regions generated using object proposal generation to realize color flattening and semantic grouping for charge-coupled device and Light Detection And Ranging (LiDAR) sensors. We evaluate our proposed method on a KITTI benchmark dataset to detect and classify three object classes: cars, pedestrians and cyclists. The evaluation results show that the proposed method achieves better performance than the previous methods. Our proposed method extracted approximately 500 proposals on a 1226×370 image, whereas the original selective search method extracted approximately 106×n proposals. We obtained classification performance with 77.72% mean average precision over the entirety of the classes in the moderate detection level of the KITTI benchmark dataset. PMID:28117742

  7. Hydrometeorological application of an extratropical cyclone classification scheme in the southern United States

    NASA Astrophysics Data System (ADS)

    Senkbeil, J. C.; Brommer, D. M.; Comstock, I. J.; Loyd, T.

    2012-07-01

    Extratropical cyclones (ETCs) in the southern United States are often overlooked when compared with tropical cyclones in the region and ETCs in the northern United States. Although southern ETCs are significant weather events, there is currently not an operational scheme used for identifying and discussing these nameless storms. In this research, we classified 84 ETCs (1970-2009). We manually identified five distinct formation regions and seven unique ETC types using statistical classification. Statistical classification employed the use of principal components analysis and two methods of cluster analysis. Both manual and statistical storm types generally showed positive (negative) relationships with El Niño (La Niña). Manual storm types displayed precipitation swaths consistent with discrete storm tracks which further legitimizes the existence of multiple modes of southern ETCs. Statistical storm types also displayed unique precipitation intensity swaths, but these swaths were less indicative of track location. It is hoped that by classifying southern ETCs into types, that forecasters, hydrologists, and broadcast meteorologists might be able to better anticipate projected amounts of precipitation at their locations.

  8. Relevance popularity: A term event model based feature selection scheme for text classification.

    PubMed

    Feng, Guozhong; An, Baiguo; Yang, Fengqin; Wang, Han; Zhang, Libiao

    2017-01-01

    Feature selection is a practical approach for improving the performance of text classification methods by optimizing the feature subsets input to classifiers. In traditional feature selection methods such as information gain and chi-square, the number of documents that contain a particular term (i.e. the document frequency) is often used. However, the frequency of a given term appearing in each document has not been fully investigated, even though it is a promising feature to produce accurate classifications. In this paper, we propose a new feature selection scheme based on a term event Multinomial naive Bayes probabilistic model. According to the model assumptions, the matching score function, which is based on the prediction probability ratio, can be factorized. Finally, we derive a feature selection measurement for each term after replacing inner parameters by their estimators. On a benchmark English text datasets (20 Newsgroups) and a Chinese text dataset (MPH-20), our numerical experiment results obtained from using two widely used text classifiers (naive Bayes and support vector machine) demonstrate that our method outperformed the representative feature selection methods.

  9. Fuzzy Classification of Ocean Color Satellite Data for Bio-optical Algorithm Constituent Retrievals

    NASA Technical Reports Server (NTRS)

    Campbell, Janet W.

    1998-01-01

    The ocean has been traditionally viewed as a 2 class system. Morel and Prieur (1977) classified ocean water according to the dominant absorbent particle suspended in the water column. Case 1 is described as having a high concentration of phytoplankton (and detritus) relative to other particles. Conversely, case 2 is described as having inorganic particles such as suspended sediments in high concentrations. Little work has gone into the problem of mixing bio-optical models for these different water types. An approach is put forth here to blend bio-optical algorithms based on a fuzzy classification scheme. This scheme involves two procedures. First, a clustering procedure identifies classes and builds class statistics from in-situ optical measurements. Next, a classification procedure assigns satellite pixels partial memberships to these classes based on their ocean color reflectance signature. These membership assignments can be used as the basis for a weighting retrievals from class-specific bio-optical algorithms. This technique is demonstrated with in-situ optical measurements and an image from the SeaWiFS ocean color satellite.

  10. Segmentation and object-oriented classification of wetlands in a karst Florida landscape using multi-season Landsat-7 ETM+ Imagery

    EPA Science Inventory

    Segmentation and object-oriented processing of single-season and multi-season Landsat-7 ETM+ data was utilized for the classification of wetlands in a 1560 km2 study area of north central Florida. This segmentation and object-oriented classification outperformed the traditional ...

  11. Evaluation of management measures of software development. Volume 1: Analysis summary

    NASA Technical Reports Server (NTRS)

    Page, J.; Card, D.; Mcgarry, F.

    1982-01-01

    The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.

  12. A review of supervised object-based land-cover image classification

    NASA Astrophysics Data System (ADS)

    Ma, Lei; Li, Manchun; Ma, Xiaoxue; Cheng, Liang; Du, Peijun; Liu, Yongxue

    2017-08-01

    Object-based image classification for land-cover mapping purposes using remote-sensing imagery has attracted significant attention in recent years. Numerous studies conducted over the past decade have investigated a broad array of sensors, feature selection, classifiers, and other factors of interest. However, these research results have not yet been synthesized to provide coherent guidance on the effect of different supervised object-based land-cover classification processes. In this study, we first construct a database with 28 fields using qualitative and quantitative information extracted from 254 experimental cases described in 173 scientific papers. Second, the results of the meta-analysis are reported, including general characteristics of the studies (e.g., the geographic range of relevant institutes, preferred journals) and the relationships between factors of interest (e.g., spatial resolution and study area or optimal segmentation scale, accuracy and number of targeted classes), especially with respect to the classification accuracy of different sensors, segmentation scale, training set size, supervised classifiers, and land-cover types. Third, useful data on supervised object-based image classification are determined from the meta-analysis. For example, we find that supervised object-based classification is currently experiencing rapid advances, while development of the fuzzy technique is limited in the object-based framework. Furthermore, spatial resolution correlates with the optimal segmentation scale and study area, and Random Forest (RF) shows the best performance in object-based classification. The area-based accuracy assessment method can obtain stable classification performance, and indicates a strong correlation between accuracy and training set size, while the accuracy of the point-based method is likely to be unstable due to mixed objects. In addition, the overall accuracy benefits from higher spatial resolution images (e.g., unmanned aerial vehicle) or agricultural sites where it also correlates with the number of targeted classes. More than 95.6% of studies involve an area less than 300 ha, and the spatial resolution of images is predominantly between 0 and 2 m. Furthermore, we identify some methods that may advance supervised object-based image classification. For example, deep learning and type-2 fuzzy techniques may further improve classification accuracy. Lastly, scientists are strongly encouraged to report results of uncertainty studies to further explore the effects of varied factors on supervised object-based image classification.

  13. SVM feature selection based rotation forest ensemble classifiers to improve computer-aided diagnosis of Parkinson disease.

    PubMed

    Ozcift, Akin

    2012-08-01

    Parkinson disease (PD) is an age-related deterioration of certain nerve systems, which affects movement, balance, and muscle control of clients. PD is one of the common diseases which affect 1% of people older than 60 years. A new classification scheme based on support vector machine (SVM) selected features to train rotation forest (RF) ensemble classifiers is presented for improving diagnosis of PD. The dataset contains records of voice measurements from 31 people, 23 with PD and each record in the dataset is defined with 22 features. The diagnosis model first makes use of a linear SVM to select ten most relevant features from 22. As a second step of the classification model, six different classifiers are trained with the subset of features. Subsequently, at the third step, the accuracies of classifiers are improved by the utilization of RF ensemble classification strategy. The results of the experiments are evaluated using three metrics; classification accuracy (ACC), Kappa Error (KE) and Area under the Receiver Operating Characteristic (ROC) Curve (AUC). Performance measures of two base classifiers, i.e. KStar and IBk, demonstrated an apparent increase in PD diagnosis accuracy compared to similar studies in literature. After all, application of RF ensemble classification scheme improved PD diagnosis in 5 of 6 classifiers significantly. We, numerically, obtained about 97% accuracy in RF ensemble of IBk (a K-Nearest Neighbor variant) algorithm, which is a quite high performance for Parkinson disease diagnosis.

  14. Object based image analysis for the classification of the growth stages of Avocado crop, in Michoacán State, Mexico

    NASA Astrophysics Data System (ADS)

    Gao, Yan; Marpu, Prashanth; Morales Manila, Luis M.

    2014-11-01

    This paper assesses the suitability of 8-band Worldview-2 (WV2) satellite data and object-based random forest algorithm for the classification of avocado growth stages in Mexico. We tested both pixel-based with minimum distance (MD) and maximum likelihood (MLC) and object-based with Random Forest (RF) algorithm for this task. Training samples and verification data were selected by visual interpreting the WV2 images for seven thematic classes: fully grown, middle stage, and early stage of avocado crops, bare land, two types of natural forests, and water body. To examine the contribution of the four new spectral bands of WV2 sensor, all the tested classifications were carried out with and without the four new spectral bands. Classification accuracy assessment results show that object-based classification with RF algorithm obtained higher overall higher accuracy (93.06%) than pixel-based MD (69.37%) and MLC (64.03%) method. For both pixel-based and object-based methods, the classifications with the four new spectral bands (overall accuracy obtained higher accuracy than those without: overall accuracy of object-based RF classification with vs without: 93.06% vs 83.59%, pixel-based MD: 69.37% vs 67.2%, pixel-based MLC: 64.03% vs 36.05%, suggesting that the four new spectral bands in WV2 sensor contributed to the increase of the classification accuracy.

  15. Synthesis and size classification of metal oxide nanoparticles for biomedical applications

    NASA Astrophysics Data System (ADS)

    Atsumi, Takashi; Jeyadevan, Balachandran; Sato, Yoshinori; Tamura, Kazuchika; Aiba, Setsuya; Tohji, Kazuyuki

    2004-12-01

    Magnetic nanoparticles are considered for biomedical applications, such as the medium in magnetic resonance imaging, hyperthermia, drug delivery, and for the purification or classification of DNA or virus. The performance of magnetic nanoparticles in biomedical application such as hyperthermia depends very much on the magnetic properties, size and size distribution. We briefly described the basic idea behind their use in drug delivery, magnetic separation and hyperthermia and discussed the prerequisite properties magnetic particles for biomedical applications. Finally reported the synthesis and classification scheme to prepare magnetite (Fe3O4) nanoparticles with narrow size distribution for magnetic fluid hyperthermia.

  16. Development and application of operational techniques for the inventory and monitoring of resources and uses for the Texas coastal zone

    NASA Technical Reports Server (NTRS)

    Harwood, P. (Principal Investigator); Finley, R.; Mcculloch, S.; Marphy, D.; Hupp, B.

    1976-01-01

    The author has identified the following significant results. Image interpretation mapping techniques were successfully applied to test site 5, an area with a semi-arid climate. The land cover/land use classification required further modification. A new program, HGROUP, added to the ADP classification schedule provides a convenient method for examining the spectral similarity between classes. This capability greatly simplifies the task of combining 25-30 unsupervised subclasses into about 15 major classes that approximately correspond to the land use/land cover classification scheme.

  17. Significance of clustering and classification applications in digital and physical libraries

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios

    2015-02-01

    Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.

  18. Initial study of Schroedinger eigenmaps for spectral target detection

    NASA Astrophysics Data System (ADS)

    Dorado-Munoz, Leidy P.; Messinger, David W.

    2016-08-01

    Spectral target detection refers to the process of searching for a specific material with a known spectrum over a large area containing materials with different spectral signatures. Traditional target detection methods in hyperspectral imagery (HSI) require assuming the data fit some statistical or geometric models and based on the model, to estimate parameters for defining a hypothesis test, where one class (i.e., target class) is chosen over the other classes (i.e., background class). Nonlinear manifold learning methods such as Laplacian eigenmaps (LE) have extensively shown their potential use in HSI processing, specifically in classification or segmentation. Recently, Schroedinger eigenmaps (SE), which is built upon LE, has been introduced as a semisupervised classification method. In SE, the former Laplacian operator is replaced by the Schroedinger operator. The Schroedinger operator includes by definition, a potential term V that steers the transformation in certain directions improving the separability between classes. In this regard, we propose a methodology for target detection that is not based on the traditional schemes and that does not need the estimation of statistical or geometric parameters. This method is based on SE, where the potential term V is taken into consideration to include the prior knowledge about the target class and use it to steer the transformation in directions where the target location in the new space is known and the separability between target and background is augmented. An initial study of how SE can be used in a target detection scheme for HSI is shown here. In-scene pixel and spectral signature detection approaches are presented. The HSI data used comprise various target panels for testing simultaneous detection of multiple objects with different complexities.

  19. Characterizing the degree of convective clustering using radar reflectivity and its application to evaluating model simulations

    NASA Astrophysics Data System (ADS)

    Cheng, W. Y.; Kim, D.; Rowe, A.; Park, S.

    2017-12-01

    Despite the impact of mesoscale convective organization on the properties of convection (e.g., mixing between updrafts and environment), parameterizing the degree of convective organization has only recently been attempted in cumulus parameterization schemes (e.g., Unified Convection Scheme UNICON). Additionally, challenges remain in determining the degree of convective organization from observations and in comparing directly with the organization metrics in model simulations. This study addresses the need to objectively quantify the degree of mesoscale convective organization using high quality S-PolKa radar data from the DYNAMO field campaign. One of the most noticeable aspects of mesoscale convective organization in radar data is the degree of convective clustering, which can be characterized by the number and size distribution of convective echoes and the distance between them. We propose a method of defining contiguous convective echoes (CCEs) using precipitating convective echoes identified by a rain type classification algorithm. Two classification algorithms, Steiner et al. (1995) and Powell et al. (2016), are tested and evaluated against high-resolution WRF simulations to determine which method better represents the degree of convective clustering. Our results suggest that the CCEs based on Powell et al.'s algorithm better represent the dynamical properties of the convective updrafts and thus provide the basis of a metric for convective organization. Furthermore, through a comparison with the observational data, the WRF simulations driven by the DYNAMO large-scale forcing, similarly applied to UNICON Single Column Model simulations, will allow us to evaluate the ability of both WRF and UNICON to simulate convective clustering. This evaluation is based on the physical processes that are explicitly represented in WRF and UNICON, including the mechanisms leading to convective clustering, and the feedback to the convective properties.

  20. Automated connectionist-geostatistical classification as an approach to identify sea ice and land ice types, properties and provinces

    NASA Astrophysics Data System (ADS)

    Goetz-Weiss, L. R.; Herzfeld, U. C.; Trantow, T.; Hunke, E. C.; Maslanik, J. A.; Crocker, R. I.

    2016-12-01

    An important problem in model-data comparison is the identification of parameters that can be extracted from observational data as well as used in numerical models, which are typically based on idealized physical processes. Here, we present a suite of approaches to characterization and classification of sea ice and land ice types, properties and provinces based on several types of remote-sensing data. Applications will be given to not only illustrate the approach, but employ it in model evaluation and understanding of physical processes. (1) In a geostatistical characterization, spatial sea-ice properties in the Chukchi and Beaufort Sea and in Elsoon Lagoon are derived from analysis of RADARSAT and ERS-2 SAR data. (2) The analysis is taken further by utilizing multi-parameter feature vectors as inputs for unsupervised and supervised statistical classification, which facilitates classification of different sea-ice types. (3) Characteristic sea-ice parameters, as resultant from the classification, can then be applied in model evaluation, as demonstrated for the ridging scheme of the Los Alamos sea ice model, CICE, using high-resolution altimeter and image data collected from unmanned aircraft over Fram Strait during the Characterization of Arctic Sea Ice Experiment (CASIE). The characteristic parameters chosen in this application are directly related to deformation processes, which also underly the ridging scheme. (4) The method that is capable of the most complex classification tasks is the connectionist-geostatistical classification method. This approach has been developed to identify currently up to 18 different crevasse types in order to map progression of the surge through the complex Bering-Bagley Glacier System, Alaska, in 2011-2014. The analysis utilizes airborne altimeter data and video image data and satellite image data. Results of the crevasse classification are compare to fracture modeling and found to match.

  1. Optimization of breast mass classification using sequential forward floating selection (SFFS) and a support vector machine (SVM) model

    PubMed Central

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-01-01

    Purpose: Improving radiologists’ performance in classification between malignant and benign breast lesions is important to increase cancer detection sensitivity and reduce false-positive recalls. For this purpose, developing computer-aided diagnosis (CAD) schemes has been attracting research interest in recent years. In this study, we investigated a new feature selection method for the task of breast mass classification. Methods: We initially computed 181 image features based on mass shape, spiculation, contrast, presence of fat or calcifications, texture, isodensity, and other morphological features. From this large image feature pool, we used a sequential forward floating selection (SFFS)-based feature selection method to select relevant features, and analyzed their performance using a support vector machine (SVM) model trained for the classification task. On a database of 600 benign and 600 malignant mass regions of interest (ROIs), we performed the study using a ten-fold cross-validation method. Feature selection and optimization of the SVM parameters were conducted on the training subsets only. Results: The area under the receiver operating characteristic curve (AUC) = 0.805±0.012 was obtained for the classification task. The results also showed that the most frequently-selected features by the SFFS-based algorithm in 10-fold iterations were those related to mass shape, isodensity and presence of fat, which are consistent with the image features frequently used by radiologists in the clinical environment for mass classification. The study also indicated that accurately computing mass spiculation features from the projection mammograms was difficult, and failed to perform well for the mass classification task due to tissue overlap within the benign mass regions. Conclusions: In conclusion, this comprehensive feature analysis study provided new and valuable information for optimizing computerized mass classification schemes that may have potential to be useful as a “second reader” in future clinical practice. PMID:24664267

  2. Analysis of parenchymal patterns using conspicuous spatial frequency features in mammograms applied to the BI-RADS density rating scheme

    NASA Astrophysics Data System (ADS)

    Perconti, Philip; Loew, Murray

    2006-03-01

    Automatic classification of the density of breast parenchyma is shown using a measure that is correlated to the human observer performance, and compared against the BI-RADS density rating. Increasingly popular in the United States, the Breast Imaging Reporting and Data System (BI-RADS) is used to draw attention to the increased screening difficulty associated with greater breast density; however, the BI-RADS rating scheme is subjective and is not intended as an objective measure of breast density. So, while popular, BI-RADS does not define density classes using a standardized measure, which leads to increased variability among observers. The adaptive thresholding technique is a more quantitative approach for assessing the percentage breast density, but considerable reader interaction is required. We calculate an objective density rating that is derived using a measure of local feature salience. Previously, this measure was shown to correlate well with radiologists' localization and discrimination of true positive and true negative regions-of-interest. Using conspicuous spatial frequency features, an objective density rating is obtained and correlated with adaptive thresholding, and the subjectively ascertained BI-RADS density ratings. Using 100 cases, obtained from the University of South Florida's DDSM database, we show that an automated breast density measure can be derived that is correlated with the interactive thresholding method for continuous percentage breast density, but not with the BI-RADS density rating categories for the selected cases. Comparison between interactive thresholding and the new salience percentage density resulted in a Pearson correlation of 76.7%. Using a four-category scale equivalent to the BI-RADS density categories, a Spearman correlation coefficient of 79.8% was found.

  3. Hydrological Climate Classification: Can We Improve on Köppen-Geiger?

    NASA Astrophysics Data System (ADS)

    Knoben, W.; Woods, R. A.; Freer, J. E.

    2017-12-01

    Classification is essential in the study of complex natural systems, yet hydrology so far has no formal way to structure the climate forcing which underlies hydrologic response. Various climate classification systems can be borrowed from other disciplines but these are based on different organizing principles than a hydrological classification might use. From gridded global data we calculate a gridded aridity index, an aridity seasonality index and a rain-vs-snow index, which we use to cluster global locations into climate groups. We then define the membership degree of nearly 1100 catchments to each of our climate groups based on each catchment's climate and investigate the extent to which streamflow responses within each climate group are similar. We compare this climate classification approach with the often-used Köppen-Geiger classification, using statistical tests based on streamflow signature values. We find that three climate indices are sufficient to distinguish 18 different climate types world-wide. Climates tend to change gradually in space and catchments can thus belong to multiple climate groups, albeit with different degrees of membership. Streamflow responses within a climate group tend to be similar, regardless of the catchments' geographical proximity. A Wilcoxon two-sample test based on streamflow signature values for each climate group shows that the new classification can distinguish different flow regimes using this classification scheme. The Köppen-Geiger approach uses 29 climate classes but is less able to differentiate streamflow regimes. Climate forcing exerts a strong control on typical hydrologic response and both change gradually in space. This makes arbitrary hard boundaries in any classification scheme difficult to defend. Any hydrological classification should thus acknowledge these gradual changes in forcing. Catchment characteristics (soil or vegetation type, land use, etc) can vary more quickly in space than climate does, which can explain streamflow differences between geographically close locations. Summarizing, this work shows that hydrology needs its own way to structure climate forcing, acknowledging that climates vary gradually on a global scale and explicitly including those climate aspects that drive seasonal changes in hydrologic regimes.

  4. Types of Crude Oil

    EPA Pesticide Factsheets

    The petroleum industry often classifies these types by geographical source, but the classification scheme here is more useful in a spill cleanup scenario. It indicates general toxicity, physical state, and changes caused by time and weathering.

  5. Transport on Riemannian manifold for functional connectivity-based classification.

    PubMed

    Ng, Bernard; Dressler, Martin; Varoquaux, Gaël; Poline, Jean Baptiste; Greicius, Michael; Thirion, Bertrand

    2014-01-01

    We present a Riemannian approach for classifying fMRI connectivity patterns before and after intervention in longitudinal studies. A fundamental difficulty with using connectivity as features is that covariance matrices live on the positive semi-definite cone, which renders their elements inter-related. The implicit independent feature assumption in most classifier learning algorithms is thus violated. In this paper, we propose a matrix whitening transport for projecting the covariance estimates onto a common tangent space to reduce the statistical dependencies between their elements. We show on real data that our approach provides significantly higher classification accuracy than directly using Pearson's correlation. We further propose a non-parametric scheme for identifying significantly discriminative connections from classifier weights. Using this scheme, a number of neuroanatomically meaningful connections are found, whereas no significant connections are detected with pure permutation testing.

  6. VTOL shipboard letdown guidance system analysis

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    Alternative letdown guidance strategies are examined for landing of a VTOL aircraft onboard a small aviation ship under adverse environmental conditions. Off line computer simulation of shipboard landing task is utilized for assessing the relative merits of the proposed guidance schemes. The touchdown performance of a nominal constant rate of descent (CROD) letdown strategy serves as a benchmark for ranking the performance of the alternative letdown schemes. Analysis of ship motion time histories indicates the existence of an alternating sequence of quiescent and rough motions called lulls and swells. A real time algorithms lull/swell classification based upon ship motion pattern features is developed. The classification algorithm is used to command a go/no go signal to indicate the initiation and termination of an acceptable landing window. Simulation results show that such a go/no go pattern based letdown guidance strategy improves touchdown performance.

  7. Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk

    PubMed Central

    Ramirez-Villegas, Juan F.; Lam-Espinosa, Eric; Ramirez-Moreno, David F.; Calvo-Echeverry, Paulo C.; Agredo-Rodriguez, Wilfredo

    2011-01-01

    Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis. PMID:21386966

  8. [Who benefits from systemic therapy with a reflecting team?].

    PubMed

    Höger, C; Temme, M; Geiken, G

    1994-03-01

    In an evaluation study we investigated the effectiveness of the reflecting team approach compared to eclectic child psychiatric treatment in an outpatient setting and the indications for each type of treatment. The relationship between treatment outcome and diagnostic data obtained with the Multi-axial Classification Scheme was examined in 22 families treated with the reflecting team approach and in a second group of families matched on all important sociodemographic and diagnostic variables but receiving eclectic treatment. No difference was found between the two groups regarding symptom improvement or changes in family functioning. Regarding satisfaction with treatment, the reflecting team approach was superior to the eclectic modality. In the reflecting team group parental mental disorder and inadequate intra-familial communication (according to the new fifth axis of the Multi-axial Classification Scheme) had a negative effect on outcome.

  9. Adaptive skin detection based on online training

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Tang, Liang; Zhou, Jie; Rong, Gang

    2007-11-01

    Skin is a widely used cue for porn image classification. Most conventional methods are off-line training schemes. They usually use a fixed boundary to segment skin regions in the images and are effective only in restricted conditions: e.g. good lightness and unique human race. This paper presents an adaptive online training scheme for skin detection which can handle these tough cases. In our approach, skin detection is considered as a classification problem on Gaussian mixture model. For each image, human face is detected and the face color is used to establish a primary estimation of skin color distribution. Then an adaptive online training algorithm is used to find the real boundary between skin color and background color in current image. Experimental results on 450 images showed that the proposed method is more robust in general situations than the conventional ones.

  10. Fault diagnosis for analog circuits utilizing time-frequency features and improved VVRKFA

    NASA Astrophysics Data System (ADS)

    He, Wei; He, Yigang; Luo, Qiwu; Zhang, Chaolong

    2018-04-01

    This paper proposes a novel scheme for analog circuit fault diagnosis utilizing features extracted from the time-frequency representations of signals and an improved vector-valued regularized kernel function approximation (VVRKFA). First, the cross-wavelet transform is employed to yield the energy-phase distribution of the fault signals over the time and frequency domain. Since the distribution is high-dimensional, a supervised dimensionality reduction technique—the bilateral 2D linear discriminant analysis—is applied to build a concise feature set from the distributions. Finally, VVRKFA is utilized to locate the fault. In order to improve the classification performance, the quantum-behaved particle swarm optimization technique is employed to gradually tune the learning parameter of the VVRKFA classifier. The experimental results for the analog circuit faults classification have demonstrated that the proposed diagnosis scheme has an advantage over other approaches.

  11. A Support Vector Machine-Based Gender Identification Using Speech Signal

    NASA Astrophysics Data System (ADS)

    Lee, Kye-Hwan; Kang, Sang-Ick; Kim, Deok-Hwan; Chang, Joon-Hyuk

    We propose an effective voice-based gender identification method using a support vector machine (SVM). The SVM is a binary classification algorithm that classifies two groups by finding the voluntary nonlinear boundary in a feature space and is known to yield high classification performance. In the present work, we compare the identification performance of the SVM with that of a Gaussian mixture model (GMM)-based method using the mel frequency cepstral coefficients (MFCC). A novel approach of incorporating a features fusion scheme based on a combination of the MFCC and the fundamental frequency is proposed with the aim of improving the performance of gender identification. Experimental results demonstrate that the gender identification performance using the SVM is significantly better than that of the GMM-based scheme. Moreover, the performance is substantially improved when the proposed features fusion technique is applied.

  12. A data set for evaluating the performance of multi-class multi-object video tracking

    NASA Astrophysics Data System (ADS)

    Chakraborty, Avishek; Stamatescu, Victor; Wong, Sebastien C.; Wigley, Grant; Kearney, David

    2017-05-01

    One of the challenges in evaluating multi-object video detection, tracking and classification systems is having publically available data sets with which to compare different systems. However, the measures of performance for tracking and classification are different. Data sets that are suitable for evaluating tracking systems may not be appropriate for classification. Tracking video data sets typically only have ground truth track IDs, while classification video data sets only have ground truth class-label IDs. The former identifies the same object over multiple frames, while the latter identifies the type of object in individual frames. This paper describes an advancement of the ground truth meta-data for the DARPA Neovision2 Tower data set to allow both the evaluation of tracking and classification. The ground truth data sets presented in this paper contain unique object IDs across 5 different classes of object (Car, Bus, Truck, Person, Cyclist) for 24 videos of 871 image frames each. In addition to the object IDs and class labels, the ground truth data also contains the original bounding box coordinates together with new bounding boxes in instances where un-annotated objects were present. The unique IDs are maintained during occlusions between multiple objects or when objects re-enter the field of view. This will provide: a solid foundation for evaluating the performance of multi-object tracking of different types of objects, a straightforward comparison of tracking system performance using the standard Multi Object Tracking (MOT) framework, and classification performance using the Neovision2 metrics. These data have been hosted publically.

  13. Synthesis of Potential Trypanocides

    DTIC Science & Technology

    1987-12-01

    0188 Ia. REPORT SECURITY CLASSIFICATION 1b RESTRICTIVE MARKINGS Unclassified 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION /AVAILABILITY OF...and the phenyl ring, ring structures 2 and 3 , introduction of a -CH:CII- group between the phenyl ring and its 4’-substituent, ring structure 4...imidazole (9, 15) thiazole (11) and pyridine (12-14) into ether-linked and vinyl-linked structures. 3 t % SCHEME 1 HETEROAROMATIC RINGS OHw Ch3 CH3 +1 N% f

  14. Some Complexity Results About Packet Radio Networks

    DTIC Science & Technology

    1983-03-01

    divsio-muti-cces)schemes foP TD ^12A 1 660 SECURITY CLASSIFICATION OF THIS PAGE (ft* 60 IAD SZCURTY CLASSIFICATION Of THIS PAGE(Whan Data Entoted) -other...8217-21-. bbd bd a aK (a) (b) b-d b 0~~ C C (c) (d) Fig. 1. Situations in a PRN for which (c,d) conflicts with (a,b). -22- 12 3 3m 3m+1- 3m+2 Fig. 2. A

  15. High-order asynchrony-tolerant finite difference schemes for partial differential equations

    NASA Astrophysics Data System (ADS)

    Aditya, Konduri; Donzis, Diego A.

    2017-12-01

    Synchronizations of processing elements (PEs) in massively parallel simulations, which arise due to communication or load imbalances between PEs, significantly affect the scalability of scientific applications. We have recently proposed a method based on finite-difference schemes to solve partial differential equations in an asynchronous fashion - synchronization between PEs is relaxed at a mathematical level. While standard schemes can maintain their stability in the presence of asynchrony, their accuracy is drastically affected. In this work, we present a general methodology to derive asynchrony-tolerant (AT) finite difference schemes of arbitrary order of accuracy, which can maintain their accuracy when synchronizations are relaxed. We show that there are several choices available in selecting a stencil to derive these schemes and discuss their effect on numerical and computational performance. We provide a simple classification of schemes based on the stencil and derive schemes that are representative of different classes. Their numerical error is rigorously analyzed within a statistical framework to obtain the overall accuracy of the solution. Results from numerical experiments are used to validate the performance of the schemes.

  16. Multilevel Green's function interpolation method for scattering from composite metallic and dielectric objects.

    PubMed

    Shi, Yan; Wang, Hao Gang; Li, Long; Chan, Chi Hou

    2008-10-01

    A multilevel Green's function interpolation method based on two kinds of multilevel partitioning schemes--the quasi-2D and the hybrid partitioning scheme--is proposed for analyzing electromagnetic scattering from objects comprising both conducting and dielectric parts. The problem is formulated using the surface integral equation for homogeneous dielectric and conducting bodies. A quasi-2D multilevel partitioning scheme is devised to improve the efficiency of the Green's function interpolation. In contrast to previous multilevel partitioning schemes, noncubic groups are introduced to discretize the whole EM structure in this quasi-2D multilevel partitioning scheme. Based on the detailed analysis of the dimension of the group in this partitioning scheme, a hybrid quasi-2D/3D multilevel partitioning scheme is proposed to effectively handle objects with fine local structures. Selection criteria for some key parameters relating to the interpolation technique are given. The proposed algorithm is ideal for the solution of problems involving objects such as missiles, microstrip antenna arrays, photonic bandgap structures, etc. Numerical examples are presented to show that CPU time is between O(N) and O(N log N) while the computer memory requirement is O(N).

  17. [Evaluation of traditional pathological classification at molecular classification era for gastric cancer].

    PubMed

    Yu, Yingyan

    2014-01-01

    Histopathological classification is in a pivotal position in both basic research and clinical diagnosis and treatment of gastric cancer. Currently, there are different classification systems in basic science and clinical application. In medical literatures, different classifications are used including Lauren and WHO systems, which have confused many researchers. Lauren classification has been proposed for half a century, but is still used worldwide. It shows many advantages of simple, easy handling with prognostic significance. The WHO classification scheme is better than Lauren classification in that it is continuously being revised according to the progress of gastric cancer, and is always used in the clinical and pathological diagnosis of common scenarios. Along with the progression of genomics, transcriptomics, proteomics, metabolomics researches, molecular classification of gastric cancer becomes the current hot topics. The traditional therapeutic approach based on phenotypic characteristics of gastric cancer will most likely be replaced with a gene variation mode. The gene-targeted therapy against the same molecular variation seems more reasonable than traditional chemical treatment based on the same morphological change.

  18. An Automated Scheme for the Large-Scale Survey of Herbig-Haro Objects

    NASA Astrophysics Data System (ADS)

    Deng, Licai; Yang, Ji; Zheng, Zhongyuan; Jiang, Zhaoji

    2001-04-01

    Owing to their spectral properties, Herbig-Haro (HH) objects can be discovered using photometric methods through a combination of filters, sampling the characteristic spectral lines and the nearby continuum. The data are commonly processed through direct visual inspection of the images. To make data reduction more efficient and the results more uniform and complete, an automated searching scheme for HH objects is developed to manipulate the images using IRAF. This approach helps to extract images with only intrinsic HH emissions. By using this scheme, the pointlike stellar sources and extended nebulous sources with continuum emission can be eliminated from the original images. The objects with only characteristic HH emission become prominent and can be easily picked up. In this paper our scheme is illustrated by a sample field and has been applied to our surveys for HH objects.

  19. A Review of Major Nursing Vocabularies and the Extent to Which They Have the Characteristics Required for Implementation in Computer-based Systems

    PubMed Central

    Henry, Suzanne Bakken; Warren, Judith J.; Lange, Linda; Button, Patricia

    1998-01-01

    Building on the work of previous authors, the Computer-based Patient Record Institute (CPRI) Work Group on Codes and Structures has described features of a classification scheme for implementation within a computer-based patient record. The authors of the current study reviewed the evaluation literature related to six major nursing vocabularies (the North American Nursing Diagnosis Association Taxonomy 1, the Nursing Interventions Classification, the Nursing Outcomes Classification, the Home Health Care Classification, the Omaha System, and the International Classification for Nursing Practice) to determine the extent to which the vocabularies include the CPRI features. None of the vocabularies met all criteria. The Omaha System, Home Health Care Classification, and International Classification for Nursing Practice each included five features. Criteria not fully met by any systems were clear and non-redundant representation of concepts, administrative cross-references, syntax and grammar, synonyms, uncertainty, context-free identifiers, and language independence. PMID:9670127

  20. Auto-SEIA: simultaneous optimization of image processing and machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Negro Maggio, Valentina; Iocchi, Luca

    2015-02-01

    Object classification from images is an important task for machine vision and it is a crucial ingredient for many computer vision applications, ranging from security and surveillance to marketing. Image based object classification techniques properly integrate image processing and machine learning (i.e., classification) procedures. In this paper we present a system for automatic simultaneous optimization of algorithms and parameters for object classification from images. More specifically, the proposed system is able to process a dataset of labelled images and to return a best configuration of image processing and classification algorithms and of their parameters with respect to the accuracy of classification. Experiments with real public datasets are used to demonstrate the effectiveness of the developed system.

Top