A Two-Stream Deep Fusion Framework for High-Resolution Aerial Scene Classification
Liu, Fuxian
2018-01-01
One of the challenging problems in understanding high-resolution remote sensing images is aerial scene classification. A well-designed feature representation method and classifier can improve classification accuracy. In this paper, we construct a new two-stream deep architecture for aerial scene classification. First, we use two pretrained convolutional neural networks (CNNs) as feature extractor to learn deep features from the original aerial image and the processed aerial image through saliency detection, respectively. Second, two feature fusion strategies are adopted to fuse the two different types of deep convolutional features extracted by the original RGB stream and the saliency stream. Finally, we use the extreme learning machine (ELM) classifier for final classification with the fused features. The effectiveness of the proposed architecture is tested on four challenging datasets: UC-Merced dataset with 21 scene categories, WHU-RS dataset with 19 scene categories, AID dataset with 30 scene categories, and NWPU-RESISC45 dataset with 45 challenging scene categories. The experimental results demonstrate that our architecture gets a significant classification accuracy improvement over all state-of-the-art references. PMID:29581722
A Two-Stream Deep Fusion Framework for High-Resolution Aerial Scene Classification.
Yu, Yunlong; Liu, Fuxian
2018-01-01
One of the challenging problems in understanding high-resolution remote sensing images is aerial scene classification. A well-designed feature representation method and classifier can improve classification accuracy. In this paper, we construct a new two-stream deep architecture for aerial scene classification. First, we use two pretrained convolutional neural networks (CNNs) as feature extractor to learn deep features from the original aerial image and the processed aerial image through saliency detection, respectively. Second, two feature fusion strategies are adopted to fuse the two different types of deep convolutional features extracted by the original RGB stream and the saliency stream. Finally, we use the extreme learning machine (ELM) classifier for final classification with the fused features. The effectiveness of the proposed architecture is tested on four challenging datasets: UC-Merced dataset with 21 scene categories, WHU-RS dataset with 19 scene categories, AID dataset with 30 scene categories, and NWPU-RESISC45 dataset with 45 challenging scene categories. The experimental results demonstrate that our architecture gets a significant classification accuracy improvement over all state-of-the-art references.
32 CFR 2001.14 - Classification challenges.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Classification challenges. 2001.14 Section 2001... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders, including authorized holders outside the classifying agency, who want to challenge the classification status of...
32 CFR 2001.14 - Classification challenges.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 6 2014-07-01 2014-07-01 false Classification challenges. 2001.14 Section 2001... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders, including authorized holders outside the classifying agency, who want to challenge the classification status of...
32 CFR 2001.14 - Classification challenges.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 6 2012-07-01 2012-07-01 false Classification challenges. 2001.14 Section 2001... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders, including authorized holders outside the classifying agency, who want to challenge the classification status of...
32 CFR 2001.14 - Classification challenges.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Classification challenges. 2001.14 Section 2001... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders, including authorized holders outside the classifying agency, who want to challenge the classification status of...
On Classification in the Study of Failure, and a Challenge to Classifiers
NASA Technical Reports Server (NTRS)
Wasson, Kimberly S.
2003-01-01
Classification schemes are abundant in the literature of failure. They serve a number of purposes, some more successfully than others. We examine several classification schemes constructed for various purposes relating to failure and its investigation, and discuss their values and limits. The analysis results in a continuum of uses for classification schemes, that suggests that the value of certain properties of these schemes is dependent on the goals a classification is designed to forward. The contrast in the value of different properties for different uses highlights a particular shortcoming: we argue that while humans are good at developing one kind of scheme: dynamic, flexible classifications used for exploratory purposes, we are not so good at developing another: static, rigid classifications used to trap and organize data for specific analytic goals. Our lack of strong foundation in developing valid instantiations of the latter impedes progress toward a number of investigative goals. This shortcoming and its consequences pose a challenge to researchers in the study of failure: to develop new methods for constructing and validating static classification schemes of demonstrable value in promoting the goals of investigations. We note current productive activity in this area, and outline foundations for more.
Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter
2018-05-01
In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.
Tacke, Ulrich; den Hollander, Bjørnar; Simojoki, Kaarlo; Korpi, Esa R; Pihlainen, Katja; Alho, Hannu
2011-01-01
Designer drugs are synthetic psychotropic drugs which are marketed as "legal drugs". Their emergence, rapid spreading and unpredictable effects have challenged the health and substance abuse care. The slow process of classification of an abusable drug has provided too many possibilities for spreading the designer drugs. Once a certain substance receives an illegal drugs classification, dealers and users usually move to another, slightly different molecule that is still legal. In Finland, the Narcotics Act has been amended to the effect that the addition of a new substance to the illegal drug list does not require an amendment to the law.
49 CFR 8.17 - Classification challenges.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 1 2011-10-01 2011-10-01 false Classification challenges. 8.17 Section 8.17 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.17 Classification challenges. (a) Authorized...
49 CFR 8.17 - Classification challenges.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 1 2013-10-01 2013-10-01 false Classification challenges. 8.17 Section 8.17 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.17 Classification challenges. (a) Authorized...
49 CFR 8.17 - Classification challenges.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 1 2014-10-01 2014-10-01 false Classification challenges. 8.17 Section 8.17 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.17 Classification challenges. (a) Authorized...
49 CFR 8.17 - Classification challenges.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 1 2012-10-01 2012-10-01 false Classification challenges. 8.17 Section 8.17 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.17 Classification challenges. (a) Authorized...
49 CFR 8.17 - Classification challenges.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 1 2010-10-01 2010-10-01 false Classification challenges. 8.17 Section 8.17 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.17 Classification challenges. (a) Authorized...
28 CFR 17.30 - Classification challenges.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Classification challenges. 17.30 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.30 Classification challenges. (a) Authorized... is improperly classified or unclassified are encouraged and expected to challenge the classification...
28 CFR 17.30 - Classification challenges.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Classification challenges. 17.30 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.30 Classification challenges. (a) Authorized... is improperly classified or unclassified are encouraged and expected to challenge the classification...
28 CFR 17.30 - Classification challenges.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Classification challenges. 17.30 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.30 Classification challenges. (a) Authorized... is improperly classified or unclassified are encouraged and expected to challenge the classification...
28 CFR 17.30 - Classification challenges.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Classification challenges. 17.30 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.30 Classification challenges. (a) Authorized... is improperly classified or unclassified are encouraged and expected to challenge the classification...
28 CFR 17.30 - Classification challenges.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Classification challenges. 17.30 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.30 Classification challenges. (a) Authorized... is improperly classified or unclassified are encouraged and expected to challenge the classification...
Stones, Catherine; Knapp, Peter; Closs, S Jose
2016-01-01
This article discusses the challenges of visually representing pain qualities in pictogram design. An existing set of 12 pictograms designed for people with literacy problems was evaluated to understand more about misunderstandings of pictogram interpretation. Two sets of university students from different disciplines were asked to interpret the pictograms, and a novel classification system was developed to categorise answer types, as ‘location’, ‘affective’, temporal’ or ‘literal’. Several design recommendations are made as a result that will help improve the design of pain pictograms as a whole as well as guide designers of related pictogram work. We demonstrate how, through the robust classification of incorrect responses, it is possible to extract useful comprehension error patterns to inform future design. PMID:27867507
6 CFR 7.30 - Classification challenges.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 6 Domestic Security 1 2012-01-01 2012-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...
6 CFR 7.30 - Classification challenges.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 6 Domestic Security 1 2013-01-01 2013-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...
6 CFR 7.30 - Classification challenges.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 6 Domestic Security 1 2014-01-01 2014-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...
6 CFR 7.30 - Classification challenges.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 6 Domestic Security 1 2010-01-01 2010-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...
6 CFR 7.30 - Classification challenges.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 6 Domestic Security 1 2011-01-01 2011-01-01 false Classification challenges. 7.30 Section 7.30... INFORMATION Classified Information § 7.30 Classification challenges. (a) Authorized holders of information... classified are encouraged and expected to challenge the classification status of that information pursuant to...
22 CFR 9.8 - Classification challenges.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...
22 CFR 9.8 - Classification challenges.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...
22 CFR 9.8 - Classification challenges.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...
22 CFR 9.8 - Classification challenges.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...
22 CFR 9.8 - Classification challenges.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Classification challenges. 9.8 Section 9.8 Foreign Relations DEPARTMENT OF STATE GENERAL SECURITY INFORMATION REGULATIONS § 9.8 Classification... classification status is improper are expected and encouraged to challenge the classification status of the...
Real-time classification of vehicles by type within infrared imagery
NASA Astrophysics Data System (ADS)
Kundegorski, Mikolaj E.; Akçay, Samet; Payen de La Garanderie, Grégoire; Breckon, Toby P.
2016-10-01
Real-time classification of vehicles into sub-category types poses a significant challenge within infra-red imagery due to the high levels of intra-class variation in thermal vehicle signatures caused by aspects of design, current operating duration and ambient thermal conditions. Despite these challenges, infra-red sensing offers significant generalized target object detection advantages in terms of all-weather operation and invariance to visual camouflage techniques. This work investigates the accuracy of a number of real-time object classification approaches for this task within the wider context of an existing initial object detection and tracking framework. Specifically we evaluate the use of traditional feature-driven bag of visual words and histogram of oriented gradient classification approaches against modern convolutional neural network architectures. Furthermore, we use classical photogrammetry, within the context of current target detection and classification techniques, as a means of approximating 3D target position within the scene based on this vehicle type classification. Based on photogrammetric estimation of target position, we then illustrate the use of regular Kalman filter based tracking operating on actual 3D vehicle trajectories. Results are presented using a conventional thermal-band infra-red (IR) sensor arrangement where targets are tracked over a range of evaluation scenarios.
Solt, Illés; Tikk, Domonkos; Gál, Viktor; Kardkovács, Zsolt T.
2009-01-01
Objective Automated and disease-specific classification of textual clinical discharge summaries is of great importance in human life science, as it helps physicians to make medical studies by providing statistically relevant data for analysis. This can be further facilitated if, at the labeling of discharge summaries, semantic labels are also extracted from text, such as whether a given disease is present, absent, questionable in a patient, or is unmentioned in the document. The authors present a classification technique that successfully solves the semantic classification task. Design The authors introduce a context-aware rule-based semantic classification technique for use on clinical discharge summaries. The classification is performed in subsequent steps. First, some misleading parts are removed from the text; then the text is partitioned into positive, negative, and uncertain context segments, then a sequence of binary classifiers is applied to assign the appropriate semantic labels. Measurement For evaluation the authors used the documents of the i2b2 Obesity Challenge and adopted its evaluation measures: F1-macro and F1-micro for measurements. Results On the two subtasks of the Obesity Challenge (textual and intuitive classification) the system performed very well, and achieved a F1-macro = 0.80 for the textual and F1-macro = 0.67 for the intuitive tasks, and obtained second place at the textual and first place at the intuitive subtasks of the challenge. Conclusions The authors show in the paper that a simple rule-based classifier can tackle the semantic classification task more successfully than machine learning techniques, if the training data are limited and some semantic labels are very sparse. PMID:19390101
Long-range dismount activity classification: LODAC
NASA Astrophysics Data System (ADS)
Garagic, Denis; Peskoe, Jacob; Liu, Fang; Cuevas, Manuel; Freeman, Andrew M.; Rhodes, Bradley J.
2014-06-01
Continuous classification of dismount types (including gender, age, ethnicity) and their activities (such as walking, running) evolving over space and time is challenging. Limited sensor resolution (often exacerbated as a function of platform standoff distance) and clutter from shadows in dense target environments, unfavorable environmental conditions, and the normal properties of real data all contribute to the challenge. The unique and innovative aspect of our approach is a synthesis of multimodal signal processing with incremental non-parametric, hierarchical Bayesian machine learning methods to create a new kind of target classification architecture. This architecture is designed from the ground up to optimally exploit correlations among the multiple sensing modalities (multimodal data fusion) and rapidly and continuously learns (online self-tuning) patterns of distinct classes of dismounts given little a priori information. This increases classification performance in the presence of challenges posed by anti-access/area denial (A2/AD) sensing. To fuse multimodal features, Long-range Dismount Activity Classification (LODAC) develops a novel statistical information theoretic approach for multimodal data fusion that jointly models multimodal data (i.e., a probabilistic model for cross-modal signal generation) and discovers the critical cross-modal correlations by identifying components (features) with maximal mutual information (MI) which is efficiently estimated using non-parametric entropy models. LODAC develops a generic probabilistic pattern learning and classification framework based on a new class of hierarchical Bayesian learning algorithms for efficiently discovering recurring patterns (classes of dismounts) in multiple simultaneous time series (sensor modalities) at multiple levels of feature granularity.
Classifications, applications, and design challenges of drones: A review
NASA Astrophysics Data System (ADS)
Hassanalian, M.; Abdelkefi, A.
2017-05-01
Nowadays, there is a growing need for flying drones with diverse capabilities for both civilian and military applications. There is also a significant interest in the development of novel drones which can autonomously fly in different environments and locations and can perform various missions. In the past decade, the broad spectrum of applications of these drones has received most attention which led to the invention of various types of drones with different sizes and weights. In this review paper, we identify a novel classification of flying drones that ranges from unmanned air vehicles to smart dusts at both ends of this spectrum, with their new defined applications. Design and fabrication challenges of micro drones, existing methods for increasing their endurance, and various navigation and control approaches are discussed in details. Limitations of the existing drones, proposed solutions for the next generation of drones, and recommendations are also presented and discussed.
Weakly supervised classification in high energy physics
Dery, Lucio Mwinmaarong; Nachman, Benjamin; Rubbo, Francesco; ...
2017-05-01
As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. Here, this paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics $-$ quark versus gluon tagging $-$ we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervisedmore » classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.« less
Weakly supervised classification in high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dery, Lucio Mwinmaarong; Nachman, Benjamin; Rubbo, Francesco
As machine learning algorithms become increasingly sophisticated to exploit subtle features of the data, they often become more dependent on simulations. Here, this paper presents a new approach called weakly supervised classification in which class proportions are the only input into the machine learning algorithm. Using one of the most challenging binary classification tasks in high energy physics $-$ quark versus gluon tagging $-$ we show that weakly supervised classification can match the performance of fully supervised algorithms. Furthermore, by design, the new algorithm is insensitive to any mis-modeling of discriminating features in the data by the simulation. Weakly supervisedmore » classification is a general procedure that can be applied to a wide variety of learning problems to boost performance and robustness when detailed simulations are not reliable or not available.« less
76 FR 59031 - Classification Challenge Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-23
... CENTRAL INTELLIGENCE AGENCY 32 CFR Part 1907 Classification Challenge Regulations AGENCY: Central Intelligence Agency. ACTION: Final rule. SUMMARY: Consistent with Executive Order 13526, the Central Intelligence Agency (CIA) has undertaken and completed a review of its public Classification Challenge...
Bulk Magnetization Effects in EMI-Based Classification and Discrimination
2012-04-01
response adds to classification performance and ( 2 ) develop a comprehensive understanding of the engineering challenges of primary field cancellation...response adds to classification performance and ( 2 ) develop a comprehensive understanding of the engineering challenges of primary field cancellation...classification performance and ( 2 ) develop a comprehensive understanding of the engineering challenges of primary field cancellation that can support a
Cardot, J-M; Garcia Arieta, A; Paixao, P; Tasevska, I; Davit, B
2016-07-01
The US-FDA recently posted a draft guideline for industry recommending procedures necessary to obtain a biowaiver for immediate-release oral dosage forms based on the Biopharmaceutics Classification System (BCS). This review compares the present FDA BCS biowaiver approach, with the existing European Medicines Agency (EMA) approach, with an emphasis on similarities, difficulties, and shared challenges. Some specifics of the current EMA BCS guideline are compared with those in the recently published draft US-FDA BCS guideline. In particular, similarities and differences in the EMA versus US-FDA approaches to establishing drug solubility, permeability, dissolution, and formulation suitability for BCS biowaiver are critically reviewed. Several case studies are presented to illustrate the (i) challenges of applying for BCS biowaivers for global registration in the face of differences in the EMA and US-FDA BCS biowaiver criteria, as well as (ii) challenges inherent in applying for BCS class I or III designation and common to both jurisdictions.
NASA Astrophysics Data System (ADS)
Verechagin, V.; Kris, R.; Schwarzband, I.; Milstein, A.; Cohen, B.; Shkalim, A.; Levy, S.; Price, D.; Bal, E.
2018-03-01
Over the years, mask and wafers defects dispositioning has become an increasingly challenging and time consuming task. With design rules getting smaller, OPC getting complex and scanner illumination taking on free-form shapes - the probability of a user to perform accurate and repeatable classification of defects detected by mask inspection tools into pass/fail bins is reducing. The critical challenging of mask defect metrology for small nodes ( < 30 nm) was reviewed in [1]. While Critical Dimension (CD) variation measurement is still the method of choice for determining a mask defect future impact on wafer, the high complexity of OPCs combined with high variability in pattern shapes poses a challenge for any automated CD variation measurement method. In this study, a novel approach for measurement generalization is presented. CD variation assessment performance is evaluated on multiple different complex shape patterns, and is benchmarked against an existing qualified measurement methodology.
Hartling, Lisa; Bond, Kenneth; Santaguida, P Lina; Viswanathan, Meera; Dryden, Donna M
2011-08-01
To develop and test a study design classification tool. We contacted relevant organizations and individuals to identify tools used to classify study designs and ranked these using predefined criteria. The highest ranked tool was a design algorithm developed, but no longer advocated, by the Cochrane Non-Randomized Studies Methods Group; this was modified to include additional study designs and decision points. We developed a reference classification for 30 studies; 6 testers applied the tool to these studies. Interrater reliability (Fleiss' κ) and accuracy against the reference classification were assessed. The tool was further revised and retested. Initial reliability was fair among the testers (κ=0.26) and the reference standard raters κ=0.33). Testing after revisions showed improved reliability (κ=0.45, moderate agreement) with improved, but still low, accuracy. The most common disagreements were whether the study design was experimental (5 of 15 studies), and whether there was a comparison of any kind (4 of 15 studies). Agreement was higher among testers who had completed graduate level training versus those who had not. The moderate reliability and low accuracy may be because of lack of clarity and comprehensiveness of the tool, inadequate reporting of the studies, and variability in tester characteristics. The results may not be generalizable to all published studies, as the test studies were selected because they had posed challenges for previous reviewers with respect to their design classification. Application of such a tool should be accompanied by training, pilot testing, and context-specific decision rules. Copyright © 2011 Elsevier Inc. All rights reserved.
Cabezas-Cruz, Alejandro; de la Fuente, José
2015-04-01
Classification of bacteria is challenging due to the lack of a theory-based framework. In addition, the adaptation of bacteria to ecological niches often results in selection of strains with diverse virulence, pathogenicity and transmission characteristics. Bacterial strain diversity presents challenges for taxonomic classification, which in turn impacts the ability to develop accurate diagnostics and effective vaccines. Over the past decade, the worldwide diversity of Anaplasma marginale, an economically important tick-borne pathogen of cattle, has become apparent. The extent of A. marginale strain diversity, formerly underappreciated, has contributed to the challenges of classification which, in turn, likely impacts the design and development of improved vaccines. Notably, the A. marginale surface protein 1a (MSP1a) is a model molecule for these studies because it serves as a marker for strain identity, is both an adhesin necessary for infection of cells and an immuno-reactive protein and is also an indicator of the evolution of strain diversity. Herein, we discuss a molecular taxonomic approach for classification of A. marginale strain diversity. Taxonomic analysis of this important molecule provides the opportunity to understand A. marginale strain diversity as it relates geographic and ecological factors and to the development of effective vaccines for control of bovine anaplasmosis worldwide. Copyright © 2015 Elsevier GmbH. All rights reserved.
Balanced VS Imbalanced Training Data: Classifying Rapideye Data with Support Vector Machines
NASA Astrophysics Data System (ADS)
Ustuner, M.; Sanli, F. B.; Abdikan, S.
2016-06-01
The accuracy of supervised image classification is highly dependent upon several factors such as the design of training set (sample selection, composition, purity and size), resolution of input imagery and landscape heterogeneity. The design of training set is still a challenging issue since the sensitivity of classifier algorithm at learning stage is different for the same dataset. In this paper, the classification of RapidEye imagery with balanced and imbalanced training data for mapping the crop types was addressed. Classification with imbalanced training data may result in low accuracy in some scenarios. Support Vector Machines (SVM), Maximum Likelihood (ML) and Artificial Neural Network (ANN) classifications were implemented here to classify the data. For evaluating the influence of the balanced and imbalanced training data on image classification algorithms, three different training datasets were created. Two different balanced datasets which have 70 and 100 pixels for each class of interest and one imbalanced dataset in which each class has different number of pixels were used in classification stage. Results demonstrate that ML and NN classifications are affected by imbalanced training data in resulting a reduction in accuracy (from 90.94% to 85.94% for ML and from 91.56% to 88.44% for NN) while SVM is not affected significantly (from 94.38% to 94.69%) and slightly improved. Our results highlighted that SVM is proven to be a very robust, consistent and effective classifier as it can perform very well under balanced and imbalanced training data situations. Furthermore, the training stage should be precisely and carefully designed for the need of adopted classifier.
Semi-Supervised Marginal Fisher Analysis for Hyperspectral Image Classification
NASA Astrophysics Data System (ADS)
Huang, H.; Liu, J.; Pan, Y.
2012-07-01
The problem of learning with both labeled and unlabeled examples arises frequently in Hyperspectral image (HSI) classification. While marginal Fisher analysis is a supervised method, which cannot be directly applied for Semi-supervised classification. In this paper, we proposed a novel method, called semi-supervised marginal Fisher analysis (SSMFA), to process HSI of natural scenes, which uses a combination of semi-supervised learning and manifold learning. In SSMFA, a new difference-based optimization objective function with unlabeled samples has been designed. SSMFA preserves the manifold structure of labeled and unlabeled samples in addition to separating labeled samples in different classes from each other. The semi-supervised method has an analytic form of the globally optimal solution, and it can be computed based on eigen decomposition. Classification experiments with a challenging HSI task demonstrate that this method outperforms current state-of-the-art HSI-classification methods.
10 CFR 1045.39 - Challenging classification and declassification determinations.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., a further appeal may be made to the Chief Health, Safety and Security Officer. (c) Classification... 10 Energy 4 2011-01-01 2011-01-01 false Challenging classification and declassification determinations. 1045.39 Section 1045.39 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) NUCLEAR CLASSIFICATION...
10 CFR 1045.39 - Challenging classification and declassification determinations.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., a further appeal may be made to the Chief Health, Safety and Security Officer. (c) Classification... 10 Energy 4 2010-01-01 2010-01-01 false Challenging classification and declassification determinations. 1045.39 Section 1045.39 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) NUCLEAR CLASSIFICATION...
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
5 CFR 1312.11 - Challenges to classifications.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.11 Challenges to classifications. OMB employees are... anonymous, in which case the question may be directed to the EOP Security Officer. ...
5 CFR 1312.11 - Challenges to classifications.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.11 Challenges to classifications. OMB employees are... anonymous, in which case the question may be directed to the EOP Security Officer. ...
32 CFR 2700.14 - Challenges to classification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... NEGOTIATIONS SECURITY INFORMATION REGULATIONS Original Classification § 2700.14 Challenges to classification. If holders of classified information believe the information is improperly or unnecessarily... the OMSN Information Security Oversight Committee, established pursuant to § 2700.51. Action on such...
Wan, Shixiang; Duan, Yucong; Zou, Quan
2017-09-01
Predicting the subcellular localization of proteins is an important and challenging problem. Traditional experimental approaches are often expensive and time-consuming. Consequently, a growing number of research efforts employ a series of machine learning approaches to predict the subcellular location of proteins. There are two main challenges among the state-of-the-art prediction methods. First, most of the existing techniques are designed to deal with multi-class rather than multi-label classification, which ignores connections between multiple labels. In reality, multiple locations of particular proteins imply that there are vital and unique biological significances that deserve special focus and cannot be ignored. Second, techniques for handling imbalanced data in multi-label classification problems are necessary, but never employed. For solving these two issues, we have developed an ensemble multi-label classifier called HPSLPred, which can be applied for multi-label classification with an imbalanced protein source. For convenience, a user-friendly webserver has been established at http://server.malab.cn/HPSLPred. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dictionary learning-based CT detection of pulmonary nodules
NASA Astrophysics Data System (ADS)
Wu, Panpan; Xia, Kewen; Zhang, Yanbo; Qian, Xiaohua; Wang, Ge; Yu, Hengyong
2016-10-01
Segmentation of lung features is one of the most important steps for computer-aided detection (CAD) of pulmonary nodules with computed tomography (CT). However, irregular shapes, complicated anatomical background and poor pulmonary nodule contrast make CAD a very challenging problem. Here, we propose a novel scheme for feature extraction and classification of pulmonary nodules through dictionary learning from training CT images, which does not require accurately segmented pulmonary nodules. Specifically, two classification-oriented dictionaries and one background dictionary are learnt to solve a two-category problem. In terms of the classification-oriented dictionaries, we calculate sparse coefficient matrices to extract intrinsic features for pulmonary nodule classification. The support vector machine (SVM) classifier is then designed to optimize the performance. Our proposed methodology is evaluated with the lung image database consortium and image database resource initiative (LIDC-IDRI) database, and the results demonstrate that the proposed strategy is promising.
A Discriminative Approach to EEG Seizure Detection
Johnson, Ashley N.; Sow, Daby; Biem, Alain
2011-01-01
Seizures are abnormal sudden discharges in the brain with signatures represented in electroencephalograms (EEG). The efficacy of the application of speech processing techniques to discriminate between seizure and non-seizure states in EEGs is reported. The approach accounts for the challenges of unbalanced datasets (seizure and non-seizure), while also showing a system capable of real-time seizure detection. The Minimum Classification Error (MCE) algorithm, which is a discriminative learning algorithm with wide-use in speech processing, is applied and compared with conventional classification techniques that have already been applied to the discrimination between seizure and non-seizure states in the literature. The system is evaluated on 22 pediatric patients multi-channel EEG recordings. Experimental results show that the application of speech processing techniques and MCE compare favorably with conventional classification techniques in terms of classification performance, while requiring less computational overhead. The results strongly suggests the possibility of deploying the designed system at the bedside. PMID:22195192
Learning to Predict Combinatorial Structures
NASA Astrophysics Data System (ADS)
Vembu, Shankar
2009-12-01
The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.
Histopathological Image Classification using Discriminative Feature-oriented Dictionary Learning
Vu, Tiep Huu; Mousavi, Hojjat Seyed; Monga, Vishal; Rao, Ganesh; Rao, UK Arvind
2016-01-01
In histopathological image analysis, feature extraction for classification is a challenging task due to the diversity of histology features suitable for each problem as well as presence of rich geometrical structures. In this paper, we propose an automatic feature discovery framework via learning class-specific dictionaries and present a low-complexity method for classification and disease grading in histopathology. Essentially, our Discriminative Feature-oriented Dictionary Learning (DFDL) method learns class-specific dictionaries such that under a sparsity constraint, the learned dictionaries allow representing a new image sample parsimoniously via the dictionary corresponding to the class identity of the sample. At the same time, the dictionary is designed to be poorly capable of representing samples from other classes. Experiments on three challenging real-world image databases: 1) histopathological images of intraductal breast lesions, 2) mammalian kidney, lung and spleen images provided by the Animal Diagnostics Lab (ADL) at Pennsylvania State University, and 3) brain tumor images from The Cancer Genome Atlas (TCGA) database, reveal the merits of our proposal over state-of-the-art alternatives. Moreover, we demonstrate that DFDL exhibits a more graceful decay in classification accuracy against the number of training images which is highly desirable in practice where generous training is often not available. PMID:26513781
Sørensen, Lauge; Nielsen, Mads
2018-05-15
The International Challenge for Automated Prediction of MCI from MRI data offered independent, standardized comparison of machine learning algorithms for multi-class classification of normal control (NC), mild cognitive impairment (MCI), converting MCI (cMCI), and Alzheimer's disease (AD) using brain imaging and general cognition. We proposed to use an ensemble of support vector machines (SVMs) that combined bagging without replacement and feature selection. SVM is the most commonly used algorithm in multivariate classification of dementia, and it was therefore valuable to evaluate the potential benefit of ensembling this type of classifier. The ensemble SVM, using either a linear or a radial basis function (RBF) kernel, achieved multi-class classification accuracies of 55.6% and 55.0% in the challenge test set (60 NC, 60 MCI, 60 cMCI, 60 AD), resulting in a third place in the challenge. Similar feature subset sizes were obtained for both kernels, and the most frequently selected MRI features were the volumes of the two hippocampal subregions left presubiculum and right subiculum. Post-challenge analysis revealed that enforcing a minimum number of selected features and increasing the number of ensemble classifiers improved classification accuracy up to 59.1%. The ensemble SVM outperformed single SVM classifications consistently in the challenge test set. Ensemble methods using bagging and feature selection can improve the performance of the commonly applied SVM classifier in dementia classification. This resulted in competitive classification accuracies in the International Challenge for Automated Prediction of MCI from MRI data. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eberle, Annika; Bhatt, Arpit; Zhang, Yimin
Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less
Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; ...
2017-04-26
Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain majormore » source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called 'major' or 'minor') has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Finally, our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.« less
Eberle, Annika; Bhatt, Arpit; Zhang, Yimin; Heath, Garvin
2017-06-06
Advanced biofuel production facilities (biorefineries), such as those envisioned by the United States (U.S.) Renewable Fuel Standard and U.S. Department of Energy's research and development programs, often lack historical air pollutant emissions data, which can pose challenges for obtaining air emission permits that are required for construction and operation. To help fill this knowledge gap, we perform a thorough regulatory analysis and use engineering process designs to assess the applicability of federal air regulations and quantify air pollutant emissions for two feasibility-level biorefinery designs. We find that without additional emission-control technologies both biorefineries would likely be required to obtain major source permits under the Clean Air Act's New Source Review program. The permitting classification (so-called "major" or "minor") has implications for the time and effort required for permitting and therefore affects the cost of capital and the fuel selling price. Consequently, we explore additional technically feasible emission-control technologies and process modifications that have the potential to reduce emissions to achieve a minor source permitting classification. Our analysis of air pollutant emissions and controls can assist biorefinery developers with the air permitting process and inform regulatory agencies about potential permitting pathways for novel biorefinery designs.
Evaluation of new antiemetic agents and definition of antineoplastic agent emetogenicity--an update.
Grunberg, Steven M; Osoba, David; Hesketh, Paul J; Gralla, Richard J; Borjeson, Sussanne; Rapoport, Bernardo L; du Bois, Andreas; Tonato, Maurizio
2005-02-01
Development of effective antiemetic therapy depends upon an understanding of both the antiemetic agents and the emetogenic challenges these agents are designed to address. New potential antiemetic agents should be studied in an orderly manner, proceeding from phase I to phase II open-label trials and then to randomized double-blind phase III trials comparing new agents and regimens to best standard therapy. Use of placebos in place of antiemetic therapy against highly or moderately emetogenic chemotherapy is unacceptable. Nausea and vomiting should be evaluated separately and for both the acute and delayed periods. Defining the emetogenicity of new antineoplastic agents is a challenge, since such data are often not reliably recorded during early drug development. A four-level classification system is proposed for emetogenicity of intravenous antineoplastic agents. A separate four-level classification system for emetogenicity of oral antineoplastic agents, which are often given over an extended period of time, is also proposed.
Pelay-Gimeno, Marta; Glas, Adrian; Koch, Oliver; Grossmann, Tom N
2015-01-01
Protein–protein interactions (PPIs) are involved at all levels of cellular organization, thus making the development of PPI inhibitors extremely valuable. The identification of selective inhibitors is challenging because of the shallow and extended nature of PPI interfaces. Inhibitors can be obtained by mimicking peptide binding epitopes in their bioactive conformation. For this purpose, several strategies have been evolved to enable a projection of side chain functionalities in analogy to peptide secondary structures, thereby yielding molecules that are generally referred to as peptidomimetics. Herein, we introduce a new classification of peptidomimetics (classes A–D) that enables a clear assignment of available approaches. Based on this classification, the Review summarizes strategies that have been applied for the structure-based design of PPI inhibitors through stabilizing or mimicking turns, β-sheets, and helices. PMID:26119925
32 CFR 2001.14 - Classification challenges.
Code of Federal Regulations, 2010 CFR
2010-07-01
... any more specific than to question why information is or is not classified, or is classified at a... a classification challenge to information that has been the subject of a challenge within the past two years, or that is the subject of pending litigation, the agency is not required to process the...
Energy-aware embedded classifier design for real-time emotion analysis.
Padmanabhan, Manoj; Murali, Srinivasan; Rincon, Francisco; Atienza, David
2015-01-01
Detection and classification of human emotions from multiple bio-signals has a wide variety of applications. Though electronic devices are available in the market today that acquire multiple body signals, the classification of human emotions in real-time, adapted to the tight energy budgets of wearable embedded systems is a big challenge. In this paper we present an embedded classifier for real-time emotion classification. We propose a system that operates at different energy budgeted modes, depending on the available energy, where each mode is constrained by an operating energy bound. The classifier has an offline training phase where feature selection is performed for each operating mode, with an energy-budget aware algorithm that we propose. Across the different operating modes, the classification accuracy ranges from 95% - 75% and 89% - 70% for arousal and valence respectively. The accuracy is traded off for less power consumption, which results in an increased battery life of up to 7.7 times (from 146.1 to 1126.9 hours).
Hand Rehabilitation Robotics on Poststroke Motor Recovery
2017-01-01
The recovery of hand function is one of the most challenging topics in stroke rehabilitation. Although the robot-assisted therapy has got some good results in the latest decades, the development of hand rehabilitation robotics is left behind. Existing reviews of hand rehabilitation robotics focus either on the mechanical design on designers' view or on the training paradigms on the clinicians' view, while these two parts are interconnected and both important for designers and clinicians. In this review, we explore the current literature surrounding hand rehabilitation robots, to help designers make better choices among varied components and thus promoting the application of hand rehabilitation robots. An overview of hand rehabilitation robotics is provided in this paper firstly, to give a general view of the relationship between subjects, rehabilitation theories, hand rehabilitation robots, and its evaluation. Secondly, the state of the art hand rehabilitation robotics is introduced in detail according to the classification of the hardware system and the training paradigm. As a result, the discussion gives available arguments behind the classification and comprehensive overview of hand rehabilitation robotics. PMID:29230081
Nagasawa, Shinji; Al-Naamani, Eman; Saeki, Akinori
2018-05-17
Owing to the diverse chemical structures, organic photovoltaic (OPV) applications with a bulk heterojunction framework have greatly evolved over the last two decades, which has produced numerous organic semiconductors exhibiting improved power conversion efficiencies (PCEs). Despite the recent fast progress in materials informatics and data science, data-driven molecular design of OPV materials remains challenging. We report a screening of conjugated molecules for polymer-fullerene OPV applications by supervised learning methods (artificial neural network (ANN) and random forest (RF)). Approximately 1000 experimental parameters including PCE, molecular weight, and electronic properties are manually collected from the literature and subjected to machine learning with digitized chemical structures. Contrary to the low correlation coefficient in ANN, RF yields an acceptable accuracy, which is twice that of random classification. We demonstrate the application of RF screening for the design, synthesis, and characterization of a conjugated polymer, which facilitates a rapid development of optoelectronic materials.
Pervasive mobile healthcare systems for chronic disease monitoring.
Huzooree, Geshwaree; Kumar Khedo, Kavi; Joonas, Noorjehan
2017-05-01
Pervasive mobile healthcare system has the potential to improve healthcare and the quality of life of chronic disease patients through continuous monitoring. Recently, many articles related to pervasive mobile healthcare system focusing on health monitoring using wireless technologies have been published. The main aim of this review is to evaluate the state-of-the-art pervasive mobile healthcare systems to identify major technical requirements and design challenges associated with the realization of a pervasive mobile healthcare system. A systematic literature review was conducted over IEEE Xplore Digital Library to evaluate 20 pervasive mobile healthcare systems out of 683 articles from 2011 to 2016. The classification of the pervasive mobile healthcare systems and other important factors are discussed. Potential opportunities and challenges are pointed out for the further deployment of effective pervasive mobile healthcare systems. This article helps researchers in health informatics to have a holistic view toward understanding pervasive mobile healthcare systems and points out new technological trends and design challenges that researchers have to consider when designing such systems for better adoption, usability, and seamless integration.
Advanced Steel Microstructural Classification by Deep Learning Methods.
Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank
2018-02-01
The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.
Apeldoorn, Adri T.; van Helvoirt, Hans; Ostelo, Raymond W.; Meihuizen, Hanneke; Kamper, Steven J.; van Tulder, Maurits W.; de Vet, Henrica C. W.
2016-01-01
Study design Observational inter-rater reliability study. Objectives To examine: (1) the inter-rater reliability of a modified version of Delitto et al.’s classification-based algorithm for patients with low back pain; (2) the influence of different levels of familiarity with the system; and (3) the inter-rater reliability of algorithm decisions in patients who clearly fit into a subgroup (clear classifications) and those who do not (unclear classifications). Methods Patients were examined twice on the same day by two of three participating physical therapists with different levels of familiarity with the system. Patients were classified into one of four classification groups. Raters were blind to the others’ classification decision. In order to quantify the inter-rater reliability, percentages of agreement and Cohen’s Kappa were calculated. Results A total of 36 patients were included (clear classification n = 23; unclear classification n = 13). The overall rate of agreement was 53% and the Kappa value was 0·34 [95% confidence interval (CI): 0·11–0·57], which indicated only fair inter-rater reliability. Inter-rater reliability for patients with a clear classification (agreement 52%, Kappa value 0·29) was not higher than for patients with an unclear classification (agreement 54%, Kappa value 0·33). Familiarity with the system (i.e. trained with written instructions and previous research experience with the algorithm) did not improve the inter-rater reliability. Conclusion Our pilot study challenges the inter-rater reliability of the classification procedure in clinical practice. Therefore, more knowledge is needed about factors that affect the inter-rater reliability, in order to improve the clinical applicability of the classification scheme. PMID:27559279
Challenges of interoperability using HL7 v3 in Czech healthcare.
Nagy, Miroslav; Preckova, Petra; Seidl, Libor; Zvarova, Jana
2010-01-01
The paper describes several classification systems that could improve patient safety through semantic interoperability among contemporary electronic health record systems (EHR-Ss) with support of the HL7 v3 standard. We describe a proposal and a pilot implementation of a semantic interoperability platform (SIP) interconnecting current EHR-Ss by using HL7 v3 messages and concepts mappings on most widely used classification systems. The increasing number of classification systems and nomenclatures requires designing of various conversion tools for transfer between main classification systems. We present the so-called LIM filler module and the HL7 broker, which are parts of the SIP, playing the role of such conversion tools. The analysis of suitability and usability of individual terminological thesauri has been started by mapping of clinical contents of the Minimal Data Model for Cardiology (MDMC) to various terminological classification systems. A national-wide implementation of the SIP would include adopting and translating international coding systems and nomenclatures, and developing implementation guidelines facilitating the migration from national standards to international ones. Our research showed that creation of such a platform is feasible; however, it will require a huge effort to adapt fully the Czech healthcare system to the European environment.
A predictive machine learning approach for microstructure optimization and materials design
NASA Astrophysics Data System (ADS)
Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; Agrawal, Ankit; Sundararaghavan, Veera; Choudhary, Alok
2015-06-01
This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniqueness of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. Experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.
Khalilzadeh, Omid; Baerlocher, Mark O; Shyn, Paul B; Connolly, Bairbre L; Devane, A Michael; Morris, Christopher S; Cohen, Alan M; Midia, Mehran; Thornton, Raymond H; Gross, Kathleen; Caplin, Drew M; Aeron, Gunjan; Misra, Sanjay; Patel, Nilesh H; Walker, T Gregory; Martinez-Salazar, Gloria; Silberzweig, James E; Nikolic, Boris
2017-10-01
To develop a new adverse event (AE) classification for the interventional radiology (IR) procedures and evaluate its clinical, research, and educational value compared with the existing Society of Interventional Radiology (SIR) classification via an SIR member survey. A new AE classification was developed by members of the Standards of Practice Committee of the SIR. Subsequently, a survey was created by a group of 18 members from the SIR Standards of Practice Committee and Service Lines. Twelve clinical AE case scenarios were generated that encompassed a broad spectrum of IR procedures and potential AEs. Survey questions were designed to evaluate the following domains: educational and research values, accountability for intraprocedural challenges, consistency of AE reporting, unambiguity, and potential for incorporation into existing quality-assurance framework. For each AE scenario, the survey participants were instructed to answer questions about the proposed and existing SIR classifications. SIR members were invited via online survey links, and 68 members participated among 140 surveyed. Answers on new and existing classifications were evaluated and compared statistically. Overall comparison between the two surveys was performed by generalized linear modeling. The proposed AE classification received superior evaluations in terms of consistency of reporting (P < .05) and potential for incorporation into existing quality-assurance framework (P < .05). Respondents gave a higher overall rating to the educational and research value of the new compared with the existing classification (P < .05). This study proposed an AE classification system that outperformed the existing SIR classification in the studied domains. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.
Classifying the Diversity of Bus Mapping Systems
NASA Astrophysics Data System (ADS)
Said, Mohd Shahmy Mohd; Forrest, David
2018-05-01
This study represents the first stage of an investigation into understanding the nature of different approaches to mapping bus routes and bus network, and how they may best be applied in different public transport situations. In many cities, bus services represent an important facet of easing traffic congestion and reducing pollution. However, with the entrenched car culture in many countries, persuading people to change their mode of transport is a major challenge. To promote this modal shift, people need to know what services are available and where (and when) they go. Bus service maps provide an invaluable element of providing suitable public transport information, but are often overlooked by transport planners, and are under-researched by cartographers. The method here consists of the creation of a map evaluation form and performing assessment of published bus networks maps. The analyses were completed by a combination of quantitative and qualitative data analysis of various aspects of cartographic design and classification. This paper focuses on the resulting classification, which is illustrated by a series of examples. This classification will facilitate more in depth investigations into the details of cartographic design for such maps and help direct areas for user evaluation.
Lyons-Weiler, James; Pelikan, Richard; Zeh, Herbert J; Whitcomb, David C; Malehorn, David E; Bigbee, William L; Hauskrecht, Milos
2005-01-01
Peptide profiles generated using SELDI/MALDI time of flight mass spectrometry provide a promising source of patient-specific information with high potential impact on the early detection and classification of cancer and other diseases. The new profiling technology comes, however, with numerous challenges and concerns. Particularly important are concerns of reproducibility of classification results and their significance. In this work we describe a computational validation framework, called PACE (Permutation-Achieved Classification Error), that lets us assess, for a given classification model, the significance of the Achieved Classification Error (ACE) on the profile data. The framework compares the performance statistic of the classifier on true data samples and checks if these are consistent with the behavior of the classifier on the same data with randomly reassigned class labels. A statistically significant ACE increases our belief that a discriminative signal was found in the data. The advantage of PACE analysis is that it can be easily combined with any classification model and is relatively easy to interpret. PACE analysis does not protect researchers against confounding in the experimental design, or other sources of systematic or random error. We use PACE analysis to assess significance of classification results we have achieved on a number of published data sets. The results show that many of these datasets indeed possess a signal that leads to a statistically significant ACE.
NASA Astrophysics Data System (ADS)
Srinivasan, Yeshwanth; Hernes, Dana; Tulpule, Bhakti; Yang, Shuyu; Guo, Jiangling; Mitra, Sunanda; Yagneswaran, Sriraja; Nutter, Brian; Jeronimo, Jose; Phillips, Benny; Long, Rodney; Ferris, Daron
2005-04-01
Automated segmentation and classification of diagnostic markers in medical imagery are challenging tasks. Numerous algorithms for segmentation and classification based on statistical approaches of varying complexity are found in the literature. However, the design of an efficient and automated algorithm for precise classification of desired diagnostic markers is extremely image-specific. The National Library of Medicine (NLM), in collaboration with the National Cancer Institute (NCI), is creating an archive of 60,000 digitized color images of the uterine cervix. NLM is developing tools for the analysis and dissemination of these images over the Web for the study of visual features correlated with precancerous neoplasia and cancer. To enable indexing of images of the cervix, it is essential to develop algorithms for the segmentation of regions of interest, such as acetowhitened regions, and automatic identification and classification of regions exhibiting mosaicism and punctation. Success of such algorithms depends, primarily, on the selection of relevant features representing the region of interest. We present color and geometric features based statistical classification and segmentation algorithms yielding excellent identification of the regions of interest. The distinct classification of the mosaic regions from the non-mosaic ones has been obtained by clustering multiple geometric and color features of the segmented sections using various morphological and statistical approaches. Such automated classification methodologies will facilitate content-based image retrieval from the digital archive of uterine cervix and have the potential of developing an image based screening tool for cervical cancer.
Naïve and Robust: Class-Conditional Independence in Human Classification Learning
ERIC Educational Resources Information Center
Jarecki, Jana B.; Meder, Björn; Nelson, Jonathan D.
2018-01-01
Humans excel in categorization. Yet from a computational standpoint, learning a novel probabilistic classification task involves severe computational challenges. The present paper investigates one way to address these challenges: assuming class-conditional independence of features. This feature independence assumption simplifies the inference…
32 CFR 1802.26 - Notification of decision and prohibition on adverse action.
Code of Federal Regulations, 2011 CFR
2011-07-01
... NATIONAL COUNTERINTELLIGENCE CENTER CHALLENGES TO CLASSIFICATION OF DOCUMENTS BY AUTHORIZED HOLDERS... regard to the challenge and that an appeal of the decision may be made to the Interagency Security Classification Appeals Panel (ISCAP) established pursuant to § 5.4 of this Order. ...
32 CFR 1802.26 - Notification of decision and prohibition on adverse action.
Code of Federal Regulations, 2010 CFR
2010-07-01
... NATIONAL COUNTERINTELLIGENCE CENTER CHALLENGES TO CLASSIFICATION OF DOCUMENTS BY AUTHORIZED HOLDERS... regard to the challenge and that an appeal of the decision may be made to the Interagency Security Classification Appeals Panel (ISCAP) established pursuant to § 5.4 of this Order. ...
NASA Astrophysics Data System (ADS)
Benini, Luca
2017-06-01
The "internet of everything" envisions trillions of connected objects loaded with high-bandwidth sensors requiring massive amounts of local signal processing, fusion, pattern extraction and classification. From the computational viewpoint, the challenge is formidable and can be addressed only by pushing computing fabrics toward massive parallelism and brain-like energy efficiency levels. CMOS technology can still take us a long way toward this goal, but technology scaling is losing steam. Energy efficiency improvement will increasingly hinge on architecture, circuits, design techniques such as heterogeneous 3D integration, mixed-signal preprocessing, event-based approximate computing and non-Von-Neumann architectures for scalable acceleration.
Approach to design neural cryptography: a generalized architecture and a heuristic rule.
Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen
2013-06-01
Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.
15 CFR 2008.7 - Challenges to classification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 15 Commerce and Foreign Trade 3 2011-01-01 2011-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...
15 CFR 2008.7 - Challenges to classification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 15 Commerce and Foreign Trade 3 2013-01-01 2013-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...
15 CFR 2008.7 - Challenges to classification.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 15 Commerce and Foreign Trade 3 2014-01-01 2014-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...
15 CFR 2008.7 - Challenges to classification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 15 Commerce and Foreign Trade 3 2012-01-01 2012-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...
15 CFR 2008.7 - Challenges to classification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Challenges to classification. 2008.7 Section 2008.7 Commerce and Foreign Trade Regulations Relating to Foreign Trade Agreements OFFICE OF THE UNITED STATES TRADE REPRESENTATIVE REGULATIONS TO IMPLEMENT E.O. 12065; OFFICE OF THE UNITED STATES TRADE...
NASA Astrophysics Data System (ADS)
Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad
2014-12-01
Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.
Mapping of Coral Reef Environment in the Arabian Gulf Using Multispectral Remote Sensing
NASA Astrophysics Data System (ADS)
Ben-Romdhane, H.; Marpu, P. R.; Ghedira, H.; Ouarda, T. B. M. J.
2016-06-01
Coral reefs of the Arabian Gulf are subject to several pressures, thus requiring conservation actions. Well-designed conservation plans involve efficient mapping and monitoring systems. Satellite remote sensing is a cost-effective tool for seafloor mapping at large scales. Multispectral remote sensing of coastal habitats, like those of the Arabian Gulf, presents a special challenge due to their complexity and heterogeneity. The present study evaluates the potential of multispectral sensor DubaiSat-2 in mapping benthic communities of United Arab Emirates. We propose to use a spectral-spatial method that includes multilevel segmentation, nonlinear feature analysis and ensemble learning methods. Support Vector Machine (SVM) is used for comparison of classification performances. Comparative data were derived from the habitat maps published by the Environment Agency-Abu Dhabi. The spectral-spatial method produced 96.41% mapping accuracy. SVM classification is assessed to be 94.17% accurate. The adaptation of these methods can help achieving well-designed coastal management plans in the region.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of, amendment of, accounting of disclosures of, or challenge to classification of records. 171.52..., amendment of, accounting of disclosures of, or challenge to classification of records. (a) Right of... records, amendment of records, accounting of disclosures of records, or any authorized holder of...
A Survey Of Architectural Approaches for Managing Embedded DRAM and Non-volatile On-chip Caches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S; Li, Dong
Recent trends of CMOS scaling and increasing number of on-chip cores have led to a large increase in the size of on-chip caches. Since SRAM has low density and consumes large amount of leakage power, its use in designing on-chip caches has become more challenging. To address this issue, researchers are exploring the use of several emerging memory technologies, such as embedded DRAM, spin transfer torque RAM, resistive RAM, phase change RAM and domain wall memory. In this paper, we survey the architectural approaches proposed for designing memory systems and, specifically, caches with these emerging memory technologies. To highlight theirmore » similarities and differences, we present a classification of these technologies and architectural approaches based on their key characteristics. We also briefly summarize the challenges in using these technologies for architecting caches. We believe that this survey will help the readers gain insights into the emerging memory device technologies, and their potential use in designing future computing systems.« less
Wang, Liansheng; Li, Shusheng; Chen, Rongzhen; Liu, Sze-Yu; Chen, Jyh-Cheng
2016-01-01
Accurate segmentation and classification of different anatomical structures of teeth from medical images plays an essential role in many clinical applications. Usually, the anatomical structures of teeth are manually labelled by experienced clinical doctors, which is time consuming. However, automatic segmentation and classification is a challenging task because the anatomical structures and surroundings of the tooth in medical images are rather complex. Therefore, in this paper, we propose an effective framework which is designed to segment the tooth with a Selective Binary and Gaussian Filtering Regularized Level Set (GFRLS) method improved by fully utilizing three dimensional (3D) information, and classify the tooth by employing unsupervised learning Pulse Coupled Neural Networks (PCNN) model. In order to evaluate the proposed method, the experiments are conducted on the different datasets of mandibular molars and the experimental results show that our method can achieve better accuracy and robustness compared to other four state of the art clustering methods.
Wang, Liansheng; Li, Shusheng; Chen, Rongzhen; Liu, Sze-Yu; Chen, Jyh-Cheng
2017-04-01
Accurate classification of different anatomical structures of teeth from medical images provides crucial information for the stress analysis in dentistry. Usually, the anatomical structures of teeth are manually labeled by experienced clinical doctors, which is time consuming. However, automatic segmentation and classification is a challenging task because the anatomical structures and surroundings of the tooth in medical images are rather complex. Therefore, in this paper, we propose an effective framework which is designed to segment the tooth with a Selective Binary and Gaussian Filtering Regularized Level Set (GFRLS) method improved by fully utilizing 3 dimensional (3D) information, and classify the tooth by employing unsupervised learning i.e., k-means++ method. In order to evaluate the proposed method, the experiments are conducted on the sufficient and extensive datasets of mandibular molars. The experimental results show that our method can achieve higher accuracy and robustness compared to other three clustering methods. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kauppi, Jukka-Pekka; Martikainen, Kalle; Ruotsalainen, Ulla
2010-12-01
The central purpose of passive signal intercept receivers is to perform automatic categorization of unknown radar signals. Currently, there is an urgent need to develop intelligent classification algorithms for these devices due to emerging complexity of radar waveforms. Especially multifunction radars (MFRs) capable of performing several simultaneous tasks by utilizing complex, dynamically varying scheduled waveforms are a major challenge for automatic pattern classification systems. To assist recognition of complex radar emissions in modern intercept receivers, we have developed a novel method to recognize dynamically varying pulse repetition interval (PRI) modulation patterns emitted by MFRs. We use robust feature extraction and classifier design techniques to assist recognition in unpredictable real-world signal environments. We classify received pulse trains hierarchically which allows unambiguous detection of the subpatterns using a sliding window. Accuracy, robustness and reliability of the technique are demonstrated with extensive simulations using both static and dynamically varying PRI modulation patterns. Copyright © 2010 Elsevier Ltd. All rights reserved.
Kim, Jeong Tae; Kim, Youn Hwan; Ghanem, Ali M
2015-11-01
Complex defects present structural and functional challenges to reconstructive surgeons. When compared to multiple free flaps or staged reconstruction, the use of chimeric flaps to reconstruct such defects have many advantages such as reduced number of operative procedures and donor site morbidity as well as preservation of recipient vessels. With increased popularity of perforator flaps, chimeric flaps' harvest and design has benefited from 'perforator concept' towards more versatile and better reconstruction solutions. This article discusses perforator based chimeric flaps and presents a practice based classification system that incorporates the perforator flap concept into "Perforator Chimerism". The authors analyzed a variety of chimeric patterns used in 31 consecutive cases to present illustrative case series and their new classification system. Accordingly, chimeric flaps are classified into four types. Type I: Classical Chimerism, Type II: Anastomotic Chimerism, Type III: Perforator Chimerism and Type IV Mixed Chimerism. Types I on specific source vessel anatomy whilst Type II requires microvascular anastomosis to create the chimeric reconstructive solution. Type III chimeric flaps utilizes the perforator concept to raise two components of tissues without microvascular anastomosis between them. Type IV chimeric flaps are mixed type flaps comprising any combination of Types I to III. Incorporation of the perforator concept in planning and designing chimeric flaps has allowed safe, effective and aesthetically superior reconstruction of complex defects. The new classification system aids reconstructive surgeons and trainees to understand chimeric flaps design, facilitating effective incorporation of this important reconstructive technique into the armamentarium of the reconstruction toolbox. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
A predictive machine learning approach for microstructure optimization and materials design
Liu, Ruoqian; Kumar, Abhishek; Chen, Zhengzhang; ...
2015-06-23
This paper addresses an important materials engineering question: How can one identify the complete space (or as much of it as possible) of microstructures that are theoretically predicted to yield the desired combination of properties demanded by a selected application? We present a problem involving design of magnetoelastic Fe-Ga alloy microstructure for enhanced elastic, plastic and magnetostrictive properties. While theoretical models for computing properties given the microstructure are known for this alloy, inversion of these relationships to obtain microstructures that lead to desired properties is challenging, primarily due to the high dimensionality of microstructure space, multi-objective design requirement and non-uniquenessmore » of solutions. These challenges render traditional search-based optimization methods incompetent in terms of both searching efficiency and result optimality. In this paper, a route to address these challenges using a machine learning methodology is proposed. A systematic framework consisting of random data generation, feature selection and classification algorithms is developed. In conclusion, experiments with five design problems that involve identification of microstructures that satisfy both linear and nonlinear property constraints show that our framework outperforms traditional optimization methods with the average running time reduced by as much as 80% and with optimality that would not be achieved otherwise.« less
Addressing current challenges in cancer immunotherapy with mathematical and computational modelling.
Konstorum, Anna; Vella, Anthony T; Adler, Adam J; Laubenbacher, Reinhard C
2017-06-01
The goal of cancer immunotherapy is to boost a patient's immune response to a tumour. Yet, the design of an effective immunotherapy is complicated by various factors, including a potentially immunosuppressive tumour microenvironment, immune-modulating effects of conventional treatments and therapy-related toxicities. These complexities can be incorporated into mathematical and computational models of cancer immunotherapy that can then be used to aid in rational therapy design. In this review, we survey modelling approaches under the umbrella of the major challenges facing immunotherapy development, which encompass tumour classification, optimal treatment scheduling and combination therapy design. Although overlapping, each challenge has presented unique opportunities for modellers to make contributions using analytical and numerical analysis of model outcomes, as well as optimization algorithms. We discuss several examples of models that have grown in complexity as more biological information has become available, showcasing how model development is a dynamic process interlinked with the rapid advances in tumour-immune biology. We conclude the review with recommendations for modellers both with respect to methodology and biological direction that might help keep modellers at the forefront of cancer immunotherapy development. © 2017 The Author(s).
Williams, Hywel D; Sassene, Philip; Kleberg, Karen; Calderone, Marilyn; Igonin, Annabel; Jule, Eduardo; Vertommen, Jan; Blundell, Ross; Benameur, Hassan; Müllertz, Anette; Porter, Christopher J H; Pouton, Colin W
2014-08-01
The Lipid Formulation Classification System Consortium looks to develop standardized in vitro tests and to generate much-needed performance criteria for lipid-based formulations (LBFs). This article highlights the value of performing a second, more stressful digestion test to identify LBFs near a performance threshold and to facilitate lead formulation selection in instances where several LBF prototypes perform adequately under standard digestion conditions (but where further discrimination is necessary). Stressed digestion tests can be designed based on an understanding of the factors that affect LBF performance, including the degree of supersaturation generated on dispersion/digestion. Stresses evaluated included decreasing LBF concentration (↓LBF), increasing bile salt, and decreasing pH. Their capacity to stress LBFs was dependent on LBF composition and drug type: ↓LBF was a stressor to medium-chain glyceride-rich LBFs, but not more hydrophilic surfactant-rich LBFs, whereas decreasing pH stressed tolfenamic acid LBFs, but not fenofibrate LBFs. Lastly, a new Performance Classification System, that is, LBF composition independent, is proposed to promote standardized LBF comparisons, encourage robust LBF development, and facilitate dialogue with the regulatory authorities. This classification system is based on the concept that performance evaluations across three in vitro tests, designed to subject a LBF to progressively more challenging conditions, will enable effective LBF discrimination and performance grading. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Technical Reports Server (NTRS)
Andrews, Alison E.
1987-01-01
An approach to analyzing CFD knowledge-based systems is proposed which is based, in part, on the concept of knowledge-level analysis. Consideration is given to the expert cooling fan design system, the PAN AIR knowledge system, grid adaptation, and expert zonal grid generation. These AI/CFD systems demonstrate that current AI technology can be successfully applied to well-formulated problems that are solved by means of classification or selection of preenumerated solutions.
Engineering and Design: Rock Mass Classification Data Requirements for Rippability
1983-06-30
Engineering and Design ROCK MASS CLASSIFICATION DATA REQUIREMENTS FOR RIPPABILITY Distribution Restriction Statement Approved for public release...and Design: Rock Mass Classification Data Requirements for Rippability Contract Number Grant Number Program Element Number Author(s) Project...Technical Letter 1110-2-282 Engineering and Design ROCK MASS CLASSIFICATION DATA REQUIREMENTS FOR RIPPABILITY 1“ -“ This ETL contains information on data
WFIRST: Microlensing Analysis Data Challenge
NASA Astrophysics Data System (ADS)
Street, Rachel; WFIRST Microlensing Science Investigation Team
2018-01-01
WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.
Vehicle detection in aerial surveillance using dynamic Bayesian networks.
Cheng, Hsu-Yung; Weng, Chih-Chia; Chen, Yi-Ying
2012-04-01
We present an automatic vehicle detection system for aerial surveillance in this paper. In this system, we escape from the stereotype and existing frameworks of vehicle detection in aerial surveillance, which are either region based or sliding window based. We design a pixelwise classification method for vehicle detection. The novelty lies in the fact that, in spite of performing pixelwise classification, relations among neighboring pixels in a region are preserved in the feature extraction process. We consider features including vehicle colors and local features. For vehicle color extraction, we utilize a color transform to separate vehicle colors and nonvehicle colors effectively. For edge detection, we apply moment preserving to adjust the thresholds of the Canny edge detector automatically, which increases the adaptability and the accuracy for detection in various aerial images. Afterward, a dynamic Bayesian network (DBN) is constructed for the classification purpose. We convert regional local features into quantitative observations that can be referenced when applying pixelwise classification via DBN. Experiments were conducted on a wide variety of aerial videos. The results demonstrate flexibility and good generalization abilities of the proposed method on a challenging data set with aerial surveillance images taken at different heights and under different camera angles.
NASA Astrophysics Data System (ADS)
Arya, Ankit S.; Anderson, Derek T.; Bethel, Cindy L.; Carruth, Daniel
2013-05-01
A vision system was designed for people detection to provide support to SWAT team members operating in challenging environments such as low-to-no light, smoke, etc. When the vision system is mounted on a mobile robot platform: it will enable the robot to function as an effective member of the SWAT team; to provide surveillance information; to make first contact with suspects; and provide safe entry for team members. The vision task is challenging because SWAT team members are typically concealed, carry various equipment such as shields, and perform tactical and stealthy maneuvers. Occlusion is a particular challenge because team members operate in close proximity to one another. An uncooled electro-opticaljlong wav e infrared (EO/ LWIR) camera, 7.5 to 13.5 m, was used. A unique thermal dataset was collected of SWAT team members from multiple teams performing tactical maneuvers during monthly training exercises. Our approach consisted of two stages: an object detector trained on people to find candidate windows, and a secondary feature extraction, multi-kernel (MK) aggregation and classification step to distinguish between SWAT team members and civilians. Two types of thermal features, local and global, are presented based on ma ximally stable extremal region (MSER) blob detection. Support vector machine (SVM) classification results of approximately [70, 93]% for SWAT team member detection are reported based on the exploration of different combinations of visual information in terms of training data.
46 CFR 503.58 - Appeals of denials of mandatory declassification review requests.
Code of Federal Regulations, 2011 CFR
2011-10-01
... PUBLIC INFORMATION Information Security Program § 503.58 Appeals of denials of mandatory declassification... Interagency Security Classification Appeals Panel. The appeal should be addressed to, Executive Secretary, Interagency Security Classification Appeals Panel, Attn: Classification Challenge Appeals, c/o Information...
46 CFR 503.58 - Appeals of denials of mandatory declassification review requests.
Code of Federal Regulations, 2010 CFR
2010-10-01
... PUBLIC INFORMATION Information Security Program § 503.58 Appeals of denials of mandatory declassification... Security Classification Appeals Panel. The appeal should be addressed to, Executive Secretary, Interagency Security Classification Appeals Panel, Attn: Classification Challenge Appeals, c/o Information Security...
A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update
NASA Astrophysics Data System (ADS)
Lotte, F.; Bougrain, L.; Cichocki, A.; Clerc, M.; Congedo, M.; Rakotomamonjy, A.; Yger, F.
2018-06-01
Objective. Most current electroencephalography (EEG)-based brain–computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach. We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results. We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance. This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.
Ecosystem services provided by a complex coastal region: challenges of classification and mapping.
Sousa, Lisa P; Sousa, Ana I; Alves, Fátima L; Lillebø, Ana I
2016-03-11
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping.
Ecosystem services provided by a complex coastal region: challenges of classification and mapping
Sousa, Lisa P.; Sousa, Ana I.; Alves, Fátima L.; Lillebø, Ana I.
2016-01-01
A variety of ecosystem services classification systems and mapping approaches are available in the scientific and technical literature, which needs to be selected and adapted when applied to complex territories (e.g. in the interface between water and land, estuary and sea). This paper provides a framework for addressing ecosystem services in complex coastal regions. The roadmap comprises the definition of the exact geographic boundaries of the study area; the use of CICES (Common International Classification of Ecosystem Services) for ecosystem services identification and classification; and the definition of qualitative indicators that will serve as basis to map the ecosystem services. Due to its complexity, the Ria de Aveiro coastal region was selected as case study, presenting an opportunity to explore the application of such approaches at a regional scale. The main challenges of implementing the proposed roadmap, together with its advantages are discussed in this research. The results highlight the importance of considering both the connectivity of natural systems and the complexity of the governance framework; the flexibility and robustness, but also the challenges when applying CICES at regional scale; and the challenges regarding ecosystem services mapping. PMID:26964892
Normalization of relative and incomplete temporal expressions in clinical narratives.
Sun, Weiyi; Rumshisky, Anna; Uzuner, Ozlem
2015-09-01
To improve the normalization of relative and incomplete temporal expressions (RI-TIMEXes) in clinical narratives. We analyzed the RI-TIMEXes in temporally annotated corpora and propose two hypotheses regarding the normalization of RI-TIMEXes in the clinical narrative domain: the anchor point hypothesis and the anchor relation hypothesis. We annotated the RI-TIMEXes in three corpora to study the characteristics of RI-TMEXes in different domains. This informed the design of our RI-TIMEX normalization system for the clinical domain, which consists of an anchor point classifier, an anchor relation classifier, and a rule-based RI-TIMEX text span parser. We experimented with different feature sets and performed an error analysis for each system component. The annotation confirmed the hypotheses that we can simplify the RI-TIMEXes normalization task using two multi-label classifiers. Our system achieves anchor point classification, anchor relation classification, and rule-based parsing accuracy of 74.68%, 87.71%, and 57.2% (82.09% under relaxed matching criteria), respectively, on the held-out test set of the 2012 i2b2 temporal relation challenge. Experiments with feature sets reveal some interesting findings, such as: the verbal tense feature does not inform the anchor relation classification in clinical narratives as much as the tokens near the RI-TIMEX. Error analysis showed that underrepresented anchor point and anchor relation classes are difficult to detect. We formulate the RI-TIMEX normalization problem as a pair of multi-label classification problems. Considering only RI-TIMEX extraction and normalization, the system achieves statistically significant improvement over the RI-TIMEX results of the best systems in the 2012 i2b2 challenge. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Brain Decoding-Classification of Hand Written Digits from fMRI Data Employing Bayesian Networks
Yargholi, Elahe'; Hossein-Zadeh, Gholam-Ali
2016-01-01
We are frequently exposed to hand written digits 0–9 in today's modern life. Success in decoding-classification of hand written digits helps us understand the corresponding brain mechanisms and processes and assists seriously in designing more efficient brain–computer interfaces. However, all digits belong to the same semantic category and similarity in appearance of hand written digits makes this decoding-classification a challenging problem. In present study, for the first time, augmented naïve Bayes classifier is used for classification of functional Magnetic Resonance Imaging (fMRI) measurements to decode the hand written digits which took advantage of brain connectivity information in decoding-classification. fMRI was recorded from three healthy participants, with an age range of 25–30. Results in different brain lobes (frontal, occipital, parietal, and temporal) show that utilizing connectivity information significantly improves decoding-classification and capability of different brain lobes in decoding-classification of hand written digits were compared to each other. In addition, in each lobe the most contributing areas and brain connectivities were determined and connectivities with short distances between their endpoints were recognized to be more efficient. Moreover, data driven method was applied to investigate the similarity of brain areas in responding to stimuli and this revealed both similarly active areas and active mechanisms during this experiment. Interesting finding was that during the experiment of watching hand written digits, there were some active networks (visual, working memory, motor, and language processing), but the most relevant one to the task was language processing network according to the voxel selection. PMID:27468261
Automatic intelligibility classification of sentence-level pathological speech
Kim, Jangwon; Kumar, Naveen; Tsiartas, Andreas; Li, Ming; Narayanan, Shrikanth S.
2014-01-01
Pathological speech usually refers to the condition of speech distortion resulting from atypicalities in voice and/or in the articulatory mechanisms owing to disease, illness or other physical or biological insult to the production system. Although automatic evaluation of speech intelligibility and quality could come in handy in these scenarios to assist experts in diagnosis and treatment design, the many sources and types of variability often make it a very challenging computational processing problem. In this work we propose novel sentence-level features to capture abnormal variation in the prosodic, voice quality and pronunciation aspects in pathological speech. In addition, we propose a post-classification posterior smoothing scheme which refines the posterior of a test sample based on the posteriors of other test samples. Finally, we perform feature-level fusions and subsystem decision fusion for arriving at a final intelligibility decision. The performances are tested on two pathological speech datasets, the NKI CCRT Speech Corpus (advanced head and neck cancer) and the TORGO database (cerebral palsy or amyotrophic lateral sclerosis), by evaluating classification accuracy without overlapping subjects’ data among training and test partitions. Results show that the feature sets of each of the voice quality subsystem, prosodic subsystem, and pronunciation subsystem, offer significant discriminating power for binary intelligibility classification. We observe that the proposed posterior smoothing in the acoustic space can further reduce classification errors. The smoothed posterior score fusion of subsystems shows the best classification performance (73.5% for unweighted, and 72.8% for weighted, average recalls of the binary classes). PMID:25414544
Cancer classification in the genomic era: five contemporary problems.
Song, Qingxuan; Merajver, Sofia D; Li, Jun Z
2015-10-19
Classification is an everyday instinct as well as a full-fledged scientific discipline. Throughout the history of medicine, disease classification is central to how we develop knowledge, make diagnosis, and assign treatment. Here, we discuss the classification of cancer and the process of categorizing cancer subtypes based on their observed clinical and biological features. Traditionally, cancer nomenclature is primarily based on organ location, e.g., "lung cancer" designates a tumor originating in lung structures. Within each organ-specific major type, finer subgroups can be defined based on patient age, cell type, histological grades, and sometimes molecular markers, e.g., hormonal receptor status in breast cancer or microsatellite instability in colorectal cancer. In the past 15+ years, high-throughput technologies have generated rich new data regarding somatic variations in DNA, RNA, protein, or epigenomic features for many cancers. These data, collected for increasingly large tumor cohorts, have provided not only new insights into the biological diversity of human cancers but also exciting opportunities to discover previously unrecognized cancer subtypes. Meanwhile, the unprecedented volume and complexity of these data pose significant challenges for biostatisticians, cancer biologists, and clinicians alike. Here, we review five related issues that represent contemporary problems in cancer taxonomy and interpretation. (1) How many cancer subtypes are there? (2) How can we evaluate the robustness of a new classification system? (3) How are classification systems affected by intratumor heterogeneity and tumor evolution? (4) How should we interpret cancer subtypes? (5) Can multiple classification systems co-exist? While related issues have existed for a long time, we will focus on those aspects that have been magnified by the recent influx of complex multi-omics data. Exploration of these problems is essential for data-driven refinement of cancer classification and the successful application of these concepts in precision medicine.
Kirshblum, S C; Biering-Sorensen, F; Betz, R; Burns, S; Donovan, W; Graves, D E; Johansen, M; Jones, L; Mulcahey, M J; Rodriguez, G M; Schmidt-Read, M; Steeves, J D; Tansey, K; Waring, W
2014-03-01
The International Standards for the Neurological Classification of Spinal Cord Injury (ISNCSCI) is routinely used to determine the levels of injury and to classify the severity of the injury. Questions are often posed to the International Standards Committee of the American Spinal Injury Association regarding the classification. The committee felt that disseminating some of the challenging questions posed, as well as the responses, would be of benefit for professionals utilizing the ISNCSCI. Case scenarios that were submitted to the committee are presented with the responses as well as the thought processes considered by the committee members. The importance of this documentation is to clarify some points as well as update the SCI community regarding possible revisions that will be needed in the future based upon some rules that require clarification.
Serpentine Robots for Inspection Tasks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choset, Howie
2003-09-11
Serpentine robots are snake like devices that can use their internal degrees of freedom to thread through tightly packed volumes accessing locations that people or conventional machinery cannot. These devices are ideally suited for minimally invasive inspection tasks where the surrounding areas do not have to be disturbed. Applications for these devices are therefore inspection of underground tanks and other storage facilities for classification purposes. This work deals with the design, construction, and control of a serpentine robot. The challenges lie in developing a device that can lift itself in three dimensions, which is necessary for the inspection tasks. Themore » other challenge in control deals with coordinating all of the internal degrees of freedom to exact purposeful motion.« less
14 CFR 21.93 - Classification of changes in type design.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Classification of changes in type design... TRANSPORTATION AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Changes to Type Certificates § 21.93 Classification of changes in type design. (a) In addition to changes in type design specified in paragraph (b) of...
Image classification of unlabeled malaria parasites in red blood cells.
Zheng Zhang; Ong, L L Sharon; Kong Fang; Matthew, Athul; Dauwels, Justin; Ming Dao; Asada, Harry
2016-08-01
This paper presents a method to detect unlabeled malaria parasites in red blood cells. The current "gold standard" for malaria diagnosis is microscopic examination of thick blood smear, a time consuming process requiring extensive training. Our goal is to develop an automate process to identify malaria infected red blood cells. Major issues in automated analysis of microscopy images of unstained blood smears include overlapping cells and oddly shaped cells. Our approach creates robust templates to detect infected and uninfected red cells. Histogram of Oriented Gradients (HOGs) features are extracted from templates and used to train a classifier offline. Next, the ViolaJones object detection framework is applied to detect infected and uninfected red cells and the image background. Results show our approach out-performs classification approaches with PCA features by 50% and cell detection algorithms applying Hough transforms by 24%. Majority of related work are designed to automatically detect stained parasites in blood smears where the cells are fixed. Although it is more challenging to design algorithms for unstained parasites, our methods will allow analysis of parasite progression in live cells under different drug treatments.
Learning optimal embedded cascades.
Saberian, Mohammad Javad; Vasconcelos, Nuno
2012-10-01
The problem of automatic and optimal design of embedded object detector cascades is considered. Two main challenges are identified: optimization of the cascade configuration and optimization of individual cascade stages, so as to achieve the best tradeoff between classification accuracy and speed, under a detection rate constraint. Two novel boosting algorithms are proposed to address these problems. The first, RCBoost, formulates boosting as a constrained optimization problem which is solved with a barrier penalty method. The constraint is the target detection rate, which is met at all iterations of the boosting process. This enables the design of embedded cascades of known configuration without extensive cross validation or heuristics. The second, ECBoost, searches over cascade configurations to achieve the optimal tradeoff between classification risk and speed. The two algorithms are combined into an overall boosting procedure, RCECBoost, which optimizes both the cascade configuration and its stages under a detection rate constraint, in a fully automated manner. Extensive experiments in face, car, pedestrian, and panda detection show that the resulting detectors achieve an accuracy versus speed tradeoff superior to those of previous methods.
Design of Multistable Origami Structures
NASA Astrophysics Data System (ADS)
Gillman, Andrew; Fuchi, Kazuko; Bazzan, Giorgio; Reich, Gregory; Alyanak, Edward; Buskohl, Philip
Origami is being transformed from an art to a mathematically robust method for device design in a variety of scientific applications. These structures often require multiple stable configurations, e.g. efficient well-controlled deployment. However, the discovery of origami structures with mechanical instabilities is challenging given the complex geometric nonlinearities and the large design space to investigate. To address this challenge, we have developed a topology optimization framework for discovering origami fold patterns that realize stable and metastable positions. The objective function targets both the desired stable positions and nonlinear loading profiles of specific vertices in the origami structure. Multistable compliant structures have been shown to offer advantages in their stability and efficiency, and certain origami fold patterns exhibit multistable behavior. Building on this previous work of single vertex multistability analysis, e.g. waterbomb origami pattern, we are expanding the solution set of multistable mechanisms to include multiple vertices and a broader set of reference configurations. Collectively, these results enable an initial classification of geometry-induced mechanical instabilities that can be programmed into active material systems. This work was supported by the Air Force Office of Scientific Research.
Shahid, Mohammad; Shahzad Cheema, Muhammad; Klenner, Alexander; Younesi, Erfan; Hofmann-Apitius, Martin
2013-03-01
Systems pharmacological modeling of drug mode of action for the next generation of multitarget drugs may open new routes for drug design and discovery. Computational methods are widely used in this context amongst which support vector machines (SVM) have proven successful in addressing the challenge of classifying drugs with similar features. We have applied a variety of such SVM-based approaches, namely SVM-based recursive feature elimination (SVM-RFE). We use the approach to predict the pharmacological properties of drugs widely used against complex neurodegenerative disorders (NDD) and to build an in-silico computational model for the binary classification of NDD drugs from other drugs. Application of an SVM-RFE model to a set of drugs successfully classified NDD drugs from non-NDD drugs and resulted in overall accuracy of ∼80 % with 10 fold cross validation using 40 top ranked molecular descriptors selected out of total 314 descriptors. Moreover, SVM-RFE method outperformed linear discriminant analysis (LDA) based feature selection and classification. The model reduced the multidimensional descriptors space of drugs dramatically and predicted NDD drugs with high accuracy, while avoiding over fitting. Based on these results, NDD-specific focused libraries of drug-like compounds can be designed and existing NDD-specific drugs can be characterized by a well-characterized set of molecular descriptors. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cascaded deep decision networks for classification of endoscopic images
NASA Astrophysics Data System (ADS)
Murthy, Venkatesh N.; Singh, Vivek; Sun, Shanhui; Bhattacharya, Subhabrata; Chen, Terrence; Comaniciu, Dorin
2017-02-01
Both traditional and wireless capsule endoscopes can generate tens of thousands of images for each patient. It is desirable to have the majority of irrelevant images filtered out by automatic algorithms during an offline review process or to have automatic indication for highly suspicious areas during an online guidance. This also applies to the newly invented endomicroscopy, where online indication of tumor classification plays a significant role. Image classification is a standard pattern recognition problem and is well studied in the literature. However, performance on the challenging endoscopic images still has room for improvement. In this paper, we present a novel Cascaded Deep Decision Network (CDDN) to improve image classification performance over standard Deep neural network based methods. During the learning phase, CDDN automatically builds a network which discards samples that are classified with high confidence scores by a previously trained network and concentrates only on the challenging samples which would be handled by the subsequent expert shallow networks. We validate CDDN using two different types of endoscopic imaging, which includes a polyp classification dataset and a tumor classification dataset. From both datasets we show that CDDN can outperform other methods by about 10%. In addition, CDDN can also be applied to other image classification problems.
Vector quantizer designs for joint compression and terrain categorization of multispectral imagery
NASA Technical Reports Server (NTRS)
Gorman, John D.; Lyons, Daniel F.
1994-01-01
Two vector quantizer designs for compression of multispectral imagery and their impact on terrain categorization performance are evaluated. The mean-squared error (MSE) and classification performance of the two quantizers are compared, and it is shown that a simple two-stage design minimizing MSE subject to a constraint on classification performance has a significantly better classification performance than a standard MSE-based tree-structured vector quantizer followed by maximum likelihood classification. This improvement in classification performance is obtained with minimal loss in MSE performance. The results show that it is advantageous to tailor compression algorithm designs to the required data exploitation tasks. Applications of joint compression/classification include compression for the archival or transmission of Landsat imagery that is later used for land utility surveys and/or radiometric analysis.
ERIC Educational Resources Information Center
Briggs, Derek C.; Circi, Ruhan
2017-01-01
Artificial Neural Networks (ANNs) have been proposed as a promising approach for the classification of students into different levels of a psychological attribute hierarchy. Unfortunately, because such classifications typically rely upon internally produced item response patterns that have not been externally validated, the instability of ANN…
De Antonio, M; Dogan, C; Hamroun, D; Mati, M; Zerrouki, S; Eymard, B; Katsahian, S; Bassez, G
2016-10-01
The broad clinical spectrum of myotonic dystrophy type 1 (DM1) creates particular challenges for both medical care and design of clinical trials. Clinical onset spans a continuum from birth to late adulthood, with symptoms that are highly variable in both severity and nature of the affected organ systems. In the literature, this complex phenotype is divided into three grades (mild, classic, and severe) and four or five main clinical categories (congenital, infantile/juvenile, adult-onset and late-onset forms), according to symptom severity and age of onset, respectively. However, these classifications are still under discussion with no consensus thus far. While some specific clinical features have been primarily reported in some forms of the disease, there are no clear distinctions. As a consequence, no modifications in the management of healthcare or the design of clinical studies have been proposed based on the clinical form of DM1. The present study has used the DM-Scope registry to assess, in a large cohort of DM1 patients, the robustness of a classification divided into five clinical forms. Our main aim was to describe the disease spectrum and investigate features of each clinical form. The five subtypes were compared by distribution of CTG expansion size, and the occurrence and onset of the main symptoms of DM1. Analyses validated the relevance of a five-grade model for DM1 classification. Patients were classified as: congenital (n=93, 4.5%); infantile (n=303, 14.8%); juvenile (n=628, 30.7%); adult (n=694, 34.0%); and late-onset (n=326, 15.9%). Our data show that the assumption of a continuum from congenital to the late-onset form is valid, and also highlights disease features specific to individual clinical forms of DM1 in terms of symptom occurrence and chronology throughout the disease course. These results support the use of the five-grade model for disease classification, and the distinct clinical profiles suggest that age of onset and clinical form may be key criteria in the design of clinical trials when considering DM1 health management and research. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Design of a Multi-Sensor Cooperation Travel Environment Perception System for Autonomous Vehicle
Chen, Long; Li, Qingquan; Li, Ming; Zhang, Liang; Mao, Qingzhou
2012-01-01
This paper describes the environment perception system designed for intelligent vehicle SmartV-II, which won the 2010 Future Challenge. This system utilizes the cooperation of multiple lasers and cameras to realize several necessary functions of autonomous navigation: road curb detection, lane detection and traffic sign recognition. Multiple single scan lasers are integrated to detect the road curb based on Z-variance method. Vision based lane detection is realized by two scans method combining with image model. Haar-like feature based method is applied for traffic sign detection and SURF matching method is used for sign classification. The results of experiments validate the effectiveness of the proposed algorithms and the whole system.
Computer Classification of Triangles and Quadrilaterals--A Challenging Application
ERIC Educational Resources Information Center
Dennis, J. Richard
1978-01-01
Two computer exercises involving the classification of geometric figures are given. The mathematics required is relatively simple but comes from several areas--synthetic geometry, analytic geometry, and linear algebra. (MN)
49 CFR 1248.100 - Commodity classification designated.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 9 2011-10-01 2011-10-01 false Commodity classification designated. 1248.100... STATISTICS Commodity Code § 1248.100 Commodity classification designated. Commencing with reports for the..., reports of commodity statistics required to be made to the Board, shall be based on the commodity codes...
Palmer, Michael J; Mercieca-Bebber, Rebecca; King, Madeleine; Calvert, Melanie; Richardson, Harriet; Brundage, Michael
2018-02-01
Missing patient-reported outcome data can lead to biased results, to loss of power to detect between-treatment differences, and to research waste. Awareness of factors may help researchers reduce missing patient-reported outcome data through study design and trial processes. The aim was to construct a Classification Framework of factors associated with missing patient-reported outcome data in the context of comparative studies. The first step in this process was informed by a systematic review. Two databases (MEDLINE and CINAHL) were searched from inception to March 2015 for English articles. Inclusion criteria were (a) relevant to patient-reported outcomes, (b) discussed missing data or compliance in prospective medical studies, and (c) examined predictors or causes of missing data, including reasons identified in actual trial datasets and reported on cover sheets. Two reviewers independently screened titles and abstracts. Discrepancies were discussed with the research team prior to finalizing the list of eligible papers. In completing the systematic review, four particular challenges to synthesizing the extracted information were identified. To address these challenges, operational principles were established by consensus to guide the development of the Classification Framework. A total of 6027 records were screened. In all, 100 papers were eligible and included in the review. Of these, 57% focused on cancer, 23% did not specify disease, and 20% reported for patients with a variety of non-cancer conditions. In total, 40% of the papers offered a descriptive analysis of possible factors associated with missing data, but some papers used other methods. In total, 663 excerpts of text (units), each describing a factor associated with missing patient-reported outcome data, were extracted verbatim. Redundant units were identified and sequestered. Similar units were grouped, and an iterative process of consensus among the investigators was used to reduce these units to a list of factors that met the guiding principles. The list was organized on a framework, using an iterative consensus-based process. The resultant Classification Framework is a summary of the factors associated with missing patient-reported outcome data described in the literature. It consists of 5 components (instrument, participant, centre, staff, and study) and 46 categories, each with one or more sub-categories or examples. A systematic review of the literature revealed 46 unique categories of factors associated with missing patient-reported outcome data, organized into 5 main component groups. The Classification Framework may assist researchers to improve the design of new randomized clinical trials and to implement procedures to reduce missing patient-reported outcome data. Further research using the Classification Framework to inform quantitative analyses of missing patient-reported outcome data in existing clinical trials and to inform qualitative inquiry of research staff is planned.
Information extraction with object based support vector machines and vegetation indices
NASA Astrophysics Data System (ADS)
Ustuner, Mustafa; Abdikan, Saygin; Balik Sanli, Fusun
2016-07-01
Information extraction through remote sensing data is important for policy and decision makers as extracted information provide base layers for many application of real world. Classification of remotely sensed data is the one of the most common methods of extracting information however it is still a challenging issue because several factors are affecting the accuracy of the classification. Resolution of the imagery, number and homogeneity of land cover classes, purity of training data and characteristic of adopted classifiers are just some of these challenging factors. Object based image classification has some superiority than pixel based classification for high resolution images since it uses geometry and structure information besides spectral information. Vegetation indices are also commonly used for the classification process since it provides additional spectral information for vegetation, forestry and agricultural areas. In this study, the impacts of the Normalized Difference Vegetation Index (NDVI) and Normalized Difference Red Edge Index (NDRE) on the classification accuracy of RapidEye imagery were investigated. Object based Support Vector Machines were implemented for the classification of crop types for the study area located in Aegean region of Turkey. Results demonstrated that the incorporation of NDRE increase the classification accuracy from 79,96% to 86,80% as overall accuracy, however NDVI decrease the classification accuracy from 79,96% to 78,90%. Moreover it is proven than object based classification with RapidEye data give promising results for crop type mapping and analysis.
Reverse Shoulder Arthroplasty Prosthesis Design Classification System.
Routman, Howard D; Flurin, Pierre-Henri; Wright, Thomas W; Zuckerman, Joseph D; Hamilton, Matthew A; Roche, Christopher P
2015-12-01
Multiple different reverse total shoulder arthroplasty (rTSA) prosthesis designs are available in the global marketplace for surgeons to perform this growing procedure. Subtle differences in rTSA prosthesis design parameters have been shown to have significant biomechanical impact and clinical consequences. We propose an rTSA prosthesis design classification system to objectively identify and categorize different designs based upon their specific glenoid and humeral prosthetic characteristics for the purpose of standardizing nomenclature that will help the orthopaedic surgeon determine which combination of design configurations best suit a given clinical scenario. The impact of each prosthesis classification type on shoulder muscle length and deltoid wrapping are also described to illustrate how each prosthesis classification type impacts these biomechanical parameters.
ERIC Educational Resources Information Center
Williams, Florence M.
This report addresses the feasibility of changing the classification of library materials in the Chicago Public School libraries from the Dewey Decimal classification system (DDC) to the Library of Congress system (LC), thus patterning the city school libraries after the Chicago Public Library and strengthening the existing close relationship…
Structural health monitoring feature design by genetic programming
NASA Astrophysics Data System (ADS)
Harvey, Dustin Y.; Todd, Michael D.
2014-09-01
Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.
32 CFR 1907.26 - Notification of decision and prohibition on adverse action.
Code of Federal Regulations, 2010 CFR
2010-07-01
... made to the Interagency Security Classification Appeals Panel (ISCAP) established pursuant to § 5.4 of... CENTRAL INTELLIGENCE AGENCY CHALLENGES TO CLASSIFICATION OF DOCUMENTS BY AUTHORIZED HOLDERS PURSUANT TO Â...
32 CFR 1907.26 - Notification of decision and prohibition on adverse action.
Code of Federal Regulations, 2011 CFR
2011-07-01
... made to the Interagency Security Classification Appeals Panel (ISCAP) established pursuant to § 5.4 of... CENTRAL INTELLIGENCE AGENCY CHALLENGES TO CLASSIFICATION OF DOCUMENTS BY AUTHORIZED HOLDERS PURSUANT TO Â...
Baldominos, Alejandro; Saez, Yago; Isasi, Pedro
2018-04-23
Human activity recognition is a challenging problem for context-aware systems and applications. It is gaining interest due to the ubiquity of different sensor sources, wearable smart objects, ambient sensors, etc. This task is usually approached as a supervised machine learning problem, where a label is to be predicted given some input data, such as the signals retrieved from different sensors. For tackling the human activity recognition problem in sensor network environments, in this paper we propose the use of deep learning (convolutional neural networks) to perform activity recognition using the publicly available OPPORTUNITY dataset. Instead of manually choosing a suitable topology, we will let an evolutionary algorithm design the optimal topology in order to maximize the classification F1 score. After that, we will also explore the performance of committees of the models resulting from the evolutionary process. Results analysis indicates that the proposed model was able to perform activity recognition within a heterogeneous sensor network environment, achieving very high accuracies when tested with new sensor data. Based on all conducted experiments, the proposed neuroevolutionary system has proved to be able to systematically find a classification model which is capable of outperforming previous results reported in the state-of-the-art, showing that this approach is useful and improves upon previously manually-designed architectures.
2018-01-01
Human activity recognition is a challenging problem for context-aware systems and applications. It is gaining interest due to the ubiquity of different sensor sources, wearable smart objects, ambient sensors, etc. This task is usually approached as a supervised machine learning problem, where a label is to be predicted given some input data, such as the signals retrieved from different sensors. For tackling the human activity recognition problem in sensor network environments, in this paper we propose the use of deep learning (convolutional neural networks) to perform activity recognition using the publicly available OPPORTUNITY dataset. Instead of manually choosing a suitable topology, we will let an evolutionary algorithm design the optimal topology in order to maximize the classification F1 score. After that, we will also explore the performance of committees of the models resulting from the evolutionary process. Results analysis indicates that the proposed model was able to perform activity recognition within a heterogeneous sensor network environment, achieving very high accuracies when tested with new sensor data. Based on all conducted experiments, the proposed neuroevolutionary system has proved to be able to systematically find a classification model which is capable of outperforming previous results reported in the state-of-the-art, showing that this approach is useful and improves upon previously manually-designed architectures. PMID:29690587
Designing a Classification System for Internet Offenders: Doing Cognitive Distortions
ERIC Educational Resources Information Center
Hundersmarck, Steven F.; Durkin, Keith F.; Delong, Ronald L.
2007-01-01
Televised features such as NBC's "To Catch a Predator" have highlighted the growing problem posed by Internet sexual predators. This paper reports on the authors' attempts in designing a classification system for Internet offenders. The classification system was designed based on existing theory, understanding the nature of Internet offenders and…
Considerations of Unmanned Aircraft Classification for Civil Airworthiness Standards
NASA Technical Reports Server (NTRS)
Maddalon, Jeffrey M.; Hayhurst, Kelly J.; Morris, A. Terry; Verstynen, Harry A.
2013-01-01
The use of unmanned aircraft in the National Airspace System (NAS) has been characterized as the next great step forward in the evolution of civil aviation. Although use of unmanned aircraft systems (UAS) in military and public service operations is proliferating, civil use of UAS remains limited in the United States today. This report focuses on one particular regulatory challenge: classifying UAS to assign airworthiness standards. Classification is useful for ensuring that meaningful differences in design are accommodated by certification to different standards, and that aircraft with similar risk profiles are held to similar standards. This paper provides observations related to how the current regulations for classifying manned aircraft, based on dimensions of aircraft class and operational aircraft categories, could apply to UAS. This report finds that existing aircraft classes are well aligned with the types of UAS that currently exist; however, the operational categories are more difficult to align to proposed UAS use in the NAS. Specifically, the factors used to group manned aircraft into similar risk profiles do not necessarily capture all relevant UAS risks. UAS classification is investigated through gathering approaches to classification from a broad spectrum of organizations, and then identifying and evaluating the classification factors from these approaches. This initial investigation concludes that factors in addition to those currently used today to group manned aircraft for the purpose of assigning airworthiness standards will be needed to adequately capture risks associated with UAS and their operations.
André, Nicole M.
2018-01-01
ABSTRACT The difficulties related to virus taxonomy have been amplified by recent advances in next-generation sequencing and metagenomics, prompting the field to revisit the question of what constitutes a useful viral classification. Here, taking a challenging classification found in coronaviruses, we argue that consideration of biological properties in addition to sequence-based demarcations is critical for generating useful taxonomy that recapitulates complex evolutionary histories. Within the Alphacoronavirus genus, the Alphacoronavirus 1 species encompasses several biologically distinct viruses. We carried out functionally based phylogenetic analysis, centered on the spike gene, which encodes the main surface antigen and primary driver of tropism and pathogenesis. Within the Alphacoronavirus 1 species, we identify clade A (encompassing serotype I feline coronavirus [FCoV] and canine coronavirus [CCoV]) and clade B (grouping serotype II FCoV and CCoV and transmissible gastroenteritis virus [TGEV]-like viruses). We propose this clade designation, along with the newly proposed Alphacoronavirus 2 species, as an improved way to classify the Alphacoronavirus genus. IMPORTANCE Our work focuses on improving the classification of the Alphacoronavirus genus. The Alphacoronavirus 1 species groups viruses of veterinary importance that infect distinct mammalian hosts and includes canine and feline coronaviruses and transmissible gastroenteritis virus. It is the prototype species of the Alphacoronavirus genus; however, it encompasses biologically distinct viruses. To better characterize this prototypical species, we performed phylogenetic analyses based on the sequences of the spike protein, one of the main determinants of tropism and pathogenesis, and reveal the existence of two subgroups or clades that fit with previously established serotype demarcations. We propose a new clade designation to better classify Alphacoronavirus 1 members. PMID:29299531
Whittaker, Gary R; André, Nicole M; Millet, Jean Kaoru
2018-01-01
The difficulties related to virus taxonomy have been amplified by recent advances in next-generation sequencing and metagenomics, prompting the field to revisit the question of what constitutes a useful viral classification. Here, taking a challenging classification found in coronaviruses, we argue that consideration of biological properties in addition to sequence-based demarcations is critical for generating useful taxonomy that recapitulates complex evolutionary histories. Within the Alphacoronavirus genus, the Alphacoronavirus 1 species encompasses several biologically distinct viruses. We carried out functionally based phylogenetic analysis, centered on the spike gene, which encodes the main surface antigen and primary driver of tropism and pathogenesis. Within the Alphacoronavirus 1 species, we identify clade A (encompassing serotype I feline coronavirus [FCoV] and canine coronavirus [CCoV]) and clade B (grouping serotype II FCoV and CCoV and transmissible gastroenteritis virus [TGEV]-like viruses). We propose this clade designation, along with the newly proposed Alphacoronavirus 2 species, as an improved way to classify the Alphacoronavirus genus. IMPORTANCE Our work focuses on improving the classification of the Alphacoronavirus genus. The Alphacoronavirus 1 species groups viruses of veterinary importance that infect distinct mammalian hosts and includes canine and feline coronaviruses and transmissible gastroenteritis virus. It is the prototype species of the Alphacoronavirus genus; however, it encompasses biologically distinct viruses. To better characterize this prototypical species, we performed phylogenetic analyses based on the sequences of the spike protein, one of the main determinants of tropism and pathogenesis, and reveal the existence of two subgroups or clades that fit with previously established serotype demarcations. We propose a new clade designation to better classify Alphacoronavirus 1 members.
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2014 CFR
2014-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2013 CFR
2013-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2011 CFR
2011-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
10 CFR 1045.17 - Classification levels.
Code of Federal Regulations, 2012 CFR
2012-01-01
... classification include detailed technical descriptions of critical features of a nuclear explosive design that... classification include designs for specific weapon components (not revealing critical features), key features of uranium enrichment technologies, or specifications of weapon materials. (3) Confidential. The Director of...
Hydrologic Landscape Characterization for the Pacific Northwest, USA
Hydrologic classification can help address some of the challenges facing catchment hydrology. Wigington et al. (2013) developed a hydrologic landscape (HL) approach to classification that was applied to the state of Oregon. Several characteristics limited its applicability outs...
SOCIAL MEDIA MINING SHARED TASK WORKSHOP.
Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela
2016-01-01
Social media has evolved into a crucial resource for obtaining large volumes of real-time information. The promise of social media has been realized by the public health domain, and recent research has addressed some important challenges in that domain by utilizing social media data. Tasks such as monitoring flu trends, viral disease outbreaks, medication abuse, and adverse drug reactions are some examples of studies where data from social media have been exploited. The focus of this workshop is to explore solutions to three important natural language processing challenges for domain-specific social media text: (i) text classification, (ii) information extraction, and (iii) concept normalization. To explore different approaches to solving these problems on social media data, we designed a shared task which was open to participants globally. We designed three tasks using our in-house annotated Twitter data on adverse drug reactions. Task 1 involved automatic classification of adverse drug reaction assertive user posts; Task 2 focused on extracting specific adverse drug reaction mentions from user posts; and Task 3, which was slightly ill-defined due to the complex nature of the problem, involved normalizing user mentions of adverse drug reactions to standardized concept IDs. A total of 11 teams participated, and a total of 24 (18 for Task 1, and 6 for Task 2) system runs were submitted. Following the evaluation of the systems, and an assessment of their innovation/novelty, we accepted 7 descriptive manuscripts for publication--5 for Task 1 and 2 for Task 2. We provide descriptions of the tasks, data, and participating systems in this paper.
A Classification Scheme for Smart Manufacturing Systems’ Performance Metrics
Lee, Y. Tina; Kumaraguru, Senthilkumaran; Jain, Sanjay; Robinson, Stefanie; Helu, Moneer; Hatim, Qais Y.; Rachuri, Sudarsan; Dornfeld, David; Saldana, Christopher J.; Kumara, Soundar
2017-01-01
This paper proposes a classification scheme for performance metrics for smart manufacturing systems. The discussion focuses on three such metrics: agility, asset utilization, and sustainability. For each of these metrics, we discuss classification themes, which we then use to develop a generalized classification scheme. In addition to the themes, we discuss a conceptual model that may form the basis for the information necessary for performance evaluations. Finally, we present future challenges in developing robust, performance-measurement systems for real-time, data-intensive enterprises. PMID:28785744
Applying graph theory to protein structures: an atlas of coiled coils.
Heal, Jack W; Bartlett, Gail J; Wood, Christopher W; Thomson, Andrew R; Woolfson, Derek N
2018-05-02
To understand protein structure, folding and function fully and to design proteins de novo reliably, we must learn from natural protein structures that have been characterised experimentally. The number of protein structures available is large and growing exponentially, which makes this task challenging. Indeed, computational resources are becoming increasingly important for classifying and analysing this resource. Here, we use tools from graph theory to define an atlas classification scheme for automatically categorising certain protein substructures. Focusing on the α-helical coiled coils, which are ubiquitous protein-structure and protein-protein interaction motifs, we present a suite of computational resources designed for analysing these assemblies. iSOCKET enables interactive analysis of side-chain packing within proteins to identify coiled coils automatically and with considerable user control. Applying a graph theory-based atlas classification scheme to structures identified by iSOCKET gives the Atlas of Coiled Coils, a fully automated, updated overview of extant coiled coils. The utility of this approach is illustrated with the first formal classification of an emerging subclass of coiled coils called α-helical barrels. Furthermore, in the Atlas, the known coiled-coil universe is presented alongside a partial enumeration of the 'dark matter' of coiled-coil structures; i.e., those coiled-coil architectures that are theoretically possible but have not been observed to date, and thus present defined targets for protein design. iSOCKET is available as part of the open-source GitHub repository associated with this work (https://github.com/woolfson-group/isocket). This repository also contains all the data generated when classifying the protein graphs. The Atlas of Coiled Coils is available at: http://coiledcoils.chm.bris.ac.uk/atlas/app.
Kinematic design considerations for minimally invasive surgical robots: an overview.
Kuo, Chin-Hsing; Dai, Jian S; Dasgupta, Prokar
2012-06-01
Kinematic design is a predominant phase in the design of robotic manipulators for minimally invasive surgery (MIS). However, an extensive overview of the kinematic design issues for MIS robots is not yet available to both mechanisms and robotics communities. Hundreds of archival reports and articles on robotic systems for MIS are reviewed and studied. In particular, the kinematic design considerations and mechanism development described in the literature for existing robots are focused on. The general kinematic design goals, design requirements, and design preferences for MIS robots are defined. An MIS-specialized mechanism, namely the remote center-of-motion (RCM) mechanism, is revisited and studied. Accordingly, based on the RCM mechanism types, a classification for MIS robots is provided. A comparison between eight different RCM types is given. Finally, several open challenges for the kinematic design of MIS robotic manipulators are discussed. This work provides a detailed survey of the kinematic design of MIS robots, addresses the research opportunity in MIS robots for kinematicians, and clarifies the kinematic point of view to MIS robots as a reference for the medical community. Copyright © 2012 John Wiley & Sons, Ltd.
A hybrid cost-sensitive ensemble for imbalanced breast thermogram classification.
Krawczyk, Bartosz; Schaefer, Gerald; Woźniak, Michał
2015-11-01
Early recognition of breast cancer, the most commonly diagnosed form of cancer in women, is of crucial importance, given that it leads to significantly improved chances of survival. Medical thermography, which uses an infrared camera for thermal imaging, has been demonstrated as a particularly useful technique for early diagnosis, because it detects smaller tumors than the standard modality of mammography. In this paper, we analyse breast thermograms by extracting features describing bilateral symmetries between the two breast areas, and present a classification system for decision making. Clearly, the costs associated with missing a cancer case are much higher than those for mislabelling a benign case. At the same time, datasets contain significantly fewer malignant cases than benign ones. Standard classification approaches fail to consider either of these aspects. In this paper, we introduce a hybrid cost-sensitive classifier ensemble to address this challenging problem. Our approach entails a pool of cost-sensitive decision trees which assign a higher misclassification cost to the malignant class, thereby boosting its recognition rate. A genetic algorithm is employed for simultaneous feature selection and classifier fusion. As an optimisation criterion, we use a combination of misclassification cost and diversity to achieve both a high sensitivity and a heterogeneous ensemble. Furthermore, we prune our ensemble by discarding classifiers that contribute minimally to the decision making. For a challenging dataset of about 150 thermograms, our approach achieves an excellent sensitivity of 83.10%, while maintaining a high specificity of 89.44%. This not only signifies improved recognition of malignant cases, it also statistically outperforms other state-of-the-art algorithms designed for imbalanced classification, and hence provides an effective approach for analysing breast thermograms. Our proposed hybrid cost-sensitive ensemble can facilitate a highly accurate early diagnostic of breast cancer based on thermogram features. It overcomes the difficulties posed by the imbalanced distribution of patients in the two analysed groups. Copyright © 2015 Elsevier B.V. All rights reserved.
Brain-computer interfacing under distraction: an evaluation study
NASA Astrophysics Data System (ADS)
Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech
2016-10-01
Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.
NASA Astrophysics Data System (ADS)
Gevaert, C. M.; Persello, C.; Sliuzas, R.; Vosselman, G.
2016-06-01
Unmanned Aerial Vehicles (UAVs) are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Especially the dense buildings and steeply sloped terrain cause difficulties in identifying elevated objects. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. It compares the utility of pixel-based and segment-based features obtained from an orthomosaic and DSM with point-based and segment-based features extracted from the point cloud to classify an unplanned settlement in Kigali, Rwanda. Findings show that the integration of 2D and 3D features leads to higher classification accuracies.
Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.
Chen, Shizhi; Yang, Xiaodong; Tian, Yingli
2015-09-01
A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.
Classifying psychosis--challenges and opportunities.
Gaebel, Wolfgang; Zielasek, Jürgen; Cleveland, Helen-Rose
2012-12-01
Within the efforts to revise ICD-10 and DSM-IV-TR, work groups on the classification of psychotic disorders appointed by the World Health Organization (WHO) and the American Psychiatric Association (APA) have proposed several changes to the corresponding classification criteria of schizophrenia and other psychotic disorders in order to increase the clinical utility, reliability and validity of these diagnoses. These proposed revisions are subject to field trials with the objective of studying whether they will lead to an improvement of the classification systems in comparison to their previous versions. Both a challenge and an opportunity, the APA and WHO have also considered harmonizing between the two classifications. The current status of both suggests that this goal can only be met in part. The main proposed revisions include changes to the number and types of symptoms of schizophrenia, the replacement of existing schizophrenia subtypes with dimensional assessments or symptom specifiers, different modifications of the criteria for schizoaffective disorder, a reorganization of the delusional disorders and the acute and transient psychotic disorders in ICD-11, as well as the revision of course and psychomotor symptoms/catatonia specifiers in both classification systems.
Houyel, Lucile; Khoshnood, Babak; Anderson, Robert H; Lelong, Nathalie; Thieulin, Anne-Claire; Goffinet, François; Bonnet, Damien
2011-10-03
Classification of the overall spectrum of congenital heart defects (CHD) has always been challenging, in part because of the diversity of the cardiac phenotypes, but also because of the oft-complex associations. The purpose of our study was to establish a comprehensive and easy-to-use classification of CHD for clinical and epidemiological studies based on the long list of the International Paediatric and Congenital Cardiac Code (IPCCC). We coded each individual malformation using six-digit codes from the long list of IPCCC. We then regrouped all lesions into 10 categories and 23 subcategories according to a multi-dimensional approach encompassing anatomic, diagnostic and therapeutic criteria. This anatomic and clinical classification of congenital heart disease (ACC-CHD) was then applied to data acquired from a population-based cohort of patients with CHD in France, made up of 2867 cases (82% live births, 1.8% stillbirths and 16.2% pregnancy terminations). The majority of cases (79.5%) could be identified with a single IPCCC code. The category "Heterotaxy, including isomerism and mirror-imagery" was the only one that typically required more than one code for identification of cases. The two largest categories were "ventricular septal defects" (52%) and "anomalies of the outflow tracts and arterial valves" (20% of cases). Our proposed classification is not new, but rather a regrouping of the known spectrum of CHD into a manageable number of categories based on anatomic and clinical criteria. The classification is designed to use the code numbers of the long list of IPCCC but can accommodate ICD-10 codes. Its exhaustiveness, simplicity, and anatomic basis make it useful for clinical and epidemiologic studies, including those aimed at assessment of risk factors and outcomes.
Faruki, Hawazin; Mayhew, Gregory M; Fan, Cheng; Wilkerson, Matthew D; Parker, Scott; Kam-Morgan, Lauren; Eisenberg, Marcia; Horten, Bruce; Hayes, D Neil; Perou, Charles M; Lai-Goldman, Myla
2016-06-01
Context .- A histologic classification of lung cancer subtypes is essential in guiding therapeutic management. Objective .- To complement morphology-based classification of lung tumors, a previously developed lung subtyping panel (LSP) of 57 genes was tested using multiple public fresh-frozen gene-expression data sets and a prospectively collected set of formalin-fixed, paraffin-embedded lung tumor samples. Design .- The LSP gene-expression signature was evaluated in multiple lung cancer gene-expression data sets totaling 2177 patients collected from 4 platforms: Illumina RNAseq (San Diego, California), Agilent (Santa Clara, California) and Affymetrix (Santa Clara) microarrays, and quantitative reverse transcription-polymerase chain reaction. Gene centroids were calculated for each of 3 genomic-defined subtypes: adenocarcinoma, squamous cell carcinoma, and neuroendocrine, the latter of which encompassed both small cell carcinoma and carcinoid. Classification by LSP into 3 subtypes was evaluated in both fresh-frozen and formalin-fixed, paraffin-embedded tumor samples, and agreement with the original morphology-based diagnosis was determined. Results .- The LSP-based classifications demonstrated overall agreement with the original clinical diagnosis ranging from 78% (251 of 322) to 91% (492 of 538 and 869 of 951) in the fresh-frozen public data sets and 84% (65 of 77) in the formalin-fixed, paraffin-embedded data set. The LSP performance was independent of tissue-preservation method and gene-expression platform. Secondary, blinded pathology review of formalin-fixed, paraffin-embedded samples demonstrated concordance of 82% (63 of 77) with the original morphology diagnosis. Conclusions .- The LSP gene-expression signature is a reproducible and objective method for classifying lung tumors and demonstrates good concordance with morphology-based classification across multiple data sets. The LSP panel can supplement morphologic assessment of lung cancers, particularly when classification by standard methods is challenging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Huiying; Hou, Zhangshuan; Huang, Maoyi
The Community Land Model (CLM) represents physical, chemical, and biological processes of the terrestrial ecosystems that interact with climate across a range of spatial and temporal scales. As CLM includes numerous sub-models and associated parameters, the high-dimensional parameter space presents a formidable challenge for quantifying uncertainty and improving Earth system predictions needed to assess environmental changes and risks. This study aims to evaluate the potential of transferring hydrologic model parameters in CLM through sensitivity analyses and classification across watersheds from the Model Parameter Estimation Experiment (MOPEX) in the United States. The sensitivity of CLM-simulated water and energy fluxes to hydrologicalmore » parameters across 431 MOPEX basins are first examined using an efficient stochastic sampling-based sensitivity analysis approach. Linear, interaction, and high-order nonlinear impacts are all identified via statistical tests and stepwise backward removal parameter screening. The basins are then classified accordingly to their parameter sensitivity patterns (internal attributes), as well as their hydrologic indices/attributes (external hydrologic factors) separately, using a Principal component analyses (PCA) and expectation-maximization (EM) –based clustering approach. Similarities and differences among the parameter sensitivity-based classification system (S-Class), the hydrologic indices-based classification (H-Class), and the Koppen climate classification systems (K-Class) are discussed. Within each S-class with similar parameter sensitivity characteristics, similar inversion modeling setups can be used for parameter calibration, and the parameters and their contribution or significance to water and energy cycling may also be more transferrable. This classification study provides guidance on identifiable parameters, and on parameterization and inverse model design for CLM but the methodology is applicable to other models. Inverting parameters at representative sites belonging to the same class can significantly reduce parameter calibration efforts.« less
Tayebi Meybodi, Ali; Lawton, Michael T
2018-02-23
Brain arteriovenous malformations (bAVM) are challenging lesions. Part of this challenge stems from the infinite diversity of these lesions regarding shape, location, anatomy, and physiology. This diversity has called on a variety of treatment modalities for these lesions, of which microsurgical resection prevails as the mainstay of treatment. As such, outcome prediction and managing strategy mainly rely on unraveling the nature of these complex tangles and ways each lesion responds to various therapeutic modalities. This strategy needs the ability to decipher each lesion through accurate and efficient categorization. Therefore, classification schemes are essential parts of treatment planning and outcome prediction. This article summarizes different surgical classification schemes and outcome predictors proposed for bAVMs.
Can surgical simulation be used to train detection and classification of neural networks?
Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail
2017-10-01
Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.
Unsupervised Biomedical Named Entity Recognition: Experiments with Clinical and Biological Texts
Zhang, Shaodian; Elhadad, Nóemie
2013-01-01
Named entity recognition is a crucial component of biomedical natural language processing, enabling information extraction and ultimately reasoning over and knowledge discovery from text. Much progress has been made in the design of rule-based and supervised tools, but they are often genre and task dependent. As such, adapting them to different genres of text or identifying new types of entities requires major effort in re-annotation or rule development. In this paper, we propose an unsupervised approach to extracting named entities from biomedical text. We describe a stepwise solution to tackle the challenges of entity boundary detection and entity type classification without relying on any handcrafted rules, heuristics, or annotated data. A noun phrase chunker followed by a filter based on inverse document frequency extracts candidate entities from free text. Classification of candidate entities into categories of interest is carried out by leveraging principles from distributional semantics. Experiments show that our system, especially the entity classification step, yields competitive results on two popular biomedical datasets of clinical notes and biological literature, and outperforms a baseline dictionary match approach. Detailed error analysis provides a road map for future work. PMID:23954592
Al-Sahaf, Harith; Zhang, Mengjie; Johnston, Mark
2016-01-01
In the computer vision and pattern recognition fields, image classification represents an important yet difficult task. It is a challenge to build effective computer models to replicate the remarkable ability of the human visual system, which relies on only one or a few instances to learn a completely new class or an object of a class. Recently we proposed two genetic programming (GP) methods, one-shot GP and compound-GP, that aim to evolve a program for the task of binary classification in images. The two methods are designed to use only one or a few instances per class to evolve the model. In this study, we investigate these two methods in terms of performance, robustness, and complexity of the evolved programs. We use ten data sets that vary in difficulty to evaluate these two methods. We also compare them with two other GP and six non-GP methods. The results show that one-shot GP and compound-GP outperform or achieve results comparable to competitor methods. Moreover, the features extracted by these two methods improve the performance of other classifiers with handcrafted features and those extracted by a recently developed GP-based method in most cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hannan, M.A., E-mail: hannan@eng.ukm.my; Abdulla Al Mamun, Md., E-mail: md.abdulla@siswa.ukm.edu.my; Hussain, Aini, E-mail: aini@eng.ukm.my
Highlights: • Classification of available technologies for SWM system in four core category. • Organization of technology based SWM systems in three main groups. • Summary of SWM systems with target application, methodology and functional domain. • Issues and challenges are highlighted for further design of a sustainable system. - Abstract: In the backdrop of prompt advancement, information and communication technology (ICT) has become an inevitable part to plan and design of modern solid waste management (SWM) systems. This study presents a critical review of the existing ICTs and their usage in SWM systems to unfold the issues and challengesmore » towards using integrated technologies based system. To plan, monitor, collect and manage solid waste, the ICTs are divided into four categories such as spatial technologies, identification technologies, data acquisition technologies and data communication technologies. The ICT based SWM systems classified in this paper are based on the first three technologies while the forth one is employed by almost every systems. This review may guide the reader about the basics of available ICTs and their application in SWM to facilitate the search for planning and design of a sustainable new system.« less
Improving Student Question Classification
ERIC Educational Resources Information Center
Heiner, Cecily; Zachary, Joseph L.
2009-01-01
Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…
This paper utilizes a two-stage clustering approach as part of an objective classification scheme designed to elucidate 03's dependence on meteorology. hen applied to ten years (1981-1990) of meteorological data for Birmingham, Alabama, the classification scheme identified seven ...
Onboard Classification of Hyperspectral Data on the Earth Observing One Mission
NASA Technical Reports Server (NTRS)
Chien, Steve; Tran, Daniel; Schaffer, Steve; Rabideau, Gregg; Davies, Ashley Gerard; Doggett, Thomas; Greeley, Ronald; Ip, Felipe; Baker, Victor; Doubleday, Joshua;
2009-01-01
Remote-sensed hyperspectral data represents significant challenges in downlink due to its large data volumes. This paper describes a research program designed to process hyperspectral data products onboard spacecraft to (a) reduce data downlink volumes and (b) decrease latency to provide key data products (often by enabling use of lower data rate communications systems). We describe efforts to develop onboard processing to study volcanoes, floods, and cryosphere, using the Hyperion hyperspectral imager and onboard processing for the Earth Observing One (EO-1) mission as well as preliminary work targeting the Hyperspectral Infrared Imager (HyspIRI) mission.
Electroencephalography (EEG) Based Control in Assistive Mobile Robots: A Review
NASA Astrophysics Data System (ADS)
Krishnan, N. Murali; Mariappan, Muralindran; Muthukaruppan, Karthigayan; Hijazi, Mohd Hanafi Ahmad; Kitt, Wong Wei
2016-03-01
Recently, EEG based control in assistive robot usage has been gradually increasing in the area of biomedical field for giving quality and stress free life for disabled and elderly people. This study reviews the deployment of EGG based control in assistive robots, especially for those who in need and neurologically disabled. The main objective of this paper is to describe the methods used for (i) EEG data acquisition and signal preprocessing, (ii) feature extraction and (iii) signal classification methods. Besides that, this study presents the specific research challenges in the designing of these control systems and future research directions.
Resnik, Linda J; Allen, Susan M
2007-01-01
This pilot study used the framework of the World Health Organization's International Classification of Functioning, Disability and Health (ICF) to understand the challenges faced by Operation Enduring Freedom (OEF) and Operation Iraqi Freedom (OIF) veterans as they reintegrate into the community. We conducted semistructured interviews with 14 injured veterans, 12 caregivers, and 14 clinicians. We used ICF taxonomy to code data and identify issues. We identified challenges in the following ICF domains: learning and applying knowledge; general tasks and demands; communication; mobility; self-care; domestic life; interpersonal interactions, major life areas; and community, social, and civic life. We found many similarities between the challenges faced by veterans with and without polytraumatic injuries, although veterans with polytraumatic injuries faced challenges of greater magnitude. Identifying community reintegration challenges early and promoting reintegration are important mandates for the Department of Veterans Affairs. The findings of this study are useful in understanding the needs of OEF/OIF veterans.
High-order distance-based multiview stochastic learning in image classification.
Yu, Jun; Rui, Yong; Tang, Yuan Yan; Tao, Dacheng
2014-12-01
How do we find all images in a larger set of images which have a specific content? Or estimate the position of a specific object relative to the camera? Image classification methods, like support vector machine (supervised) and transductive support vector machine (semi-supervised), are invaluable tools for the applications of content-based image retrieval, pose estimation, and optical character recognition. However, these methods only can handle the images represented by single feature. In many cases, different features (or multiview data) can be obtained, and how to efficiently utilize them is a challenge. It is inappropriate for the traditionally concatenating schema to link features of different views into a long vector. The reason is each view has its specific statistical property and physical interpretation. In this paper, we propose a high-order distance-based multiview stochastic learning (HD-MSL) method for image classification. HD-MSL effectively combines varied features into a unified representation and integrates the labeling information based on a probabilistic framework. In comparison with the existing strategies, our approach adopts the high-order distance obtained from the hypergraph to replace pairwise distance in estimating the probability matrix of data distribution. In addition, the proposed approach can automatically learn a combination coefficient for each view, which plays an important role in utilizing the complementary information of multiview data. An alternative optimization is designed to solve the objective functions of HD-MSL and obtain different views on coefficients and classification scores simultaneously. Experiments on two real world datasets demonstrate the effectiveness of HD-MSL in image classification.
NASA Astrophysics Data System (ADS)
Fluet-Chouinard, E.; Lehner, B.; Aires, F.; Prigent, C.; McIntyre, P. B.
2017-12-01
Global surface water maps have improved in spatial and temporal resolutions through various remote sensing methods: open water extents with compiled Landsat archives and inundation with topographically downscaled multi-sensor retrievals. These time-series capture variations through time of open water and inundation without discriminating between hydrographic features (e.g. lakes, reservoirs, river channels and wetland types) as other databases have done as static representation. Available data sources present the opportunity to generate a comprehensive map and typology of aquatic environments (deepwater and wetlands) that improves on earlier digitized inventories and maps. The challenge of classifying surface waters globally is to distinguishing wetland types with meaningful characteristics or proxies (hydrology, water chemistry, soils, vegetation) while accommodating limitations of remote sensing data. We present a new wetland classification scheme designed for global application and produce a map of aquatic ecosystem types globally using state-of-the-art remote sensing products. Our classification scheme combines open water extent and expands it with downscaled multi-sensor inundation data to capture the maximal vegetated wetland extent. The hierarchical structure of the classification is modified from the Cowardin Systems (1979) developed for the USA. The first level classification is based on a combination of landscape positions and water source (e.g. lacustrine, riverine, palustrine, coastal and artificial) while the second level represents the hydrologic regime (e.g. perennial, seasonal, intermittent and waterlogged). Class-specific descriptors can further detail the wetland types with soils and vegetation cover. Our globally consistent nomenclature and top-down mapping allows for direct comparison across biogeographic regions, to upscale biogeochemical fluxes as well as other landscape level functions.
Telemedicine for Developing Countries. A Survey and Some Design Issues.
Combi, Carlo; Pozzani, Gabriele; Pozzi, Giuseppe
2016-11-02
Developing countries need telemedicine applications that help in many situations, when physicians are a small number with respect to the population, when specialized physicians are not available, when patients and physicians in rural villages need assistance in the delivery of health care. Moreover, the requirements of telemedicine applications for developing countries are somewhat more demanding than for developed countries. Indeed, further social, organizational, and technical aspects need to be considered for successful telemedicine applications in developing countries. We consider all the major projects in telemedicine, devoted to developing countries, as described by the proper scientific literature. On the basis of such literature, we want to define a specific taxonomy that allows a proper classification and a fast overview of telemedicine projects in developing countries. Moreover, by considering both the literature and some recent direct experiences, we want to complete such overview by discussing some design issues to be taken into consideration when developing telemedicine software systems. We considered and reviewed the major conferences and journals in depth, and looked for reports on the telemedicine projects. We provide the reader with a survey of the main projects and systems, from which we derived a taxonomy of features of telemedicine systems for developing countries. We also propose and discuss some classification criteria for design issues, based on the lessons learned in this research area. We highlight some challenges and recommendations to be considered when designing a telemedicine system for developing countries.
Gender classification from video under challenging operating conditions
NASA Astrophysics Data System (ADS)
Mendoza-Schrock, Olga; Dong, Guozhu
2014-06-01
The literature is abundant with papers on gender classification research. However the majority of such research is based on the assumption that there is enough resolution so that the subject's face can be resolved. Hence the majority of the research is actually in the face recognition and facial feature area. A gap exists for gender classification under challenging operating conditions—different seasonal conditions, different clothing, etc.—and when the subject's face cannot be resolved due to lack of resolution. The Seasonal Weather and Gender (SWAG) Database is a novel database that contains subjects walking through a scene under operating conditions that span a calendar year. This paper exploits a subset of that database—the SWAG One dataset—using data mining techniques, traditional classifiers (ex. Naïve Bayes, Support Vector Machine, etc.) and traditional (canny edge detection, etc.) and non-traditional (height/width ratios, etc.) feature extractors to achieve high correct gender classification rates (greater than 85%). Another novelty includes exploiting frame differentials.
Phillips, Charles D
2015-01-01
Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges.
Phillips, Charles D.
2015-01-01
Case-mix classification and payment systems help assure that persons with similar needs receive similar amounts of care resources, which is a major equity concern for consumers, providers, and programs. Although health service programs for adults regularly use case-mix payment systems, programs providing health services to children and youth rarely use such models. This research utilized Medicaid home care expenditures and assessment data on 2,578 children receiving home care in one large state in the USA. Using classification and regression tree analyses, a case-mix model for long-term pediatric home care was developed. The Pediatric Home Care/Expenditure Classification Model (P/ECM) grouped children and youth in the study sample into 24 groups, explaining 41% of the variance in annual home care expenditures. The P/ECM creates the possibility of a more equitable, and potentially more effective, allocation of home care resources among children and youth facing serious health care challenges. PMID:26740744
Coalition Warfare: the Leadership Challenges
2011-05-19
Approved for Public Release; Distribution is Unlimited Coalition Warfare: The leadership challenges A Monograph by Colonel Mark J Thornhill...The leadership challenges . 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Colonel Mark J. Thornhill...multinational operations, leadership challenges , leadership attributes, unity of command. 16. SECURITY CLASSIFICATION OF: UNCLASSIFIED 17. LIMITATION OF
Railroad Classification Yard Technology Manual. Volume I : Yard Design Methods
DOT National Transportation Integrated Search
1981-02-01
This volume documents the procedures and methods associated with the design of railroad classification yards. Subjects include: site location, economic analysis, yard capacity analysis, design of flat yards, overall configuration of hump yards, hump ...
Synthetic biology routes to bio-artificial intelligence
Zaikin, Alexey; Saka, Yasushi; Romano, M. Carmen; Giuraniuc, Claudiu V.; Kanakov, Oleg; Laptyeva, Tetyana
2016-01-01
The design of synthetic gene networks (SGNs) has advanced to the extent that novel genetic circuits are now being tested for their ability to recapitulate archetypal learning behaviours first defined in the fields of machine and animal learning. Here, we discuss the biological implementation of a perceptron algorithm for linear classification of input data. An expansion of this biological design that encompasses cellular ‘teachers’ and ‘students’ is also examined. We also discuss implementation of Pavlovian associative learning using SGNs and present an example of such a scheme and in silico simulation of its performance. In addition to designed SGNs, we also consider the option to establish conditions in which a population of SGNs can evolve diversity in order to better contend with complex input data. Finally, we compare recent ethical concerns in the field of artificial intelligence (AI) and the future challenges raised by bio-artificial intelligence (BI). PMID:27903825
Hierarchical micro-architectures of electrodes for energy storage
NASA Astrophysics Data System (ADS)
Yue, Yuan; Liang, Hong
2015-06-01
The design of electrodes for the electrochemical energy storage devices, particularly Lithium ion batteries (LIBs) and Supercapacitors (SCs), has extraordinary importance in optimization of electrochemical performance. Regardless of the materials used, the architecture of electrodes is crucial for charge transport efficiency and electrochemical interactions. This report provides a critical review of the prototype architectural design and micro- and nano-material properties designated to electrodes of LIBs and SCs. An alternative classification criterion is proposed that divides reported hierarchical architectures into two categories: aligned and unaligned structures. The structures were evaluated and it was found that the aligned architectures are superior to the unaligned in the following characteristics: 1) highly-organized charger pathways, 2) tunable interspaces between architecture units, and 3) good electric-contacted current collectors prepared along with electrodes. Based on these findings, challenges and potential routes to resolve those are provided for future development.
NASA Astrophysics Data System (ADS)
Yang, Hongbin; Sun, Lixia; Li, Weihua; Liu, Guixia; Tang, Yun
2018-02-01
For a drug, safety is always the most important issue, including a variety of toxicities and adverse drug effects, which should be evaluated in preclinical and clinical trial phases. This review article at first simply introduced the computational methods used in prediction of chemical toxicity for drug design, including machine learning methods and structural alerts. Machine learning methods have been widely applied in qualitative classification and quantitative regression studies, while structural alerts can be regarded as a complementary tool for lead optimization. The emphasis of this article was put on the recent progress of predictive models built for various toxicities. Available databases and web servers were also provided. Though the methods and models are very helpful for drug design, there are still some challenges and limitations to be improved for drug safety assessment in the future.
Yang, Hongbin; Sun, Lixia; Li, Weihua; Liu, Guixia; Tang, Yun
2018-01-01
During drug development, safety is always the most important issue, including a variety of toxicities and adverse drug effects, which should be evaluated in preclinical and clinical trial phases. This review article at first simply introduced the computational methods used in prediction of chemical toxicity for drug design, including machine learning methods and structural alerts. Machine learning methods have been widely applied in qualitative classification and quantitative regression studies, while structural alerts can be regarded as a complementary tool for lead optimization. The emphasis of this article was put on the recent progress of predictive models built for various toxicities. Available databases and web servers were also provided. Though the methods and models are very helpful for drug design, there are still some challenges and limitations to be improved for drug safety assessment in the future. PMID:29515993
Maitre, Nathalie L; Ballard, Roberta A; Ellenberg, Jonas H; Davis, Stephanie D; Greenberg, James M; Hamvas, Aaron; Pryhuber, Gloria S
2015-05-01
Bronchopulmonary dysplasia (BPD) is the most common respiratory consequence of premature birth and contributes to significant short- and long-term morbidity, mortality and resource utilization. Initially defined as a radiographic, clinical and histopathological entity, the chronic lung disease known as BPD has evolved as obstetrical and neonatal care have improved the survival of lower gestational age infants. Now, definitions based on the need for supplementary oxygen at 28 days and/or 36 weeks provide a useful reference point in the neonatal intensive-care unit (NICU), but are no longer based on histopathological findings, and are neither designed to predict longer term respiratory consequences nor to study the evolution of a multifactorial disease. The aims of this review are to critically examine the evolution of the diagnosis of BPD and the challenges inherent to current classifications. We found that the increasing use of respiratory support strategies that administer ambient air without supplementary oxygen confounds oxygen-based definitions of BPD. Furthermore, lack of reproducible, genetic, biochemical and physiological biomarkers limits the ability to identify an impending BPD for early intervention, quantify disease severity for standardized classification and approaches and reliably predict the long-term outcomes. More comprehensive, multidisciplinary approaches to overcome these challenges involve longitudinal observation of extremely preterm infants, not only those with BPD, using genetic, environmental, physiological and clinical data as well as large databases of patient samples. The Prematurity and Respiratory Outcomes Program (PROP) will provide such a framework to address these challenges through high-resolution characterization of both NICU and post-NICU discharge outcomes.
2017-01-23
of classification technologies for Munitions Response (MR). This demonstration was designed to evaluate advanced classification methodology at the...advanced electromagnetic induction sensors and static, cued surveys to classify anomalies as either targets of interest (TOI) or non -TOI. Static data...17 5.1 Conceptual Experimental Design
Strength Analysis on Ship Ladder Using Finite Element Method
NASA Astrophysics Data System (ADS)
Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.
2018-01-01
In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.
An important challenge for an integrative approach to developmental systems toxicology is associating putative molecular initiating events (MIEs), cell signaling pathways, cell function and modeled fetal exposure kinetics. We have developed a chemical classification model based o...
Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D
2018-05-08
In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.
NASA Astrophysics Data System (ADS)
Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Homayouni, S.
2016-06-01
Polarimetric Synthetic Aperture Radar (PolSAR) imagery is a complex multi-dimensional dataset, which is an important source of information for various natural resources and environmental classification and monitoring applications. PolSAR imagery produces valuable information by observing scattering mechanisms from different natural and man-made objects. Land cover mapping using PolSAR data classification is one of the most important applications of SAR remote sensing earth observations, which have gained increasing attention in the recent years. However, one of the most challenging aspects of classification is selecting features with maximum discrimination capability. To address this challenge, a statistical approach based on the Fisher Linear Discriminant Analysis (FLDA) and the incorporation of physical interpretation of PolSAR data into classification is proposed in this paper. After pre-processing of PolSAR data, including the speckle reduction, the H/α classification is used in order to classify the basic scattering mechanisms. Then, a new method for feature weighting, based on the fusion of FLDA and physical interpretation, is implemented. This method proves to increase the classification accuracy as well as increasing between-class discrimination in the final Wishart classification. The proposed method was applied to a full polarimetric C-band RADARSAT-2 data set from Avalon area, Newfoundland and Labrador, Canada. This imagery has been acquired in June 2015, and covers various types of wetlands including bogs, fens, marshes and shallow water. The results were compared with the standard Wishart classification, and an improvement of about 20% was achieved in the overall accuracy. This method provides an opportunity for operational wetland classification in northern latitude with high accuracy using only SAR polarimetric data.
Protocol Design Challenges in the Detection of Awareness in Aware Subjects Using EEG Signals.
Henriques, J; Gabriel, D; Grigoryeva, L; Haffen, E; Moulin, T; Aubry, R; Pazart, L; Ortega, J-P
2016-10-01
Recent studies have evidenced serious difficulties in detecting covert awareness with electroencephalography-based techniques both in unresponsive patients and in healthy control subjects. This work reproduces the protocol design in two recent mental imagery studies with a larger group comprising 20 healthy volunteers. The main goal is assessing if modifications in the signal extraction techniques, training-testing/cross-validation routines, and hypotheses evoked in the statistical analysis, can provide solutions to the serious difficulties documented in the literature. The lack of robustness in the results advises for further search of alternative protocols more suitable for machine learning classification and of better performing signal treatment techniques. Specific recommendations are made using the findings in this work. © EEG and Clinical Neuroscience Society (ECNS) 2014.
Application of the SNoW machine learning paradigm to a set of transportation imaging problems
NASA Astrophysics Data System (ADS)
Paul, Peter; Burry, Aaron M.; Wang, Yuheng; Kozitsky, Vladimir
2012-01-01
Machine learning methods have been successfully applied to image object classification problems where there is clear distinction between classes and where a comprehensive set of training samples and ground truth are readily available. The transportation domain is an area where machine learning methods are particularly applicable, since the classification problems typically have well defined class boundaries and, due to high traffic volumes in most applications, massive roadway data is available. Though these classes tend to be well defined, the particular image noises and variations can be challenging. Another challenge is the extremely high accuracy typically required in most traffic applications. Incorrect assignment of fines or tolls due to imaging mistakes is not acceptable in most applications. For the front seat vehicle occupancy detection problem, classification amounts to determining whether one face (driver only) or two faces (driver + passenger) are detected in the front seat of a vehicle on a roadway. For automatic license plate recognition, the classification problem is a type of optical character recognition problem encompassing multiple class classification. The SNoW machine learning classifier using local SMQT features is shown to be successful in these two transportation imaging applications.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., NARA may forward the challenge directly to the ISCAP. NARA must forward the challenge within 60 days of... may forward the appeal directly to the ISCAP. NARA must forward the challenge within 60 days of the...
Code of Federal Regulations, 2011 CFR
2011-07-01
..., NARA may forward the challenge directly to the ISCAP. NARA must forward the challenge within 60 days of... may forward the appeal directly to the ISCAP. NARA must forward the challenge within 60 days of the...
DNA methylation-based classification of central nervous system tumours.
Capper, David; Jones, David T W; Sill, Martin; Hovestadt, Volker; Schrimpf, Daniel; Sturm, Dominik; Koelsche, Christian; Sahm, Felix; Chavez, Lukas; Reuss, David E; Kratz, Annekathrin; Wefers, Annika K; Huang, Kristin; Pajtler, Kristian W; Schweizer, Leonille; Stichel, Damian; Olar, Adriana; Engel, Nils W; Lindenberg, Kerstin; Harter, Patrick N; Braczynski, Anne K; Plate, Karl H; Dohmen, Hildegard; Garvalov, Boyan K; Coras, Roland; Hölsken, Annett; Hewer, Ekkehard; Bewerunge-Hudler, Melanie; Schick, Matthias; Fischer, Roger; Beschorner, Rudi; Schittenhelm, Jens; Staszewski, Ori; Wani, Khalida; Varlet, Pascale; Pages, Melanie; Temming, Petra; Lohmann, Dietmar; Selt, Florian; Witt, Hendrik; Milde, Till; Witt, Olaf; Aronica, Eleonora; Giangaspero, Felice; Rushing, Elisabeth; Scheurlen, Wolfram; Geisenberger, Christoph; Rodriguez, Fausto J; Becker, Albert; Preusser, Matthias; Haberler, Christine; Bjerkvig, Rolf; Cryan, Jane; Farrell, Michael; Deckert, Martina; Hench, Jürgen; Frank, Stephan; Serrano, Jonathan; Kannan, Kasthuri; Tsirigos, Aristotelis; Brück, Wolfgang; Hofer, Silvia; Brehmer, Stefanie; Seiz-Rosenhagen, Marcel; Hänggi, Daniel; Hans, Volkmar; Rozsnoki, Stephanie; Hansford, Jordan R; Kohlhof, Patricia; Kristensen, Bjarne W; Lechner, Matt; Lopes, Beatriz; Mawrin, Christian; Ketter, Ralf; Kulozik, Andreas; Khatib, Ziad; Heppner, Frank; Koch, Arend; Jouvet, Anne; Keohane, Catherine; Mühleisen, Helmut; Mueller, Wolf; Pohl, Ute; Prinz, Marco; Benner, Axel; Zapatka, Marc; Gottardo, Nicholas G; Driever, Pablo Hernáiz; Kramm, Christof M; Müller, Hermann L; Rutkowski, Stefan; von Hoff, Katja; Frühwald, Michael C; Gnekow, Astrid; Fleischhack, Gudrun; Tippelt, Stephan; Calaminus, Gabriele; Monoranu, Camelia-Maria; Perry, Arie; Jones, Chris; Jacques, Thomas S; Radlwimmer, Bernhard; Gessi, Marco; Pietsch, Torsten; Schramm, Johannes; Schackert, Gabriele; Westphal, Manfred; Reifenberger, Guido; Wesseling, Pieter; Weller, Michael; Collins, Vincent Peter; Blümcke, Ingmar; Bendszus, Martin; Debus, Jürgen; Huang, Annie; Jabado, Nada; Northcott, Paul A; Paulus, Werner; Gajjar, Amar; Robinson, Giles W; Taylor, Michael D; Jaunmuktane, Zane; Ryzhova, Marina; Platten, Michael; Unterberg, Andreas; Wick, Wolfgang; Karajannis, Matthias A; Mittelbronn, Michel; Acker, Till; Hartmann, Christian; Aldape, Kenneth; Schüller, Ulrich; Buslei, Rolf; Lichter, Peter; Kool, Marcel; Herold-Mende, Christel; Ellison, David W; Hasselblatt, Martin; Snuderl, Matija; Brandner, Sebastian; Korshunov, Andrey; von Deimling, Andreas; Pfister, Stefan M
2018-03-22
Accurate pathological diagnosis is crucial for optimal management of patients with cancer. For the approximately 100 known tumour types of the central nervous system, standardization of the diagnostic process has been shown to be particularly challenging-with substantial inter-observer variability in the histopathological diagnosis of many tumour types. Here we present a comprehensive approach for the DNA methylation-based classification of central nervous system tumours across all entities and age groups, and demonstrate its application in a routine diagnostic setting. We show that the availability of this method may have a substantial impact on diagnostic precision compared to standard methods, resulting in a change of diagnosis in up to 12% of prospective cases. For broader accessibility, we have designed a free online classifier tool, the use of which does not require any additional onsite data processing. Our results provide a blueprint for the generation of machine-learning-based tumour classifiers across other cancer entities, with the potential to fundamentally transform tumour pathology.
Evaluation of Waveform Structure Features on Time Domain Target Recognition under Cross Polarization
NASA Astrophysics Data System (ADS)
Selver, M. A.; Seçmen, M.; Zoral, E. Y.
2016-08-01
Classification of aircraft targets from scattered electromagnetic waves is a challenging application, which suffers from aspect angle dependency. In order to eliminate the adverse effects of aspect angle, various strategies were developed including the techniques that rely on extraction of several features and design of suitable classification systems to process them. Recently, a hierarchical method, which uses features that take advantage of waveform structure of the scattered signals, is introduced and shown to have effective results. However, this approach has been applied to the special cases that consider only a single planar component of electric field that cause no-cross polarization at the observation point. In this study, two small scale aircraft models, Boeing-747 and DC-10, are selected as the targets and various polarizations are used to analyse the cross-polarization effects on system performance of the aforementioned method. The results reveal the advantages and the shortcomings of using waveform structures in time-domain target identification.
Object-based detection of vehicles using combined optical and elevation data
NASA Astrophysics Data System (ADS)
Schilling, Hendrik; Bulatov, Dimitri; Middelmann, Wolfgang
2018-02-01
The detection of vehicles is an important and challenging topic that is relevant for many applications. In this work, we present a workflow that utilizes optical and elevation data to detect vehicles in remotely sensed urban data. This workflow consists of three consecutive stages: candidate identification, classification, and single vehicle extraction. Unlike in most previous approaches, fusion of both data sources is strongly pursued at all stages. While the first stage utilizes the fact that most man-made objects are rectangular in shape, the second and third stages employ machine learning techniques combined with specific features. The stages are designed to handle multiple sensor input, which results in a significant improvement. A detailed evaluation shows the benefits of our workflow, which includes hand-tailored features; even in comparison with classification approaches based on Convolutional Neural Networks, which are state of the art in computer vision, we could obtain a comparable or superior performance (F1 score of 0.96-0.94).
7 CFR 51.2836 - Size classifications.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Size classifications. 51.2836 Section 51.2836...) Size Classifications § 51.2836 Size classifications. The size of onions may be specified in accordance with one of the following classifications. Size designation Minimum diameter Inches Millimeters Maximum...
7 CFR 51.2836 - Size classifications.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 2 2013-01-01 2013-01-01 false Size classifications. 51.2836 Section 51.2836...-Granex-Grano and Creole Types) Size Classifications § 51.2836 Size classifications. The size of onions may be specified in accordance with one of the following classifications. Size designation Minimum...
7 CFR 51.2836 - Size classifications.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 2 2014-01-01 2014-01-01 false Size classifications. 51.2836 Section 51.2836...-Granex-Grano and Creole Types) Size Classifications § 51.2836 Size classifications. The size of onions may be specified in accordance with one of the following classifications. Size designation Minimum...
7 CFR 51.2836 - Size classifications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Size classifications. 51.2836 Section 51.2836...) Size Classifications § 51.2836 Size classifications. The size of onions may be specified in accordance with one of the following classifications. Size designation Minimum diameter Inches Millimeters Maximum...
7 CFR 51.2836 - Size classifications.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Size classifications. 51.2836 Section 51.2836...) Size Classifications § 51.2836 Size classifications. The size of onions may be specified in accordance with one of the following classifications. Size designation Minimum diameter Inches Millimeters Maximum...
22 CFR 9.5 - Original classification authority.
Code of Federal Regulations, 2010 CFR
2010-04-01
... classification authority. (a) Authority for original classification of information as Top Secret may be exercised... Notice dated May 26, 2000. (b) Authority for original classification of information as Secret or..., 2000. In the absence of the Secret or Confidential classification authority, the person designated to...
Demirci, Oguz; Clark, Vincent P; Magnotta, Vincent A; Andreasen, Nancy C; Lauriello, John; Kiehl, Kent A; Pearlson, Godfrey D; Calhoun, Vince D
2008-09-01
Functional magnetic resonance imaging (fMRI) is a fairly new technique that has the potential to characterize and classify brain disorders such as schizophrenia. It has the possibility of playing a crucial role in designing objective prognostic/diagnostic tools, but also presents numerous challenges to analysis and interpretation. Classification provides results for individual subjects, rather than results related to group differences. This is a more complicated endeavor that must be approached more carefully and efficient methods should be developed to draw generalized and valid conclusions out of high dimensional data with a limited number of subjects, especially for heterogeneous disorders whose pathophysiology is unknown. Numerous research efforts have been reported in the field using fMRI activation of schizophrenia patients and healthy controls. However, the results are usually not generalizable to larger data sets and require careful definition of the techniques used both in designing algorithms and reporting prediction accuracies. In this review paper, we survey a number of previous reports and also identify possible biases (cross-validation, class size, e.g.) in class comparison/prediction problems. Some suggestions to improve the effectiveness of the presentation of the prediction accuracy results are provided. We also present our own results using a projection pursuit algorithm followed by an application of independent component analysis proposed in an earlier study. We classify schizophrenia versus healthy controls using fMRI data of 155 subjects from two sites obtained during three different tasks. The results are compared in order to investigate the effectiveness of each task and differences between patients with schizophrenia and healthy controls were investigated.
Integrating Human and Machine Intelligence in Galaxy Morphology Classification Tasks
NASA Astrophysics Data System (ADS)
Beck, Melanie Renee
The large flood of data flowing from observatories presents significant challenges to astronomy and cosmology--challenges that will only be magnified by projects currently under development. Growth in both volume and velocity of astrophysics data is accelerating: whereas the Sloan Digital Sky Survey (SDSS) has produced 60 terabytes of data in the last decade, the upcoming Large Synoptic Survey Telescope (LSST) plans to register 30 terabytes per night starting in the year 2020. Additionally, the Euclid Mission will acquire imaging for 5 x 107 resolvable galaxies. The field of galaxy evolution faces a particularly challenging future as complete understanding often cannot be reached without analysis of detailed morphological galaxy features. Historically, morphological analysis has relied on visual classification by astronomers, accessing the human brains capacity for advanced pattern recognition. However, this accurate but inefficient method falters when confronted with many thousands (or millions) of images. In the SDSS era, efforts to automate morphological classifications of galaxies (e.g., Conselice et al., 2000; Lotz et al., 2004) are reasonably successful and can distinguish between elliptical and disk-dominated galaxies with accuracies of 80%. While this is statistically very useful, a key problem with these methods is that they often cannot say which 80% of their samples are accurate. Furthermore, when confronted with the more complex task of identifying key substructure within galaxies, automated classification algorithms begin to fail. The Galaxy Zoo project uses a highly innovative approach to solving the scalability problem of visual classification. Displaying images of SDSS galaxies to volunteers via a simple and engaging web interface, www.galaxyzoo.org asks people to classify images by eye. Within the first year hundreds of thousands of members of the general public had classified each of the 1 million SDSS galaxies an average of 40 times. Galaxy Zoo thus solved both the visual classification problem of time efficiency and improved accuracy by producing a distribution of independent classifications for each galaxy. While crowd-sourced galaxy classifications have proven their worth, challenges remain before establishing this method as a critical and standard component of the data processing pipelines for the next generation of surveys. In particular, though innovative, crowd-sourcing techniques do not have the capacity to handle the data volume and rates expected in the next generation of surveys. These algorithms will be delegated to handle the majority of the classification tasks, freeing citizen scientists to contribute their efforts on subtler and more complex assignments. This thesis presents a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme we increase the classification rate nearly 5-fold classifying 226,124 galaxies in 92 days of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7% accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides a factor of 11.4 increase in the classification rate, classifying 210,803 galaxies in just 32 days of GZ2 project time with 93.1% accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.
Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...
1984-12-01
52242 Prepared for the AIR FORCE OFFICE OF SCIENTIFIC RESEARCH Under Grant No. AFOSR 82-0322 December 1984 ~ " ’w Unclassified SECURITY CLASSIFICATION4...OF THIS PAGE REPORT DOCUMENTATION PAGE is REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE MARKINGS Unclassified None 20 SECURITY CLASSIFICATION...designer .and computer- are 20 DIiRIBUTION/AVAILABI LIT Y 0P ABSTR4ACT 21 ABSTRACT SECURITY CLASSIFICA1ONr UNCLASSIFIED/UNLIMITED SAME AS APT OTIC USERS
The Cool and Belkin Faceted Classification of Information Interactions Revisited
ERIC Educational Resources Information Center
Huvila, Isto
2010-01-01
Introduction: The complexity of human information activity is a challenge for both practice and research in information sciences and information management. Literature presents a wealth of approaches to analytically structure and make sense of human information activity including a faceted classification model of information interactions published…
Computational Support for Early Elicitation and Classification of Tone
ERIC Educational Resources Information Center
Bird, Steven; Lee, Haejoong
2014-01-01
Investigating a tone language involves careful transcription of tone on words and phrases. This is challenging when the phonological categories--the tones or melodies--have not been identified. Effects such as coarticulation, sandhi, and phrase-level prosody appear as obstacles to early elicitation and classification of tone. This article presents…
9 CFR 146.7 - Terminology and classification; general.
Code of Federal Regulations, 2011 CFR
2011-01-01
... General Provisions § 146.7 Terminology and classification; general. The official classification terms defined in §§ 146.8 and 146.9 and the various designs illustrative of the official classifications... 9 Animals and Animal Products 1 2011-01-01 2011-01-01 false Terminology and classification...
22 CFR 9.5 - Original classification authority.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., 2000. In the absence of the Secret or Confidential classification authority, the person designated to... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Original classification authority. 9.5 Section... classification authority. (a) Authority for original classification of information as Top Secret may be exercised...
22 CFR 9.5 - Original classification authority.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., 2000. In the absence of the Secret or Confidential classification authority, the person designated to... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Original classification authority. 9.5 Section... classification authority. (a) Authority for original classification of information as Top Secret may be exercised...
22 CFR 9.5 - Original classification authority.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., 2000. In the absence of the Secret or Confidential classification authority, the person designated to... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Original classification authority. 9.5 Section... classification authority. (a) Authority for original classification of information as Top Secret may be exercised...
ERIC Educational Resources Information Center
Wyse, Adam E.; Babcock, Ben
2016-01-01
A common suggestion made in the psychometric literature for fixed-length classification tests is that one should design tests so that they have maximum information at the cut score. Designing tests in this way is believed to maximize the classification accuracy and consistency of the assessment. This article uses simulated examples to illustrate…
Rajagopal, Rekha; Ranganathan, Vidhyapriya
2018-06-05
Automation in cardiac arrhythmia classification helps medical professionals make accurate decisions about the patient's health. The aim of this work was to design a hybrid classification model to classify cardiac arrhythmias. The design phase of the classification model comprises the following stages: preprocessing of the cardiac signal by eliminating detail coefficients that contain noise, feature extraction through Daubechies wavelet transform, and arrhythmia classification using a collaborative decision from the K nearest neighbor classifier (KNN) and a support vector machine (SVM). The proposed model is able to classify 5 arrhythmia classes as per the ANSI/AAMI EC57: 1998 classification standard. Level 1 of the proposed model involves classification using the KNN and the classifier is trained with examples from all classes. Level 2 involves classification using an SVM and is trained specifically to classify overlapped classes. The final classification of a test heartbeat pertaining to a particular class is done using the proposed KNN/SVM hybrid model. The experimental results demonstrated that the average sensitivity of the proposed model was 92.56%, the average specificity 99.35%, the average positive predictive value 98.13%, the average F-score 94.5%, and the average accuracy 99.78%. The results obtained using the proposed model were compared with the results of discriminant, tree, and KNN classifiers. The proposed model is able to achieve a high classification accuracy.
Improvement in defect classification efficiency by grouping disposition for reticle inspection
NASA Astrophysics Data System (ADS)
Lai, Rick; Hsu, Luke T. H.; Chang, Peter; Ho, C. H.; Tsai, Frankie; Long, Garrett; Yu, Paul; Miller, John; Hsu, Vincent; Chen, Ellison
2005-11-01
As the lithography design rule of IC manufacturing continues to migrate toward more advanced technology nodes, the mask error enhancement factor (MEEF) increases and necessitates the use of aggressive OPC features. These aggressive OPC features pose challenges to reticle inspection due to high false detection, which is time-consuming for defect classification and impacts the throughput of mask manufacturing. Moreover, higher MEEF leads to stricter mask defect capture criteria so that new generation reticle inspection tool is equipped with better detection capability. Hence, mask process induced defects, which were once undetectable, are now detected and results in the increase of total defect count. Therefore, how to review and characterize reticle defects efficiently is becoming more significant. A new defect review system called ReviewSmart has been developed based on the concept of defect grouping disposition. The review system intelligently bins repeating or similar defects into defect groups and thus allows operators to review massive defects more efficiently. Compared to the conventional defect review method, ReviewSmart not only reduces defect classification time and human judgment error, but also eliminates desensitization that is formerly inevitable. In this study, we attempt to explore the most efficient use of ReviewSmart by evaluating various defect binning conditions. The optimal binning conditions are obtained and have been verified for fidelity qualification through inspection reports (IRs) of production masks. The experiment results help to achieve the best defect classification efficiency when using ReviewSmart in the mask manufacturing and development.
Liu, Aiming; Liu, Quan; Ai, Qingsong; Xie, Yi; Chen, Anqi
2017-01-01
Motor Imagery (MI) electroencephalography (EEG) is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA) can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA) to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP) and local characteristic-scale decomposition (LCD) algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA) classifier. Both the fourth brain–computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain–computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain–computer interface systems. PMID:29117100
Liu, Aiming; Chen, Kun; Liu, Quan; Ai, Qingsong; Xie, Yi; Chen, Anqi
2017-11-08
Motor Imagery (MI) electroencephalography (EEG) is widely studied for its non-invasiveness, easy availability, portability, and high temporal resolution. As for MI EEG signal processing, the high dimensions of features represent a research challenge. It is necessary to eliminate redundant features, which not only create an additional overhead of managing the space complexity, but also might include outliers, thereby reducing classification accuracy. The firefly algorithm (FA) can adaptively select the best subset of features, and improve classification accuracy. However, the FA is easily entrapped in a local optimum. To solve this problem, this paper proposes a method of combining the firefly algorithm and learning automata (LA) to optimize feature selection for motor imagery EEG. We employed a method of combining common spatial pattern (CSP) and local characteristic-scale decomposition (LCD) algorithms to obtain a high dimensional feature set, and classified it by using the spectral regression discriminant analysis (SRDA) classifier. Both the fourth brain-computer interface competition data and real-time data acquired in our designed experiments were used to verify the validation of the proposed method. Compared with genetic and adaptive weight particle swarm optimization algorithms, the experimental results show that our proposed method effectively eliminates redundant features, and improves the classification accuracy of MI EEG signals. In addition, a real-time brain-computer interface system was implemented to verify the feasibility of our proposed methods being applied in practical brain-computer interface systems.
Zur, Moran; Hanson, Allison S; Dahan, Arik
2014-09-30
While the solubility parameter is fairly straightforward when assigning BCS classification, the intestinal permeability (Peff) is more complex than generally recognized. In this paper we emphasize this complexity through the analysis of codeine, a commonly used antitussive/analgesic drug. Codeine was previously classified as a low-permeability compound, based on its lower LogP compared to metoprolol, a marker for the low-high permeability class boundary. In contrast, high fraction of dose absorbed (Fabs) was reported for codeine, which challenges the generally recognized Peff-Fabs correlation. The purpose of this study was to clarify this ambiguity through elucidation of codeine's BCS solubility/permeability class membership. Codeine's BCS solubility class was determined, and its intestinal permeability throughout the small intestine was investigated, both in vitro and in vivo in rats. Codeine was found to be unequivocally a high-solubility compound. All in vitro studies indicated that codeine's permeability is higher than metoprolol's. In vivo studies in rats showed similar permeability for both drugs throughout the entire small-intestine. In conclusion, codeine was found to be a BCS Class I compound. No Peff-Fabs discrepancy is involved in its absorption; rather, it reflects the risk of assigning BCS classification based on merely limited physicochemical characteristics. A thorough investigation using multiple experimental methods is prudent before assigning a BCS classification, to avoid misjudgment in various settings, e.g., drug discovery, formulation design, drug development and regulation. Copyright © 2013 Elsevier B.V. All rights reserved.
Railroad classification yard design methodology study : East Deerfield Yard, a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...
32 CFR 2700.12 - Criteria for and level of original classification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... classification are authorized—“Top Secret,” “Secret,” “Confidential.” No other classification designation is... classification. 2700.12 Section 2700.12 National Defense Other Regulations Relating to National Defense OFFICE FOR MICRONESIAN STATUS NEGOTIATIONS SECURITY INFORMATION REGULATIONS Original Classification § 2700.12...
32 CFR 2700.12 - Criteria for and level of original classification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... classification are authorized—“Top Secret,” “Secret,” “Confidential.” No other classification designation is... classification. 2700.12 Section 2700.12 National Defense Other Regulations Relating to National Defense OFFICE FOR MICRONESIAN STATUS NEGOTIATIONS SECURITY INFORMATION REGULATIONS Original Classification § 2700.12...
32 CFR 2700.12 - Criteria for and level of original classification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... classification are authorized—“Top Secret,” “Secret,” “Confidential.” No other classification designation is... classification. 2700.12 Section 2700.12 National Defense Other Regulations Relating to National Defense OFFICE FOR MICRONESIAN STATUS NEGOTIATIONS SECURITY INFORMATION REGULATIONS Original Classification § 2700.12...
Validation of a new classification system for interprosthetic femoral fractures.
Pires, Robinson Esteves Santos; Silveira, Marcelo Peixoto Sena; Resende, Alessandra Regina da Silva; Junior, Egidio Oliveira Santana; Campos, Tulio Vinicius Oliveira; Santos, Leandro Emilio Nascimento; Balbachevsky, Daniel; Andrade, Marco Antônio Percope de
2017-07-01
Interprosthetic femoral fracture (IFF) incidence is gradually increasing as the population is progressively ageing. However, treatment remains challenging due to several contributing factors, such as poor bone quality, patient comorbidities, small interprosthetic fragment, and prostheses instability. An effective and specific classification system is essential to optimize treatment management, therefore diminishing complication rates. This study aims to validate a previously described classification system for interprosthetic femoral fractures. Copyright © 2017 Elsevier Ltd. All rights reserved.
A model-based test for treatment effects with probabilistic classifications.
Cavagnaro, Daniel R; Davis-Stober, Clintin P
2018-05-21
Within modern psychology, computational and statistical models play an important role in describing a wide variety of human behavior. Model selection analyses are typically used to classify individuals according to the model(s) that best describe their behavior. These classifications are inherently probabilistic, which presents challenges for performing group-level analyses, such as quantifying the effect of an experimental manipulation. We answer this challenge by presenting a method for quantifying treatment effects in terms of distributional changes in model-based (i.e., probabilistic) classifications across treatment conditions. The method uses hierarchical Bayesian mixture modeling to incorporate classification uncertainty at the individual level into the test for a treatment effect at the group level. We illustrate the method with several worked examples, including a reanalysis of the data from Kellen, Mata, and Davis-Stober (2017), and analyze its performance more generally through simulation studies. Our simulations show that the method is both more powerful and less prone to type-1 errors than Fisher's exact test when classifications are uncertain. In the special case where classifications are deterministic, we find a near-perfect power-law relationship between the Bayes factor, derived from our method, and the p value obtained from Fisher's exact test. We provide code in an online supplement that allows researchers to apply the method to their own data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Statistical Analyses of Femur Parameters for Designing Anatomical Plates.
Wang, Lin; He, Kunjin; Chen, Zhengming
2016-01-01
Femur parameters are key prerequisites for scientifically designing anatomical plates. Meanwhile, individual differences in femurs present a challenge to design well-fitting anatomical plates. Therefore, to design anatomical plates more scientifically, analyses of femur parameters with statistical methods were performed in this study. The specific steps were as follows. First, taking eight anatomical femur parameters as variables, 100 femur samples were classified into three classes with factor analysis and Q-type cluster analysis. Second, based on the mean parameter values of the three classes of femurs, three sizes of average anatomical plates corresponding to the three classes of femurs were designed. Finally, based on Bayes discriminant analysis, a new femur could be assigned to the proper class. Thereafter, the average anatomical plate suitable for that new femur was selected from the three available sizes of plates. Experimental results showed that the classification of femurs was quite reasonable based on the anatomical aspects of the femurs. For instance, three sizes of condylar buttress plates were designed. Meanwhile, 20 new femurs are judged to which classes the femurs belong. Thereafter, suitable condylar buttress plates were determined and selected.
Divorcing Strain Classification from Species Names.
Baltrus, David A
2016-06-01
Confusion about strain classification and nomenclature permeates modern microbiology. Although taxonomists have traditionally acted as gatekeepers of order, the numbers of, and speed at which, new strains are identified has outpaced the opportunity for professional classification for many lineages. Furthermore, the growth of bioinformatics and database-fueled investigations have placed metadata curation in the hands of researchers with little taxonomic experience. Here I describe practical challenges facing modern microbial taxonomy, provide an overview of complexities of classification for environmentally ubiquitous taxa like Pseudomonas syringae, and emphasize that classification can be independent of nomenclature. A move toward implementation of relational classification schemes based on inherent properties of whole genomes could provide sorely needed continuity in how strains are referenced across manuscripts and data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.
Detection and Classification of UXO Using Unmanned Undersea Electromagnetic Sensing Platforms
NASA Astrophysics Data System (ADS)
Schultz, G.; Keranen, J.; McNinch, J.; Miller, J.
2017-12-01
Important seafloor applications, including mine countermeasures, unexploded ordnance (UXO) surveys, salvage, and underwater hazards, require the detection, geo-registration, and characterization of man-made targets on, or below, the seafloor. Investigations in littoral environments can be time-consuming and expensive due to the challenges of accurately tracking underwater assets, the difficulty of quick or effective site reconnaissance activities, high levels of clutter in nearshore areas, and lack of situational awareness and real-time feedback to operators. Consequently, a high payoff exists for effective methods using sensor and data fusion, feature extraction, and effective payload integration and deployment for improved assessments of littoral infrastructure. We present technology development and demonstration results from multiple technology research, development, and demonstration projects over the last 3 years that have been focused on advancing seafloor target detection, tracking, and classification for specific environmental and defense missions. We focus on challenges overcome in integrating and testing new miniaturized passive magnetic and controlled-source electromagnetic sensors on a variety of remotely and autonomously operated sensing platforms (ROVs, AUVs and bottom crawling systems). In particular, we present aspects of the design, development, and testing of array configurations of miniaturized atomic magnetometers/gradiometers and multi-dimensional electromagnetic (EM) sensor arrays. Results from nearshore (surf zone and marsh in North Carolina) and littoral experiments (bays and reef areas of Florida Gulf and Florida Keys) are presented.
Madison, Matthew J; Bradshaw, Laine P
2015-06-01
Diagnostic classification models are psychometric models that aim to classify examinees according to their mastery or non-mastery of specified latent characteristics. These models are well-suited for providing diagnostic feedback on educational assessments because of their practical efficiency and increased reliability when compared with other multidimensional measurement models. A priori specifications of which latent characteristics or attributes are measured by each item are a core element of the diagnostic assessment design. This item-attribute alignment, expressed in a Q-matrix, precedes and supports any inference resulting from the application of the diagnostic classification model. This study investigates the effects of Q-matrix design on classification accuracy for the log-linear cognitive diagnosis model. Results indicate that classification accuracy, reliability, and convergence rates improve when the Q-matrix contains isolated information from each measured attribute.
McElroy, L. M.; Woods, D. M.; Yanes, A. F.; Skaro, A. I.; Daud, A.; Curtis, T.; Wymore, E.; Holl, J. L.; Abecassis, M. M.; Ladner, D. P.
2016-01-01
Objective Efforts to improve patient safety are challenged by the lack of universally agreed upon terms. The International Classification for Patient Safety (ICPS) was developed by the World Health Organization for this purpose. This study aimed to test the applicability of the ICPS to a surgical population. Design A web-based safety debriefing was sent to clinicians involved in surgical care of abdominal organ transplant patients. A multidisciplinary team of patient safety experts, surgeons and researchers used the data to develop a system of classification based on the ICPS. Disagreements were reconciled via consensus, and a codebook was developed for future use by researchers. Results A total of 320 debriefing responses were used for the initial review and codebook development. In total, the 320 debriefing responses contained 227 patient safety incidents (range: 0–7 per debriefing) and 156 contributing factors/hazards (0–5 per response). The most common severity classification was ‘reportable circumstance,’ followed by ‘near miss.’ The most common incident types were ‘resources/organizational management,’ followed by ‘medical device/equipment.’ Several aspects of surgical care were encompassed by more than one classification, including operating room scheduling, delays in care, trainee-related incidents, interruptions and handoffs. Conclusions This study demonstrates that a framework for patient safety can be applied to facilitate the organization and analysis of surgical safety data. Several unique aspects of surgical care require consideration, and by using a standardized framework for describing concepts, research findings can be compared and disseminated across surgical specialties. The codebook is intended for use as a framework for other specialties and institutions. PMID:26803539
NASA Astrophysics Data System (ADS)
Anwer, Rao Muhammad; Khan, Fahad Shahbaz; van de Weijer, Joost; Molinier, Matthieu; Laaksonen, Jorma
2018-04-01
Designing discriminative powerful texture features robust to realistic imaging conditions is a challenging computer vision problem with many applications, including material recognition and analysis of satellite or aerial imagery. In the past, most texture description approaches were based on dense orderless statistical distribution of local features. However, most recent approaches to texture recognition and remote sensing scene classification are based on Convolutional Neural Networks (CNNs). The de facto practice when learning these CNN models is to use RGB patches as input with training performed on large amounts of labeled data (ImageNet). In this paper, we show that Local Binary Patterns (LBP) encoded CNN models, codenamed TEX-Nets, trained using mapped coded images with explicit LBP based texture information provide complementary information to the standard RGB deep models. Additionally, two deep architectures, namely early and late fusion, are investigated to combine the texture and color information. To the best of our knowledge, we are the first to investigate Binary Patterns encoded CNNs and different deep network fusion architectures for texture recognition and remote sensing scene classification. We perform comprehensive experiments on four texture recognition datasets and four remote sensing scene classification benchmarks: UC-Merced with 21 scene categories, WHU-RS19 with 19 scene classes, RSSCN7 with 7 categories and the recently introduced large scale aerial image dataset (AID) with 30 aerial scene types. We demonstrate that TEX-Nets provide complementary information to standard RGB deep model of the same network architecture. Our late fusion TEX-Net architecture always improves the overall performance compared to the standard RGB network on both recognition problems. Furthermore, our final combination leads to consistent improvement over the state-of-the-art for remote sensing scene classification.
A Novel Multi-Class Ensemble Model for Classifying Imbalanced Biomedical Datasets
NASA Astrophysics Data System (ADS)
Bikku, Thulasi; Sambasiva Rao, N., Dr; Rao, Akepogu Ananda, Dr
2017-08-01
This paper mainly focuseson developing aHadoop based framework for feature selection and classification models to classify high dimensionality data in heterogeneous biomedical databases. Wide research has been performing in the fields of Machine learning, Big data and Data mining for identifying patterns. The main challenge is extracting useful features generated from diverse biological systems. The proposed model can be used for predicting diseases in various applications and identifying the features relevant to particular diseases. There is an exponential growth of biomedical repositories such as PubMed and Medline, an accurate predictive model is essential for knowledge discovery in Hadoop environment. Extracting key features from unstructured documents often lead to uncertain results due to outliers and missing values. In this paper, we proposed a two phase map-reduce framework with text preprocessor and classification model. In the first phase, mapper based preprocessing method was designed to eliminate irrelevant features, missing values and outliers from the biomedical data. In the second phase, a Map-Reduce based multi-class ensemble decision tree model was designed and implemented in the preprocessed mapper data to improve the true positive rate and computational time. The experimental results on the complex biomedical datasets show that the performance of our proposed Hadoop based multi-class ensemble model significantly outperforms state-of-the-art baselines.
Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.
Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel
2017-06-01
Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.
Quality-of-care research in mental health: responding to the challenge.
McGlynn, E A; Norquist, G S; Wells, K B; Sullivan, G; Liberman, R P
1988-01-01
Quality-of-care research in mental health is in the developmental stages, which affords an opportunity to take an integrative approach, building on principles from efficacy, effectiveness, quality assessment, and quality assurance research. We propose an analytic strategy for designing research on the quality of mental health services using an adaptation of the structure, process, and outcome classification scheme. As a concrete illustration of our approach, we discuss research on a particular target population-patients with chronic schizophrenia. Future research should focus on developing models of treatment, establishing criteria and standards for outcomes and processes, and gathering data on community practices.
14 CFR Sec. 19-4 - Service classes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... a composite of first class, coach, and mixed passenger/cargo service. The following classifications... integral part of services performed pursuant to published flight schedules. The following classifications... Classifications Sec. 19-4 Service classes. The statistical classifications are designed to reflect the operating...
14 CFR Sec. 19-4 - Service classes.
Code of Federal Regulations, 2011 CFR
2011-01-01
... a composite of first class, coach, and mixed passenger/cargo service. The following classifications... integral part of services performed pursuant to published flight schedules. The following classifications... Classifications Sec. 19-4 Service classes. The statistical classifications are designed to reflect the operating...
Data Science for Imbalanced Data: Methods and Applications
ERIC Educational Resources Information Center
Johnson, Reid A.
2016-01-01
Data science is a broad, interdisciplinary field concerned with the extraction of knowledge or insights from data, with the classification of data as a core, fundamental task. One of the most persistent challenges faced when performing classification is the class imbalance problem. Class imbalance refers to when the frequency with which each class…
Hussain, Shaista; Basu, Arindam
2016-01-01
The development of power-efficient neuromorphic devices presents the challenge of designing spike pattern classification algorithms which can be implemented on low-precision hardware and can also achieve state-of-the-art performance. In our pursuit of meeting this challenge, we present a pattern classification model which uses a sparse connection matrix and exploits the mechanism of nonlinear dendritic processing to achieve high classification accuracy. A rate-based structural learning rule for multiclass classification is proposed which modifies a connectivity matrix of binary synaptic connections by choosing the best “k” out of “d” inputs to make connections on every dendritic branch (k < < d). Because learning only modifies connectivity, the model is well suited for implementation in neuromorphic systems using address-event representation (AER). We develop an ensemble method which combines several dendritic classifiers to achieve enhanced generalization over individual classifiers. We have two major findings: (1) Our results demonstrate that an ensemble created with classifiers comprising moderate number of dendrites performs better than both ensembles of perceptrons and of complex dendritic trees. (2) In order to determine the moderate number of dendrites required for a specific classification problem, a two-step solution is proposed. First, an adaptive approach is proposed which scales the relative size of the dendritic trees of neurons for each class. It works by progressively adding dendrites with fixed number of synapses to the network, thereby allocating synaptic resources as per the complexity of the given problem. As a second step, theoretical capacity calculations are used to convert each neuronal dendritic tree to its optimal topology where dendrites of each class are assigned different number of synapses. The performance of the model is evaluated on classification of handwritten digits from the benchmark MNIST dataset and compared with other spike classifiers. We show that our system can achieve classification accuracy within 1 − 2% of other reported spike-based classifiers while using much less synaptic resources (only 7%) compared to that used by other methods. Further, an ensemble classifier created with adaptively learned sizes can attain accuracy of 96.4% which is at par with the best reported performance of spike-based classifiers. Moreover, the proposed method achieves this by using about 20% of the synapses used by other spike algorithms. We also present results of applying our algorithm to classify the MNIST-DVS dataset collected from a real spike-based image sensor and show results comparable to the best reported ones (88.1% accuracy). For VLSI implementations, we show that the reduced synaptic memory can save upto 4X area compared to conventional crossbar topologies. Finally, we also present a biologically realistic spike-based version for calculating the correlations required by the structural learning rule and demonstrate the correspondence between the rate-based and spike-based methods of learning. PMID:27065782
Data handling and representation of freeform surfaces
NASA Astrophysics Data System (ADS)
Steinkopf, Ralf; Dick, Lars; Kopf, Tino; Gebhardt, Andreas; Risse, Stefan; Eberhardt, Ramona
2011-10-01
Freeform surfaces enable innovative optics. They are not limited by axis symmetry and hence they are almost free in design. They are used to reduce the installation space and enhance the performance of optical elements. State of the art optical design tools are computing with powerful algorithms to simulate freeform surfaces. Even new mathematical approaches are under development /1/. In consequence, new optical designs /2/ are pushing the development of manufacturing processes consequently and novel types of datasets have to proceed through the process chain /3/. The complexity of these data is the huge challenge for the data handling. Because of the asymmetrical and 3-dimensional surfaces of freeforms, large data volumes have to be created, trimmed, extended and fitted. All these processes must be performed without losing the accuracy of the original design data. Additionally, manifold types of geometries results in different kinds of mathematical representations of freeform surfaces and furthermore the used CAD/CAM tools are dealing with a set of spatial transport formats. These are all reasons why manufacture-oriented approaches for the freeform data handling are not yet sufficiently developed. This paper suggests a classification of freeform surfaces based on the manufacturing methods which are offered by diamond machining. The different manufacturing technologies, ranging from servo-turning to shaping, require a differentiated approach for the data handling process. The usage of analytical descriptions in form of splines and polynomials as well as the application of discrete descriptions like point clouds is shown in relation to the previously made classification. Advantages and disadvantages of freeform representations are discussed. Aspects of the data handling in between different process steps are pointed out and suitable exchange formats for freeform data are proposed. The described approach offers the possibility for efficient data handling from optical design to systems in novel optics.
5 CFR 1312.4 - Classified designations.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Top Secret. This classification shall be applied only to information the unauthorized disclosure of... original classification authority is able to identify or describe. (2) Secret. This classification shall be...
Classifying diseases and remedies in ethnomedicine and ethnopharmacology.
Staub, Peter O; Geck, Matthias S; Weckerle, Caroline S; Casu, Laura; Leonti, Marco
2015-11-04
Ethnopharmacology focuses on the understanding of local and indigenous use of medicines and therefore an emic approach is inevitable. Often, however, standard biomedical disease classifications are used to describe and analyse local diseases and remedies. Standard classifications might be a valid tool for cross-cultural comparisons and bioprospecting purposes but are not suitable to understand the local perception of disease and use of remedies. Different standard disease classification systems exist but their suitability for cross-cultural comparisons of ethnomedical data has never been assessed. Depending on the research focus, (I) ethnomedical, (II) cross-cultural, and (III) bioprospecting, we provide suggestions for the use of specific classification systems. We analyse three different standard biomedical classification systems (the International Classification of Diseases (ICD); the Economic Botany Data Collection Standard (EBDCS); and the International Classification of Primary Care (ICPC)), and discuss their value for categorizing diseases of ethnomedical systems and their suitability for cross-cultural research in ethnopharmacology. Moreover, based on the biomedical uses of all approved plant derived biomedical drugs, we propose a biomedical therapy-based classification system as a guide for the discovery of drugs from ethnopharmacological sources. Widely used standards, such as the International Classification of Diseases (ICD) by the WHO and the Economic Botany Data Collection Standard (EBDCS) are either technically challenging due to a categorisation system based on clinical examinations, which are usually not possible during field research (ICD) or lack clear biomedical criteria combining disorders and medical effects in an imprecise and confusing way (EBDCS). The International Classification of Primary Care (ICPC), also accepted by the WHO, has more in common with ethnomedical reality than the ICD or the EBDCS, as the categories are designed according to patient's perceptions and are less influenced by clinical medicine. Since diagnostic tools are not required, medical ethnobotanists and ethnopharmacologists can easily classify reported symptoms and complaints with the ICPC in one of the "chapters" based on 17 body systems, psychological and social problems. Also the biomedical uses of plant-derived drugs are classifiable into 17 broad organ- and therapy-based use-categories but can easily be divided into more specific subcategories. Depending on the research focus (I-III) we propose the following classification systems: I. Ethnomedicine: Ethnomedicine is culture-bound and local classifications have to be understood from an emic perspective. Consequently, the application of prefabricated, "one-size fits all" biomedical classification schemes is of limited value. II. Cross-cultural analysis: The ICPC is a suitable standard that can be applied but modified as required. III. Bioprospecting: We suggest a biomedical therapy-driven classification system with currently 17 use-categories based on biomedical uses of all approved plant derived natural product drugs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
McMahon, Richard
2018-03-01
A recently blossoming historiographical literature recognizes that physical anthropologists allied with scholars of diverse aspects of society and history to racially classify European peoples over a period of about a hundred years. They created three successive race classification coalitions - ethnology, from around 1840; anthropology, from the 1850s; and interwar raciology - each of which successively disintegrated. The present genealogical study argues that representing these coalitions as 'transdisciplinary' can enrich our understanding of challenges to disciplinary specialization. This is especially the case for the less well-studied nineteenth century, when disciplines and challenges to disciplinary specialization were both gradually emerging. Like Marxism or structuralism, race classification was a holistic interpretive framework, which, at its most ambitious, aimed to structure the human sciences as a whole. It resisted the organization of academia and knowledge into disciplines with separate organizational institutions and research practices. However, the 'transdisciplinarity' of this nationalistic project also bridged emerging borderlines between science and politics. I ascribe race classification's simultaneous longevity and instability to its complex and intricately entwined processes of political and interdisciplinary coalition building. Race classification's politically useful conclusions helped secure public support for institutionalizing the coalition's component disciplines. Institutionalization in turn stimulated disciplines to professionalize. They emphasized disciplinary boundaries and insisted on apolitical science, thus ultimately undermining the 'transdisciplinary' project.
Emotion models for textual emotion classification
NASA Astrophysics Data System (ADS)
Bruna, O.; Avetisyan, H.; Holub, J.
2016-11-01
This paper deals with textual emotion classification which gained attention in recent years. Emotion classification is used in user experience, product evaluation, national security, and tutoring applications. It attempts to detect the emotional content in the input text and based on different approaches establish what kind of emotional content is present, if any. Textual emotion classification is the most difficult to handle, since it relies mainly on linguistic resources and it introduces many challenges to assignment of text to emotion represented by a proper model. A crucial part of each emotion detector is emotion model. Focus of this paper is to introduce emotion models used for classification. Categorical and dimensional models of emotion are explained and some more advanced approaches are mentioned.
Classification of male lower torso for underwear design
NASA Astrophysics Data System (ADS)
Cheng, Z.; Kuzmichev, V. E.
2017-10-01
By means of scanning technology we have got new information about the morphology of male bodies and have redistricted the classification of men’s underwear by adopting one to consumer demands. To build the new classification in accordance with male body characteristic factors of lower torso, we make the method of underwear designing which allow to get the accurate and convenience for consumers products.
Cell-based therapy technology classifications and translational challenges
Mount, Natalie M.; Ward, Stephen J.; Kefalas, Panos; Hyllner, Johan
2015-01-01
Cell therapies offer the promise of treating and altering the course of diseases which cannot be addressed adequately by existing pharmaceuticals. Cell therapies are a diverse group across cell types and therapeutic indications and have been an active area of research for many years but are now strongly emerging through translation and towards successful commercial development and patient access. In this article, we present a description of a classification of cell therapies on the basis of their underlying technologies rather than the more commonly used classification by cell type because the regulatory path and manufacturing solutions are often similar within a technology area due to the nature of the methods used. We analyse the progress of new cell therapies towards clinical translation, examine how they are addressing the clinical, regulatory, manufacturing and reimbursement requirements, describe some of the remaining challenges and provide perspectives on how the field may progress for the future. PMID:26416686
14 CFR 1203.412 - Classification guides.
Code of Federal Regulations, 2010 CFR
2010-01-01
... of the classification designations (i.e., Top Secret, Secret or Confidential) apply to the identified... writing by an official with original Top Secret classification authority; the identity of the official...
Krause, Fabian G; Di Silvestro, Matthew; Penner, Murray J; Wing, Kevin J; Glazebrook, Mark A; Daniels, Timothy R; Lau, Johnny T C; Younger, Alastair S E
2012-02-01
End-stage ankle arthritis is operatively treated with numerous designs of total ankle replacement and different techniques for ankle fusion. For superior comparison of these procedures, outcome research requires a classification system to stratify patients appropriately. A postoperative 4-type classification system was designed by 6 fellowship-trained foot and ankle surgeons. Four surgeons reviewed blinded patient profiles and radiographs on 2 occasions to determine the interobserver and intraobserver reliability of the classification. Excellent interobserver reliability (κ = .89) and intraobserver reproducibility (κ = .87) were demonstrated for the postoperative classification system. In conclusion, the postoperative Canadian Orthopaedic Foot and Ankle Society (COFAS) end-stage ankle arthritis classification system appears to be a valid tool to evaluate the outcome of patients operated for end-stage ankle arthritis.
Self-organizing ontology of biochemically relevant small molecules
2012-01-01
Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of chemical data annotators and dramatically increase their productivity. We anticipate that the use of formal logic in our proposed framework will make chemical classification criteria more transparent to humans and machines alike and will thus facilitate predictive and integrative bioactivity model development. PMID:22221313
The SpeX Prism Library Analysis Toolkit: Design Considerations and First Results
NASA Astrophysics Data System (ADS)
Burgasser, Adam J.; Aganze, Christian; Escala, Ivana; Lopez, Mike; Choban, Caleb; Jin, Yuhui; Iyer, Aishwarya; Tallis, Melisa; Suarez, Adrian; Sahi, Maitrayee
2016-01-01
Various observational and theoretical spectral libraries now exist for galaxies, stars, planets and other objects, which have proven useful for classification, interpretation, simulation and model development. Effective use of these libraries relies on analysis tools, which are often left to users to develop. In this poster, we describe a program to develop a combined spectral data repository and Python-based analysis toolkit for low-resolution spectra of very low mass dwarfs (late M, L and T dwarfs), which enables visualization, spectral index analysis, classification, atmosphere model comparison, and binary modeling for nearly 2000 library spectra and user-submitted data. The SpeX Prism Library Analysis Toolkit (SPLAT) is being constructed as a collaborative, student-centered, learning-through-research model with high school, undergraduate and graduate students and regional science teachers, who populate the database and build the analysis tools through quarterly challenge exercises and summer research projects. In this poster, I describe the design considerations of the toolkit, its current status and development plan, and report the first published results led by undergraduate students. The combined data and analysis tools are ideal for characterizing cool stellar and exoplanetary atmospheres (including direct exoplanetary spectra observations by Gemini/GPI, VLT/SPHERE, and JWST), and the toolkit design can be readily adapted for other spectral datasets as well.This material is based upon work supported by the National Aeronautics and Space Administration under Grant No. NNX15AI75G. SPLAT code can be found at https://github.com/aburgasser/splat.
Telemedicine for Developing Countries
Combi, Carlo; Pozzani, Gabriele
2016-01-01
Summary Background Developing countries need telemedicine applications that help in many situations, when physicians are a small number with respect to the population, when specialized physicians are not available, when patients and physicians in rural villages need assistance in the delivery of health care. Moreover, the requirements of telemedicine applications for developing countries are somewhat more demanding than for developed countries. Indeed, further social, organizational, and technical aspects need to be considered for successful telemedicine applications in developing countries. Objective We consider all the major projects in telemedicine, devoted to developing countries, as described by the proper scientific literature. On the basis of such literature, we want to define a specific taxonomy that allows a proper classification and a fast overview of telemedicine projects in developing countries. Moreover, by considering both the literature and some recent direct experiences, we want to complete such overview by discussing some design issues to be taken into consideration when developing telemedicine software systems. Methods We considered and reviewed the major conferences and journals in depth, and looked for reports on the telemedicine projects. Results We provide the reader with a survey of the main projects and systems, from which we derived a taxonomy of features of telemedicine systems for developing countries. We also propose and discuss some classification criteria for design issues, based on the lessons learned in this research area. Conclusions We highlight some challenges and recommendations to be considered when designing a telemedicine system for developing countries. PMID:27803948
Online clustering algorithms for radar emitter classification.
Liu, Jun; Lee, Jim P Y; Senior; Li, Lingjie; Luo, Zhi-Quan; Wong, K Max
2005-08-01
Radar emitter classification is a special application of data clustering for classifying unknown radar emitters from received radar pulse samples. The main challenges of this task are the high dimensionality of radar pulse samples, small sample group size, and closely located radar pulse clusters. In this paper, two new online clustering algorithms are developed for radar emitter classification: One is model-based using the Minimum Description Length (MDL) criterion and the other is based on competitive learning. Computational complexity is analyzed for each algorithm and then compared. Simulation results show the superior performance of the model-based algorithm over competitive learning in terms of better classification accuracy, flexibility, and stability.
Science Planning and Orbit Classification for Solar Probe Plus
NASA Astrophysics Data System (ADS)
Kusterer, M. B.; Fox, N. J.; Rodgers, D. J.; Turner, F. S.
2016-12-01
There are a number of challenges for the Science Planning Team (SPT) of the Solar Probe Plus (SPP) Mission. Since SPP is using a decoupled payload operations approach, tight coordination between the mission operations and payload teams will be required. The payload teams must manage the volume of data that they write to the spacecraft solid-state recorders (SSR) for their individual instruments for downlink to the ground. Making this process more difficult, the geometry of the celestial bodies and the spacecraft during some of the SPP mission orbits cause limited uplink and downlink opportunities. The payload teams will also be required to coordinate power on opportunities, command uplink opportunities, and data transfers from instrument memory to the spacecraft SSR with the operation team. The SPT also intend to coordinate observations with other spacecraft and ground based systems. To solve these challenges, detailed orbit activity planning is required in advance for each orbit. An orbit planning process is being created to facilitate the coordination of spacecraft and payload activities for each orbit. An interactive Science Planning Tool is being designed to integrate the payload data volume and priority allocations, spacecraft ephemeris, attitude, downlink and uplink schedules, spacecraft and payload activities, and other spacecraft ephemeris. It will be used during science planning to select the instrument data priorities and data volumes that satisfy the orbit data volume constraints and power on, command uplink and data transfer time periods. To aid in the initial stages of science planning we have created an orbit classification scheme based on downlink availability and significant science events. Different types of challenges arise in the management of science data driven by orbital geometry and operational constraints, and this scheme attempts to identify the patterns that emerge.
14 CFR 1203.701 - Classification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 5 2013-01-01 2013-01-01 false Classification. 1203.701 Section 1203.701... Government Information § 1203.701 Classification. (a) Foreign government information that is classified by a foreign entity shall either retain its original classification designation or be marked with a United...
14 CFR 1203.701 - Classification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Classification. 1203.701 Section 1203.701... Government Information § 1203.701 Classification. (a) Foreign government information that is classified by a foreign entity shall either retain its original classification designation or be marked with a United...
Estimating Classification Consistency and Accuracy for Cognitive Diagnostic Assessment
ERIC Educational Resources Information Center
Cui, Ying; Gierl, Mark J.; Chang, Hua-Hua
2012-01-01
This article introduces procedures for the computation and asymptotic statistical inference for classification consistency and accuracy indices specifically designed for cognitive diagnostic assessments. The new classification indices can be used as important indicators of the reliability and validity of classification results produced by…
14 CFR 1203.701 - Classification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Classification. 1203.701 Section 1203.701... Government Information § 1203.701 Classification. (a) Foreign government information that is classified by a foreign entity shall either retain its original classification designation or be marked with a United...
Typecasting catchments: Classification, directionality, and the pursuit of universality
NASA Astrophysics Data System (ADS)
Smith, Tyler; Marshall, Lucy; McGlynn, Brian
2018-02-01
Catchment classification poses a significant challenge to hydrology and hydrologic modeling, restricting widespread transfer of knowledge from well-studied sites. The identification of important physical, climatological, or hydrologic attributes (to varying degrees depending on application/data availability) has traditionally been the focus for catchment classification. Classification approaches are regularly assessed with regard to their ability to provide suitable hydrologic predictions - commonly by transferring fitted hydrologic parameters at a data-rich catchment to a data-poor catchment deemed similar by the classification. While such approaches to hydrology's grand challenges are intuitive, they often ignore the most uncertain aspect of the process - the model itself. We explore catchment classification and parameter transferability and the concept of universal donor/acceptor catchments. We identify the implications of the assumption that the transfer of parameters between "similar" catchments is reciprocal (i.e., non-directional). These concepts are considered through three case studies situated across multiple gradients that include model complexity, process description, and site characteristics. Case study results highlight that some catchments are more successfully used as donor catchments and others are better suited as acceptor catchments. These results were observed for both black-box and process consistent hydrologic models, as well as for differing levels of catchment similarity. Therefore, we suggest that similarity does not adequately satisfy the underlying assumptions being made in parameter regionalization approaches regardless of model appropriateness. Furthermore, we suggest that the directionality of parameter transfer is an important factor in determining the success of parameter regionalization approaches.
Deep Recurrent Neural Networks for Supernovae Classification
NASA Astrophysics Data System (ADS)
Charnock, Tom; Moss, Adam
2017-03-01
We apply deep recurrent neural networks, which are capable of learning complex sequential information, to classify supernovae (code available at https://github.com/adammoss/supernovae). The observational time and filter fluxes are used as inputs to the network, but since the inputs are agnostic, additional data such as host galaxy information can also be included. Using the Supernovae Photometric Classification Challenge (SPCC) data, we find that deep networks are capable of learning about light curves, however the performance of the network is highly sensitive to the amount of training data. For a training size of 50% of the representational SPCC data set (around 104 supernovae) we obtain a type-Ia versus non-type-Ia classification accuracy of 94.7%, an area under the Receiver Operating Characteristic curve AUC of 0.986 and an SPCC figure-of-merit F 1 = 0.64. When using only the data for the early-epoch challenge defined by the SPCC, we achieve a classification accuracy of 93.1%, AUC of 0.977, and F 1 = 0.58, results almost as good as with the whole light curve. By employing bidirectional neural networks, we can acquire impressive classification results between supernovae types I, II and III at an accuracy of 90.4% and AUC of 0.974. We also apply a pre-trained model to obtain classification probabilities as a function of time and show that it can give early indications of supernovae type. Our method is competitive with existing algorithms and has applications for future large-scale photometric surveys.
10 CFR 1045.39 - Challenging classification and declassification determinations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... holder of an RD or FRD document who, in good faith, believes that the RD or FRD document has an improper... classified the document. (b) Agencies shall establish procedures under which authorized holders of RD and FRD... involving RD or FRD may be appealed to the Director of Classification. In the case of FRD and RD related...
10 CFR 1045.39 - Challenging classification and declassification determinations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... holder of an RD or FRD document who, in good faith, believes that the RD or FRD document has an improper... classified the document. (b) Agencies shall establish procedures under which authorized holders of RD and FRD... involving RD or FRD may be appealed to the Director of Classification. In the case of FRD and RD related...
10 CFR 1045.39 - Challenging classification and declassification determinations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... holder of an RD or FRD document who, in good faith, believes that the RD or FRD document has an improper... classified the document. (b) Agencies shall establish procedures under which authorized holders of RD and FRD... involving RD or FRD may be appealed to the Director of Classification. In the case of FRD and RD related...
2014-09-30
This ONR grant promotes the development and application of advanced machine learning techniques for detection and classification of marine mammal...sounds. The objective is to engage a broad community of data scientists in the development and application of advanced machine learning techniques for detection and classification of marine mammal sounds.
ERIC Educational Resources Information Center
Okhremtchouk, I.; Levine-Smith, J.; Clark, Adam T.
2018-01-01
In this article we unpack the obstacles and opportunities associated with language minority student classification practices and, more specifically, English language learners' reclassification to fluent proficient status. First, we discuss classification permanency for language minority students. Second, we provide an overview of national…
ERIC Educational Resources Information Center
Ha¨rtinger, Stefan; Clarke, Nigel
2016-01-01
Developing skills for searching the patent literature is an essential element of chemical information literacy programs at the university level. The present article creates awareness of patents as a rich source of chemical information. Patent classification is introduced as a key-component in comprehensive search strategies. The free Espacenet…
Optimal design of a bank of spatio-temporal filters for EEG signal classification.
Higashi, Hiroshi; Tanaka, Toshihisa
2011-01-01
The spatial weights for electrodes called common spatial pattern (CSP) are known to be effective in EEG signal classification for motor imagery based brain computer interfaces (MI-BCI). To achieve accurate classification in CSP, the frequency filter should be properly designed. To this end, several methods for designing the filter have been proposed. However, the existing methods cannot consider plural brain activities described with different frequency bands and different spatial patterns such as activities of mu and beta rhythms. In order to efficiently extract these brain activities, we propose a method to design plural filters and spatial weights which extract desired brain activity. The proposed method designs finite impulse response (FIR) filters and the associated spatial weights by optimization of an objective function which is a natural extension of CSP. Moreover, we show by a classification experiment that the bank of FIR filters which are designed by introducing an orthogonality into the objective function can extract good discriminative features. Moreover, the experiment result suggests that the proposed method can automatically detect and extract brain activities related to motor imagery.
Available Tools and Challenges Classifying Cutting-Edge and Historical Astronomical Documents
NASA Astrophysics Data System (ADS)
Lagerstrom, Jill
2015-08-01
The STScI Library assists the Science Policies Division in evaluating and choosing scientific keywords and categories for proposals for the Hubble Space Telescope mission and the upcoming James Webb Space Telescope mission. In addition we are often faced with the question “what is the shape of the astronomical literature?” However, subject classification in astronomy in recent times has not been cultivated. This talk will address the available tools and challenges of classifying cutting-edge as well as historical astronomical documents. In at the process, we will give an overview of current and upcoming practices of subject classification in astronomy.
Biosonar-inspired technology: goals, challenges and insights.
Müller, Rolf; Kuc, Roman
2007-12-01
Bioinspired engineering based on biosonar systems in nature is reviewed and discussed in terms of the merits of different approaches and their results: biosonar systems are attractive technological paragons because of their capabilities, built-in task-specific knowledge, intelligent system integration and diversity. Insights from the diverse set of sensing tasks solved by bats are relevant to a wide range of application areas such as sonar, biomedical ultrasound, non-destructive testing, sensors for autonomous systems and wireless communication. Challenges in the design of bioinspired sonar systems are posed by transducer performance, actuation for sensor mobility, design, actuation and integration of beamforming baffle shapes, echo encoding for signal processing, estimation algorithms and their implementations, as well as system integration and feedback control. The discussed examples of experimental systems have capabilities that include localization and tracking using binaural and multiple-band hearing as well as self-generated dynamic cues, classification of small deterministic and large random targets, beamforming with bioinspired baffle shapes, neuromorphic spike processing, artifact rejection in sonar maps and passing range estimation. In future research, bioinspired engineering could capitalize on some of its strengths to serve as a model system for basic automation methodologies for the bioinspired engineering process.
NASA Astrophysics Data System (ADS)
Buongiorno Nardelli, Marco
2015-03-01
High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB, open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends on the the design effcient algorithms for electronic structure simulations of realistic material systems, the systematic compilation and classification of the generated data, and its presentation in easily accessed form to the materials science community, the primary mission of the AFLOW consortium. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.
Zhou, Fuqun; Zhang, Aining
2016-01-01
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152
Kainz, Philipp; Pfeiffer, Michael; Urschler, Martin
2017-01-01
Segmentation of histopathology sections is a necessary preprocessing step for digital pathology. Due to the large variability of biological tissue, machine learning techniques have shown superior performance over conventional image processing methods. Here we present our deep neural network-based approach for segmentation and classification of glands in tissue of benign and malignant colorectal cancer, which was developed to participate in the GlaS@MICCAI2015 colon gland segmentation challenge. We use two distinct deep convolutional neural networks (CNN) for pixel-wise classification of Hematoxylin-Eosin stained images. While the first classifier separates glands from background, the second classifier identifies gland-separating structures. In a subsequent step, a figure-ground segmentation based on weighted total variation produces the final segmentation result by regularizing the CNN predictions. We present both quantitative and qualitative segmentation results on the recently released and publicly available Warwick-QU colon adenocarcinoma dataset associated with the GlaS@MICCAI2015 challenge and compare our approach to the simultaneously developed other approaches that participated in the same challenge. On two test sets, we demonstrate our segmentation performance and show that we achieve a tissue classification accuracy of 98% and 95%, making use of the inherent capability of our system to distinguish between benign and malignant tissue. Our results show that deep learning approaches can yield highly accurate and reproducible results for biomedical image analysis, with the potential to significantly improve the quality and speed of medical diagnoses.
Kainz, Philipp; Pfeiffer, Michael
2017-01-01
Segmentation of histopathology sections is a necessary preprocessing step for digital pathology. Due to the large variability of biological tissue, machine learning techniques have shown superior performance over conventional image processing methods. Here we present our deep neural network-based approach for segmentation and classification of glands in tissue of benign and malignant colorectal cancer, which was developed to participate in the GlaS@MICCAI2015 colon gland segmentation challenge. We use two distinct deep convolutional neural networks (CNN) for pixel-wise classification of Hematoxylin-Eosin stained images. While the first classifier separates glands from background, the second classifier identifies gland-separating structures. In a subsequent step, a figure-ground segmentation based on weighted total variation produces the final segmentation result by regularizing the CNN predictions. We present both quantitative and qualitative segmentation results on the recently released and publicly available Warwick-QU colon adenocarcinoma dataset associated with the GlaS@MICCAI2015 challenge and compare our approach to the simultaneously developed other approaches that participated in the same challenge. On two test sets, we demonstrate our segmentation performance and show that we achieve a tissue classification accuracy of 98% and 95%, making use of the inherent capability of our system to distinguish between benign and malignant tissue. Our results show that deep learning approaches can yield highly accurate and reproducible results for biomedical image analysis, with the potential to significantly improve the quality and speed of medical diagnoses. PMID:29018612
Comparison of seven protocols to identify fecal contamination sources using Escherichia coli
Stoeckel, D.M.; Mathes, M.V.; Hyer, K.E.; Hagedorn, C.; Kator, H.; Lukasik, J.; O'Brien, T. L.; Fenger, T.W.; Samadpour, M.; Strickler, K.M.; Wiggins, B.A.
2004-01-01
Microbial source tracking (MST) uses various approaches to classify fecal-indicator microorganisms to source hosts. Reproducibility, accuracy, and robustness of seven phenotypic and genotypic MST protocols were evaluated by use of Escherichia coli from an eight-host library of known-source isolates and a separate, blinded challenge library. In reproducibility tests, measuring each protocol's ability to reclassify blinded replicates, only one (pulsed-field gel electrophoresis; PFGE) correctly classified all test replicates to host species; three protocols classified 48-62% correctly, and the remaining three classified fewer than 25% correctly. In accuracy tests, measuring each protocol's ability to correctly classify new isolates, ribotyping with EcoRI and PvuII approached 100% correct classification but only 6% of isolates were classified; four of the other six protocols (antibiotic resistance analysis, PFGE, and two repetitive-element PCR protocols) achieved better than random accuracy rates when 30-100% of challenge isolates were classified. In robustness tests, measuring each protocol's ability to recognize isolates from nonlibrary hosts, three protocols correctly classified 33-100% of isolates as "unknown origin," whereas four protocols classified all isolates to a source category. A relevance test, summarizing interpretations for a hypothetical water sample containing 30 challenge isolates, indicated that false-positive classifications would hinder interpretations for most protocols. Study results indicate that more representation in known-source libraries and better classification accuracy would be needed before field application. Thorough reliability assessment of classification results is crucial before and during application of MST protocols.
Zhou, Fuqun; Zhang, Aining
2016-10-25
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.
5 CFR 1312.4 - Classified designations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... describe. (3) Confidential. This classification shall be applied only to information the unauthorized... 1312.4 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION, DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and...
5 CFR 1312.4 - Classified designations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... describe. (3) Confidential. This classification shall be applied only to information the unauthorized... 1312.4 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION, DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and...
5 CFR 1312.4 - Classified designations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... describe. (3) Confidential. This classification shall be applied only to information the unauthorized... 1312.4 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION, DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and...
Review article: A systematic review of emergency department incident classification frameworks.
Murray, Matthew; McCarthy, Sally
2018-06-01
As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
14 CFR 298.3 - Classification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...
14 CFR 298.3 - Classification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...
14 CFR 298.3 - Classification.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...
14 CFR 298.3 - Classification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...
14 CFR 298.3 - Classification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Classification. 298.3 Section 298.3... REGULATIONS EXEMPTIONS FOR AIR TAXI AND COMMUTER AIR CARRIER OPERATIONS General § 298.3 Classification. (a) There is hereby established a classification of air carriers, designated as “air taxi operators,” which...
Hierarchical structure for audio-video based semantic classification of sports video sequences
NASA Astrophysics Data System (ADS)
Kolekar, M. H.; Sengupta, S.
2005-07-01
A hierarchical structure for sports event classification based on audio and video content analysis is proposed in this paper. Compared to the event classifications in other games, those of cricket are very challenging and yet unexplored. We have successfully solved cricket video classification problem using a six level hierarchical structure. The first level performs event detection based on audio energy and Zero Crossing Rate (ZCR) of short-time audio signal. In the subsequent levels, we classify the events based on video features using a Hidden Markov Model implemented through Dynamic Programming (HMM-DP) using color or motion as a likelihood function. For some of the game-specific decisions, a rule-based classification is also performed. Our proposed hierarchical structure can easily be applied to any other sports. Our results are very promising and we have moved a step forward towards addressing semantic classification problems in general.
28 CFR 570.35 - Transfer furlough eligibility requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... facility based on the inmate's security designation and custody classification at the time of transfer. (d... security designation and custody classification at the time of transfer. ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Transfer furlough eligibility...
28 CFR 570.35 - Transfer furlough eligibility requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... facility based on the inmate's security designation and custody classification at the time of transfer. (d... security designation and custody classification at the time of transfer. ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Transfer furlough eligibility...
28 CFR 570.35 - Transfer furlough eligibility requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... facility based on the inmate's security designation and custody classification at the time of transfer. (d... security designation and custody classification at the time of transfer. ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Transfer furlough eligibility...
28 CFR 570.35 - Transfer furlough eligibility requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... facility based on the inmate's security designation and custody classification at the time of transfer. (d... security designation and custody classification at the time of transfer. ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Transfer furlough eligibility...
Wu, Haiming; Zhang, Jian; Ngo, Huu Hao; Guo, Wenshan; Hu, Zhen; Liang, Shuang; Fan, Jinlin; Liu, Hai
2015-01-01
Constructed wetlands (CWs) have been used as a green technology to treat various wastewaters for several decades. CWs offer a land-intensive, low-energy, and less-operational-requirements alternative to conventional treatment systems, especially for small communities and remote locations. However, the sustainable operation and successful application of these systems remains a challenge. Hence, this paper aims to provide and inspire sustainable solutions for the performance and application of CWs by giving a comprehensive review of CWs' application and the recent development on their sustainable design and operation for wastewater treatment. Firstly, a brief summary on the definition, classification and application of current CWs was presented. The design parameters and operational conditions of CWs including plant species, substrate types, water depth, hydraulic load, hydraulic retention time and feeding mode related to the sustainable operation for wastewater treatments were then discussed. Lastly, future research on improving the stability and sustainability of CWs were highlighted. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hyperspectral image classification based on local binary patterns and PCANet
NASA Astrophysics Data System (ADS)
Yang, Huizhen; Gao, Feng; Dong, Junyu; Yang, Yang
2018-04-01
Hyperspectral image classification has been well acknowledged as one of the challenging tasks of hyperspectral data processing. In this paper, we propose a novel hyperspectral image classification framework based on local binary pattern (LBP) features and PCANet. In the proposed method, linear prediction error (LPE) is first employed to select a subset of informative bands, and LBP is utilized to extract texture features. Then, spectral and texture features are stacked into a high dimensional vectors. Next, the extracted features of a specified position are transformed to a 2-D image. The obtained images of all pixels are fed into PCANet for classification. Experimental results on real hyperspectral dataset demonstrate the effectiveness of the proposed method.
Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.
2017-01-01
Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.
Development and evaluation of a study design typology for human research.
Carini, Simona; Pollock, Brad H; Lehmann, Harold P; Bakken, Suzanne; Barbour, Edward M; Gabriel, Davera; Hagler, Herbert K; Harper, Caryn R; Mollah, Shamim A; Nahm, Meredith; Nguyen, Hien H; Scheuermann, Richard H; Sim, Ida
2009-11-14
A systematic classification of study designs would be useful for researchers, systematic reviewers, readers, and research administrators, among others. As part of the Human Studies Database Project, we developed the Study Design Typology to standardize the classification of study designs in human research. We then performed a multiple observer masked evaluation of active research protocols in four institutions according to a standardized protocol. Thirty-five protocols were classified by three reviewers each into one of nine high-level study designs for interventional and observational research (e.g., N-of-1, Parallel Group, Case Crossover). Rater classification agreement was moderately high for the 35 protocols (Fleiss' kappa = 0.442) and higher still for the 23 quantitative studies (Fleiss' kappa = 0.463). We conclude that our typology shows initial promise for reliably distinguishing study design types for quantitative human research.
NASA Astrophysics Data System (ADS)
Ilehag, R.; Schenk, A.; Hinz, S.
2017-08-01
This paper presents a concept for classification of facade elements, based on the material and the geometry of the elements in addition to the thermal radiation of the facade with the usage of a multimodal Unmanned Aerial Vehicle (UAV) system. Once the concept is finalized and functional, the workflow can be used for energy demand estimations for buildings by exploiting existing methods for estimation of heat transfer coefficient and the transmitted heat loss. The multimodal system consists of a thermal, a hyperspectral and an optical sensor, which can be operational with a UAV. While dealing with sensors that operate in different spectra and have different technical specifications, such as the radiometric and the geometric resolution, the challenges that are faced are presented. Addressed are the different approaches of data fusion, such as image registration, generation of 3D models by performing image matching and the means for classification based on either the geometry of the object or the pixel values. As a first step towards realizing the concept, the result from a geometric calibration with a designed multimodal calibration pattern is presented.
Mapping benthic macroalgal communities in the coastal zone using CHRIS-PROBA mode 2 images
NASA Astrophysics Data System (ADS)
Casal, G.; Kutser, T.; Domínguez-Gómez, J. A.; Sánchez-Carnero, N.; Freire, J.
2011-09-01
The ecological importance of benthic macroalgal communities in coastal ecosystems has been recognised worldwide and the application of remote sensing to study these communities presents certain advantages respect to in situ methods. The present study used three CHRIS-PROBA images to analyse macroalgal communities distribution in the Seno de Corcubión (NW Spain). The use of this sensor represent a challenge given that its design, build and deployment programme is intended to follow the principles of the "faster, better, cheaper". To assess the application of this sensor to macroalgal mapping, two types of classifications were carried out: Maximum Likelihood and Spectral Angle Mapper (SAM). Maximum Likelihood classifier showed positive results, reaching overall accuracy percentages higher than 90% and kappa coefficients higher than 0.80 for the bottom classes shallow submerged sand, deep submerged sand, macroalgae less than 5 m and macroalgae between 5 and 10 m depth. The differentiation among macroalgal groups using SAM classifications showed positive results for green seaweeds although the differentiation between brown and red algae was not clear in the study area.
Raman, M R Gauthama; Somu, Nivethitha; Kirthivasan, Kannan; Sriram, V S Shankar
2017-08-01
Over the past few decades, the design of an intelligent Intrusion Detection System (IDS) remains an open challenge to the research community. Continuous efforts by the researchers have resulted in the development of several learning models based on Artificial Neural Network (ANN) to improve the performance of the IDSs. However, there exists a tradeoff with respect to the stability of ANN architecture and the detection rate for less frequent attacks. This paper presents a novel approach based on Helly property of Hypergraph and Arithmetic Residue-based Probabilistic Neural Network (HG AR-PNN) to address the classification problem in IDS. The Helly property of Hypergraph was exploited for the identification of the optimal feature subset and the arithmetic residue of the optimal feature subset was used to train the PNN. The performance of HG AR-PNN was evaluated using KDD CUP 1999 intrusion dataset. Experimental results prove the dominance of HG AR-PNN classifier over the existing classifiers with respect to the stability and improved detection rate for less frequent attacks. Copyright © 2017 Elsevier Ltd. All rights reserved.
Caridakis, G; Karpouzis, K; Drosopoulos, A; Kollias, S
2012-12-01
Modeling and recognizing spatiotemporal, as opposed to static input, is a challenging task since it incorporates input dynamics as part of the problem. The vast majority of existing methods tackle the problem as an extension of the static counterpart, using dynamics, such as input derivatives, at feature level and adopting artificial intelligence and machine learning techniques originally designed for solving problems that do not specifically address the temporal aspect. The proposed approach deals with temporal and spatial aspects of the spatiotemporal domain in a discriminative as well as coupling manner. Self Organizing Maps (SOM) model the spatial aspect of the problem and Markov models its temporal counterpart. Incorporation of adjacency, both in training and classification, enhances the overall architecture with robustness and adaptability. The proposed scheme is validated both theoretically, through an error propagation study, and experimentally, on the recognition of individual signs, performed by different, native Greek Sign Language users. Results illustrate the architecture's superiority when compared to Hidden Markov Model techniques and variations both in terms of classification performance and computational cost. Copyright © 2012 Elsevier Ltd. All rights reserved.
Reflected scatterometry for noninvasive interrogation of bacterial colonies
NASA Astrophysics Data System (ADS)
Kim, Huisung; Doh, Iyll-Joon; Sturgis, Jennifer; Bhunia, Arun K.; Robinson, J. Paul; Bae, Euiwon
2016-10-01
A phenotyping of bacterial colonies on agar plates using forward-scattering diffraction-pattern analysis provided promising classification of several different bacteria such as Salmonella, Vibrio, Listeria, and E. coli. Since the technique is based on forward-scattering phenomena, light transmittance of both the colony and the medium is critical to ensure quality data. However, numerous microorganisms and their growth media allow only limited light penetration and render the forward-scattering measurement a challenging task. For example, yeast, Lactobacillus, mold, and several soil bacteria form colorful and dense colonies that obstruct most of the incoming light passing through them. Moreover, blood agar, which is widely utilized in the clinical field, completely blocks the incident coherent light source used in forward scatterometry. We present a newly designed reflection scatterometer and validation of the resolving power of the instrument. The reflectance-type instrument can acquire backward elastic scatter patterns for both highly opaque media and colonies and has been tested with three different bacterial genera grown on blood agar plates. Cross-validation results show a classification rate above 90% for four genera.
Robust pattern decoding in shape-coded structured light
NASA Astrophysics Data System (ADS)
Tang, Suming; Zhang, Xu; Song, Zhan; Song, Lifang; Zeng, Hai
2017-09-01
Decoding is a challenging and complex problem in a coded structured light system. In this paper, a robust pattern decoding method is proposed for the shape-coded structured light in which the pattern is designed as grid shape with embedded geometrical shapes. In our decoding method, advancements are made at three steps. First, a multi-template feature detection algorithm is introduced to detect the feature point which is the intersection of each two orthogonal grid-lines. Second, pattern element identification is modelled as a supervised classification problem and the deep neural network technique is applied for the accurate classification of pattern elements. Before that, a training dataset is established, which contains a mass of pattern elements with various blurring and distortions. Third, an error correction mechanism based on epipolar constraint, coplanarity constraint and topological constraint is presented to reduce the false matches. In the experiments, several complex objects including human hand are chosen to test the accuracy and robustness of the proposed method. The experimental results show that our decoding method not only has high decoding accuracy, but also owns strong robustness to surface color and complex textures.
Evolutionary Algorithm Based Feature Optimization for Multi-Channel EEG Classification.
Wang, Yubo; Veluvolu, Kalyana C
2017-01-01
The most BCI systems that rely on EEG signals employ Fourier based methods for time-frequency decomposition for feature extraction. The band-limited multiple Fourier linear combiner is well-suited for such band-limited signals due to its real-time applicability. Despite the improved performance of these techniques in two channel settings, its application in multiple-channel EEG is not straightforward and challenging. As more channels are available, a spatial filter will be required to eliminate the noise and preserve the required useful information. Moreover, multiple-channel EEG also adds the high dimensionality to the frequency feature space. Feature selection will be required to stabilize the performance of the classifier. In this paper, we develop a new method based on Evolutionary Algorithm (EA) to solve these two problems simultaneously. The real-valued EA encodes both the spatial filter estimates and the feature selection into its solution and optimizes it with respect to the classification error. Three Fourier based designs are tested in this paper. Our results show that the combination of Fourier based method with covariance matrix adaptation evolution strategy (CMA-ES) has the best overall performance.
A topological approach for protein classification
Cang, Zixuan; Mu, Lin; Wu, Kedi; ...
2015-11-04
Here, protein function and dynamics are closely related to its sequence and structure. However, prediction of protein function and dynamics from its sequence and structure is still a fundamental challenge in molecular biology. Protein classification, which is typically done through measuring the similarity between proteins based on protein sequence or physical information, serves as a crucial step toward the understanding of protein function and dynamics.
Informal settlement classification using point-cloud and image-based features from UAV data
NASA Astrophysics Data System (ADS)
Gevaert, C. M.; Persello, C.; Sliuzas, R.; Vosselman, G.
2017-03-01
Unmanned Aerial Vehicles (UAVs) are capable of providing very high resolution and up-to-date information to support informal settlement upgrading projects. In order to provide accurate basemaps, urban scene understanding through the identification and classification of buildings and terrain is imperative. However, common characteristics of informal settlements such as small, irregular buildings with heterogeneous roof material and large presence of clutter challenge state-of-the-art algorithms. Furthermore, it is of interest to analyse which fundamental attributes are suitable for describing these objects in different geographic locations. This work investigates how 2D radiometric and textural features, 2.5D topographic features, and 3D geometric features obtained from UAV imagery can be integrated to obtain a high classification accuracy in challenging classification problems for the analysis of informal settlements. UAV datasets from informal settlements in two different countries are compared in order to identify salient features for specific objects in heterogeneous urban environments. Findings show that the integration of 2D and 3D features leads to an overall accuracy of 91.6% and 95.2% respectively for informal settlements in Kigali, Rwanda and Maldonado, Uruguay.
Classification of kidney and liver tissue using ultrasound backscatter data
NASA Astrophysics Data System (ADS)
Aalamifar, Fereshteh; Rivaz, Hassan; Cerrolaza, Juan J.; Jago, James; Safdar, Nabile; Boctor, Emad M.; Linguraru, Marius G.
2015-03-01
Ultrasound (US) tissue characterization provides valuable information for the initialization of automatic segmentation algorithms, and can further provide complementary information for diagnosis of pathologies. US tissue characterization is challenging due to the presence of various types of image artifacts and dependence on the sonographer's skills. One way of overcoming this challenge is by characterizing images based on the distribution of the backscatter data derived from the interaction between US waves and tissue. The goal of this work is to classify liver versus kidney tissue in 3D volumetric US data using the distribution of backscatter US data recovered from end-user displayed Bmode image available in clinical systems. To this end, we first propose the computation of a large set of features based on the homodyned-K distribution of the speckle as well as the correlation coefficients between small patches in 3D images. We then utilize the random forests framework to select the most important features for classification. Experiments on in-vivo 3D US data from nine pediatric patients with hydronephrosis showed an average accuracy of 94% for the classification of liver and kidney tissues showing a good potential of this work to assist in the classification and segmentation of abdominal soft tissue.
C-fuzzy variable-branch decision tree with storage and classification error rate constraints
NASA Astrophysics Data System (ADS)
Yang, Shiueng-Bien
2009-10-01
The C-fuzzy decision tree (CFDT), which is based on the fuzzy C-means algorithm, has recently been proposed. The CFDT is grown by selecting the nodes to be split according to its classification error rate. However, the CFDT design does not consider the classification time taken to classify the input vector. Thus, the CFDT can be improved. We propose a new C-fuzzy variable-branch decision tree (CFVBDT) with storage and classification error rate constraints. The design of the CFVBDT consists of two phases-growing and pruning. The CFVBDT is grown by selecting the nodes to be split according to the classification error rate and the classification time in the decision tree. Additionally, the pruning method selects the nodes to prune based on the storage requirement and the classification time of the CFVBDT. Furthermore, the number of branches of each internal node is variable in the CFVBDT. Experimental results indicate that the proposed CFVBDT outperforms the CFDT and other methods.
32 CFR 2001.12 - Duration of classification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... classification authority shall follow the sequence listed in paragraphs (a)(1)(i), (ii), and (iii) of this... to the sequence in paragraph (a)(1) of this section are as follows: (i) If an original classification... and shall be designated with the following marking, “50X1-HUM;” or (ii) If an original classification...
32 CFR 2001.12 - Duration of classification.
Code of Federal Regulations, 2010 CFR
2010-07-01
... classification authority shall follow the sequence listed in paragraphs (a)(1)(i), (ii), and (iii) of this... to the sequence in paragraph (a)(1) of this section are as follows: (i) If an original classification... and shall be designated with the following marking, “50X1-HUM;” or (ii) If an original classification...
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
NASA Technical Reports Server (NTRS)
Fisher, Kevin; Chang, Chein-I
2009-01-01
Progressive band selection (PBS) reduces spectral redundancy without significant loss of information, thereby reducing hyperspectral image data volume and processing time. Used onboard a spacecraft, it can also reduce image downlink time. PBS prioritizes an image's spectral bands according to priority scores that measure their significance to a specific application. Then it uses one of three methods to select an appropriate number of the most useful bands. Key challenges for PBS include selecting an appropriate criterion to generate band priority scores, and determining how many bands should be retained in the reduced image. The image's Virtual Dimensionality (VD), once computed, is a reasonable estimate of the latter. We describe the major design details of PBS and test PBS in a land classification experiment.
Lambert, Thomas; Nahler, Alexander; Rohla, Miklos; Reiter, Christian; Grund, Michael; Kammler, Jürgen; Blessberger, Hermann; Kypta, Alexander; Kellermair, Jörg; Schwarz, Stefan; Starnawski, Jennifer A; Lichtenauer, Michael; Weiss, Thomas W; Huber, Kurt; Steinwender, Clemens
2016-10-01
Defining an adequate endpoint for renal denervation trials represents a major challenge. A high inter-individual and intra-individual variability of blood pressure levels as well as a partial or total non-adherence on antihypertensive drugs hamper treatment evaluations after renal denervation. Blood pressure measurements at a single point in time as used as primary endpoint in most clinical trials on renal denervation, might not be sufficient to discriminate between patients who do or do not respond to renal denervation. We compared the traditional responder classification (defined as systolic 24-hour blood pressure reduction of -5mmHg six months after renal denervation) with a novel definition of an ideal respondership (based on a 24h blood pressure reduction at no point in time, one, or all follow-up timepoints). We were able to re-classify almost a quarter of patients. Blood pressure variability was substantial in patients traditionally defined as responders. On the other hand, our novel classification of an ideal respondership seems to be clinically superior in discriminating sustained from pseudo-response to renal denervation. Based on our observations, we recommend that the traditional response classification should be reconsidered and possibly strengthened by using a composite endpoint of 24h-BP reductions at different follow-up-visits. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2017-04-01
The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.
sw-SVM: sensor weighting support vector machines for EEG-based brain-computer interfaces.
Jrad, N; Congedo, M; Phlypo, R; Rousseau, S; Flamary, R; Yger, F; Rakotomamonjy, A
2011-10-01
In many machine learning applications, like brain-computer interfaces (BCI), high-dimensional sensor array data are available. Sensor measurements are often highly correlated and signal-to-noise ratio is not homogeneously spread across sensors. Thus, collected data are highly variable and discrimination tasks are challenging. In this work, we focus on sensor weighting as an efficient tool to improve the classification procedure. We present an approach integrating sensor weighting in the classification framework. Sensor weights are considered as hyper-parameters to be learned by a support vector machine (SVM). The resulting sensor weighting SVM (sw-SVM) is designed to satisfy a margin criterion, that is, the generalization error. Experimental studies on two data sets are presented, a P300 data set and an error-related potential (ErrP) data set. For the P300 data set (BCI competition III), for which a large number of trials is available, the sw-SVM proves to perform equivalently with respect to the ensemble SVM strategy that won the competition. For the ErrP data set, for which a small number of trials are available, the sw-SVM shows superior performances as compared to three state-of-the art approaches. Results suggest that the sw-SVM promises to be useful in event-related potentials classification, even with a small number of training trials.
Zhu, Lianzhang; Chen, Leiming; Zhao, Dehai
2017-01-01
Accurate emotion recognition from speech is important for applications like smart health care, smart entertainment, and other smart services. High accuracy emotion recognition from Chinese speech is challenging due to the complexities of the Chinese language. In this paper, we explore how to improve the accuracy of speech emotion recognition, including speech signal feature extraction and emotion classification methods. Five types of features are extracted from a speech sample: mel frequency cepstrum coefficient (MFCC), pitch, formant, short-term zero-crossing rate and short-term energy. By comparing statistical features with deep features extracted by a Deep Belief Network (DBN), we attempt to find the best features to identify the emotion status for speech. We propose a novel classification method that combines DBN and SVM (support vector machine) instead of using only one of them. In addition, a conjugate gradient method is applied to train DBN in order to speed up the training process. Gender-dependent experiments are conducted using an emotional speech database created by the Chinese Academy of Sciences. The results show that DBN features can reflect emotion status better than artificial features, and our new classification approach achieves an accuracy of 95.8%, which is higher than using either DBN or SVM separately. Results also show that DBN can work very well for small training databases if it is properly designed. PMID:28737705
Optical Neural Classification Of Binary Patterns
NASA Astrophysics Data System (ADS)
Gustafson, Steven C.; Little, Gordon R.
1988-05-01
Binary pattern classification that may be implemented using optical hardware and neural network algorithms is considered. Pattern classification problems that have no concise description (as in classifying handwritten characters) or no concise computation (as in NP-complete problems) are expected to be particularly amenable to this approach. For example, optical processors that efficiently classify binary patterns in accordance with their Boolean function complexity might be designed. As a candidate for such a design, an optical neural network model is discussed that is designed for binary pattern classification and that consists of an optical resonator with a dynamic multiplex-recorded reflection hologram and a phase conjugate mirror with thresholding and gain. In this model, learning or training examples of binary patterns may be recorded on the hologram such that one bit in each pattern marks the pattern class. Any input pattern, including one with an unknown class or marker bit, will be modified by a large number of parallel interactions with the reflection hologram and nonlinear mirror. After perhaps several seconds and 100 billion interactions, a steady-state pattern may develop with a marker bit that represents a minimum-Boolean-complexity classification of the input pattern. Computer simulations are presented that illustrate progress in understanding the behavior of this model and in developing a processor design that could have commanding and enduring performance advantages compared to current pattern classification techniques.
Xu, Yun; Muhamadali, Howbeer; Sayqal, Ali; Dixon, Neil; Goodacre, Royston
2016-10-28
Partial least squares (PLS) is one of the most commonly used supervised modelling approaches for analysing multivariate metabolomics data. PLS is typically employed as either a regression model (PLS-R) or a classification model (PLS-DA). However, in metabolomics studies it is common to investigate multiple, potentially interacting, factors simultaneously following a specific experimental design. Such data often cannot be considered as a "pure" regression or a classification problem. Nevertheless, these data have often still been treated as a regression or classification problem and this could lead to ambiguous results. In this study, we investigated the feasibility of designing a hybrid target matrix Y that better reflects the experimental design than simple regression or binary class membership coding commonly used in PLS modelling. The new design of Y coding was based on the same principle used by structural modelling in machine learning techniques. Two real metabolomics datasets were used as examples to illustrate how the new Y coding can improve the interpretability of the PLS model compared to classic regression/classification coding.
NASA Astrophysics Data System (ADS)
Sun, Z.; Xu, Y.; Hoegner, L.; Stilla, U.
2018-05-01
In this work, we propose a classification method designed for the labeling of MLS point clouds, with detrended geometric features extracted from the points of the supervoxel-based local context. To achieve the analysis of complex 3D urban scenes, acquired points of the scene should be tagged with individual labels of different classes. Thus, assigning a unique label to the points of an object that belong to the same category plays an essential role in the entire 3D scene analysis workflow. Although plenty of studies in this field have been reported, this work is still a challenging task. Specifically, in this work: 1) A novel geometric feature extraction method, detrending the redundant and in-salient information in the local context, is proposed, which is proved to be effective for extracting local geometric features from the 3D scene. 2) Instead of using individual point as basic element, the supervoxel-based local context is designed to encapsulate geometric characteristics of points, providing a flexible and robust solution for feature extraction. 3) Experiments using complex urban scene with manually labeled ground truth are conducted, and the performance of proposed method with respect to different methods is analyzed. With the testing dataset, we have obtained a result of 0.92 for overall accuracy for assigning eight semantic classes.
Bodden, Carina; Siestrup, Sophie; Palme, Rupert; Kaiser, Sylvia; Sachser, Norbert; Richter, S Helene
2018-01-15
According to current guidelines on animal experiments, a prospective assessment of the severity of each procedure is mandatory. However, so far, the classification of procedures into different severity categories mainly relies on theoretic considerations, since it is not entirely clear which of the various procedures compromise the welfare of animals, or, to what extent. Against this background, a systematic empirical investigation of the impact of each procedure, including behavioral testing, seems essential. Therefore, the present study was designed to elucidate the effects of repeated versus single testing on mouse welfare, using one of the most commonly used paradigms for behavioral phenotyping in behavioral neuroscience, the open-field test. In an independent groups design, laboratory mice (Mus musculus f. domestica) experienced either repeated, single, or no open-field testing - procedures that are assigned to different severity categories. Interestingly, testing experiences did not affect fecal corticosterone metabolites, body weights, elevated plus-maze or home cage behavior differentially. Thus, with respect to the assessed endocrinological, physical, and behavioral outcome measures, no signs of compromised welfare could be detected in mice that were tested in the open-field repeatedly, once, or, not at all. These findings challenge current classification guidelines and may, furthermore, stimulate systematic research on the severity of single procedures involving living animals. Copyright © 2017 Elsevier B.V. All rights reserved.
14 CFR 1203.800 - Delegations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... paragraph (b)(3) of the section are authorized to declassify top Secret security classification assignments... which NASA has original classification authority. (b) Designated officials—(1) TOP SECRET Classification... this section. (3) Declassification Authority, Top Secret Assignments over 25 years Old. (i) Agency...
Termination Criteria for Computerized Classification Testing
ERIC Educational Resources Information Center
Thompson, Nathan A.
2011-01-01
Computerized classification testing (CCT) is an approach to designing tests with intelligent algorithms, similar to adaptive testing, but specifically designed for the purpose of classifying examinees into categories such as "pass" and "fail." Like adaptive testing for point estimation of ability, the key component is the…
28 CFR 523.34 - How can I challenge DCEGT award decisions?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false How can I challenge DCEGT award decisions? 523.34 Section 523.34 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER COMPUTATION OF SENTENCE District of Columbia Educational Good Time...
4 CFR 21.5 - Protest issues not for consideration.
Code of Federal Regulations, 2010 CFR
2010-01-01
... official to file a protest or not to file a protest in connection with a public-private competition. [61 FR... business size standards and North American Industry Classification System (NAICS) standards. Challenges of established size standards or the size status of particular firms, and challenges of the selected NAICS code...
28 CFR 523.34 - How can I challenge DCEGT award decisions?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 2 2011-07-01 2011-07-01 false How can I challenge DCEGT award decisions? 523.34 Section 523.34 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER COMPUTATION OF SENTENCE District of Columbia Educational Good Time...
28 CFR 523.34 - How can I challenge DCEGT award decisions?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 2 2013-07-01 2013-07-01 false How can I challenge DCEGT award decisions? 523.34 Section 523.34 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER COMPUTATION OF SENTENCE District of Columbia Educational Good Time...
28 CFR 523.34 - How can I challenge DCEGT award decisions?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 2 2014-07-01 2014-07-01 false How can I challenge DCEGT award decisions? 523.34 Section 523.34 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER COMPUTATION OF SENTENCE District of Columbia Educational Good Time...
28 CFR 523.34 - How can I challenge DCEGT award decisions?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 2 2012-07-01 2012-07-01 false How can I challenge DCEGT award decisions? 523.34 Section 523.34 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER COMPUTATION OF SENTENCE District of Columbia Educational Good Time...
Towards the use of similarity distances to music genre classification: A comparative study.
Goienetxea, Izaro; Martínez-Otzeta, José María; Sierra, Basilio; Mendialdua, Iñigo
2018-01-01
Music genre classification is a challenging research concept, for which open questions remain regarding classification approach, music piece representation, distances between/within genres, and so on. In this paper an investigation on the classification of generated music pieces is performed, based on the idea that grouping close related known pieces in different sets -or clusters- and then generating in an automatic way a new song which is somehow "inspired" in each set, the new song would be more likely to be classified as belonging to the set which inspired it, based on the same distance used to separate the clusters. Different music pieces representations and distances among pieces are used; obtained results are promising, and indicate the appropriateness of the used approach even in a such a subjective area as music genre classification is.
Towards the use of similarity distances to music genre classification: A comparative study
Martínez-Otzeta, José María; Sierra, Basilio; Mendialdua, Iñigo
2018-01-01
Music genre classification is a challenging research concept, for which open questions remain regarding classification approach, music piece representation, distances between/within genres, and so on. In this paper an investigation on the classification of generated music pieces is performed, based on the idea that grouping close related known pieces in different sets –or clusters– and then generating in an automatic way a new song which is somehow “inspired” in each set, the new song would be more likely to be classified as belonging to the set which inspired it, based on the same distance used to separate the clusters. Different music pieces representations and distances among pieces are used; obtained results are promising, and indicate the appropriateness of the used approach even in a such a subjective area as music genre classification is. PMID:29444160
Congenital Differences of the Upper Extremity: Classification and Treatment Principles
2011-01-01
For hand surgeons, the treatment of children with congenital differences of the upper extremity is challenging because of the diverse spectrum of conditions encountered, but the task is also rewarding because it provides surgeons with the opportunity to impact a child's growth and development. An ideal classification of congenital differences of the upper extremity would reflect the full spectrum of morphologic abnormalities and encompass etiology, a guide to treatment, and provide prognoses. In this report, I review current classification systems and discuss their contradictions and limitations. In addition, I present a modified classification system and provide treatment principles. As our understanding of the etiology of congenital differences of the upper extremity increases and as experience of treating difficult cases accumulates, even an ideal classification system and optimal treatment strategies will undoubtedly continue to evolve. PMID:21909463
Challenges in the automated classification of variable stars in large databases
NASA Astrophysics Data System (ADS)
Graham, Matthew; Drake, Andrew; Djorgovski, S. G.; Mahabal, Ashish; Donalek, Ciro
2017-09-01
With ever-increasing numbers of astrophysical transient surveys, new facilities and archives of astronomical time series, time domain astronomy is emerging as a mainstream discipline. However, the sheer volume of data alone - hundreds of observations for hundreds of millions of sources - necessitates advanced statistical and machine learning methodologies for scientific discovery: characterization, categorization, and classification. Whilst these techniques are slowly entering the astronomer's toolkit, their application to astronomical problems is not without its issues. In this paper, we will review some of the challenges posed by trying to identify variable stars in large data collections, including appropriate feature representations, dealing with uncertainties, establishing ground truths, and simple discrete classes.
George E. Host; Carl W. Ramm; Eunice A. Padley; Kurt S. Pregitzer; James B. Hart; David T. Cleland
1992-01-01
Presents technical documentation for development of an Ecological Classification System for the Manistee National Forest in northwest Lower Michigan, and suggests procedures applicable to other ecological land classification projects. Includes discussion of sampling design, field data collection, data summarization and analyses, development of classification units,...
EOG and EMG: two important switches in automatic sleep stage classification.
Estrada, E; Nazeran, H; Barragan, J; Burk, J R; Lucas, E A; Behbehani, K
2006-01-01
Sleep is a natural periodic state of rest for the body, in which the eyes are usually closed and consciousness is completely or partially lost. In this investigation we used the EOG and EMG signals acquired from 10 patients undergoing overnight polysomnography with their sleep stages determined by expert sleep specialists based on RK rules. Differentiation between Stage 1, Awake and REM stages challenged a well trained neural network classifier to distinguish between classes when only EEG-derived signal features were used. To meet this challenge and improve the classification rate, extra features extracted from EOG and EMG signals were fed to the classifier. In this study, two simple feature extraction algorithms were applied to EOG and EMG signals. The statistics of the results were calculated and displayed in an easy to visualize fashion to observe tendencies for each sleep stage. Inclusion of these features show a great promise to improve the classification rate towards the target rate of 100%
Age and gender classification in the wild with unsupervised feature learning
NASA Astrophysics Data System (ADS)
Wan, Lihong; Huo, Hong; Fang, Tao
2017-03-01
Inspired by unsupervised feature learning (UFL) within the self-taught learning framework, we propose a method based on UFL, convolution representation, and part-based dimensionality reduction to handle facial age and gender classification, which are two challenging problems under unconstrained circumstances. First, UFL is introduced to learn selective receptive fields (filters) automatically by applying whitening transformation and spherical k-means on random patches collected from unlabeled data. The learning process is fast and has no hyperparameters to tune. Then, the input image is convolved with these filters to obtain filtering responses on which local contrast normalization is applied. Average pooling and feature concatenation are then used to form global face representation. Finally, linear discriminant analysis with part-based strategy is presented to reduce the dimensions of the global representation and to improve classification performances further. Experiments on three challenging databases, namely, Labeled faces in the wild, Gallagher group photos, and Adience, demonstrate the effectiveness of the proposed method relative to that of state-of-the-art approaches.
77 FR 30087 - Air Quality Designations for the 2008 Ozone National Ambient Air Quality Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-21
...This rule establishes initial air quality designations for most areas in the United States, including areas of Indian country, for the 2008 primary and secondary national ambient air quality standards (NAAQS) for ozone. The designations for several counties in Illinois, Indiana, and Wisconsin that the EPA is considering for inclusion in the Chicago nonattainment area will be designated in a subsequent action, no later than May 31, 2012. Areas designated as nonattainment are also being classified by operation of law according to the severity of their air quality problems. The classification categories are Marginal, Moderate, Serious, Severe, and Extreme. The EPA is establishing the air quality thresholds that define the classifications in a separate rule that the EPA is signing and publishing in the Federal Register on the same schedule as these designations. In accordance with that separate rule, six nonattainment areas in California are being reclassified to a higher classification.
Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard
2017-02-01
Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.
From landscape to domain: Soils role in landscape classifications
USDA-ARS?s Scientific Manuscript database
Soil landscape classifications are designed to divide landscapes into units with significance for the provisioning and regulating of ecosystem services and the development of conservation plans for natural resources. More specifically, such classifications serve as the basis for stratifying manageme...
ESTCP Pilot Program - Classification Approaches in Munitions Response
2008-11-17
Electromagnetic induction sensors detect ferrous and 57 nonferrous metallic objects and can be effective in geology that challenges magnetometers. EM...harmless metallic objects or geology. Application of technology to separate the munitions from other objects, known as classification, offers the potential...detectable signals are excavated. Many of these detections do not correspond to munitions, but rather to other harmless metallic objects or geology, termed
A Social Media Based Index of Mental Well-Being in College Campuses.
Bagroy, Shrey; Kumaraguru, Ponnurangam; De Choudhury, Munmun
2017-05-01
Psychological distress in the form of depression, anxiety and other mental health challenges among college students is a growing health concern. Dearth of accurate, continuous, and multi-campus data on mental well-being presents significant challenges to intervention and mitigation efforts in college campuses. We examine the potential of social media as a new "barometer" for quantifying the mental well-being of college populations. Utilizing student-contributed data in Reddit communities of over 100 universities, we first build and evaluate a transfer learning based classification approach that can detect mental health expressions with 97% accuracy. Thereafter, we propose a robust campus-specific Mental Well-being Index: MWI. We find that MWI is able to reveal meaningful temporal patterns of mental well-being in campuses, and to assess how their expressions relate to university attributes like size, academic prestige, and student demographics. We discuss the implications of our work for improving counselor efforts, and in the design of tools that can enable better assessment of the mental health climate of college campuses.
Multi-Temporal Classification and Change Detection Using Uav Images
NASA Astrophysics Data System (ADS)
Makuti, S.; Nex, F.; Yang, M. Y.
2018-05-01
In this paper different methodologies for the classification and change detection of UAV image blocks are explored. UAV is not only the cheapest platform for image acquisition but it is also the easiest platform to operate in repeated data collections over a changing area like a building construction site. Two change detection techniques have been evaluated in this study: the pre-classification and the post-classification algorithms. These methods are based on three main steps: feature extraction, classification and change detection. A set of state of the art features have been used in the tests: colour features (HSV), textural features (GLCM) and 3D geometric features. For classification purposes Conditional Random Field (CRF) has been used: the unary potential was determined using the Random Forest algorithm while the pairwise potential was defined by the fully connected CRF. In the performed tests, different feature configurations and settings have been considered to assess the performance of these methods in such challenging task. Experimental results showed that the post-classification approach outperforms the pre-classification change detection method. This was analysed using the overall accuracy, where by post classification have an accuracy of up to 62.6 % and the pre classification change detection have an accuracy of 46.5 %. These results represent a first useful indication for future works and developments.
Overview of classification systems in peripheral artery disease.
Hardman, Rulon L; Jazaeri, Omid; Yi, J; Smith, M; Gupta, Rajan
2014-12-01
Peripheral artery disease (PAD), secondary to atherosclerotic disease, is currently the leading cause of morbidity and mortality in the western world. While PAD is common, it is estimated that the majority of patients with PAD are undiagnosed and undertreated. The challenge to the treatment of PAD is to accurately diagnose the symptoms and determine treatment for each patient. The varied presentations of peripheral vascular disease have led to numerous classification schemes throughout the literature. Consistent grading of patients leads to both objective criteria for treating patients and a baseline for clinical follow-up. Reproducible classification systems are also important in clinical trials and when comparing medical, surgical, and endovascular treatment paradigms. This article reviews the various classification systems for PAD and advantages to each system.
Boroda, A M
2004-03-01
Current clinical gynecology considers pathological states of endometrium (PSE) as one of the most challenging issue of the day. Many questions of etiology, pathogenesis, diagnostics, and treatment of PSE are still under discussion. Nowadays there isn't a whole agreed classification of PSE. Morphological classification remains the most widely used one, but morphological changes occurring in the endometrium don't show a wide variety of disorders related to these pathological states. A new clinicopathogenetic classification of PSE was proposed, which is based on choosing the optimal treatment with functional state of the disease taken into account. This classification helps us to perceive the problem as a whole with choosing functionally based treatment for each patient.
Mapping ecological states in a complex environment
NASA Astrophysics Data System (ADS)
Steele, C. M.; Bestelmeyer, B.; Burkett, L. M.; Ayers, E.; Romig, K.; Slaughter, A.
2013-12-01
The vegetation of northern Chihuahuan Desert rangelands is sparse, heterogeneous and for most of the year, consists of a large proportion of non-photosynthetic material. The soils in this area are spectrally bright and variable in their reflectance properties. Both factors provide challenges to the application of remote sensing for estimating canopy variables (e.g., leaf area index, biomass, percentage canopy cover, primary production). Additionally, with reference to current paradigms of rangeland health assessment, remotely-sensed estimates of canopy variables have limited practical use to the rangeland manager if they are not placed in the context of ecological site and ecological state. To address these challenges, we created a multifactor classification system based on the USDA-NRCS ecological site schema and associated state-and-transition models to map ecological states on desert rangelands in southern New Mexico. Applying this system using per-pixel image processing techniques and multispectral, remotely sensed imagery raised other challenges. Per-pixel image classification relies upon the spectral information in each pixel alone, there is no reference to the spatial context of the pixel and its relationship with its neighbors. Ecological state classes may have direct relevance to managers but the non-unique spectral properties of different ecological state classes in our study area means that per-pixel classification of multispectral data performs poorly in discriminating between different ecological states. We found that image interpreters who are familiar with the landscape and its associated ecological site descriptions perform better than per-pixel classification techniques in assigning ecological states. However, two important issues affect manual classification methods: subjectivity of interpretation and reproducibility of results. An alternative to per-pixel classification and manual interpretation is object-based image analysis. Object-based image analysis provides a platform for classification that more closely resembles human recognition of objects within a remotely sensed image. The analysis presented here compares multiple thematic maps created for test locations on the USDA-ARS Jornada Experimental Range ranch. Three study sites in different pastures, each 300 ha in size, were selected for comparison on the basis of their ecological site type (';Clayey', ';Sandy' and a combination of both) and the degree of complexity of vegetation cover. Thematic maps were produced for each study site using (i) manual interpretation of digital aerial photography (by five independent interpreters); (ii) object-oriented, decision-tree classification of fine and moderate spatial resolution imagery (Quickbird; Landsat Thematic Mapper) and (iii) ground survey. To identify areas of uncertainty, we compared agreement in location, areal extent and class assignation between 5 independently produced, manually-digitized ecological state maps and with the map created from ground survey. Location, areal extent and class assignation of the map produced by object-oriented classification was also assessed with reference to the ground survey map.
An Annotated Bibliography on Operator Mental Workload Assessment
1980-03-26
The descriptors associated with each citation designate the general workload classification, the specific workload classification, tue type of...systems, with all of their advanced sen3ors and avionics, must be compatible with the capabilities and limitations of the aircrew. During the design ...constructs or models was included only if mental workload was at least potentially assessable from the constructs or models. C. Experimental design . A
LUNGx Challenge for computerized lung nodule classification
Armato, Samuel G.; Drukker, Karen; Li, Feng; ...
2016-12-19
The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. We present ten groups that applied their own methods to 73 lung nodules (37 benign and 36 malignant) thatmore » were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. Lastly, the continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community.« less
LUNGx Challenge for computerized lung nodule classification
Armato, Samuel G.; Drukker, Karen; Li, Feng; Hadjiiski, Lubomir; Tourassi, Georgia D.; Engelmann, Roger M.; Giger, Maryellen L.; Redmond, George; Farahani, Keyvan; Kirby, Justin S.; Clarke, Laurence P.
2016-01-01
Abstract. The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. Ten groups applied their own methods to 73 lung nodules (37 benign and 36 malignant) that were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. The continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community. PMID:28018939
LUNGx Challenge for computerized lung nodule classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armato, Samuel G.; Drukker, Karen; Li, Feng
The purpose of this work is to describe the LUNGx Challenge for the computerized classification of lung nodules on diagnostic computed tomography (CT) scans as benign or malignant and report the performance of participants’ computerized methods along with that of six radiologists who participated in an observer study performing the same Challenge task on the same dataset. The Challenge provided sets of calibration and testing scans, established a performance assessment process, and created an infrastructure for case dissemination and result submission. We present ten groups that applied their own methods to 73 lung nodules (37 benign and 36 malignant) thatmore » were selected to achieve approximate size matching between the two cohorts. Area under the receiver operating characteristic curve (AUC) values for these methods ranged from 0.50 to 0.68; only three methods performed statistically better than random guessing. The radiologists’ AUC values ranged from 0.70 to 0.85; three radiologists performed statistically better than the best-performing computer method. The LUNGx Challenge compared the performance of computerized methods in the task of differentiating benign from malignant lung nodules on CT scans, placed in the context of the performance of radiologists on the same task. Lastly, the continued public availability of the Challenge cases will provide a valuable resource for the medical imaging research community.« less
Cox, Emily; Martin, Bradley C; Van Staa, Tjeerd; Garbe, Edeltraut; Siebert, Uwe; Johnson, Michael L
2009-01-01
The goal of comparative effectiveness analysis is to examine the relationship between two variables, treatment, or exposure and effectiveness or outcome. Unlike data obtained through randomized controlled trials, researchers face greater challenges with causal inference with observational studies. Recognizing these challenges, a task force was formed to develop a guidance document on methodological approaches to addresses these biases. The task force was commissioned and a Chair was selected by the International Society for Pharmacoeconomics and Outcomes Research Board of Directors in October 2007. This report, the second of three reported in this issue of the Journal, discusses the inherent biases when using secondary data sources for comparative effectiveness analysis and provides methodological recommendations to help mitigate these biases. The task force report provides recommendations and tools for researchers to mitigate threats to validity from bias and confounding in measurement of exposure and outcome. Recommendations on design of study included: the need for data analysis plan with causal diagrams; detailed attention to classification bias in definition of exposure and clinical outcome; careful and appropriate use of restriction; extreme care to identify and control for confounding factors, including time-dependent confounding. Design of nonrandomized studies of comparative effectiveness face several daunting issues, including measurement of exposure and outcome challenged by misclassification and confounding. Use of causal diagrams and restriction are two techniques that can improve the theoretical basis for analyzing treatment effects in study populations of more homogeneity, with reduced loss of generalizability.
Filtering big data from social media--Building an early warning system for adverse drug reactions.
Yang, Ming; Kiang, Melody; Shang, Wei
2015-04-01
Adverse drug reactions (ADRs) are believed to be a leading cause of death in the world. Pharmacovigilance systems are aimed at early detection of ADRs. With the popularity of social media, Web forums and discussion boards become important sources of data for consumers to share their drug use experience, as a result may provide useful information on drugs and their adverse reactions. In this study, we propose an automated ADR related posts filtering mechanism using text classification methods. In real-life settings, ADR related messages are highly distributed in social media, while non-ADR related messages are unspecific and topically diverse. It is expensive to manually label a large amount of ADR related messages (positive examples) and non-ADR related messages (negative examples) to train classification systems. To mitigate this challenge, we examine the use of a partially supervised learning classification method to automate the process. We propose a novel pharmacovigilance system leveraging a Latent Dirichlet Allocation modeling module and a partially supervised classification approach. We select drugs with more than 500 threads of discussion, and collect all the original posts and comments of these drugs using an automatic Web spidering program as the text corpus. Various classifiers were trained by varying the number of positive examples and the number of topics. The trained classifiers were applied to 3000 posts published over 60 days. Top-ranked posts from each classifier were pooled and the resulting set of 300 posts was reviewed by a domain expert to evaluate the classifiers. Compare to the alternative approaches using supervised learning methods and three general purpose partially supervised learning methods, our approach performs significantly better in terms of precision, recall, and the F measure (the harmonic mean of precision and recall), based on a computational experiment using online discussion threads from Medhelp. Our design provides satisfactory performance in identifying ADR related posts for post-marketing drug surveillance. The overall design of our system also points out a potentially fruitful direction for building other early warning systems that need to filter big data from social media networks. Copyright © 2015 Elsevier Inc. All rights reserved.
49 CFR 1248.100 - Commodity classification designated.
Code of Federal Regulations, 2010 CFR
2010-10-01
... STATISTICS Commodity Code § 1248.100 Commodity classification designated. Commencing with reports for the..., reports of commodity statistics required to be made to the Board, shall be based on the commodity codes... Statistics, 1963, issued by the Bureau of the Budget, and on additional codes 411 through 462 shown in § 1248...
Challenges of a community based pragmatic, randomised controlled trial of weight loss maintenance.
Randell, Elizabeth; McNamara, Rachel; Shaw, Christine; Espinasse, Aude; Simpson, Sharon Anne
2015-12-18
Randomised controlled trials (RCTs) have a reputation for being inherently difficult to deliver as planned and often face unforeseen challenges and delays, particularly in relation to organisational and governance difficulties, participant interest, constraints due to allocation of costs, local investigator interest and lengthy bureaucracy. Recruitment is often difficult and the challenges faced often impact on the cost and delivery of a successful trial within the funded period. This paper reflects upon the challenges faced in delivering a pragmatic RCT of weight loss maintenance in a community setting and suggests some potential solutions. The weight loss maintenance in adults trial aimed to evaluate the impact of a 12 month, individually tailored weight maintenance intervention on BMI 3 years from randomisation. Participants were recruited primarily from participant identification centres (PICs)-GP surgeries, exercise on referral schemes and slimming world. The intervention was delivered in community settings. A recruitment strategy implementation plan was drafted to address and monitor poor recruitment. Delays in opening and recruitment were experienced early on. Some were beyond the control of the study team such as; disagreement over allocation of national health service costs and PIC classification as well as difficulties in securing support from research networks. That the intervention was delivered in community settings was often at the root of these issues. Key items to address at the design stage of future trials include feasibility of eligibility criteria. The most effective element of the recruitment implementation plan was to refocus sources of recruitment and target only those who could fulfil the eligibility criteria immediately. Learnings from this trial should be kept in mind by those designing similar studies in the future. Considering potential governance, cost and research network support implications at the design stage of pragmatic trials of any community-based complex intervention is paramount. The appropriateness and viability of inclusion criteria also require careful consideration as does use of a targeted advertising strategy. ISRCTN35774128, 12/01/2010.
Differences in forest area classification based on tree tally from variable- and fixed-radius plots
David Azuma; Vicente J. Monleon
2011-01-01
In forest inventory, it is not enough to formulate a definition; it is also necessary to define the "measurement procedure." In the classification of forestland by dominant cover type, the measurement design (the plot) can affect the outcome of the classification. We present results of a simulation study comparing classification of the dominant cover type...
NASA Astrophysics Data System (ADS)
Wozniak, Breann M.
The purpose of this study was to examine the effect of process-oriented guided-inquiry learning (POGIL) on non-majors college biology students' understanding of biological classification. This study addressed an area of science instruction, POGIL in the non-majors college biology laboratory, which has yet to be qualitatively and quantitatively researched. A concurrent triangulation mixed methods approach was used. Students' understanding of biological classification was measured in two areas: scores on pre and posttests (consisting of 11 multiple choice questions), and conceptions of classification as elicited in pre and post interviews and instructor reflections. Participants were Minnesota State University, Mankato students enrolled in BIOL 100 Summer Session. One section was taught with the traditional curriculum (n = 6) and the other section in the POGIL curriculum (n = 10) developed by the researcher. Three students from each section were selected to take part in pre and post interviews. There were no significant differences within each teaching method (p < .05). There was a tendency of difference in the means. The POGIL group may have scored higher on the posttest (M = 8.830 +/- .477 vs. M = 7.330 +/- .330; z =-1.729, p = .084) and the traditional group may have scored higher on the pretest than the posttest (M = 8.333 +/- .333 vs M = 7.333 +/- .333; z = -1.650 , p = .099). Two themes emerged after the interviews and instructor reflections: 1) After instruction students had a more extensive understanding of classification in three areas: vocabulary terms, physical characteristics, and types of evidence used to classify. Both groups extended their understanding, but only POGIL students could explain how molecular evidence is used in classification. 2) The challenges preventing students from understanding classification were: familiar animal categories and aquatic habitats, unfamiliar organisms, combining and subdividing initial groupings, and the hierarchical nature of classification. The POGIL students were the only group to surpass these challenges after the teaching intervention. This study shows that POGIL is an effective technique at eliciting students' misconceptions, and addressing these misconceptions, leading to an increase in student understanding of biological classification.
Classification of Regional Ionospheric Disturbances Based on Support Vector Machines
NASA Astrophysics Data System (ADS)
Begüm Terzi, Merve; Arikan, Feza; Arikan, Orhan; Karatay, Secil
2016-07-01
Ionosphere is an anisotropic, inhomogeneous, time varying and spatio-temporally dispersive medium whose parameters can be estimated almost always by using indirect measurements. Geomagnetic, gravitational, solar or seismic activities cause variations of ionosphere at various spatial and temporal scales. This complex spatio-temporal variability is challenging to be identified due to extensive scales in period, duration, amplitude and frequency of disturbances. Since geomagnetic and solar indices such as Disturbance storm time (Dst), F10.7 solar flux, Sun Spot Number (SSN), Auroral Electrojet (AE), Kp and W-index provide information about variability on a global scale, identification and classification of regional disturbances poses a challenge. The main aim of this study is to classify the regional effects of global geomagnetic storms and classify them according to their risk levels. For this purpose, Total Electron Content (TEC) estimated from GPS receivers, which is one of the major parameters of ionosphere, will be used to model the regional and local variability that differs from global activity along with solar and geomagnetic indices. In this work, for the automated classification of the regional disturbances, a classification technique based on a robust machine learning technique that have found wide spread use, Support Vector Machine (SVM) is proposed. SVM is a supervised learning model used for classification with associated learning algorithm that analyze the data and recognize patterns. In addition to performing linear classification, SVM can efficiently perform nonlinear classification by embedding data into higher dimensional feature spaces. Performance of the developed classification technique is demonstrated for midlatitude ionosphere over Anatolia using TEC estimates generated from the GPS data provided by Turkish National Permanent GPS Network (TNPGN-Active) for solar maximum year of 2011. As a result of implementing the developed classification technique to the Global Ionospheric Map (GIM) TEC data which is provided by the NASA Jet Propulsion Laboratory (JPL), it will be shown that SVM can be a suitable learning method to detect the anomalies in Total Electron Content (TEC) variations. This study is supported by TUBITAK 114E541 project as a part of the Scientific and Technological Research Projects Funding Program (1001).
Schneider, Bruce A.; Avivi-Reich, Meital; Mozuraitis, Mindaugas
2015-01-01
A number of statistical textbooks recommend using an analysis of covariance (ANCOVA) to control for the effects of extraneous factors that might influence the dependent measure of interest. However, it is not generally recognized that serious problems of interpretation can arise when the design contains comparisons of participants sampled from different populations (classification designs). Designs that include a comparison of younger and older adults, or a comparison of musicians and non-musicians are examples of classification designs. In such cases, estimates of differences among groups can be contaminated by differences in the covariate population means across groups. A second problem of interpretation will arise if the experimenter fails to center the covariate measures (subtracting the mean covariate score from each covariate score) whenever the design contains within-subject factors. Unless the covariate measures on the participants are centered, estimates of within-subject factors are distorted, and significant increases in Type I error rates, and/or losses in power can occur when evaluating the effects of within-subject factors. This paper: (1) alerts potential users of ANCOVA of the need to center the covariate measures when the design contains within-subject factors, and (2) indicates how they can avoid biases when one cannot assume that the expected value of the covariate measure is the same for all of the groups in a classification design. PMID:25954230
Reduction from cost-sensitive ordinal ranking to weighted binary classification.
Lin, Hsuan-Tien; Li, Ling
2012-05-01
We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.
Invertebrate Iridoviruses: A Glance over the Last Decade
Özcan, Orhan; Ilter-Akulke, Ayca Zeynep; Scully, Erin D.; Özgen, Arzu
2018-01-01
Members of the family Iridoviridae (iridovirids) are large dsDNA viruses that infect both invertebrate and vertebrate ectotherms and whose symptoms range in severity from minor reductions in host fitness to systemic disease and large-scale mortality. Several characteristics have been useful for classifying iridoviruses; however, novel strains are continuously being discovered and, in many cases, reliable classification has been challenging. Further impeding classification, invertebrate iridoviruses (IIVs) can occasionally infect vertebrates; thus, host range is often not a useful criterion for classification. In this review, we discuss the current classification of iridovirids, focusing on genomic and structural features that distinguish vertebrate and invertebrate iridovirids and viral factors linked to host interactions in IIV6 (Invertebrate iridescent virus 6). In addition, we show for the first time how complete genome sequences of viral isolates can be leveraged to improve classification of new iridovirid isolates and resolve ambiguous relations. Improved classification of the iridoviruses may facilitate the identification of genus-specific virulence factors linked with diverse host phenotypes and host interactions. PMID:29601483
Invertebrate Iridoviruses: A Glance over the Last Decade.
İnce, İkbal Agah; Özcan, Orhan; Ilter-Akulke, Ayca Zeynep; Scully, Erin D; Özgen, Arzu
2018-03-30
Members of the family Iridoviridae (iridovirids) are large dsDNA viruses that infect both invertebrate and vertebrate ectotherms and whose symptoms range in severity from minor reductions in host fitness to systemic disease and large-scale mortality. Several characteristics have been useful for classifying iridoviruses; however, novel strains are continuously being discovered and, in many cases, reliable classification has been challenging. Further impeding classification, invertebrate iridoviruses (IIVs) can occasionally infect vertebrates; thus, host range is often not a useful criterion for classification. In this review, we discuss the current classification of iridovirids, focusing on genomic and structural features that distinguish vertebrate and invertebrate iridovirids and viral factors linked to host interactions in IIV6 (Invertebrate iridescent virus 6). In addition, we show for the first time how complete genome sequences of viral isolates can be leveraged to improve classification of new iridovirid isolates and resolve ambiguous relations. Improved classification of the iridoviruses may facilitate the identification of genus-specific virulence factors linked with diverse host phenotypes and host interactions.
The Effect of Normalization in Violence Video Classification Performance
NASA Astrophysics Data System (ADS)
Ali, Ashikin; Senan, Norhalina
2017-08-01
Basically, data pre-processing is an important part of data mining. Normalization is a pre-processing stage for any type of problem statement, especially in video classification. Challenging problems that arises in video classification is because of the heterogeneous content, large variations in video quality and complex semantic meanings of the concepts involved. Therefore, to regularize this problem, it is thoughtful to ensure normalization or basically involvement of thorough pre-processing stage aids the robustness of classification performance. This process is to scale all the numeric variables into certain range to make it more meaningful for further phases in available data mining techniques. Thus, this paper attempts to examine the effect of 2 normalization techniques namely Min-max normalization and Z-score in violence video classifications towards the performance of classification rate using Multi-layer perceptron (MLP) classifier. Using Min-Max Normalization range of [0,1] the result shows almost 98% of accuracy, meanwhile Min-Max Normalization range of [-1,1] accuracy is 59% and for Z-score the accuracy is 50%.
Ambert, Kyle H; Cohen, Aaron M
2009-01-01
OBJECTIVE Free-text clinical reports serve as an important part of patient care management and clinical documentation of patient disease and treatment status. Free-text notes are commonplace in medical practice, but remain an under-used source of information for clinical and epidemiological research, as well as personalized medicine. The authors explore the challenges associated with automatically extracting information from clinical reports using their submission to the Integrating Informatics with Biology and the Bedside (i2b2) 2008 Natural Language Processing Obesity Challenge Task. DESIGN A text mining system for classifying patient comorbidity status, based on the information contained in clinical reports. The approach of the authors incorporates a variety of automated techniques, including hot-spot filtering, negated concept identification, zero-vector filtering, weighting by inverse class-frequency, and error-correcting of output codes with linear support vector machines. MEASUREMENTS Performance was evaluated in terms of the macroaveraged F1 measure. RESULTS The automated system performed well against manual expert rule-based systems, finishing fifth in the Challenge's intuitive task, and 13(th) in the textual task. CONCLUSIONS The system demonstrates that effective comorbidity status classification by an automated system is possible.
Wilkerson, Richard C; Linton, Yvonne-Marie; Fonseca, Dina M; Schultz, Ted R; Price, Dana C; Strickman, Daniel A
2015-01-01
The tribe Aedini (Family Culicidae) contains approximately one-quarter of the known species of mosquitoes, including vectors of deadly or debilitating disease agents. This tribe contains the genus Aedes, which is one of the three most familiar genera of mosquitoes. During the past decade, Aedini has been the focus of a series of extensive morphology-based phylogenetic studies published by Reinert, Harbach, and Kitching (RH&K). Those authors created 74 new, elevated or resurrected genera from what had been the single genus Aedes, almost tripling the number of genera in the entire family Culicidae. The proposed classification is based on subjective assessments of the "number and nature of the characters that support the branches" subtending particular monophyletic groups in the results of cladistic analyses of a large set of morphological characters of representative species. To gauge the stability of RH&K's generic groupings we reanalyzed their data with unweighted parsimony jackknife and maximum-parsimony analyses, with and without ordering 14 of the characters as in RH&K. We found that their phylogeny was largely weakly supported and their taxonomic rankings failed priority and other useful taxon-naming criteria. Consequently, we propose simplified aedine generic designations that 1) restore a classification system that is useful for the operational community; 2) enhance the ability of taxonomists to accurately place new species into genera; 3) maintain the progress toward a natural classification based on monophyletic groups of species; and 4) correct the current classification system that is subject to instability as new species are described and existing species more thoroughly defined. We do not challenge the phylogenetic hypotheses generated by the above-mentioned series of morphological studies. However, we reduce the ranks of the genera and subgenera of RH&K to subgenera or informal species groups, respectively, to preserve stability as new data become available.
Wilkerson, Richard C.; Linton, Yvonne-Marie; Fonseca, Dina M.; Schultz, Ted R.; Price, Dana C.; Strickman, Daniel A.
2015-01-01
The tribe Aedini (Family Culicidae) contains approximately one-quarter of the known species of mosquitoes, including vectors of deadly or debilitating disease agents. This tribe contains the genus Aedes, which is one of the three most familiar genera of mosquitoes. During the past decade, Aedini has been the focus of a series of extensive morphology-based phylogenetic studies published by Reinert, Harbach, and Kitching (RH&K). Those authors created 74 new, elevated or resurrected genera from what had been the single genus Aedes, almost tripling the number of genera in the entire family Culicidae. The proposed classification is based on subjective assessments of the “number and nature of the characters that support the branches” subtending particular monophyletic groups in the results of cladistic analyses of a large set of morphological characters of representative species. To gauge the stability of RH&K’s generic groupings we reanalyzed their data with unweighted parsimony jackknife and maximum-parsimony analyses, with and without ordering 14 of the characters as in RH&K. We found that their phylogeny was largely weakly supported and their taxonomic rankings failed priority and other useful taxon-naming criteria. Consequently, we propose simplified aedine generic designations that 1) restore a classification system that is useful for the operational community; 2) enhance the ability of taxonomists to accurately place new species into genera; 3) maintain the progress toward a natural classification based on monophyletic groups of species; and 4) correct the current classification system that is subject to instability as new species are described and existing species more thoroughly defined. We do not challenge the phylogenetic hypotheses generated by the above-mentioned series of morphological studies. However, we reduce the ranks of the genera and subgenera of RH&K to subgenera or informal species groups, respectively, to preserve stability as new data become available. PMID:26226613
Supernova Photometric Lightcurve Classification
NASA Astrophysics Data System (ADS)
Zaidi, Tayeb; Narayan, Gautham
2016-01-01
This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).
NASA Astrophysics Data System (ADS)
Teffahi, Hanane; Yao, Hongxun; Belabid, Nasreddine; Chaib, Souleyman
2018-02-01
The satellite images with very high spatial resolution have been recently widely used in image classification topic as it has become challenging task in remote sensing field. Due to a number of limitations such as the redundancy of features and the high dimensionality of the data, different classification methods have been proposed for remote sensing images classification particularly the methods using feature extraction techniques. This paper propose a simple efficient method exploiting the capability of extended multi-attribute profiles (EMAP) with sparse autoencoder (SAE) for remote sensing image classification. The proposed method is used to classify various remote sensing datasets including hyperspectral and multispectral images by extracting spatial and spectral features based on the combination of EMAP and SAE by linking them to kernel support vector machine (SVM) for classification. Experiments on new hyperspectral image "Huston data" and multispectral image "Washington DC data" shows that this new scheme can achieve better performance of feature learning than the primitive features, traditional classifiers and ordinary autoencoder and has huge potential to achieve higher accuracy for classification in short running time.
Comparison of Classifier Architectures for Online Neural Spike Sorting.
Saeed, Maryam; Khan, Amir Ali; Kamboh, Awais Mehmood
2017-04-01
High-density, intracranial recordings from micro-electrode arrays need to undergo Spike Sorting in order to associate the recorded neuronal spikes to particular neurons. This involves spike detection, feature extraction, and classification. To reduce the data transmission and power requirements, on-chip real-time processing is becoming very popular. However, high computational resources are required for classifiers in on-chip spike-sorters, making scalability a great challenge. In this review paper, we analyze several popular classifiers to propose five new hardware architectures using the off-chip training with on-chip classification approach. These include support vector classification, fuzzy C-means classification, self-organizing maps classification, moving-centroid K-means classification, and Cosine distance classification. The performance of these architectures is analyzed in terms of accuracy and resource requirement. We establish that the neural networks based Self-Organizing Maps classifier offers the most viable solution. A spike sorter based on the Self-Organizing Maps classifier, requires only 7.83% of computational resources of the best-reported spike sorter, hierarchical adaptive means, while offering a 3% better accuracy at 7 dB SNR.
Designing and Implementation of River Classification Assistant Management System
NASA Astrophysics Data System (ADS)
Zhao, Yinjun; Jiang, Wenyuan; Yang, Rujun; Yang, Nan; Liu, Haiyan
2018-03-01
In an earlier publication, we proposed a new Decision Classifier (DCF) for Chinese river classification based on their structures. To expand, enhance and promote the application of the DCF, we build a computer system to support river classification named River Classification Assistant Management System. Based on ArcEngine and ArcServer platform, this system implements many functions such as data management, extraction of river network, river classification, and results publication under combining Client / Server with Browser / Server framework.
Detection of Alzheimer's disease using group lasso SVM-based region selection
NASA Astrophysics Data System (ADS)
Sun, Zhuo; Fan, Yong; Lelieveldt, Boudewijn P. F.; van de Giessen, Martijn
2015-03-01
Alzheimer's disease (AD) is one of the most frequent forms of dementia and an increasing challenging public health problem. In the last two decades, structural magnetic resonance imaging (MRI) has shown potential in distinguishing patients with Alzheimer's disease and elderly controls (CN). To obtain AD-specific biomarkers, previous research used either statistical testing to find statistically significant different regions between the two clinical groups, or l1 sparse learning to select isolated features in the image domain. In this paper, we propose a new framework that uses structural MRI to simultaneously distinguish the two clinical groups and find the bio-markers of AD, using a group lasso support vector machine (SVM). The group lasso term (mixed l1- l2 norm) introduces anatomical information from the image domain into the feature domain, such that the resulting set of selected voxels are more meaningful than the l1 sparse SVM. Because of large inter-structure size variation, we introduce a group specific normalization factor to deal with the structure size bias. Experiments have been performed on a well-designed AD vs. CN dataset1 to validate our method. Comparing to the l1 sparse SVM approach, our method achieved better classification performance and a more meaningful biomarker selection. When we vary the training set, the selected regions by our method were more stable than the l1 sparse SVM. Classification experiments showed that our group normalization lead to higher classification accuracy with fewer selected regions than the non-normalized method. Comparing to the state-of-art AD vs. CN classification methods, our approach not only obtains a high accuracy with the same dataset, but more importantly, we simultaneously find the brain anatomies that are closely related to the disease.
Biobehavioral Correlates of Depression in Reaction to Mental and Physical Challenge
2007-03-07
positive effects on quality of life for individuals with depression. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same...reactivity to challenge with potential positive effects on quality of life for individuals with depression. v Biobehavioral Correlates of...Responsiveness.............................................. 22 IV. Immune System Parameters in Depression............................................ 24
Land use/cover classification in the Brazilian Amazon using satellite images.
Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant'anna, Sidnei João Siqueira
2012-09-01
Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.
Land use/cover classification in the Brazilian Amazon using satellite images
Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant’Anna, Sidnei João Siqueira
2013-01-01
Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data. PMID:24353353
The Landscape of long non-coding RNA classification
St Laurent, Georges; Wahlestedt, Claes; Kapranov, Philipp
2015-01-01
Advances in the depth and quality of transcriptome sequencing have revealed many new classes of long non-coding RNAs (lncRNAs). lncRNA classification has mushroomed to accommodate these new findings, even though the real dimensions and complexity of the non-coding transcriptome remain unknown. Although evidence of functionality of specific lncRNAs continues to accumulate, conflicting, confusing, and overlapping terminology has fostered ambiguity and lack of clarity in the field in general. The lack of fundamental conceptual un-ambiguous classification framework results in a number of challenges in the annotation and interpretation of non-coding transcriptome data. It also might undermine integration of the new genomic methods and datasets in an effort to unravel function of lncRNA. Here, we review existing lncRNA classifications, nomenclature, and terminology. Then we describe the conceptual guidelines that have emerged for their classification and functional annotation based on expanding and more comprehensive use of large systems biology-based datasets. PMID:25869999
Zhe Fan; Zhong Wang; Guanglin Li; Ruomei Wang
2016-08-01
Motion classification system based on surface Electromyography (sEMG) pattern recognition has achieved good results in experimental condition. But it is still a challenge for clinical implement and practical application. Many factors contribute to the difficulty of clinical use of the EMG based dexterous control. The most obvious and important is the noise in the EMG signal caused by electrode shift, muscle fatigue, motion artifact, inherent instability of signal and biological signals such as Electrocardiogram. In this paper, a novel method based on Canonical Correlation Analysis (CCA) was developed to eliminate the reduction of classification accuracy caused by electrode shift. The average classification accuracy of our method were above 95% for the healthy subjects. In the process, we validated the influence of electrode shift on motion classification accuracy and discovered the strong correlation with correlation coefficient of >0.9 between shift position data and normal position data.
The Use and Abuse of Diagnostic/Classification Criteria
June, Rayford R.; Aggarwal, Rohit
2015-01-01
In rheumatic diseases, classification criteria have been developed to identify well-defined homogenous cohorts for clinical research. Although, they are commonly used in clinical practice, their use may not be appropriate for routine diagnostic clinical care. Classification criteria are being revised with improved methodology and further understanding of disease pathophysiology, but still may not encompass all unique clinical situations to be applied for diagnosis of heterogeneous, rare, evolving rheumatic diseases. Diagnostic criteria development is challenging primarily due to difficulty for universal application given significant differences in prevalence of rheumatic diseases based on geographical area and clinic settings. Despite these shortcomings, the clinician can still use classification criteria for understanding the disease as well as a guide for diagnosis with a few caveats. We present the limits of current classification criteria, describe their use and abuse in clinical practice, and how they should be used with caution when applied in clinics. PMID:26096094
Uav-Based Crops Classification with Joint Features from Orthoimage and Dsm Data
NASA Astrophysics Data System (ADS)
Liu, B.; Shi, Y.; Duan, Y.; Wu, W.
2018-04-01
Accurate crops classification remains a challenging task due to the same crop with different spectra and different crops with same spectrum phenomenon. Recently, UAV-based remote sensing approach gains popularity not only for its high spatial and temporal resolution, but also for its ability to obtain spectraand spatial data at the same time. This paper focus on how to take full advantages of spatial and spectrum features to improve crops classification accuracy, based on an UAV platform equipped with a general digital camera. Texture and spatial features extracted from the RGB orthoimage and the digital surface model of the monitoring area are analysed and integrated within a SVM classification framework. Extensive experiences results indicate that the overall classification accuracy is drastically improved from 72.9 % to 94.5 % when the spatial features are combined together, which verified the feasibility and effectiveness of the proposed method.
Li, Juntao; Wang, Yanyan; Jiang, Tao; Xiao, Huimin; Song, Xuekun
2018-05-09
Diagnosing acute leukemia is the necessary prerequisite to treating it. Multi-classification on the gene expression data of acute leukemia is help for diagnosing it which contains B-cell acute lymphoblastic leukemia (BALL), T-cell acute lymphoblastic leukemia (TALL) and acute myeloid leukemia (AML). However, selecting cancer-causing genes is a challenging problem in performing multi-classification. In this paper, weighted gene co-expression networks are employed to divide the genes into groups. Based on the dividing groups, a new regularized multinomial regression with overlapping group lasso penalty (MROGL) has been presented to simultaneously perform multi-classification and select gene groups. By implementing this method on three-class acute leukemia data, the grouped genes which work synergistically are identified, and the overlapped genes shared by different groups are also highlighted. Moreover, MROGL outperforms other five methods on multi-classification accuracy. Copyright © 2017. Published by Elsevier B.V.
Lin, Dongyun; Sun, Lei; Toh, Kar-Ann; Zhang, Jing Bo; Lin, Zhiping
2018-05-01
Automated biomedical image classification could confront the challenges of high level noise, image blur, illumination variation and complicated geometric correspondence among various categorical biomedical patterns in practice. To handle these challenges, we propose a cascade method consisting of two stages for biomedical image classification. At stage 1, we propose a confidence score based classification rule with a reject option for a preliminary decision using the support vector machine (SVM). The testing images going through stage 1 are separated into two groups based on their confidence scores. Those testing images with sufficiently high confidence scores are classified at stage 1 while the others with low confidence scores are rejected and fed to stage 2. At stage 2, the rejected images from stage 1 are first processed by a subspace analysis technique called eigenfeature regularization and extraction (ERE), and then classified by another SVM trained in the transformed subspace learned by ERE. At both stages, images are represented based on two types of local features, i.e., SIFT and SURF, respectively. They are encoded using various bag-of-words (BoW) models to handle biomedical patterns with and without geometric correspondence, respectively. Extensive experiments are implemented to evaluate the proposed method on three benchmark real-world biomedical image datasets. The proposed method significantly outperforms several competing state-of-the-art methods in terms of classification accuracy. Copyright © 2018 Elsevier Ltd. All rights reserved.
Al-Masni, Mohammed A; Al-Antari, Mugahed A; Park, Jeong-Min; Gi, Geon; Kim, Tae-Yeon; Rivera, Patricio; Valarezo, Edwin; Choi, Mun-Taek; Han, Seung-Moo; Kim, Tae-Seong
2018-04-01
Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions. Copyright © 2018 Elsevier B.V. All rights reserved.
VizieR Online Data Catalog: LAMOST-Kepler MKCLASS spectral classification (Gray+, 2016)
NASA Astrophysics Data System (ADS)
Gray, R. O.; Corbally, C. J.; De Cat, P.; Fu, J. N.; Ren, A. B.; Shi, J. R.; Luo, A. L.; Zhang, H. T.; Wu, Y.; Cao, Z.; Li, G.; Zhang, Y.; Hou, Y.; Wang, Y.
2016-07-01
The data for the LAMOST-Kepler project are supplied by the Large Sky Area Multi Object Fiber Spectroscopic Telescope (LAMOST, also known as the Guo Shou Jing Telescope). This unique astronomical instrument is located at the Xinglong observatory in China, and combines a large aperture (4 m) telescope with a 5° circular field of view (Wang et al. 1996ApOpt..35.5155W). Our role in this project is to supply accurate two-dimensional spectral types for the observed targets. The large number of spectra obtained for this project (101086) makes traditional visual classification techniques impractical, so we have utilized the MKCLASS code to perform these classifications. The MKCLASS code (Gray & Corbally 2014AJ....147...80G, v1.07 http://www.appstate.edu/~grayro/mkclass/), an expert system designed to classify blue-violet spectra on the MK Classification system, was employed to produce the spectral classifications reported in this paper. MKCLASS was designed to reproduce the steps skilled human classifiers employ in the classification process. (2 data files).
Simon, A.; Doyle, M.; Kondolf, M.; Shields, F.D.; Rhoads, B.; Grant, G.; Fitzpatrick, F.; Juracek, K.; McPhillips, M.; MacBroom, J.
2005-01-01
Over the past 10 years the Rosgen classification system and its associated methods of "natural channel design" have become synonymous (to many without prior knowledge of the field) with the term "stream restoration" and the science of fluvial geomorphology. Since the mid 1990s, this classification approach has become widely, and perhaps dominantly adopted by governmental agencies, particularly those funding restoration projects. For example, in a request for proposals for the restoration of Trout Creek in Montana, the Natural Resources Conservation Service required "experience in the use and application of a stream classification system and its implementation." Similarly, classification systems have been used in evaluation guides for riparian areas and U.S. Forest Service management plans. Most notably, many highly trained geomorphologists and hydraulic engineers are often held suspect, or even thought incorrect, if their approach does not include reference to or application of a classification system. This, combined with the para-professional training provided by some involved in "natural channel design" empower individuals and groups with limited backgrounds in stream and watershed sciences to engineer wholesale re-patterning of stream reaches using 50-year old technology that was never intended for engineering design. At Level I, the Rosgen classification system consists of eight or nine major stream types, based on hydraulic-geometry relations and four other measures of channel shape to distinguish the dimensions of alluvial stream channels as a function of the bankfull stage. Six classes of the particle size of the boundary sediments are used to further sub-divide each of the major stream types, resulting in 48 or 54 stream types. Aside from the difficulty in identifying bankfull stage, particularly in incising channels, and the issue of sampling from two distinct populations (beds and banks) to classify the boundary sediments, the classification provides a consistent and reproducible means for practitioners to describe channel morphology although difficulties have been encountered in lower-gradient stream systems. Use of the scheme to communicate between users or as a conceptual model, however, has not justified its use for engineering design or for predicting river behavior; its use for designing mitigation projects, therefore, seems beyond its technical scope. Copyright ASCE 2005.
Comparative Analysis of Haar and Daubechies Wavelet for Hyper Spectral Image Classification
NASA Astrophysics Data System (ADS)
Sharif, I.; Khare, S.
2014-11-01
With the number of channels in the hundreds instead of in the tens Hyper spectral imagery possesses much richer spectral information than multispectral imagery. The increased dimensionality of such Hyper spectral data provides a challenge to the current technique for analyzing data. Conventional classification methods may not be useful without dimension reduction pre-processing. So dimension reduction has become a significant part of Hyper spectral image processing. This paper presents a comparative analysis of the efficacy of Haar and Daubechies wavelets for dimensionality reduction in achieving image classification. Spectral data reduction using Wavelet Decomposition could be useful because it preserves the distinction among spectral signatures. Daubechies wavelets optimally capture the polynomial trends while Haar wavelet is discontinuous and resembles a step function. The performance of these wavelets are compared in terms of classification accuracy and time complexity. This paper shows that wavelet reduction has more separate classes and yields better or comparable classification accuracy. In the context of the dimensionality reduction algorithm, it is found that the performance of classification of Daubechies wavelets is better as compared to Haar wavelet while Daubechies takes more time compare to Haar wavelet. The experimental results demonstrate the classification system consistently provides over 84% classification accuracy.
NASA Technical Reports Server (NTRS)
Shull, Sarah A.; Gralla, Erica L.; deWeck, Olivier L.; Shishko, Robert
2006-01-01
One of the major logistical challenges in human space exploration is asset management. This paper presents observations on the practice of asset management in support of human space flight to date and discusses a functional-based supply classification and a framework for an integrated database that could be used to improve asset management and logistics for human missions to the Moon, Mars and beyond.
Learning From Short Text Streams With Topic Drifts.
Li, Peipei; He, Lu; Wang, Haiyan; Hu, Xuegang; Zhang, Yuhong; Li, Lei; Wu, Xindong
2017-09-18
Short text streams such as search snippets and micro blogs have been popular on the Web with the emergence of social media. Unlike traditional normal text streams, these data present the characteristics of short length, weak signal, high volume, high velocity, topic drift, etc. Short text stream classification is hence a very challenging and significant task. However, this challenge has received little attention from the research community. Therefore, a new feature extension approach is proposed for short text stream classification with the help of a large-scale semantic network obtained from a Web corpus. It is built on an incremental ensemble classification model for efficiency. First, more semantic contexts based on the senses of terms in short texts are introduced to make up of the data sparsity using the open semantic network, in which all terms are disambiguated by their semantics to reduce the noise impact. Second, a concept cluster-based topic drifting detection method is proposed to effectively track hidden topic drifts. Finally, extensive studies demonstrate that as compared to several well-known concept drifting detection methods in data stream, our approach can detect topic drifts effectively, and it enables handling short text streams effectively while maintaining the efficiency as compared to several state-of-the-art short text classification approaches.
UAS Detection Classification and Neutralization: Market Survey 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birch, Gabriel Carisle; Griffin, John Clark; Erdman, Matthew Kelly
The purpose of this document is to briefly frame the challenges of detecting low, slow, and small (LSS) unmanned aerial systems (UAS). The conclusion drawn from internal discussions and external reports is the following; detection of LSS UAS is a challenging problem that can- not be achieved with a single detection modality for all potential targets. Classification of LSS UAS, especially classification in the presence of background clutter (e.g., urban environment) or other non-threating targets (e.g., birds), is under-explored. Though information of avail- able technologies is sparse, many of the existing options for UAS detection appear to be in theirmore » infancy (when compared to more established ground-based air defense systems for larger and/or faster threats). Companies currently providing or developing technologies to combat the UAS safety and security problem are certainly worth investigating, however, no company has provided the statistical evidence necessary to support robust detection, identification, and/or neutralization of LSS UAS targets. The results of a market survey are included that highlights potential commercial entities that could contribute some technology that assists in the detection, classification, and neutral- ization of a LSS UAS. This survey found no clear and obvious commercial solution, though recommendations are given for further investigation of several potential systems.« less
Discrete Event Simulation for the Analysis of Artillery Fired Projectiles from Shore
2017-06-01
a designed experiment indicate artillery systems provide commanders a limited area denial capability, and should be employed where naval forces are... Design 15. NUMBER OF PAGES 85 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY CLASSIFICATION OF THIS PAGE Unclassified 19...to deny freedom of navigation (area denial) and stop an amphibious naval convoy (anti-access). Results from a designed experiment indicate artillery
Corps of Engineers Hydraulic Design Criteria. Volume I
1977-01-01
DESIGN CRITERIA CLASSIFICATION INDEX S000-GENERAL 000 Physical Constants 001 Fluid Properties 010 Open Channel Flow 020 Free Overflow 030 Pressure Flow...Dissipation 113 Erosion below Spillways 120 Chute Spillways 121 Approach Channel 122 Ogee Crests 123 Spillay Chutes S ii124 Spillway Stilling Basins 125...Spillvay Exit Channel Revised 5-59 .. . j1.I.i edCORPS OF ENGINEERS HYDRAULIC DESIGN CRITERIA CLASSIFICATION INDEX (Continued) %. IO0-SPILLWAYS
46 CFR 108.109 - Classification society standards.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false Classification society standards. 108.109 Section 108.109 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT General § 108.109 Classification society standards. (a) Any person who desires to...
46 CFR 108.109 - Classification society standards.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false Classification society standards. 108.109 Section 108.109 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT General § 108.109 Classification society standards. (a) Any person who desires to...
46 CFR 108.109 - Classification society standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false Classification society standards. 108.109 Section 108.109 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT General § 108.109 Classification society standards. (a) Any person who desires to...
46 CFR 108.109 - Classification society standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 4 2014-10-01 2014-10-01 false Classification society standards. 108.109 Section 108.109 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT General § 108.109 Classification society standards. (a) Any person who desires to...
46 CFR 108.109 - Classification society standards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 4 2013-10-01 2013-10-01 false Classification society standards. 108.109 Section 108.109 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS DESIGN AND EQUIPMENT General § 108.109 Classification society standards. (a) Any person who desires to...
Evaluation of change detection techniques for monitoring coastal zone environments
NASA Technical Reports Server (NTRS)
Weismiller, R. A. (Principal Investigator); Kristof, S. J.; Scholz, D. K.; Anuta, P. E.; Momin, S. M.
1977-01-01
The author has identified the following significant results. Four change detection techniques were designed and implemented for evaluation: (1) post classification comparison change detection, (2) delta data change detection, (3) spectral/temporal change classification, and (4) layered spectral/temporal change classification. The post classification comparison technique reliably identified areas of change and was used as the standard for qualitatively evaluating the other three techniques. The layered spectral/temporal change classification and the delta data change detection results generally agreed with the post classification comparison technique results; however, many small areas of change were not identified. Major discrepancies existed between the post classification comparison and spectral/temporal change detection results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-11
.... FDA-2008-N-0163] (formerly Docket No. 2001N-0067) RIN 0910-AG21 Dental Devices: Classification of Dental Amalgam, Reclassification of Dental Mercury, Designation of Special Controls for Dental Amalgam... the Federal Register of August 4, 2009 (74 FR 38686) which classified dental amalgam as a class II...
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, P.; Beaudet, P.
1980-01-01
The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.
Virtual design and construction of plumbing systems
NASA Astrophysics Data System (ADS)
Filho, João Bosco P. Dantas; Angelim, Bruno Maciel; Guedes, Joana Pimentel; de Castro, Marcelo Augusto Farias; Neto, José de Paula Barros
2016-12-01
Traditionally, the design coordination process is carried out by overlaying and comparing 2D drawings made by different project participants. Detecting information errors from a composite drawing is especially challenging and error prone. This procedure usually leaves many design errors undetected until construction begins, and typically lead to rework. Correcting conflict issues, which were not identified during design and coordination phase, reduces the overall productivity for everyone involved in the construction process. The identification of construction issues in the field generate Request for Information (RFIs) that is one of delays causes. The application of Virtual Design and Construction (VDC) tools to the coordination processes can bring significant value to architecture, structure, and mechanical, electrical, and plumbing (MEP) designs in terms of a reduced number of errors undetected and requests for information. This paper is focused on evaluating requests for information (RFI) associated with water/sanitary facilities of a BIM model. Thus, it is expected to add improvements of water/sanitary facility designs, as well as to assist the virtual construction team to notice and identify design problems. This is an exploratory and descriptive research. A qualitative methodology is used. This study adopts RFI's classification in six analyzed categories: correction, omission, validation of information, modification, divergence of information and verification. The results demonstrate VDC's contribution improving the plumbing system designs. Recommendations are suggested to identify and avoid these RFI types in plumbing system design process or during virtual construction.
Sullivan, Mary C.; Msall, Michael E.; Miller, Robin J.
2012-01-01
Purpose The purpose of this study was to comprehensively examine physical, neurological, and psychological health in a U.S. sample of 180 infants at age 17. Design & Methods The World Health Organization International Classification of Functioning, Disability and Health model framed the health-related domains and contextual factors. Assessments included growth, chronic conditions, neurological status, and psychological health. Results Physical health, growth, and neurological outcomes were poorer in the preterm groups. Minor neurological impairment was related to integrative function. Preterm survivors reported higher rates of depression, anxiety, and inattention/hyperactivity. Practice Implications Complex health challenges confront preterm survivors at late adolescence suggesting the necessity of continued health surveillance. PMID:22734876
Alsalem, M A; Zaidan, A A; Zaidan, B B; Hashim, M; Madhloom, H T; Azeez, N D; Alsyisuf, S
2018-05-01
Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis. This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area. We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature. Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys. Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis. Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields. Copyright © 2018 Elsevier B.V. All rights reserved.
Guo, Yang; Liu, Shuhui; Li, Zhanhuai; Shang, Xuequn
2018-04-11
The classification of cancer subtypes is of great importance to cancer disease diagnosis and therapy. Many supervised learning approaches have been applied to cancer subtype classification in the past few years, especially of deep learning based approaches. Recently, the deep forest model has been proposed as an alternative of deep neural networks to learn hyper-representations by using cascade ensemble decision trees. It has been proved that the deep forest model has competitive or even better performance than deep neural networks in some extent. However, the standard deep forest model may face overfitting and ensemble diversity challenges when dealing with small sample size and high-dimensional biology data. In this paper, we propose a deep learning model, so-called BCDForest, to address cancer subtype classification on small-scale biology datasets, which can be viewed as a modification of the standard deep forest model. The BCDForest distinguishes from the standard deep forest model with the following two main contributions: First, a named multi-class-grained scanning method is proposed to train multiple binary classifiers to encourage diversity of ensemble. Meanwhile, the fitting quality of each classifier is considered in representation learning. Second, we propose a boosting strategy to emphasize more important features in cascade forests, thus to propagate the benefits of discriminative features among cascade layers to improve the classification performance. Systematic comparison experiments on both microarray and RNA-Seq gene expression datasets demonstrate that our method consistently outperforms the state-of-the-art methods in application of cancer subtype classification. The multi-class-grained scanning and boosting strategy in our model provide an effective solution to ease the overfitting challenge and improve the robustness of deep forest model working on small-scale data. Our model provides a useful approach to the classification of cancer subtypes by using deep learning on high-dimensional and small-scale biology data.
New tools for evaluating LQAS survey designs
2014-01-01
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the ‘grey region’ are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions. PMID:24528928
New tools for evaluating LQAS survey designs.
Hund, Lauren
2014-02-15
Lot Quality Assurance Sampling (LQAS) surveys have become increasingly popular in global health care applications. Incorporating Bayesian ideas into LQAS survey design, such as using reasonable prior beliefs about the distribution of an indicator, can improve the selection of design parameters and decision rules. In this paper, a joint frequentist and Bayesian framework is proposed for evaluating LQAS classification accuracy and informing survey design parameters. Simple software tools are provided for calculating the positive and negative predictive value of a design with respect to an underlying coverage distribution and the selected design parameters. These tools are illustrated using a data example from two consecutive LQAS surveys measuring Oral Rehydration Solution (ORS) preparation. Using the survey tools, the dependence of classification accuracy on benchmark selection and the width of the 'grey region' are clarified in the context of ORS preparation across seven supervision areas. Following the completion of an LQAS survey, estimation of the distribution of coverage across areas facilitates quantifying classification accuracy and can help guide intervention decisions.
ERIC Educational Resources Information Center
Harris, Christopher
2013-01-01
In this article the author explores how a new library classification system might be designed using some aspects of the Dewey Decimal Classification (DDC) and ideas from other systems to create something that works for school libraries in the year 2020. By examining what works well with the Dewey Decimal System, what features should be carried…
The Alaska vegetation classification.
L.A. Viereck; C.T. Dyrness; A.R. Batten; K.J. Wenzlick
1992-01-01
The Alaska vegetation classification presented here is a comprehensive, statewide system that has been under development since 1976. The classification is based, as much as possible, on the characteristics of the vegetation itself and is designed to categorize existing vegetation, not potential vegetation. A hierarchical system with five levels of resolution is used...
Non-parametric transient classification using adaptive wavelets
NASA Astrophysics Data System (ADS)
Varughese, Melvin M.; von Sachs, Rainer; Stephanou, Michael; Bassett, Bruce A.
2015-11-01
Classifying transients based on multiband light curves is a challenging but crucial problem in the era of GAIA and Large Synoptic Sky Telescope since the sheer volume of transients will make spectroscopic classification unfeasible. We present a non-parametric classifier that predicts the transient's class given training data. It implements two novel components: the use of the BAGIDIS wavelet methodology - a characterization of functional data using hierarchical wavelet coefficients - as well as the introduction of a ranked probability classifier on the wavelet coefficients that handles both the heteroscedasticity of the data in addition to the potential non-representativity of the training set. The classifier is simple to implement while a major advantage of the BAGIDIS wavelets is that they are translation invariant. Hence, BAGIDIS does not need the light curves to be aligned to extract features. Further, BAGIDIS is non-parametric so it can be used effectively in blind searches for new objects. We demonstrate the effectiveness of our classifier against the Supernova Photometric Classification Challenge to correctly classify supernova light curves as Type Ia or non-Ia. We train our classifier on the spectroscopically confirmed subsample (which is not representative) and show that it works well for supernova with observed light-curve time spans greater than 100 d (roughly 55 per cent of the data set). For such data, we obtain a Ia efficiency of 80.5 per cent and a purity of 82.4 per cent, yielding a highly competitive challenge score of 0.49. This indicates that our `model-blind' approach may be particularly suitable for the general classification of astronomical transients in the era of large synoptic sky surveys.
Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.
2018-01-01
The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.
Intelligence system based classification approach for medical disease diagnosis
NASA Astrophysics Data System (ADS)
Sagir, Abdu Masanawa; Sathasivam, Saratha
2017-08-01
The prediction of breast cancer in women who have no signs or symptoms of the disease as well as survivability after undergone certain surgery has been a challenging problem for medical researchers. The decision about presence or absence of diseases depends on the physician's intuition, experience and skill for comparing current indicators with previous one than on knowledge rich data hidden in a database. This measure is a very crucial and challenging task. The goal is to predict patient condition by using an adaptive neuro fuzzy inference system (ANFIS) pre-processed by grid partitioning. To achieve an accurate diagnosis at this complex stage of symptom analysis, the physician may need efficient diagnosis system. A framework describes methodology for designing and evaluation of classification performances of two discrete ANFIS systems of hybrid learning algorithms least square estimates with Modified Levenberg-Marquardt and Gradient descent algorithms that can be used by physicians to accelerate diagnosis process. The proposed method's performance was evaluated based on training and test datasets with mammographic mass and Haberman's survival Datasets obtained from benchmarked datasets of University of California at Irvine's (UCI) machine learning repository. The robustness of the performance measuring total accuracy, sensitivity and specificity is examined. In comparison, the proposed method achieves superior performance when compared to conventional ANFIS based gradient descent algorithm and some related existing methods. The software used for the implementation is MATLAB R2014a (version 8.3) and executed in PC Intel Pentium IV E7400 processor with 2.80 GHz speed and 2.0 GB of RAM.
NASA Astrophysics Data System (ADS)
Diamant, Idit; Shalhon, Moran; Goldberger, Jacob; Greenspan, Hayit
2016-03-01
Classification of clustered breast microcalcifications into benign and malignant categories is an extremely challenging task for computerized algorithms and expert radiologists alike. In this paper we present a novel method for feature selection based on mutual information (MI) criterion for automatic classification of microcalcifications. We explored the MI based feature selection for various texture features. The proposed method was evaluated on a standardized digital database for screening mammography (DDSM). Experimental results demonstrate the effectiveness and the advantage of using the MI-based feature selection to obtain the most relevant features for the task and thus to provide for improved performance as compared to using all features.
Elliott, Caroline M.; Jacobson, Robert B.
2006-01-01
A multiscale geomorphic classification was established for the 39-mile, 59-mile, and adjacent segments of the Missouri National Recreational River administered by the National Park Service in South Dakota and Nebraska. The objective of the classification was to define naturally occurring clusters of geomorphic characteristics that would be indicative of discrete sets of geomorphic processes, with the intent that such a classification would be useful in river-management and rehabilitation decisions. The statistical classification was based on geomorphic characteristics of the river collected from 1999 orthophotography and the persistence of classified units was evaluated by comparison with similar datasets for 2003 and 2004 and by evaluating variation of bank erosion rates by geomorphic class. Changes in channel location and form were also explored using imagery and maps from 1993-2004, 1941 and 1894. The multivariate classification identified a hierarchy of naturally occurring clusters of reach-scale geomorphic characteristics. The simplest level of the hierarchy divides the river from segments into discrete reaches characterized by single and multithread channels and additional hierarchical levels established 4-part and 10-part classifications. The classification system presents a physical framework that can be applied to prioritization and design of bank stabilization projects, design of habitat rehabilitation projects, and stratification of monitoring and assessment sampling programs.
Spectral-spatial classification of hyperspectral imagery with cooperative game
NASA Astrophysics Data System (ADS)
Zhao, Ji; Zhong, Yanfei; Jia, Tianyi; Wang, Xinyu; Xu, Yao; Shu, Hong; Zhang, Liangpei
2018-01-01
Spectral-spatial classification is known to be an effective way to improve classification performance by integrating spectral information and spatial cues for hyperspectral imagery. In this paper, a game-theoretic spectral-spatial classification algorithm (GTA) using a conditional random field (CRF) model is presented, in which CRF is used to model the image considering the spatial contextual information, and a cooperative game is designed to obtain the labels. The algorithm establishes a one-to-one correspondence between image classification and game theory. The pixels of the image are considered as the players, and the labels are considered as the strategies in a game. Similar to the idea of soft classification, the uncertainty is considered to build the expected energy model in the first step. The local expected energy can be quickly calculated, based on a mixed strategy for the pixels, to establish the foundation for a cooperative game. Coalitions can then be formed by the designed merge rule based on the local expected energy, so that a majority game can be performed to make a coalition decision to obtain the label of each pixel. The experimental results on three hyperspectral data sets demonstrate the effectiveness of the proposed classification algorithm.
Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y
2015-11-01
To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.
Charting the landscape of priority problems in psychiatry, part 1: classification and diagnosis.
Stephan, Klaas E; Bach, Dominik R; Fletcher, Paul C; Flint, Jonathan; Frank, Michael J; Friston, Karl J; Heinz, Andreas; Huys, Quentin J M; Owen, Michael J; Binder, Elisabeth B; Dayan, Peter; Johnstone, Eve C; Meyer-Lindenberg, Andreas; Montague, P Read; Schnyder, Ulrich; Wang, Xiao-Jing; Breakspear, Michael
2016-01-01
Contemporary psychiatry faces major challenges. Its syndrome-based disease classification is not based on mechanisms and does not guide treatment, which largely depends on trial and error. The development of therapies is hindered by ignorance of potential beneficiary patient subgroups. Neuroscientific and genetics research have yet to affect disease definitions or contribute to clinical decision making. In this challenging setting, what should psychiatric research focus on? In two companion papers, we present a list of problems nominated by clinicians and researchers from different disciplines as candidates for future scientific investigation of mental disorders. These problems are loosely grouped into challenges concerning nosology and diagnosis (this Personal View) and problems related to pathogenesis and aetiology (in the companion Personal View). Motivated by successful examples in other disciplines, particularly the list of Hilbert's problems in mathematics, this subjective and eclectic list of priority problems is intended for psychiatric researchers, helping to re-focus existing research and providing perspectives for future psychiatric science. Copyright © 2016 Elsevier Ltd. All rights reserved.
A new EMI system for detection and classification of challenging targets
NASA Astrophysics Data System (ADS)
Shubitidze, F.; Fernández, J. P.; Barrowes, B. E.; O'Neill, K.
2013-06-01
Advanced electromagnetic induction (EMI) sensors currently feature multi-axis illumination of targets and tri-axial vector sensing (e.g., MetalMapper), or exploit multi-static array data acquisition (e.g., TEMTADS). They produce data of high density, quality, and diversity, and have been combined with advanced EMI models to provide superb classification performance relative to the previous generation of single-axis, monostatic sensors. However, these advances yet have to improve significantly our ability to classify small, deep, and otherwise challenging targets. Particularly, recent live-site discrimination studies at Camp Butner, NC and Camp Beale, CA have revealed that it is more challenging to detect and discriminate small munitions (with calibers ranging from 20 mm to 60 mm) than larger ones. In addition, a live-site test at the Massachusetts Military Reservation, MA highlighted the difficulties for current sensors to classify large, deep, and overlapping targets with high confidence. There are two main approaches to overcome these problems: 1) adapt advanced EMI models to the existing systems and 2) improve the detection limits of current sensors by modifying their hardware. In this paper we demonstrate a combined software/hardware approach that will provide extended detection range and spatial resolution to next-generation EMI systems; we analyze and invert EMI data to extract classification features for small and deep targets; and we propose a new system that features a large transmitter coil.
9 CFR 78.41 - State/area classification.
Code of Federal Regulations, 2012 CFR
2012-01-01
... AGRICULTURE INTERSTATE TRANSPORTATION OF ANIMALS (INCLUDING POULTRY) AND ANIMAL PRODUCTS BRUCELLOSIS Designation of Brucellosis Areas § 78.41 State/area classification. (a) Class Free. Alabama, Alaska, Arizona...
9 CFR 78.41 - State/area classification.
Code of Federal Regulations, 2013 CFR
2013-01-01
... AGRICULTURE INTERSTATE TRANSPORTATION OF ANIMALS (INCLUDING POULTRY) AND ANIMAL PRODUCTS BRUCELLOSIS Designation of Brucellosis Areas § 78.41 State/area classification. (a) Class Free. Alabama, Alaska, Arizona...
9 CFR 78.41 - State/area classification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... AGRICULTURE INTERSTATE TRANSPORTATION OF ANIMALS (INCLUDING POULTRY) AND ANIMAL PRODUCTS BRUCELLOSIS Designation of Brucellosis Areas § 78.41 State/area classification. (a) Class Free. Alabama, Alaska, Arizona...
Synthetic biology routes to bio-artificial intelligence.
Nesbeth, Darren N; Zaikin, Alexey; Saka, Yasushi; Romano, M Carmen; Giuraniuc, Claudiu V; Kanakov, Oleg; Laptyeva, Tetyana
2016-11-30
The design of synthetic gene networks (SGNs) has advanced to the extent that novel genetic circuits are now being tested for their ability to recapitulate archetypal learning behaviours first defined in the fields of machine and animal learning. Here, we discuss the biological implementation of a perceptron algorithm for linear classification of input data. An expansion of this biological design that encompasses cellular 'teachers' and 'students' is also examined. We also discuss implementation of Pavlovian associative learning using SGNs and present an example of such a scheme and in silico simulation of its performance. In addition to designed SGNs, we also consider the option to establish conditions in which a population of SGNs can evolve diversity in order to better contend with complex input data. Finally, we compare recent ethical concerns in the field of artificial intelligence (AI) and the future challenges raised by bio-artificial intelligence (BI). © 2016 The Author(s). This is an open access article published by Portland Press Limited on behalf of the Biochemical Society and distributed under the Creative Commons Attribution License 4.0 (CC BY).
Semantic labeling of high-resolution aerial images using an ensemble of fully convolutional networks
NASA Astrophysics Data System (ADS)
Sun, Xiaofeng; Shen, Shuhan; Lin, Xiangguo; Hu, Zhanyi
2017-10-01
High-resolution remote sensing data classification has been a challenging and promising research topic in the community of remote sensing. In recent years, with the rapid advances of deep learning, remarkable progress has been made in this field, which facilitates a transition from hand-crafted features designing to an automatic end-to-end learning. A deep fully convolutional networks (FCNs) based ensemble learning method is proposed to label the high-resolution aerial images. To fully tap the potentials of FCNs, both the Visual Geometry Group network and a deeper residual network, ResNet, are employed. Furthermore, to enlarge training samples with diversity and gain better generalization, in addition to the commonly used data augmentation methods (e.g., rotation, multiscale, and aspect ratio) in the literature, aerial images from other datasets are also collected for cross-scene learning. Finally, we combine these learned models to form an effective FCN ensemble and refine the results using a fully connected conditional random field graph model. Experiments on the ISPRS 2-D Semantic Labeling Contest dataset show that our proposed end-to-end classification method achieves an overall accuracy of 90.7%, a state-of-the-art in the field.
Classification methodology for tritiated waste requiring interim storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cana, D.; Dall'ava, D.; Decanis, C.
2015-03-15
Fusion machines like the ITER experimental research facility will use tritium as fuel. Therefore, most of the solid radioactive waste will result not only from activation by 14 MeV neutrons, but also from contamination by tritium. As a consequence, optimizing the treatment process for waste containing tritium (tritiated waste) is a major challenge. This paper summarizes the studies conducted in France within the framework of the French national plan for the management of radioactive materials and waste. The paper recommends a reference program for managing this waste based on its sorting, treatment and packaging by the producer. It also recommendsmore » setting up a 50-year temporary storage facility to allow for tritium decay and designing future disposal facilities using tritiated radwaste characteristics as input data. This paper first describes this waste program and then details an optimized classification methodology which takes into account tritium decay over a 50-year storage period. The paper also describes a specific application for purely tritiated waste and discusses the set-up expected to be implemented for ITER decommissioning waste (current assumption). Comparison between this optimized approach and other viable detritiation techniques will be drawn. (authors)« less
Medulloblastoma in the Molecular Era
Miranda Kuzan-Fischer, Claudia; Juraschka, Kyle; Taylor, Michael D.
2018-01-01
Medulloblastoma is the most common malignant brain tumor of childhood and remains a major cause of cancer related mortality in children. Significant scientific advancements have transformed the understanding of medulloblastoma, leading to the recognition of four distinct clinical and molecular subgroups, namely wingless (WNT), sonic hedgehog, group 3, and group 4. Subgroup classification combined with the recognition of subgroup specific molecular alterations has also led to major changes in risk stratification of medulloblastoma patients and these changes have begun to alter clinical trial design, in which the newly recognized subgroups are being incorporated as individualized treatment arms. Despite these recent advancements, identification of effective targeted therapies remains a challenge for several reasons. First, significant molecular heterogeneity exists within the four subgroups, meaning this classification system alone may not be sufficient to predict response to a particular therapy. Second, the majority of novel agents are currently tested at the time of recurrence, after which significant selective pressures have been exerted by radiation and chemotherapy. Recent studies demonstrate selection of tumor sub-clones that exhibit genetic divergence from the primary tumor, exist within metastatic and recurrent tumor populations. Therefore, tumor resampling at the time of recurrence may become necessary to accurately select patients for personalized therapy. PMID:29742881
sEMG Signal Acquisition Strategy towards Hand FES Control.
Toledo-Peral, Cinthya Lourdes; Gutiérrez-Martínez, Josefina; Mercado-Gutiérrez, Jorge Airy; Martín-Vignon-Whaley, Ana Isabel; Vera-Hernández, Arturo; Leija-Salas, Lorenzo
2018-01-01
Due to damage of the nervous system, patients experience impediments in their daily life: severe fatigue, tremor or impaired hand dexterity, hemiparesis, or hemiplegia. Surface electromyography (sEMG) signal analysis is used to identify motion; however, standardization of electrode placement and classification of sEMG patterns are major challenges. This paper describes a technique used to acquire sEMG signals for five hand motion patterns from six able-bodied subjects using an array of recording and stimulation electrodes placed on the forearm and its effects over functional electrical stimulation (FES) and volitional sEMG combinations, in order to eventually control a sEMG-driven FES neuroprosthesis for upper limb rehabilitation. A two-part protocol was performed. First, personalized templates to place eight sEMG bipolar channels were designed; with these data, a universal template, called forearm electrode set (FELT), was built. Second, volitional and evoked movements were recorded during FES application. 95% classification accuracy was achieved using two sessions per movement. With the FELT, it was possible to perform FES and sEMG recordings simultaneously. Also, it was possible to extract the volitional and evoked sEMG from the raw signal, which is highly important for closed-loop FES control.
Medulloblastoma in the Molecular Era.
Miranda Kuzan-Fischer, Claudia; Juraschka, Kyle; Taylor, Michael D
2018-05-01
Medulloblastoma is the most common malignant brain tumor of childhood and remains a major cause of cancer related mortality in children. Significant scientific advancements have transformed the understanding of medulloblastoma, leading to the recognition of four distinct clinical and molecular subgroups, namely wingless (WNT), sonic hedgehog, group 3, and group 4. Subgroup classification combined with the recognition of subgroup specific molecular alterations has also led to major changes in risk stratification of medulloblastoma patients and these changes have begun to alter clinical trial design, in which the newly recognized subgroups are being incorporated as individualized treatment arms. Despite these recent advancements, identification of effective targeted therapies remains a challenge for several reasons. First, significant molecular heterogeneity exists within the four subgroups, meaning this classification system alone may not be sufficient to predict response to a particular therapy. Second, the majority of novel agents are currently tested at the time of recurrence, after which significant selective pressures have been exerted by radiation and chemotherapy. Recent studies demonstrate selection of tumor sub-clones that exhibit genetic divergence from the primary tumor, exist within metastatic and recurrent tumor populations. Therefore, tumor resampling at the time of recurrence may become necessary to accurately select patients for personalized therapy.
Data Modeling Challenges of Advanced Interoperability.
Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka
2018-01-01
Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.
An Automatic User-Adapted Physical Activity Classification Method Using Smartphones.
Li, Pengfei; Wang, Yu; Tian, Yu; Zhou, Tian-Shu; Li, Jing-Song
2017-03-01
In recent years, an increasing number of people have become concerned about their health. Most chronic diseases are related to lifestyle, and daily activity records can be used as an important indicator of health. Specifically, using advanced technology to automatically monitor actual activities can effectively prevent and manage chronic diseases. The data used in this paper were obtained from acceleration sensors and gyroscopes integrated in smartphones. We designed an efficient Adaboost-Stump running on a smartphone to classify five common activities: cycling, running, sitting, standing, and walking and achieved a satisfactory classification accuracy of 98%. We designed an online learning method, and the classification model requires continuous training with actual data. The parameters in the model then become increasingly fitted to the specific user, which allows the classification accuracy to reach 95% under different use environments. In addition, this paper also utilized the OpenCL framework to design the program in parallel. This process can enhance the computing efficiency approximately ninefold.
A soil map of a large watershed in China: applying digital soil mapping in a data sparse region
NASA Astrophysics Data System (ADS)
Barthold, F.; Blank, B.; Wiesmeier, M.; Breuer, L.; Frede, H.-G.
2009-04-01
Prediction of soil classes in data sparse regions is a major research challenge. With the advent of machine learning the possibilities to spatially predict soil classes have increased tremendously and given birth to new possibilities in soil mapping. Digital soil mapping is a research field that has been established during the last decades and has been accepted widely. We now need to develop tools to reduce the uncertainty in soil predictions. This is especially challenging in data sparse regions. One approach to do this is to implement soil taxonomic distance as a classification error criterion in classification and regression trees (CART) as suggested by Minasny et al. (Geoderma 142 (2007) 285-293). This approach assumes that the classification error should be larger between soils that are more dissimilar, i.e. differ in a larger number of soil properties, and smaller between more similar soils. Our study area is the Xilin River Basin, which is located in central Inner Mongolia in China. It is characterized by semi arid climate conditions and is representative for the natural occurring steppe ecosystem. The study area comprises 3600 km2. We applied a random, stratified sampling design after McKenzie and Ryan (Geoderma 89 (1999) 67-94) with landuse and topography as stratifying variables. We defined 10 sampling classes, from each class 14 replicates were randomly drawn and sampled. The dataset was split into 100 soil profiles for training and 40 soil profiles for validation. We then applied classification and regression trees (CART) to quantify the relationships between soil classes and environmental covariates. The classification tree explained 75.5% of the variance with land use and geology as most important predictor variables. Among the 8 soil classes that we predicted, the Kastanozems cover most of the area. They are predominantly found in steppe areas. However, even some of the soils at sand dune sites, which were thought to show only little soil formation, can be classified as Kastanozems. Besides the Kastanozems, Regosols are most common at the sand dune sites as well as at sites that are defined as bare soil which are characterized by little or no vegetation. Gleysols are mostly found at sites in the vicinity of the Xilin river, which are connected to the groundwater. They can also be found in small valleys or depressions where sub-surface waters from neighboring areas collect. The richest soils are found in mountain meadow areas. Pedogenetic conditions here are most favorable and lead to the formation of Chernozems with deep humic Ah horizons. Other soil types that occur in the study area are Arenosols, Calcisols, Cambisol and Phaeozems. In addition, soil taxonomic distance is implemented into the decision tree procedure as a measure of classification error. The results of incorporating taxonomic distance as a loss function in the decision tree will be compared with the standard application of the decision tree.
Towards ontology-driven navigation of the lipid bibliosphere
Baker, Christopher JO; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R
2008-01-01
Background The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. Results We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. Conclusion As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology. PMID:18315858
Towards ontology-driven navigation of the lipid bibliosphere.
Baker, Christopher Jo; Kanagasabai, Rajaraman; Ang, Wee Tiong; Veeramani, Anitha; Low, Hong-Sang; Wenk, Markus R
2008-01-01
The indexing of scientific literature and content is a relevant and contemporary requirement within life science information systems. Navigating information available in legacy formats continues to be a challenge both in enterprise and academic domains. The emergence of semantic web technologies and their fusion with artificial intelligence techniques has provided a new toolkit with which to address these data integration challenges. In the emerging field of lipidomics such navigation challenges are barriers to the translation of scientific results into actionable knowledge, critical to the treatment of diseases such as Alzheimer's syndrome, Mycobacterium infections and cancer. We present a literature-driven workflow involving document delivery and natural language processing steps generating tagged sentences containing lipid, protein and disease names, which are instantiated to custom designed lipid ontology. We describe the design challenges in capturing lipid nomenclature, the mandate of the ontology and its role as query model in the navigation of the lipid bibliosphere. We illustrate the extent of the description logic-based A-box query capability provided by the instantiated ontology using a graphical query composer to query sentences describing lipid-protein and lipid-disease correlations. As scientists accept the need to readjust the manner in which we search for information and derive knowledge we illustrate a system that can constrain the literature explosion and knowledge navigation problems. Specifically we have focussed on solving this challenge for lipidomics researchers who have to deal with the lack of standardized vocabulary, differing classification schemes, and a wide array of synonyms before being able to derive scientific insights. The use of the OWL-DL variant of the Web Ontology Language (OWL) and description logic reasoning is pivotal in this regard, providing the lipid scientist with advanced query access to the results of text mining algorithms instantiated into the ontology. The visual query paradigm assists in the adoption of this technology.
The Carnegie Classification of Institutions of Higher Education. 2000 Edition. A Technical Report.
ERIC Educational Resources Information Center
Carnegie Foundation for the Advancement of Teaching, Menlo Park, CA.
The Carnegie Classification of Institutions of Higher Education is the framework in which institutional diversity in United States higher education is commonly described. Developed in 1971, the Classification was designed to support research in higher education by identifying categories of colleges and universities that would be homogeneous with…
Variance estimates and confidence intervals for the Kappa measure of classification accuracy
M. A. Kalkhan; R. M. Reich; R. L. Czaplewski
1997-01-01
The Kappa statistic is frequently used to characterize the results of an accuracy assessment used to evaluate land use and land cover classifications obtained by remotely sensed data. This statistic allows comparisons of alternative sampling designs, classification algorithms, photo-interpreters, and so forth. In order to make these comparisons, it is...
Comparative Analysis of RF Emission Based Fingerprinting Techniques for ZigBee Device Classification
quantify the differences invarious RF fingerprinting techniques via comparative analysis of MDA/ML classification results. The findings herein demonstrate...correct classification rates followed by COR-DNA and then RF-DNA in most test cases and especially in low Eb/N0 ranges, where ZigBee is designed to operate.
ERIC Educational Resources Information Center
Cartwright, Kelly B.
2002-01-01
A reading-specific multiple classification task was designed that required children to classify printed words along phonological and semantic dimensions simultaneously. Reading-specific multiple classification skill made a unique contribution to children's reading comprehension over contributions made by age, domain-general multiple classification…
Linear Classification of Dairy Cattle. Slide Script.
ERIC Educational Resources Information Center
Sipiorski, James; Spike, Peter
This slide script, part of a series of slide scripts designed for use in vocational agriculture classes, deals with principles of the linear classification of dairy cattle. Included in the guide are narrations for use with 63 slides, which illustrate the following areas that are considered in the linear classification system: stature, strength,…
Optimizing spectral CT parameters for material classification tasks
NASA Astrophysics Data System (ADS)
Rigie, D. S.; La Rivière, P. J.
2016-06-01
In this work, we propose a framework for optimizing spectral CT imaging parameters and hardware design with regard to material classification tasks. Compared with conventional CT, many more parameters must be considered when designing spectral CT systems and protocols. These choices will impact material classification performance in a non-obvious, task-dependent way with direct implications for radiation dose reduction. In light of this, we adapt Hotelling Observer formalisms typically applied to signal detection tasks to the spectral CT, material-classification problem. The result is a rapidly computable metric that makes it possible to sweep out many system configurations, generating parameter optimization curves (POC’s) that can be used to select optimal settings. The proposed model avoids restrictive assumptions about the basis-material decomposition (e.g. linearity) and incorporates signal uncertainty with a stochastic object model. This technique is demonstrated on dual-kVp and photon-counting systems for two different, clinically motivated material classification tasks (kidney stone classification and plaque removal). We show that the POC’s predicted with the proposed analytic model agree well with those derived from computationally intensive numerical simulation studies.
Optimizing Spectral CT Parameters for Material Classification Tasks
Rigie, D. S.; La Rivière, P. J.
2017-01-01
In this work, we propose a framework for optimizing spectral CT imaging parameters and hardware design with regard to material classification tasks. Compared with conventional CT, many more parameters must be considered when designing spectral CT systems and protocols. These choices will impact material classification performance in a non-obvious, task-dependent way with direct implications for radiation dose reduction. In light of this, we adapt Hotelling Observer formalisms typically applied to signal detection tasks to the spectral CT, material-classification problem. The result is a rapidly computable metric that makes it possible to sweep out many system configurations, generating parameter optimization curves (POC’s) that can be used to select optimal settings. The proposed model avoids restrictive assumptions about the basis-material decomposition (e.g. linearity) and incorporates signal uncertainty with a stochastic object model. This technique is demonstrated on dual-kVp and photon-counting systems for two different, clinically motivated material classification tasks (kidney stone classification and plaque removal). We show that the POC’s predicted with the proposed analytic model agree well with those derived from computationally intensive numerical simulation studies. PMID:27227430
Hierarchy-associated semantic-rule inference framework for classifying indoor scenes
NASA Astrophysics Data System (ADS)
Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei
2016-03-01
Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.
NASA Astrophysics Data System (ADS)
Fujita, Yusuke; Mitani, Yoshihiro; Hamamoto, Yoshihiko; Segawa, Makoto; Terai, Shuji; Sakaida, Isao
2017-03-01
Ultrasound imaging is a popular and non-invasive tool used in the diagnoses of liver disease. Cirrhosis is a chronic liver disease and it can advance to liver cancer. Early detection and appropriate treatment are crucial to prevent liver cancer. However, ultrasound image analysis is very challenging, because of the low signal-to-noise ratio of ultrasound images. To achieve the higher classification performance, selection of training regions of interest (ROIs) is very important that effect to classification accuracy. The purpose of our study is cirrhosis detection with high accuracy using liver ultrasound images. In our previous works, training ROI selection by MILBoost and multiple-ROI classification based on the product rule had been proposed, to achieve high classification performance. In this article, we propose self-training method to select training ROIs effectively. Evaluation experiments were performed to evaluate effect of self-training, using manually selected ROIs and also automatically selected ROIs. Experimental results show that self-training for manually selected ROIs achieved higher classification performance than other approaches, including our conventional methods. The manually ROI definition and sample selection are important to improve classification accuracy in cirrhosis detection using ultrasound images.
NASA Astrophysics Data System (ADS)
Cui, Binge; Ma, Xiudan; Xie, Xiaoyun; Ren, Guangbo; Ma, Yi
2017-03-01
The classification of hyperspectral images with a few labeled samples is a major challenge which is difficult to meet unless some spatial characteristics can be exploited. In this study, we proposed a novel spectral-spatial hyperspectral image classification method that exploited spatial autocorrelation of hyperspectral images. First, image segmentation is performed on the hyperspectral image to assign each pixel to a homogeneous region. Second, the visible and infrared bands of hyperspectral image are partitioned into multiple subsets of adjacent bands, and each subset is merged into one band. Recursive edge-preserving filtering is performed on each merged band which utilizes the spectral information of neighborhood pixels. Third, the resulting spectral and spatial feature band set is classified using the SVM classifier. Finally, bilateral filtering is performed to remove "salt-and-pepper" noise in the classification result. To preserve the spatial structure of hyperspectral image, edge-preserving filtering is applied independently before and after the classification process. Experimental results on different hyperspectral images prove that the proposed spectral-spatial classification approach is robust and offers more classification accuracy than state-of-the-art methods when the number of labeled samples is small.
Actionable exomic incidental findings in 6503 participants: challenges of variant classification
Amendola, Laura M.; Dorschner, Michael O.; Robertson, Peggy D.; Salama, Joseph S.; Hart, Ragan; Shirts, Brian H.; Murray, Mitzi L.; Tokita, Mari J.; Gallego, Carlos J.; Kim, Daniel Seung; Bennett, James T.; Crosslin, David R.; Ranchalis, Jane; Jones, Kelly L.; Rosenthal, Elisabeth A.; Jarvik, Ella R.; Itsara, Andy; Turner, Emily H.; Herman, Daniel S.; Schleit, Jennifer; Burt, Amber; Jamal, Seema M.; Abrudan, Jenica L.; Johnson, Andrew D.; Conlin, Laura K.; Dulik, Matthew C.; Santani, Avni; Metterville, Danielle R.; Kelly, Melissa; Foreman, Ann Katherine M.; Lee, Kristy; Taylor, Kent D.; Guo, Xiuqing; Crooks, Kristy; Kiedrowski, Lesli A.; Raffel, Leslie J.; Gordon, Ora; Machini, Kalotina; Desnick, Robert J.; Biesecker, Leslie G.; Lubitz, Steven A.; Mulchandani, Surabhi; Cooper, Greg M.; Joffe, Steven; Richards, C. Sue; Yang, Yaoping; Rotter, Jerome I.; Rich, Stephen S.; O’Donnell, Christopher J.; Berg, Jonathan S.; Spinner, Nancy B.; Evans, James P.; Fullerton, Stephanie M.; Leppig, Kathleen A.; Bennett, Robin L.; Bird, Thomas; Sybert, Virginia P.; Grady, William M.; Tabor, Holly K.; Kim, Jerry H.; Bamshad, Michael J.; Wilfond, Benjamin; Motulsky, Arno G.; Scott, C. Ronald; Pritchard, Colin C.; Walsh, Tom D.; Burke, Wylie; Raskind, Wendy H.; Byers, Peter; Hisama, Fuki M.; Rehm, Heidi; Nickerson, Debbie A.; Jarvik, Gail P.
2015-01-01
Recommendations for laboratories to report incidental findings from genomic tests have stimulated interest in such results. In order to investigate the criteria and processes for assigning the pathogenicity of specific variants and to estimate the frequency of such incidental findings in patients of European and African ancestry, we classified potentially actionable pathogenic single-nucleotide variants (SNVs) in all 4300 European- and 2203 African-ancestry participants sequenced by the NHLBI Exome Sequencing Project (ESP). We considered 112 gene-disease pairs selected by an expert panel as associated with medically actionable genetic disorders that may be undiagnosed in adults. The resulting classifications were compared to classifications from other clinical and research genetic testing laboratories, as well as with in silico pathogenicity scores. Among European-ancestry participants, 30 of 4300 (0.7%) had a pathogenic SNV and six (0.1%) had a disruptive variant that was expected to be pathogenic, whereas 52 (1.2%) had likely pathogenic SNVs. For African-ancestry participants, six of 2203 (0.3%) had a pathogenic SNV and six (0.3%) had an expected pathogenic disruptive variant, whereas 13 (0.6%) had likely pathogenic SNVs. Genomic Evolutionary Rate Profiling mammalian conservation score and the Combined Annotation Dependent Depletion summary score of conservation, substitution, regulation, and other evidence were compared across pathogenicity assignments and appear to have utility in variant classification. This work provides a refined estimate of the burden of adult onset, medically actionable incidental findings expected from exome sequencing, highlights challenges in variant classification, and demonstrates the need for a better curated variant interpretation knowledge base. PMID:25637381
Exploring Deep Learning and Transfer Learning for Colonic Polyp Classification
Uhl, Andreas; Wimmer, Georg; Häfner, Michael
2016-01-01
Recently, Deep Learning, especially through Convolutional Neural Networks (CNNs) has been widely used to enable the extraction of highly representative features. This is done among the network layers by filtering, selecting, and using these features in the last fully connected layers for pattern classification. However, CNN training for automated endoscopic image classification still provides a challenge due to the lack of large and publicly available annotated databases. In this work we explore Deep Learning for the automated classification of colonic polyps using different configurations for training CNNs from scratch (or full training) and distinct architectures of pretrained CNNs tested on 8-HD-endoscopic image databases acquired using different modalities. We compare our results with some commonly used features for colonic polyp classification and the good results suggest that features learned by CNNs trained from scratch and the “off-the-shelf” CNNs features can be highly relevant for automated classification of colonic polyps. Moreover, we also show that the combination of classical features and “off-the-shelf” CNNs features can be a good approach to further improve the results. PMID:27847543
Velidedeoglu, Mehmet; Arikan, Akif Enes; Uludag, Sezgin Server; Olgun, Deniz Cebi; Kilic, Fahrettin; Kapan, Metin
2015-05-01
Due to being a severe complication, iatrogenic bile duct injury is still a challenging issue for surgeons in gallbladder surgery. However, a commonly accepted classification describing the type of injury has not been available yet. This study aims to evaluate ability of six current classification systems to discriminate bile duct injury patterns. Twelve patients, who were referred to our clinic because of iatrogenic bile duct injury after laparoscopic cholecystectomy were reviewed retrospectively. We described type of injury for each patient according to current six different classifications. 9 patients underwent definitive biliary reconstruction. Bismuth, Strasberg-Bismuth, Stewart-Way and Neuhaus classifications do not consider vascular involvement, Siewert system does, but only for the tangential lesions without structural loss of duct and lesion with a structural defect of hepatic or common bile duct. Siewert, Neuhaus and Stewart-Way systems do not discriminate between lesions at or above bifurcation of the hepatic duct. The Hannover classification may resolve the missing aspects of other systems by describing additional vascular involvement and location of the lesion at or above bifurcation.
Scalable metagenomic taxonomy classification using a reference genome database
Ames, Sasha K.; Hysom, David A.; Gardner, Shea N.; Lloyd, G. Scott; Gokhale, Maya B.; Allen, Jonathan E.
2013-01-01
Motivation: Deep metagenomic sequencing of biological samples has the potential to recover otherwise difficult-to-detect microorganisms and accurately characterize biological samples with limited prior knowledge of sample contents. Existing metagenomic taxonomic classification algorithms, however, do not scale well to analyze large metagenomic datasets, and balancing classification accuracy with computational efficiency presents a fundamental challenge. Results: A method is presented to shift computational costs to an off-line computation by creating a taxonomy/genome index that supports scalable metagenomic classification. Scalable performance is demonstrated on real and simulated data to show accurate classification in the presence of novel organisms on samples that include viruses, prokaryotes, fungi and protists. Taxonomic classification of the previously published 150 giga-base Tyrolean Iceman dataset was found to take <20 h on a single node 40 core large memory machine and provide new insights on the metagenomic contents of the sample. Availability: Software was implemented in C++ and is freely available at http://sourceforge.net/projects/lmat Contact: allen99@llnl.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23828782
Lung texture classification using bag of visual words
NASA Astrophysics Data System (ADS)
Asherov, Marina; Diamant, Idit; Greenspan, Hayit
2014-03-01
Interstitial lung diseases (ILD) refer to a group of more than 150 parenchymal lung disorders. High-Resolution Computed Tomography (HRCT) is the most essential imaging modality of ILD diagnosis. Nonetheless, classification of various lung tissue patterns caused by ILD is still regarded as a challenging task. The current study focuses on the classification of five most common categories of lung tissues of ILD in HRCT images: normal, emphysema, ground glass, fibrosis and micronodules. The objective of the research is to classify an expert-given annotated region of interest (AROI) using a bag of visual words (BoVW) framework. The images are divided into small patches and a collection of representative patches are defined as visual words. This procedure, termed dictionary construction, is performed for each individual lung texture category. The assumption is that different lung textures are represented by a different visual word distribution. The classification is performed using an SVM classifier with histogram intersection kernel. In the experiments, we use a dataset of 1018 AROIs from 95 patients. Classification using a leave-one-patient-out cross validation (LOPO CV) is used. Current classification accuracy obtained is close to 80%.
NASA Astrophysics Data System (ADS)
Matikainen, L.; Karila, K.; Hyyppä, J.; Puttonen, E.; Litkey, P.; Ahokas, E.
2017-10-01
This article summarises our first results and experiences on the use of multispectral airborne laser scanner (ALS) data. Optech Titan multispectral ALS data over a large suburban area in Finland were acquired on three different dates in 2015-2016. We investigated the feasibility of the data from the first date for land cover classification and road mapping. Object-based analyses with segmentation and random forests classification were used. The potential of the data for change detection of buildings and roads was also demonstrated. The overall accuracy of land cover classification results with six classes was 96 % compared with validation points. The data also showed high potential for road detection, road surface classification and change detection. The multispectral intensity information appeared to be very important for automated classifications. Compared to passive aerial images, the intensity images have interesting advantages, such as the lack of shadows. Currently, we focus on analyses and applications with the multitemporal multispectral data. Important questions include, for example, the potential and challenges of the multitemporal data for change detection.
HEp-2 cell image classification method based on very deep convolutional networks with small datasets
NASA Astrophysics Data System (ADS)
Lu, Mengchi; Gao, Long; Guo, Xifeng; Liu, Qiang; Yin, Jianping
2017-07-01
Human Epithelial-2 (HEp-2) cell images staining patterns classification have been widely used to identify autoimmune diseases by the anti-Nuclear antibodies (ANA) test in the Indirect Immunofluorescence (IIF) protocol. Because manual test is time consuming, subjective and labor intensive, image-based Computer Aided Diagnosis (CAD) systems for HEp-2 cell classification are developing. However, methods proposed recently are mostly manual features extraction with low accuracy. Besides, the scale of available benchmark datasets is small, which does not exactly suitable for using deep learning methods. This issue will influence the accuracy of cell classification directly even after data augmentation. To address these issues, this paper presents a high accuracy automatic HEp-2 cell classification method with small datasets, by utilizing very deep convolutional networks (VGGNet). Specifically, the proposed method consists of three main phases, namely image preprocessing, feature extraction and classification. Moreover, an improved VGGNet is presented to address the challenges of small-scale datasets. Experimental results over two benchmark datasets demonstrate that the proposed method achieves superior performance in terms of accuracy compared with existing methods.
Hripcsak, George; Knirsch, Charles; Zhou, Li; Wilcox, Adam; Melton, Genevieve B
2007-03-01
Data mining in electronic medical records may facilitate clinical research, but much of the structured data may be miscoded, incomplete, or non-specific. The exploitation of narrative data using natural language processing may help, although nesting, varying granularity, and repetition remain challenges. In a study of community-acquired pneumonia using electronic records, these issues led to poor classification. Limiting queries to accurate, complete records led to vastly reduced, possibly biased samples. We exploited knowledge latent in the electronic records to improve classification. A similarity metric was used to cluster cases. We defined discordance as the degree to which cases within a cluster give different answers for some query that addresses a classification task of interest. Cases with higher discordance are more likely to be incorrectly classified, and can be reviewed manually to adjust the classification, improve the query, or estimate the likely accuracy of the query. In a study of pneumonia--in which the ICD9-CM coding was found to be very poor--the discordance measure was statistically significantly correlated with classification correctness (.45; 95% CI .15-.62).
Integration of heterogeneous features for remote sensing scene classification
NASA Astrophysics Data System (ADS)
Wang, Xin; Xiong, Xingnan; Ning, Chen; Shi, Aiye; Lv, Guofang
2018-01-01
Scene classification is one of the most important issues in remote sensing (RS) image processing. We find that features from different channels (shape, spectral, texture, etc.), levels (low-level and middle-level), or perspectives (local and global) could provide various properties for RS images, and then propose a heterogeneous feature framework to extract and integrate heterogeneous features with different types for RS scene classification. The proposed method is composed of three modules (1) heterogeneous features extraction, where three heterogeneous feature types, called DS-SURF-LLC, mean-Std-LLC, and MS-CLBP, are calculated, (2) heterogeneous features fusion, where the multiple kernel learning (MKL) is utilized to integrate the heterogeneous features, and (3) an MKL support vector machine classifier for RS scene classification. The proposed method is extensively evaluated on three challenging benchmark datasets (a 6-class dataset, a 12-class dataset, and a 21-class dataset), and the experimental results show that the proposed method leads to good classification performance. It produces good informative features to describe the RS image scenes. Moreover, the integration of heterogeneous features outperforms some state-of-the-art features on RS scene classification tasks.
Taxonomy for Common-Cause Failure Vulnerability and Mitigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Richard Thomas; Korsah, Kofi; Mullens, James Allen
2015-09-01
Applying current guidance and practices for common-cause failure (CCF) mitigation to digital instrumentation and control (I&C) systems has proven problematic, and the regulatory environment has been unpredictable. The potential for CCF vulnerability inhibits I&C modernization, thereby challenging the long-term sustainability of existing plants. For new plants and advanced reactor concepts, concern about CCF vulnerability in highly integrated digital I&C systems imposes a design burden that results in higher costs and increased complexity. The regulatory uncertainty in determining which mitigation strategies will be acceptable (e.g., what diversity is needed and how much is sufficient) drives designers to adopt complicated, costly solutionsmore » devised for existing plants. To address the conditions that constrain the transition to digital I&C technology by the US nuclear industry, crosscutting research is needed to resolve uncertainty, demonstrate necessary characteristics, and establish an objective basis for qualification of digital technology for nuclear power plant (NPP) I&C applications. To fulfill this research need, Oak Ridge National Laboratory is investigating mitigation of CCF vulnerability for nuclear-qualified applications. The outcome of this research is expected to contribute to a fundamentally sound, comprehensive basis to qualify digital technology for nuclear power applications. This report documents the development of a CCF taxonomy. The basis for the CCF taxonomy was generated by determining consistent terminology and establishing a classification approach. The terminology is based on definitions from standards, guides, and relevant nuclear power industry technical reports. The classification approach is derived from identified classification schemes focused on I&C systems and key characteristics, including failure modes. The CCF taxonomy provides the basis for a systematic organization of key systems aspects relevant to analyzing the potential for CCF vulnerability and the suitability of mitigation techniques. Development of an effective CCF taxonomy will help to provide a framework for establishing the objective analysis and assessment capabilities desired to facilitate rigorous identification of fault types and triggers that are the fundamental elements of CCF.« less
Deep learning for tumor classification in imaging mass spectrometry.
Behrmann, Jens; Etmann, Christian; Boskamp, Tobias; Casadonte, Rita; Kriegsmann, Jörg; Maaß, Peter
2018-04-01
Tumor classification using imaging mass spectrometry (IMS) data has a high potential for future applications in pathology. Due to the complexity and size of the data, automated feature extraction and classification steps are required to fully process the data. Since mass spectra exhibit certain structural similarities to image data, deep learning may offer a promising strategy for classification of IMS data as it has been successfully applied to image classification. Methodologically, we propose an adapted architecture based on deep convolutional networks to handle the characteristics of mass spectrometry data, as well as a strategy to interpret the learned model in the spectral domain based on a sensitivity analysis. The proposed methods are evaluated on two algorithmically challenging tumor classification tasks and compared to a baseline approach. Competitiveness of the proposed methods is shown on both tasks by studying the performance via cross-validation. Moreover, the learned models are analyzed by the proposed sensitivity analysis revealing biologically plausible effects as well as confounding factors of the considered tasks. Thus, this study may serve as a starting point for further development of deep learning approaches in IMS classification tasks. https://gitlab.informatik.uni-bremen.de/digipath/Deep_Learning_for_Tumor_Classification_in_IMS. jbehrmann@uni-bremen.de or christianetmann@uni-bremen.de. Supplementary data are available at Bioinformatics online.
Wong, Wai Keat; Shetty, Subhaschandra
2017-08-01
Parotidectomy remains the mainstay of treatment for both benign and malignant lesions of the parotid gland. There exists a wide range of possible surgical options in parotidectomy in terms of extent of parotid tissue removed. There is increasing need for uniformity of terminology resulting from growing interest in modifications of the conventional parotidectomy. It is, therefore, of paramount importance for a standardized classification system in describing extent of parotidectomy. Recently, the European Salivary Gland Society (ESGS) proposed a novel classification system for parotidectomy. The aim of this study is to evaluate this system. A classification system proposed by the ESGS was critically re-evaluated and modified to increase its accuracy and its acceptability. Modifications mainly focused on subdividing Levels I and II into IA, IB, IIA, and IIB. From June 2006 to June 2016, 126 patients underwent 130 parotidectomies at our hospital. The classification system was tested in that cohort of patient. While the ESGS classification system is comprehensive, it does not cover all possibilities. The addition of Sublevels IA, IB, IIA, and IIB may help to address some of the clinical situations seen and is clinically relevant. We aim to test the modified classification system for partial parotidectomy to address some of the challenges mentioned.
The 2015 WHO Classification of Tumors of the Thymus: Continuity and Changes
Marx, Alexander; Chan, John K.C.; Coindre, Jean-Michel; Detterbeck, Frank; Girard, Nicolas; Harris, Nancy L.; Jaffe, Elaine S.; Kurrer, Michael O.; Marom, Edith M.; Moreira, Andre L.; Mukai, Kiyoshi; Orazi, Attilio; Ströbel, Philipp
2015-01-01
This overview of the 4th edition of the WHO Classification of thymic tumors has two aims. First, to comprehensively list the established and new tumour entities and variants that are described in the new WHO Classification of thymic epithelial tumors, germ cell tumors, lymphomas, dendritic cell and myeloid neoplasms, and soft tissue tumors of the thymus and mediastinum; second, to highlight major differences in the new WHO Classification that result from the progress that has been made since the 3rd edition in 2004 at immunohistochemical, genetic and conceptual levels. Refined diagnostic criteria for type A, AB, B1–B3 thymomas and thymic squamous cell carcinoma are given and will hopefully improve the reproducibility of the classification and its clinical relevance. The clinical perspective of the classification has been strengthened by involving experts from radiology, thoracic surgery and oncology; by incorporating state-of-the-art PET/CT images; and by depicting prototypic cytological specimens. This makes the thymus section of the new WHO Classification of Tumours of the Lung, Pleura, Thymus and Heart a valuable tool for pathologists, cytologists and clinicians alike. The impact of the new WHO Classification on therapeutic decisions is exemplified in this overview for thymic epithelial tumors and mediastinal lymphomas, and future perspectives and challenges are discussed. PMID:26295375
Continuous robust sound event classification using time-frequency features and deep learning
Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification. PMID:28892478
Continuous robust sound event classification using time-frequency features and deep learning.
McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy
2017-01-01
The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.
77 FR 37346 - Export Control Reform Transition Plan
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-21
... review the appropriate Export Control Classification Number (ECCN) to determine the classification of their item. Licensees who are unsure of the proper ECCN designation may request a Commodity...
NASA Technical Reports Server (NTRS)
Bowell, E.; Chapman, C. R.; Gradie, J. C.; Zellner, B.; Morrison, D.
1978-01-01
A taxonomic system for asteroids is discussed which is based on seven directly observable parameters from polarimetry, spectrophotometry, radiometry, and UBV photometry. The classification scheme is entirely empirical and independent of specific mineralogical interpretations. Five broad classes (designated C, S, M, E, and R), as well as an 'unclassifiable' designation, are defined on the basis of observational data for 523 asteroids. Computer-generated type classifications and derived diameters are given for the 523 asteroids, and the application of the classification procedure is illustrated. Of the 523 asteroids classified, 190 are identified as C objects, 141 as S type, 13 as type M, three as type E, three as type R, 55 as unclassifiable, and 118 as ambiguous. The present taxonomic system is compared with several other asteroid classification systems.
NASA Astrophysics Data System (ADS)
Setiyoko, A.; Dharma, I. G. W. S.; Haryanto, T.
2017-01-01
Multispectral data and hyperspectral data acquired from satellite sensor have the ability in detecting various objects on the earth ranging from low scale to high scale modeling. These data are increasingly being used to produce geospatial information for rapid analysis by running feature extraction or classification process. Applying the most suited model for this data mining is still challenging because there are issues regarding accuracy and computational cost. This research aim is to develop a better understanding regarding object feature extraction and classification applied for satellite image by systematically reviewing related recent research projects. A method used in this research is based on PRISMA statement. After deriving important points from trusted sources, pixel based and texture-based feature extraction techniques are promising technique to be analyzed more in recent development of feature extraction and classification.
An Optimization-based Framework to Learn Conditional Random Fields for Multi-label Classification
Naeini, Mahdi Pakdaman; Batal, Iyad; Liu, Zitao; Hong, CharmGil; Hauskrecht, Milos
2015-01-01
This paper studies multi-label classification problem in which data instances are associated with multiple, possibly high-dimensional, label vectors. This problem is especially challenging when labels are dependent and one cannot decompose the problem into a set of independent classification problems. To address the problem and properly represent label dependencies we propose and study a pairwise conditional random Field (CRF) model. We develop a new approach for learning the structure and parameters of the CRF from data. The approach maximizes the pseudo likelihood of observed labels and relies on the fast proximal gradient descend for learning the structure and limited memory BFGS for learning the parameters of the model. Empirical results on several datasets show that our approach outperforms several multi-label classification baselines, including recently published state-of-the-art methods. PMID:25927015
Ethical challenges in developing drugs for psychiatric disorders.
Carrier, Felix; Banayan, David; Boley, Randy; Karnik, Niranjan
2017-05-01
As the classification of mental disorders advances towards a disease model as promoted by the National Institute of Mental Health (NIMH) Research Domain Criteria (RDoC), there is hope that a more thorough neurobiological understanding of mental illness may allow clinicians and researchers to determine treatment efficacy with less diagnostic variability. This paradigm shift has presented a variety of ethical issues to be considered in the development of psychiatric drugs. These challenges are not limited to informed consent practices, industry funding, and placebo use. The consideration for alternative research models and quality of research design also present ethical challenges in the development of psychiatric drugs. The imperatives to create valid and sound research that justify the human time, cost, risk and use of limited resources must also be considered. Clinical innovation, and consideration for special populations are also important aspects to take into account. Based on the breadth of these ethical concerns, it is particularly important that scientific questions regarding the development of psychiatric drugs be answered collaboratively by a variety of stakeholders. As the field expands, new ethical considerations will be raised with increased focus on genetic markers, personalized medicine, patient-centered outcomes research, and tension over funding. We suggest that innovation in trial design is necessary to better reflect practices in clinical settings and that there must be an emphasized focus on expanding the transparency of consent processes, regard for suicidality, and care in working with special populations to support the goal of developing sound psychiatric drug therapies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen
2006-01-01
We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...
ERIC Educational Resources Information Center
Güyer, Tolga; Aydogdu, Seyhmus
2016-01-01
This study suggests a classification model and an e-learning system based on this model for all instructional theories, approaches, models, strategies, methods, and technics being used in the process of instructional design that constitutes a direct or indirect resource for educational technology based on the theory of intuitionistic fuzzy sets…
Rosenthal, E T; Bowles, K R; Pruss, D; van Kan, A; Vail, P J; McElroy, H; Wenstrup, R J
2015-12-01
Based on current consensus guidelines and standard practice, many genetic variants detected in clinical testing are classified as disease causing based on their predicted impact on the normal expression or function of the gene in the absence of additional data. However, our laboratory has identified a subset of such variants in hereditary cancer genes for which compelling contradictory evidence emerged after the initial evaluation following the first observation of the variant. Three representative examples of variants in BRCA1, BRCA2 and MSH2 that are predicted to disrupt splicing, prematurely truncate the protein, or remove the start codon were evaluated for pathogenicity by analyzing clinical data with multiple classification algorithms. Available clinical data for all three variants contradicts the expected pathogenic classification. These variants illustrate potential pitfalls associated with standard approaches to variant classification as well as the challenges associated with monitoring data, updating classifications, and reporting potentially contradictory interpretations to the clinicians responsible for translating test outcomes to appropriate clinical action. It is important to address these challenges now as the model for clinical testing moves toward the use of large multi-gene panels and whole exome/genome analysis, which will dramatically increase the number of genetic variants identified. © 2015 The Authors. Clinical Genetics published by John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Innovative Acoustic Sensor Technologies for Leak Detection in Challenging Pipe Types
2016-12-30
correlation features to detect and pinpoint leaks in challenging pipe types, as well as metallic pipes. 15. SUBJECT TERMS Leak detection; acoustic... correlation ; water distribution systems 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18.NUMBER OF PAGES 109 19a. NAME OF...6 1.3.2 State Regulations and Voluntary Water Industry Standards .......................... 7 2.0 TECHNOLOGY DESCRIPTION
Another Challenge for Africa: Ethnic Stability
2011-03-24
is a cultural , deep-seated personal matter to the African people. Still, ethnicity is a key foundational challenge for Africa to adequately address...Issues, Diplomacy as an Element of National Power CLASSIFICATION: Unclassified Ethnicity is a cultural , deep-seated personal matter to the...history, geography, political system, economics, demographics, culture , and languages. From a U.S. perspective, most know of Africa from news
Biometric Authentication for Gender Classification Techniques: A Review
NASA Astrophysics Data System (ADS)
Mathivanan, P.; Poornima, K.
2017-12-01
One of the challenging biometric authentication applications is gender identification and age classification, which captures gait from far distance and analyze physical information of the subject such as gender, race and emotional state of the subject. It is found that most of the gender identification techniques have focused only with frontal pose of different human subject, image size and type of database used in the process. The study also classifies different feature extraction process such as, Principal Component Analysis (PCA) and Local Directional Pattern (LDP) that are used to extract the authentication features of a person. This paper aims to analyze different gender classification techniques that help in evaluating strength and weakness of existing gender identification algorithm. Therefore, it helps in developing a novel gender classification algorithm with less computation cost and more accuracy. In this paper, an overview and classification of different gender identification techniques are first presented and it is compared with other existing human identification system by means of their performance.
Motivation Classification and Grade Prediction for MOOCs Learners
Xu, Bin; Yang, Dan
2016-01-01
While MOOCs offer educational data on a new scale, many educators find great potential of the big data including detailed activity records of every learner. A learner's behavior such as if a learner will drop out from the course can be predicted. How to provide an effective, economical, and scalable method to detect cheating on tests such as surrogate exam-taker is a challenging problem. In this paper, we present a grade predicting method that uses student activity features to predict whether a learner may get a certification if he/she takes a test. The method consists of two-step classifications: motivation classification (MC) and grade classification (GC). The MC divides all learners into three groups including certification earning, video watching, and course sampling. The GC then predicts a certification earning learner may or may not obtain a certification. Our experiment shows that the proposed method can fit the classification model at a fine scale and it is possible to find a surrogate exam-taker. PMID:26884747
A Case Study in Integrating Multiple E-commerce Standards via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Yu, Yang; Hillman, Donald; Setio, Basuki; Heflin, Jeff
Internet business-to-business transactions present great challenges in merging information from different sources. In this paper we describe a project to integrate four representative commercial classification systems with the Federal Cataloging System (FCS). The FCS is used by the US Defense Logistics Agency to name, describe and classify all items under inventory control by the DoD. Our approach uses the ECCMA Open Technical Dictionary (eOTD) as a common vocabulary to accommodate all different classifications. We create a semantic bridging ontology between each classification and the eOTD to describe their logical relationships in OWL DL. The essential idea is that since each classification has formal definitions in a common vocabulary, we can use subsumption to automatically integrate them, thus mitigating the need for pairwise mappings. Furthermore our system provides an interactive interface to let users choose and browse the results and more importantly it can translate catalogs that commit to these classifications using compiled mapping results.
Motivation Classification and Grade Prediction for MOOCs Learners.
Xu, Bin; Yang, Dan
2016-01-01
While MOOCs offer educational data on a new scale, many educators find great potential of the big data including detailed activity records of every learner. A learner's behavior such as if a learner will drop out from the course can be predicted. How to provide an effective, economical, and scalable method to detect cheating on tests such as surrogate exam-taker is a challenging problem. In this paper, we present a grade predicting method that uses student activity features to predict whether a learner may get a certification if he/she takes a test. The method consists of two-step classifications: motivation classification (MC) and grade classification (GC). The MC divides all learners into three groups including certification earning, video watching, and course sampling. The GC then predicts a certification earning learner may or may not obtain a certification. Our experiment shows that the proposed method can fit the classification model at a fine scale and it is possible to find a surrogate exam-taker.
Zhang, Y N
2017-01-01
Parkinson's disease (PD) is primarily diagnosed by clinical examinations, such as walking test, handwriting test, and MRI diagnostic. In this paper, we propose a machine learning based PD telediagnosis method for smartphone. Classification of PD using speech records is a challenging task owing to the fact that the classification accuracy is still lower than doctor-level. Here we demonstrate automatic classification of PD using time frequency features, stacked autoencoders (SAE), and K nearest neighbor (KNN) classifier. KNN classifier can produce promising classification results from useful representations which were learned by SAE. Empirical results show that the proposed method achieves better performance with all tested cases across classification tasks, demonstrating machine learning capable of classifying PD with a level of competence comparable to doctor. It concludes that a smartphone can therefore potentially provide low-cost PD diagnostic care. This paper also gives an implementation on browser/server system and reports the running time cost. Both advantages and disadvantages of the proposed telediagnosis system are discussed.
2017-01-01
Parkinson's disease (PD) is primarily diagnosed by clinical examinations, such as walking test, handwriting test, and MRI diagnostic. In this paper, we propose a machine learning based PD telediagnosis method for smartphone. Classification of PD using speech records is a challenging task owing to the fact that the classification accuracy is still lower than doctor-level. Here we demonstrate automatic classification of PD using time frequency features, stacked autoencoders (SAE), and K nearest neighbor (KNN) classifier. KNN classifier can produce promising classification results from useful representations which were learned by SAE. Empirical results show that the proposed method achieves better performance with all tested cases across classification tasks, demonstrating machine learning capable of classifying PD with a level of competence comparable to doctor. It concludes that a smartphone can therefore potentially provide low-cost PD diagnostic care. This paper also gives an implementation on browser/server system and reports the running time cost. Both advantages and disadvantages of the proposed telediagnosis system are discussed. PMID:29075547
Rule-guided human classification of Volunteered Geographic Information
NASA Astrophysics Data System (ADS)
Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian
2017-05-01
During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass-related features like forest, garden, park, and meadow. The findings of this study indicate the feasibility of the proposed approach.
On validation of the rain climatic zone designations for Nigeria
NASA Astrophysics Data System (ADS)
Obiyemi, O. O.; Ibiyemi, T. S.; Ojo, J. S.
2017-07-01
In this paper, validation of rain climatic zone classifications for Nigeria is presented based on global radio-climatic models by the International Telecommunication Union-Radiocommunication (ITU-R) and Crane. Rain rate estimates deduced from several ground-based measurements and those earlier estimated from the precipitation index on the Tropical Rain Measurement Mission (TRMM) were employed for the validation exercise. Although earlier classifications indicated that Nigeria falls into zones P, Q, N, and K for the ITU-R designations, and zones E and H for Crane's climatic zone designations, the results however confirmed that the rain climatic zones across Nigeria can only be classified into four, namely P, Q, M, and N for the ITU-R designations, while the designations by Crane exhibited only three zones, namely E, G, and H. The ITU-R classification was found to be more suitable for planning microwave and millimeter wave links across Nigeria. The research outcomes are vital in boosting the confidence level of system designers in using the ITU-R designations as presented in the map developed for the rain zone designations for estimating the attenuation induced by rain along satellite and terrestrial microwave links over Nigeria.
Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello
2012-01-01
Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%), and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.
Neural architecture design based on extreme learning machine.
Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis
2013-12-01
Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.
Angle classification revisited 2: a modified Angle classification.
Katz, M I
1992-09-01
Edward Angle, in his classification of malocclusions, appears to have made Class I a range of abnormality, not a point of ideal occlusion. Current goals of orthodontic treatment, however, strive for the designation "Class I occlusion" to be synonymous with the point of ideal intermeshing and not a broad range. If contemporary orthodontists are to continue to use Class I as a goal, then it is appropriate that Dr. Angle's century-old classification, be modified to be more precise.
ERIC Educational Resources Information Center
Fan, Xitao; Wang, Lin
The Monte Carlo study compared the performance of predictive discriminant analysis (PDA) and that of logistic regression (LR) for the two-group classification problem. Prior probabilities were used for classification, but the cost of misclassification was assumed to be equal. The study used a fully crossed three-factor experimental design (with…
A review of classification algorithms for EEG-based brain-computer interfaces.
Lotte, F; Congedo, M; Lécuyer, A; Lamarche, F; Arnaldi, B
2007-06-01
In this paper we review classification algorithms used to design brain-computer interface (BCI) systems based on electroencephalography (EEG). We briefly present the commonly employed algorithms and describe their critical properties. Based on the literature, we compare them in terms of performance and provide guidelines to choose the suitable classification algorithm(s) for a specific BCI.
Zhang, Jianhua; Li, Sunan; Wang, Rubin
2017-01-01
In this paper, we deal with the Mental Workload (MWL) classification problem based on the measured physiological data. First we discussed the optimal depth (i.e., the number of hidden layers) and parameter optimization algorithms for the Convolutional Neural Networks (CNN). The base CNNs designed were tested according to five classification performance indices, namely Accuracy, Precision, F-measure, G-mean, and required training time. Then we developed an Ensemble Convolutional Neural Network (ECNN) to enhance the accuracy and robustness of the individual CNN model. For the ECNN design, three model aggregation approaches (weighted averaging, majority voting and stacking) were examined and a resampling strategy was used to enhance the diversity of individual CNN models. The results of MWL classification performance comparison indicated that the proposed ECNN framework can effectively improve MWL classification performance and is featured by entirely automatic feature extraction and MWL classification, when compared with traditional machine learning methods.
Simple-random-sampling-based multiclass text classification algorithm.
Liu, Wuying; Wang, Lin; Yi, Mianzhu
2014-01-01
Multiclass text classification (MTC) is a challenging issue and the corresponding MTC algorithms can be used in many applications. The space-time overhead of the algorithms must be concerned about the era of big data. Through the investigation of the token frequency distribution in a Chinese web document collection, this paper reexamines the power law and proposes a simple-random-sampling-based MTC (SRSMTC) algorithm. Supported by a token level memory to store labeled documents, the SRSMTC algorithm uses a text retrieval approach to solve text classification problems. The experimental results on the TanCorp data set show that SRSMTC algorithm can achieve the state-of-the-art performance at greatly reduced space-time requirements.
Quantitative real-time analysis of collective cancer invasion and dissemination
NASA Astrophysics Data System (ADS)
Ewald, Andrew J.
2015-05-01
A grand challenge in biology is to understand the cellular and molecular basis of tissue and organ level function in mammals. The ultimate goals of such efforts are to explain how organs arise in development from the coordinated actions of their constituent cells and to determine how molecularly regulated changes in cell behavior alter the structure and function of organs during disease processes. Two major barriers stand in the way of achieving these goals: the relative inaccessibility of cellular processes in mammals and the daunting complexity of the signaling environment inside an intact organ in vivo. To overcome these barriers, we have developed a suite of tissue isolation, three dimensional (3D) culture, genetic manipulation, nanobiomaterials, imaging, and molecular analysis techniques to enable the real-time study of cell biology within intact tissues in physiologically relevant 3D environments. This manuscript introduces the rationale for 3D culture, reviews challenges to optical imaging in these cultures, and identifies current limitations in the analysis of complex experimental designs that could be overcome with improved imaging, imaging analysis, and automated classification of the results of experimental interventions.
Bioinformatic perspectives on NRPS/PKS megasynthases: advances and challenges.
Jenke-Kodama, Holger; Dittmann, Elke
2009-07-01
The increased understanding of both fundamental principles and mechanistic variations of NRPS/PKS megasynthases along with the unprecedented availability of microbial sequences has inspired a number of in silico studies of both enzyme families. The insights that can be extracted from these analyses go far beyond a rough classification of data and have turned bioinformatics into a frontier field of natural products research. As databases are flooded with NRPS/PKS gene sequence of microbial genomes and metagenomes, increasingly reliable structural prediction methods can help to uncover hidden treasures. Already, phylogenetic analyses have revealed that NRPS/PKS pathways should not simply be regarded as enzyme complexes, specifically evolved to product a selected natural product. Rather, they represent a collection of genetic opinions, allowing biosynthetic pathways to be shuffled in a process of perpetual chemical innovations and pathways diversification in nature can give impulses for specificities, protein interactions and genetic engineering of libraries of novel peptides and polyketides. The successful translation of the knowledge obtained from bioinformatic dissection of NRPS/PKS megasynthases into new techniques for drug discovery and design remain challenges for the future.
Stable architectures for deep neural networks
NASA Astrophysics Data System (ADS)
Haber, Eldad; Ruthotto, Lars
2018-01-01
Deep neural networks have become invaluable tools for supervised machine learning, e.g. classification of text or images. While often offering superior results over traditional techniques and successfully expressing complicated patterns in data, deep architectures are known to be challenging to design and train such that they generalize well to new data. Critical issues with deep architectures are numerical instabilities in derivative-based learning algorithms commonly called exploding or vanishing gradients. In this paper, we propose new forward propagation techniques inspired by systems of ordinary differential equations (ODE) that overcome this challenge and lead to well-posed learning problems for arbitrarily deep networks. The backbone of our approach is our interpretation of deep learning as a parameter estimation problem of nonlinear dynamical systems. Given this formulation, we analyze stability and well-posedness of deep learning and use this new understanding to develop new network architectures. We relate the exploding and vanishing gradient phenomenon to the stability of the discrete ODE and present several strategies for stabilizing deep learning for very deep networks. While our new architectures restrict the solution space, several numerical experiments show their competitiveness with state-of-the-art networks.
Nonlinear Deep Kernel Learning for Image Annotation.
Jiu, Mingyuan; Sahbi, Hichem
2017-02-08
Multiple kernel learning (MKL) is a widely used technique for kernel design. Its principle consists in learning, for a given support vector classifier, the most suitable convex (or sparse) linear combination of standard elementary kernels. However, these combinations are shallow and often powerless to capture the actual similarity between highly semantic data, especially for challenging classification tasks such as image annotation. In this paper, we redefine multiple kernels using deep multi-layer networks. In this new contribution, a deep multiple kernel is recursively defined as a multi-layered combination of nonlinear activation functions, each one involves a combination of several elementary or intermediate kernels, and results into a positive semi-definite deep kernel. We propose four different frameworks in order to learn the weights of these networks: supervised, unsupervised, kernel-based semisupervised and Laplacian-based semi-supervised. When plugged into support vector machines (SVMs), the resulting deep kernel networks show clear gain, compared to several shallow kernels for the task of image annotation. Extensive experiments and analysis on the challenging ImageCLEF photo annotation benchmark, the COREL5k database and the Banana dataset validate the effectiveness of the proposed method.
A Social Media Based Index of Mental Well-Being in College Campuses
Bagroy, Shrey; Kumaraguru, Ponnurangam; De Choudhury, Munmun
2017-01-01
Psychological distress in the form of depression, anxiety and other mental health challenges among college students is a growing health concern. Dearth of accurate, continuous, and multi-campus data on mental well-being presents significant challenges to intervention and mitigation efforts in college campuses. We examine the potential of social media as a new “barometer” for quantifying the mental well-being of college populations. Utilizing student-contributed data in Reddit communities of over 100 universities, we first build and evaluate a transfer learning based classification approach that can detect mental health expressions with 97% accuracy. Thereafter, we propose a robust campus-specific Mental Well-being Index: MWI. We find that MWI is able to reveal meaningful temporal patterns of mental well-being in campuses, and to assess how their expressions relate to university attributes like size, academic prestige, and student demographics. We discuss the implications of our work for improving counselor efforts, and in the design of tools that can enable better assessment of the mental health climate of college campuses. PMID:28840202
Tsimmerman, Ia S
2008-01-01
The new International Classification of Chronic Pancreatitis (designated as M-ANNHEIM) proposed by a group of German specialists in late 2007 is reviewed. All its sections are subjected to analysis (risk group categories, clinical stages and phases, variants of clinical course, diagnostic criteria for "established" and "suspected" pancreatitis, instrumental methods and functional tests used in the diagnosis, evaluation of the severity of the disease using a scoring system, stages of elimination of pain syndrome). The new classification is compared with the earlier classification proposed by the author. Its merits and demerits are discussed.
Nutritional Management in Enterocutaneous Fistula. What is the evidence?
BADRASAWI, Manal; SHAHAR, Suzana; SAGAP, Ismail
2015-01-01
The management of Enterocutaneous fistula (ECF) is challenging. It remains associated with morbidity and mortality, despite advancements in medical and surgical therapies. Early nutritional support using parenteral, enteral or fystuloclysis routs is essential to reverse catabolism and replace nutrients, fluid and electrolyte losses. This study aims to review the current literature on the management of ECF. Fistulae classifications have an impact on the calories and protein requirements. Early nutritional support with parenteral, enteral nutrition or fistuloclysis played a significant role in the management outcome. Published literature on the nutritional management of ECF is mostly retrospective and lacks experimental design. Prospective studies do not investigate nutritional assessment or management experimentally. Individualising the nutritional management protocol was recommended due to the absence of management guidelines for ECF patients. PMID:26715903
Abnormal global and local event detection in compressive sensing domain
NASA Astrophysics Data System (ADS)
Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem
2018-05-01
Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honorio, J.; Goldstein, R.; Honorio, J.
We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statisticalmore » theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.« less
NASA Astrophysics Data System (ADS)
Cardille, J. A.; Crowley, M.; Fortin, J. A.; Lee, J.; Perez, E.; Sleeter, B. M.; Thau, D.
2016-12-01
With the opening of the Landsat archive, researchers have a vast new data source teeming with imagery and potential. Beyond Landsat, data from other sensors is newly available as well: these include ALOS/PALSAR, Sentinel-1 and -2, MERIS, and many more. Google Earth Engine, developed to organize and provide analysis tools for these immense data sets, is an ideal platform for researchers trying to sift through huge image stacks. It offers nearly unlimited processing power and storage with a straightforward programming interface. Yet labeling land-cover change through time remains challenging given the current state of the art for interpreting remote sensing image sequences. Moreover, combining data from very different image platforms remains quite difficult. To address these challenges, we developed the BULC algorithm (Bayesian Updating of Land Cover), designed for the continuous updating of land-cover classifications through time in large data sets. The algorithm ingests data from any of the wide variety of earth-resources sensors; it maintains a running estimate of land-cover probabilities and the most probable class at all time points along a sequence of events. Here we compare BULC results from two study sites that witnessed considerable forest change in the last 40 years: the Pacific Northwest of the United States and the Mato Grosso region of Brazil. In Brazil, we incorporated rough classifications from more than 100 images of varying quality, mixing imagery from more than 10 different sensors. In the Pacific Northwest, we used BULC to identify forest changes due to logging and urbanization from 1973 to the present. Both regions had classification sequences that were better than many of the component days, effectively ignoring clouds and other unwanted noise while fusing the information contained on several platforms. As we leave remote sensing's data-poor era and enter a period with multiple looks at Earth's surface from multiple sensors over a short period of time, the BULC algorithm can help to sift through images of varying quality in Google Earth Engine to extract the most useful information for mapping the state and history of Earth's land cover.
NASA Astrophysics Data System (ADS)
Cardille, J. A.
2015-12-01
With the opening of the Landsat archive, researchers have a vast new data source teeming with imagery and potential. Beyond Landsat, data from other sensors is newly available as well: these include ALOS/PALSAR, Sentinel-1 and -2, MERIS, and many more. Google Earth Engine, developed to organize and provide analysis tools for these immense data sets, is an ideal platform for researchers trying to sift through huge image stacks. It offers nearly unlimited processing power and storage with a straightforward programming interface. Yet labeling forest change through time remains challenging given the current state of the art for interpreting remote sensing image sequences. Moreover, combining data from very different image platforms remains quite difficult. To address these challenges, we developed the BULC algorithm (Bayesian Updating of Land Cover), designed for the continuous updating of land-cover classifications through time in large data sets. The algorithm ingests data from any of the wide variety of earth-resources sensors; it maintains a running estimate of land-cover probabilities and the most probable class at all time points along a sequence of events. Here we compare BULC results from two study sites that witnessed considerable forest change in the last 40 years: the Pacific Northwest of the United States and the Mato Grosso region of Brazil. In Brazil, we incorporated rough classifications from more than 100 images of varying quality, mixing imagery from more than 10 different sensors. In the Pacific Northwest, we used BULC to identify forest changes due to logging and urbanization from 1973 to the present. Both regions had classification sequences that were better than many of the component days, effectively ignoring clouds and other unwanted signal while fusing the information contained on several platforms. As we leave remote sensing's data-poor era and enter a period with multiple looks at Earth's surface from multiple sensors over a short period of time, this algorithm may help to sift through images of varying quality in Google Earth Engine to extract the most useful information for mapping.
NASA Astrophysics Data System (ADS)
Montereale Gavazzi, G.; Madricardo, F.; Janowski, L.; Kruss, A.; Blondel, P.; Sigovini, M.; Foglini, F.
2016-03-01
Recent technological developments of multibeam echosounder systems (MBES) allow mapping of benthic habitats with unprecedented detail. MBES can now be employed in extremely shallow waters, challenging data acquisition (as these instruments were often designed for deeper waters) and data interpretation (honed on datasets with resolution sometimes orders of magnitude lower). With extremely high-resolution bathymetry and co-located backscatter data, it is now possible to map the spatial distribution of fine scale benthic habitats, even identifying the acoustic signatures of single sponges. In this context, it is necessary to understand which of the commonly used segmentation methods is best suited to account for such level of detail. At the same time, new sampling protocols for precisely geo-referenced ground truth data need to be developed to validate the benthic environmental classification. This study focuses on a dataset collected in a shallow (2-10 m deep) tidal channel of the Lagoon of Venice, Italy. Using 0.05-m and 0.2-m raster grids, we compared a range of classifications, both pixel-based and object-based approaches, including manual, Maximum Likelihood Classifier, Jenks Optimization clustering, textural analysis and Object Based Image Analysis. Through a comprehensive and accurately geo-referenced ground truth dataset, we were able to identify five different classes of the substrate composition, including sponges, mixed submerged aquatic vegetation, mixed detritic bottom (fine and coarse) and unconsolidated bare sediment. We computed estimates of accuracy (namely Overall, User, Producer Accuracies and the Kappa statistic) by cross tabulating predicted and reference instances. Overall, pixel based segmentations produced the highest accuracies and the accuracy assessment is strongly dependent on the number of classes chosen for the thematic output. Tidal channels in the Venice Lagoon are extremely important in terms of habitats and sediment distribution, particularly within the context of the new tidal barrier being built. However, they had remained largely unexplored until now, because of the surveying challenges. The application of this remote sensing approach, combined with targeted sampling, opens a new perspective in the monitoring of benthic habitats in view of a knowledge-based management of natural resources in shallow coastal areas.
A Classification of Designated Logic Systems
1988-02-01
Introduction to Logic. New York: Macmillan Publishing, 1972. Grimaldi , Ralph P . Discrete and Combinatorial Mathematics. Reading: Addison-Wesley...sound basis for understanding non-classical logic systems. I would like to thank the Air Force Institute of Technology for funding this research. vi p ...6 p ILLUSTRATIONS Figure Page 1. Two Classifications of Designated Logic Systems 13 2. Two Partitions of Two-valued Logic Systems 14 3. Two
Benchmarking protein classification algorithms via supervised cross-validation.
Kertész-Farkas, Attila; Dhir, Somdutta; Sonego, Paolo; Pacurar, Mircea; Netoteia, Sergiu; Nijveen, Harm; Kuzniar, Arnold; Leunissen, Jack A M; Kocsor, András; Pongor, Sándor
2008-04-24
Development and testing of protein classification algorithms are hampered by the fact that the protein universe is characterized by groups vastly different in the number of members, in average protein size, similarity within group, etc. Datasets based on traditional cross-validation (k-fold, leave-one-out, etc.) may not give reliable estimates on how an algorithm will generalize to novel, distantly related subtypes of the known protein classes. Supervised cross-validation, i.e., selection of test and train sets according to the known subtypes within a database has been successfully used earlier in conjunction with the SCOP database. Our goal was to extend this principle to other databases and to design standardized benchmark datasets for protein classification. Hierarchical classification trees of protein categories provide a simple and general framework for designing supervised cross-validation strategies for protein classification. Benchmark datasets can be designed at various levels of the concept hierarchy using a simple graph-theoretic distance. A combination of supervised and random sampling was selected to construct reduced size model datasets, suitable for algorithm comparison. Over 3000 new classification tasks were added to our recently established protein classification benchmark collection that currently includes protein sequence (including protein domains and entire proteins), protein structure and reading frame DNA sequence data. We carried out an extensive evaluation based on various machine-learning algorithms such as nearest neighbor, support vector machines, artificial neural networks, random forests and logistic regression, used in conjunction with comparison algorithms, BLAST, Smith-Waterman, Needleman-Wunsch, as well as 3D comparison methods DALI and PRIDE. The resulting datasets provide lower, and in our opinion more realistic estimates of the classifier performance than do random cross-validation schemes. A combination of supervised and random sampling was used to construct model datasets, suitable for algorithm comparison.
KineAssist: design and development of a robotic overground gait and balance therapy device.
Patton, James; Brown, David A; Peshkin, Michael; Santos-Munné, Julio J; Makhlin, Alex; Lewis, Ela; Colgate, Edward J; Schwandt, Doug
2008-01-01
Balance and mobility training consists of activities that carry a high risk for falling. The purpose of this article is to describe a novel robotic system for allowing challenging, yet safe, balance and mobility training in persons at high risk for falls. With no initial preconceptions of what device we would build, a user-needs analysis led us to focus on increasing the level of challenge to a patient's ability to maintain balance during gait training and also on maintaining direct involvement of a physical therapist (rather than attempting robotic replacement). The KineAssist is a robotic device for gait and balance training that has emerged from a unique design process of a start-up product of a small company and a team of therapists, engineers, mechanical design experts, and rehabilitation scientists. The KineAssist provides partial body weight support and postural control on the torso; allows many axes of motion of the trunk and pelvis; leaves the patient's legs accessible to a physical therapist's manipulation during walking; follows a patient's walking motions overground in forward, rotation, and sidestepping directions; and catches an individual who loses balance and begins to fall. Design and development of the KineAssist proceeded more rapidly in the context of a small company than would have been possible in most institutional research contexts. A prototype KineAssist has been constructed and has received US Food and Drug Administration (FDA) classification and institutional review board clearance for initial human studies. The acceptance of KineAssist will ultimately depend on improved patient outcomes, the use of this new tool by therapists, the ease of use of the system, and the recognition of the unique value it brings to therapeutic recovery.
Realization of Configurable One-Dimensional Reflectarray
2017-08-31
Maximum 200 words) A fundamental challenge remains in dynamically controlling the steering of long wavelength radiation (λ > 8 μm) using metal... dynamic , nanoribbons 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT: SAR 18. NUMBER OF PAGES 20 19a. NAME OF RESPONSIBLE PERSON...challenge remains in dynamically controlling the steering of long wavelength radiation (λ > 8 μm) using metal nanostructures or metamaterials (with critical
32 CFR 1907.03 - Contact for general information and requests.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Agency Release Panel, Central Intelligence Agency, Washington, DC 20505. The commercial (non-secure... INTELLIGENCE AGENCY CHALLENGES TO CLASSIFICATION OF DOCUMENTS BY AUTHORIZED HOLDERS PURSUANT TO § 1.9 OF...
Predicting Flavonoid UGT Regioselectivity
Jackson, Rhydon; Knisley, Debra; McIntosh, Cecilia; Pfeiffer, Phillip
2011-01-01
Machine learning was applied to a challenging and biologically significant protein classification problem: the prediction of avonoid UGT acceptor regioselectivity from primary sequence. Novel indices characterizing graphical models of residues were proposed and found to be widely distributed among existing amino acid indices and to cluster residues appropriately. UGT subsequences biochemically linked to regioselectivity were modeled as sets of index sequences. Several learning techniques incorporating these UGT models were compared with classifications based on standard sequence alignment scores. These techniques included an application of time series distance functions to protein classification. Time series distances defined on the index sequences were used in nearest neighbor and support vector machine classifiers. Additionally, Bayesian neural network classifiers were applied to the index sequences. The experiments identified improvements over the nearest neighbor and support vector machine classifications relying on standard alignment similarity scores, as well as strong correlations between specific subsequences and regioselectivities. PMID:21747849
ERIC Educational Resources Information Center
Funk, Kerri L.; Tseng, M. S.
Two groups of 32 educable mentally retarded children (ages 7 to 14 years) were compared as to their arithmetic and classification performances attributable to the presence or absence of a 4 1/2 week exposure to classification tasks. The randomized block pretest-posttest design was used. The experimental group and the control group were matched on…
ERIC Educational Resources Information Center
Kunina-Habenicht, Olga; Rupp, Andre A.; Wilhelm, Oliver
2012-01-01
Using a complex simulation study we investigated parameter recovery, classification accuracy, and performance of two item-fit statistics for correct and misspecified diagnostic classification models within a log-linear modeling framework. The basic manipulated test design factors included the number of respondents (1,000 vs. 10,000), attributes (3…
ERIC Educational Resources Information Center
International Federation of Library Associations and Institutions, London (England).
Five papers from the sessions of the International Federation of Library Associations and Institutions 1992 conference on classification, indexing, and cataloging are presented. Three papers deal with knowledge classification as it relates to database design, as it is practiced in India, and in a worldwide context. The remaining two papers focus…
Design of partially supervised classifiers for multispectral image data
NASA Technical Reports Server (NTRS)
Jeon, Byeungwoo; Landgrebe, David
1993-01-01
A partially supervised classification problem is addressed, especially when the class definition and corresponding training samples are provided a priori only for just one particular class. In practical applications of pattern classification techniques, a frequently observed characteristic is the heavy, often nearly impossible requirements on representative prior statistical class characteristics of all classes in a given data set. Considering the effort in both time and man-power required to have a well-defined, exhaustive list of classes with a corresponding representative set of training samples, this 'partially' supervised capability would be very desirable, assuming adequate classifier performance can be obtained. Two different classification algorithms are developed to achieve simplicity in classifier design by reducing the requirement of prior statistical information without sacrificing significant classifying capability. The first one is based on optimal significance testing, where the optimal acceptance probability is estimated directly from the data set. In the second approach, the partially supervised classification is considered as a problem of unsupervised clustering with initially one known cluster or class. A weighted unsupervised clustering procedure is developed to automatically define other classes and estimate their class statistics. The operational simplicity thus realized should make these partially supervised classification schemes very viable tools in pattern classification.
Agent Collaborative Target Localization and Classification in Wireless Sensor Networks
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
Wireless sensor networks (WSNs) are autonomous networks that have been frequently deployed to collaboratively perform target localization and classification tasks. Their autonomous and collaborative features resemble the characteristics of agents. Such similarities inspire the development of heterogeneous agent architecture for WSN in this paper. The proposed agent architecture views WSN as multi-agent systems and mobile agents are employed to reduce in-network communication. According to the architecture, an energy based acoustic localization algorithm is proposed. In localization, estimate of target location is obtained by steepest descent search. The search algorithm adapts to measurement environments by dynamically adjusting its termination condition. With the agent architecture, target classification is accomplished by distributed support vector machine (SVM). Mobile agents are employed for feature extraction and distributed SVM learning to reduce communication load. Desirable learning performance is guaranteed by combining support vectors and convex hull vectors. Fusion algorithms are designed to merge SVM classification decisions made from various modalities. Real world experiments with MICAz sensor nodes are conducted for vehicle localization and classification. Experimental results show the proposed agent architecture remarkably facilitates WSN designs and algorithm implementation. The localization and classification algorithms also prove to be accurate and energy efficient.
Li, Zhan; Schaefer, Michael; Strahler, Alan; Schaaf, Crystal; Jupp, David
2018-04-06
The Dual-Wavelength Echidna Lidar (DWEL), a full waveform terrestrial laser scanner (TLS), has been used to scan a variety of forested and agricultural environments. From these scanning campaigns, we summarize the benefits and challenges given by DWEL's novel coaxial dual-wavelength scanning technology, particularly for the three-dimensional (3D) classification of vegetation elements. Simultaneous scanning at both 1064 nm and 1548 nm by DWEL instruments provides a new spectral dimension to TLS data that joins the 3D spatial dimension of lidar as an information source. Our point cloud classification algorithm explores the utilization of both spectral and spatial attributes of individual points from DWEL scans and highlights the strengths and weaknesses of each attribute domain. The spectral and spatial attributes for vegetation element classification each perform better in different parts of vegetation (canopy interior, fine branches, coarse trunks, etc.) and under different vegetation conditions (dead or live, leaf-on or leaf-off, water content, etc.). These environmental characteristics of vegetation, convolved with the lidar instrument specifications and lidar data quality, result in the actual capabilities of spectral and spatial attributes to classify vegetation elements in 3D space. The spectral and spatial information domains thus complement each other in the classification process. The joint use of both not only enhances the classification accuracy but also reduces its variance across the multiple vegetation types we have examined, highlighting the value of the DWEL as a new source of 3D spectral information. Wider deployment of the DWEL instruments is in practice currently held back by challenges in instrument development and the demands of data processing required by coaxial dual- or multi-wavelength scanning. But the simultaneous 3D acquisition of both spectral and spatial features, offered by new multispectral scanning instruments such as the DWEL, opens doors to study biophysical and biochemical properties of forested and agricultural ecosystems at more detailed scales.
Actionable exomic incidental findings in 6503 participants: challenges of variant classification.
Amendola, Laura M; Dorschner, Michael O; Robertson, Peggy D; Salama, Joseph S; Hart, Ragan; Shirts, Brian H; Murray, Mitzi L; Tokita, Mari J; Gallego, Carlos J; Kim, Daniel Seung; Bennett, James T; Crosslin, David R; Ranchalis, Jane; Jones, Kelly L; Rosenthal, Elisabeth A; Jarvik, Ella R; Itsara, Andy; Turner, Emily H; Herman, Daniel S; Schleit, Jennifer; Burt, Amber; Jamal, Seema M; Abrudan, Jenica L; Johnson, Andrew D; Conlin, Laura K; Dulik, Matthew C; Santani, Avni; Metterville, Danielle R; Kelly, Melissa; Foreman, Ann Katherine M; Lee, Kristy; Taylor, Kent D; Guo, Xiuqing; Crooks, Kristy; Kiedrowski, Lesli A; Raffel, Leslie J; Gordon, Ora; Machini, Kalotina; Desnick, Robert J; Biesecker, Leslie G; Lubitz, Steven A; Mulchandani, Surabhi; Cooper, Greg M; Joffe, Steven; Richards, C Sue; Yang, Yaoping; Rotter, Jerome I; Rich, Stephen S; O'Donnell, Christopher J; Berg, Jonathan S; Spinner, Nancy B; Evans, James P; Fullerton, Stephanie M; Leppig, Kathleen A; Bennett, Robin L; Bird, Thomas; Sybert, Virginia P; Grady, William M; Tabor, Holly K; Kim, Jerry H; Bamshad, Michael J; Wilfond, Benjamin; Motulsky, Arno G; Scott, C Ronald; Pritchard, Colin C; Walsh, Tom D; Burke, Wylie; Raskind, Wendy H; Byers, Peter; Hisama, Fuki M; Rehm, Heidi; Nickerson, Debbie A; Jarvik, Gail P
2015-03-01
Recommendations for laboratories to report incidental findings from genomic tests have stimulated interest in such results. In order to investigate the criteria and processes for assigning the pathogenicity of specific variants and to estimate the frequency of such incidental findings in patients of European and African ancestry, we classified potentially actionable pathogenic single-nucleotide variants (SNVs) in all 4300 European- and 2203 African-ancestry participants sequenced by the NHLBI Exome Sequencing Project (ESP). We considered 112 gene-disease pairs selected by an expert panel as associated with medically actionable genetic disorders that may be undiagnosed in adults. The resulting classifications were compared to classifications from other clinical and research genetic testing laboratories, as well as with in silico pathogenicity scores. Among European-ancestry participants, 30 of 4300 (0.7%) had a pathogenic SNV and six (0.1%) had a disruptive variant that was expected to be pathogenic, whereas 52 (1.2%) had likely pathogenic SNVs. For African-ancestry participants, six of 2203 (0.3%) had a pathogenic SNV and six (0.3%) had an expected pathogenic disruptive variant, whereas 13 (0.6%) had likely pathogenic SNVs. Genomic Evolutionary Rate Profiling mammalian conservation score and the Combined Annotation Dependent Depletion summary score of conservation, substitution, regulation, and other evidence were compared across pathogenicity assignments and appear to have utility in variant classification. This work provides a refined estimate of the burden of adult onset, medically actionable incidental findings expected from exome sequencing, highlights challenges in variant classification, and demonstrates the need for a better curated variant interpretation knowledge base. © 2015 Amendola et al.; Published by Cold Spring Harbor Laboratory Press.
Domain Adaptation for Alzheimer’s Disease Diagnostics
Wachinger, Christian; Reuter, Martin
2016-01-01
With the increasing prevalence of Alzheimer’s disease, research focuses on the early computer-aided diagnosis of dementia with the goal to understand the disease process, determine risk and preserving factors, and explore preventive therapies. By now, large amounts of data from multi-site studies have been made available for developing, training, and evaluating automated classifiers. Yet, their translation to the clinic remains challenging, in part due to their limited generalizability across different datasets. In this work, we describe a compact classification approach that mitigates overfitting by regularizing the multinomial regression with the mixed ℓ1/ℓ2 norm. We combine volume, thickness, and anatomical shape features from MRI scans to characterize neuroanatomy for the three-class classification of Alzheimer’s disease, mild cognitive impairment and healthy controls. We demonstrate high classification accuracy via independent evaluation within the scope of the CADDementia challenge. We, furthermore, demonstrate that variations between source and target datasets can substantially influence classification accuracy. The main contribution of this work addresses this problem by proposing an approach for supervised domain adaptation based on instance weighting. Integration of this method into our classifier allows us to assess different strategies for domain adaptation. Our results demonstrate (i) that training on only the target training set yields better results than the naïve combination (union) of source and target training sets, and (ii) that domain adaptation with instance weighting yields the best classification results, especially if only a small training component of the target dataset is available. These insights imply that successful deployment of systems for computer-aided diagnostics to the clinic depends not only on accurate classifiers that avoid overfitting, but also on a dedicated domain adaptation strategy. PMID:27262241
UAV Research at NASA Langley: Towards Safe, Reliable, and Autonomous Operations
NASA Technical Reports Server (NTRS)
Davila, Carlos G.
2016-01-01
Unmanned Aerial Vehicles (UAV) are fundamental components in several aspects of research at NASA Langley, such as flight dynamics, mission-driven airframe design, airspace integration demonstrations, atmospheric science projects, and more. In particular, NASA Langley Research Center (Langley) is using UAVs to develop and demonstrate innovative capabilities that meet the autonomy and robotics challenges that are anticipated in science, space exploration, and aeronautics. These capabilities will enable new NASA missions such as asteroid rendezvous and retrieval (ARRM), Mars exploration, in-situ resource utilization (ISRU), pollution measurements in historically inaccessible areas, and the integration of UAVs into our everyday lives all missions of increasing complexity, distance, pace, and/or accessibility. Building on decades of NASA experience and success in the design, fabrication, and integration of robust and reliable automated systems for space and aeronautics, Langley Autonomy Incubator seeks to bridge the gap between automation and autonomy by enabling safe autonomous operations via onboard sensing and perception systems in both data-rich and data-deprived environments. The Autonomy Incubator is focused on the challenge of mobility and manipulation in dynamic and unstructured environments by integrating technologies such as computer vision, visual odometry, real-time mapping, path planning, object detection and avoidance, object classification, adaptive control, sensor fusion, machine learning, and natural human-machine teaming. These technologies are implemented in an architectural framework developed in-house for easy integration and interoperability of cutting-edge hardware and software.
NASA Astrophysics Data System (ADS)
Milner, G. Martin
2005-05-01
ChemSentry is a portable system used to detect, identify, and quantify chemical warfare (CW) agents. Electro chemical (EC) cell sensor technology is used for blood agents and an array of surface acoustic wave (SAW) sensors is used for nerve and blister agents. The combination of the EC cell and the SAW array provides sufficient sensor information to detect, classify and quantify all CW agents of concern using smaller, lighter, lower cost units. Initial development of the SAW array and processing was a key challenge for ChemSentry requiring several years of fundamental testing of polymers and coating methods to finalize the sensor array design in 2001. Following the finalization of the SAW array, nearly three (3) years of intensive testing in both laboratory and field environments were required in order to gather sufficient data to fully understand the response characteristics. Virtually unbounded permutations of agent characteristics and environmental characteristics must be considered in order to operate against all agents and all environments of interest to the U.S. military and other potential users of ChemSentry. The resulting signal processing design matched to this extensive body of measured data (over 8,000 agent challenges and 10,000 hours of ambient data) is considered to be a significant advance in state-of-the-art for CW agent detection.
Validating the performance of vehicle classification stations : executive summary report.
DOT National Transportation Integrated Search
2012-05-01
Vehicle classification data are used in many transportation applications, including: pavement design, : environmental impact studies, traffic control, and traffic safety. Typical of most developed countries, every : state in the US maintains a networ...
Mining vehicle classifications from the Columbus Metropolitan Freeway Management System.
DOT National Transportation Integrated Search
2015-01-01
Vehicle classification data are used in many transportation applications, including: pavement design, : environmental impact studies, traffic control, and traffic safety. Ohio has over 200 permanent count stations, : supplemented by many more short-t...
NASA Astrophysics Data System (ADS)
Warren, Sean N.; Kallu, Raj R.; Barnard, Chase K.
2016-11-01
Underground gold mines in Nevada are exploiting increasingly deeper ore bodies comprised of weak to very weak rock masses. The Rock Mass Rating (RMR) classification system is widely used at underground gold mines in Nevada and is applicable in fair to good-quality rock masses, but is difficult to apply and loses reliability in very weak rock mass to soil-like material. Because very weak rock masses are transition materials that border engineering rock mass and soil classification systems, soil classification may sometimes be easier and more appropriate to provide insight into material behavior and properties. The Unified Soil Classification System (USCS) is the most likely choice for the classification of very weak rock mass to soil-like material because of its accepted use in tunnel engineering projects and its ability to predict soil-like material behavior underground. A correlation between the RMR and USCS systems was developed by comparing underground geotechnical RMR mapping to laboratory testing of bulk samples from the same locations, thereby assigning a numeric RMR value to the USCS classification that can be used in spreadsheet calculations and geostatistical analyses. The geotechnical classification system presented in this paper including a USCS-RMR correlation, RMR rating equations, and the Geo-Pick Strike Index is collectively introduced as the Weak Rock Mass Rating System (W-RMR). It is the authors' hope that this system will aid in the classification of weak rock masses and more usable design tools based on the RMR system. More broadly, the RMR-USCS correlation and the W-RMR system help define the transition between engineering soil and rock mass classification systems and may provide insight for geotechnical design in very weak rock masses.
Fainsinger, Robin L; Nekolaichuk, Cheryl L
2008-06-01
The purpose of this paper is to provide an overview of the development of a "TNM" cancer pain classification system for advanced cancer patients, the Edmonton Classification System for Cancer Pain (ECS-CP). Until we have a common international language to discuss cancer pain, understanding differences in clinical and research experience in opioid rotation and use remains problematic. The complexity of the cancer pain experience presents unique challenges for the classification of pain. To date, no universally accepted pain classification measure can accurately predict the complexity of pain management, particularly for patients with cancer pain that is difficult to treat. In response to this gap in clinical assessment, the Edmonton Staging System (ESS), a classification system for cancer pain, was developed. Difficulties in definitions and interpretation of some aspects of the ESS restricted acceptance and widespread use. Construct, inter-rater reliability, and predictive validity evidence have contributed to the development of the ECS-CP. The five features of the ECS-CP--Pain Mechanism, Incident Pain, Psychological Distress, Addictive Behavior and Cognitive Function--have demonstrated value in predicting pain management complexity. The development of a standardized classification system that is comprehensive, prognostic and simple to use could provide a common language for clinical management and research of cancer pain. An international study to assess the inter-rater reliability and predictive value of the ECS-CP is currently in progress.
NASA Astrophysics Data System (ADS)
Li, Mengmeng; Bijker, Wietske; Stein, Alfred
2015-04-01
Two main challenges are faced when classifying urban land cover from very high resolution satellite images: obtaining an optimal image segmentation and distinguishing buildings from other man-made objects. For optimal segmentation, this work proposes a hierarchical representation of an image by means of a Binary Partition Tree (BPT) and an unsupervised evaluation of image segmentations by energy minimization. For building extraction, we apply fuzzy sets to create a fuzzy landscape of shadows which in turn involves a two-step procedure. The first step is a preliminarily image classification at a fine segmentation level to generate vegetation and shadow information. The second step models the directional relationship between building and shadow objects to extract building information at the optimal segmentation level. We conducted the experiments on two datasets of Pléiades images from Wuhan City, China. To demonstrate its performance, the proposed classification is compared at the optimal segmentation level with Maximum Likelihood Classification and Support Vector Machine classification. The results show that the proposed classification produced the highest overall accuracies and kappa coefficients, and the smallest over-classification and under-classification geometric errors. We conclude first that integrating BPT with energy minimization offers an effective means for image segmentation. Second, we conclude that the directional relationship between building and shadow objects represented by a fuzzy landscape is important for building extraction.
Multiclass classification of microarray data samples with a reduced number of genes
2011-01-01
Background Multiclass classification of microarray data samples with a reduced number of genes is a rich and challenging problem in Bioinformatics research. The problem gets harder as the number of classes is increased. In addition, the performance of most classifiers is tightly linked to the effectiveness of mandatory gene selection methods. Critical to gene selection is the availability of estimates about the maximum number of genes that can be handled by any classification algorithm. Lack of such estimates may lead to either computationally demanding explorations of a search space with thousands of dimensions or classification models based on gene sets of unrestricted size. In the former case, unbiased but possibly overfitted classification models may arise. In the latter case, biased classification models unable to support statistically significant findings may be obtained. Results A novel bound on the maximum number of genes that can be handled by binary classifiers in binary mediated multiclass classification algorithms of microarray data samples is presented. The bound suggests that high-dimensional binary output domains might favor the existence of accurate and sparse binary mediated multiclass classifiers for microarray data samples. Conclusions A comprehensive experimental work shows that the bound is indeed useful to induce accurate and sparse multiclass classifiers for microarray data samples. PMID:21342522
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
Framework for evaluating disease severity measures in older adults with comorbidity.
Boyd, Cynthia M; Weiss, Carlos O; Halter, Jeff; Han, K Carol; Ershler, William B; Fried, Linda P
2007-03-01
Accounting for the influence of concurrent conditions on health and functional status for both research and clinical decision-making purposes is especially important in older adults. Although approaches to classifying severity of individual diseases and conditions have been developed, the utility of these classification systems has not been evaluated in the presence of multiple conditions. We present a framework for evaluating severity classification systems for common chronic diseases. The framework evaluates the: (a) goal or purpose of the classification system; (b) physiological and/or functional criteria for severity graduation; and (c) potential reliability and validity of the system balanced against burden and costs associated with classification. Approaches to severity classification of individual diseases were not originally conceived for the study of comorbidity. Therefore, they vary greatly in terms of objectives, physiological systems covered, level of severity characterization, reliability and validity, and costs and burdens. Using different severity classification systems to account for differing levels of disease severity in a patient with multiple diseases, or, assessing global disease burden may be challenging. Most approaches to severity classification are not adequate to address comorbidity. Nevertheless, thoughtful use of some existing approaches and refinement of others may advance the study of comorbidity and diagnostic and therapeutic approaches to patients with multimorbidity.
Ensemble based on static classifier selection for automated diagnosis of Mild Cognitive Impairment.
Nanni, Loris; Lumini, Alessandra; Zaffonato, Nicolò
2018-05-15
Alzheimer's disease (AD) is the most common cause of neurodegenerative dementia in the elderly population. Scientific research is very active in the challenge of designing automated approaches to achieve an early and certain diagnosis. Recently an international competition among AD predictors has been organized: "A Machine learning neuroimaging challenge for automated diagnosis of Mild Cognitive Impairment" (MLNeCh). This competition is based on pre-processed sets of T1-weighted Magnetic Resonance Images (MRI) to be classified in four categories: stable AD, individuals with MCI who converted to AD, individuals with MCI who did not convert to AD and healthy controls. In this work, we propose a method to perform early diagnosis of AD, which is evaluated on MLNeCh dataset. Since the automatic classification of AD is based on the use of feature vectors of high dimensionality, different techniques of feature selection/reduction are compared in order to avoid the curse-of-dimensionality problem, then the classification method is obtained as the combination of Support Vector Machines trained using different clusters of data extracted from the whole training set. The multi-classifier approach proposed in this work outperforms all the stand-alone method tested in our experiments. The final ensemble is based on a set of classifiers, each trained on a different cluster of the training data. The proposed ensemble has the great advantage of performing well using a very reduced version of the data (the reduction factor is more than 90%). The MATLAB code for the ensemble of classifiers will be publicly available 1 to other researchers for future comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
High-speed atomic force microscopy and peak force tapping control
NASA Astrophysics Data System (ADS)
Hu, Shuiqing; Mininni, Lars; Hu, Yan; Erina, Natalia; Kindt, Johannes; Su, Chanmin
2012-03-01
ITRS Roadmap requires defect size measurement below 10 nanometers and challenging classifications for both blank and patterned wafers and masks. Atomic force microscope (AFM) is capable of providing metrology measurement in 3D at sub-nanometer accuracy but has long suffered from drawbacks in throughput and limitation of slow topography imaging without chemical information. This presentation focus on two disruptive technology developments, namely high speed AFM and quantitative nanomechanical mapping, which enables high throughput measurement with capability of identifying components through concurrent physical property imaging. The high speed AFM technology has allowed the imaging speed increase by 10-100 times without loss of the data quality. Such improvement enables the speed of defect review on a wafer to increase from a few defects per hour to nearly 100 defects an hour, approaching the requirements of ITRS Roadmap. Another technology development, Peak Force Tapping, substantially simplified the close loop system response, leading to self-optimization of most challenging samples groups to generate expert quality data. More importantly, AFM also simultaneously provides a series of mechanical property maps with a nanometer spatial resolution during defect review. These nanomechanical maps (including elastic modulus, hardness, and surface adhesion) provide complementary information for elemental analysis, differentiate defect materials by their physical properties, and assist defect classification beyond topographic measurements. This paper will explain the key enabling technologies, namely high speed tip-scanning AFM using innovative flexure design and control algorithm. Another critical element is AFM control using Peak Force Tapping, in which the instantaneous tip-sample interaction force is measured and used to derive a full suite of physical properties at each imaging pixel. We will provide examples of defect review data on different wafers and media disks. The similar AFM-based defect review capacity was also applied to EUV masks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatt, Arpit H; Zhang, Yi Min
A biorefinery, considered a chemical process plant under the Clean Air Act permitting program, could be classified as a major or minor source based on the size of the facility and magnitude of regulated pollutants emitted. Our previous analysis indicates that a biorefinery using fast pyrolysis conversion process to produce finished gasoline and diesel blendstocks with a capacity of processing 2,000 dry metric tons of biomass per day would likely be classified as a major source because several regulated pollutants (such as particulate matter, sulfur dioxide, nitrogen oxide) are estimated to exceed the 100 tons per year (tpy) major sourcemore » threshold, applicable to chemical process plants. Being subject to a major source classification could pose additional challenges associated with obtaining an air permit in a timely manner before the biorefinery can start its construction. Recent developments propose an alternative approach to utilize bio-oil produced via the fast pyrolysis conversion process by shipping it to an existing petroleum refinery, where the raw bio-oil can be blended with petroleum-based feedstocks (e.g., vacuum gas oil) to produce gasoline and diesel blendstocks with renewable content. Without having to hydro-treat raw bio-oil, a biorefinery is likely to reduce its potential-to-emit to below the 100 tpy major source threshold, and therefore expedite its permitting process. We compare the PTE estimates for the two biorefinery designs with and without hydrotreating of bio-oils and examine the air permitting implications on potential air permit classification and discuss the best available control technology requirements for the major source biorefinery utilizing hydrotreating operation. Our analysis is expected to provide useful information to new biofuel project developers to identify opportunities to overcome challenges associated with air permitting.« less
Automatic classification of protein structures using physicochemical parameters.
Mohan, Abhilash; Rao, M Divya; Sunderrajan, Shruthi; Pennathur, Gautam
2014-09-01
Protein classification is the first step to functional annotation; SCOP and Pfam databases are currently the most relevant protein classification schemes. However, the disproportion in the number of three dimensional (3D) protein structures generated versus their classification into relevant superfamilies/families emphasizes the need for automated classification schemes. Predicting function of novel proteins based on sequence information alone has proven to be a major challenge. The present study focuses on the use of physicochemical parameters in conjunction with machine learning algorithms (Naive Bayes, Decision Trees, Random Forest and Support Vector Machines) to classify proteins into their respective SCOP superfamily/Pfam family, using sequence derived information. Spectrophores™, a 1D descriptor of the 3D molecular field surrounding a structure was used as a benchmark to compare the performance of the physicochemical parameters. The machine learning algorithms were modified to select features based on information gain for each SCOP superfamily/Pfam family. The effect of combining physicochemical parameters and spectrophores on classification accuracy (CA) was studied. Machine learning algorithms trained with the physicochemical parameters consistently classified SCOP superfamilies and Pfam families with a classification accuracy above 90%, while spectrophores performed with a CA of around 85%. Feature selection improved classification accuracy for both physicochemical parameters and spectrophores based machine learning algorithms. Combining both attributes resulted in a marginal loss of performance. Physicochemical parameters were able to classify proteins from both schemes with classification accuracy ranging from 90-96%. These results suggest the usefulness of this method in classifying proteins from amino acid sequences.