Sample records for classification procedure based

  1. Effects of Estimation Bias on Multiple-Category Classification with an IRT-Based Adaptive Classification Procedure

    ERIC Educational Resources Information Center

    Yang, Xiangdong; Poggio, John C.; Glasnapp, Douglas R.

    2006-01-01

    The effects of five ability estimators, that is, maximum likelihood estimator, weighted likelihood estimator, maximum a posteriori, expected a posteriori, and Owen's sequential estimator, on the performances of the item response theory-based adaptive classification procedure on multiple categories were studied via simulations. The following…

  2. Classifying breast cancer surgery: a novel, complexity-based system for oncological, oncoplastic and reconstructive procedures, and proof of principle by analysis of 1225 operations in 1166 patients.

    PubMed

    Hoffmann, Jürgen; Wallwiener, Diethelm

    2009-04-08

    One of the basic prerequisites for generating evidence-based data is the availability of classification systems. Attempts to date to classify breast cancer operations have focussed on specific problems, e.g. the avoidance of secondary corrective surgery for surgical defects, rather than taking a generic approach. Starting from an existing, simpler empirical scheme based on the complexity of breast surgical procedures, which was used in-house primarily in operative report-writing, a novel classification of ablative and breast-conserving procedures initially needed to be developed and elaborated systematically. To obtain proof of principle, a prospectively planned analysis of patient records for all major breast cancer-related operations performed at our breast centre in 2005 and 2006 was conducted using the new classification. Data were analysed using basic descriptive statistics such as frequency tables. A novel two-type, six-tier classification system comprising 12 main categories, 13 subcategories and 39 sub-subcategories of oncological, oncoplastic and reconstructive breast cancer-related surgery was successfully developed. Our system permitted unequivocal classification, without exception, of all 1225 procedures performed in 1166 breast cancer patients in 2005 and 2006. Breast cancer-related surgical procedures can be generically classified according to their surgical complexity. Analysis of all major procedures performed at our breast centre during the study period provides proof of principle for this novel classification system. We envisage various applications for this classification, including uses in randomised clinical trials, guideline development, specialist surgical training, continuing professional development as well as quality of care and public health research.

  3. On evaluating clustering procedures for use in classification

    NASA Technical Reports Server (NTRS)

    Pore, M. D.; Moritz, T. E.; Register, D. T.; Yao, S. S.; Eppler, W. G. (Principal Investigator)

    1979-01-01

    The problem of evaluating clustering algorithms and their respective computer programs for use in a preprocessing step for classification is addressed. In clustering for classification the probability of correct classification is suggested as the ultimate measure of accuracy on training data. A means of implementing this criterion and a measure of cluster purity are discussed. Examples are given. A procedure for cluster labeling that is based on cluster purity and sample size is presented.

  4. Applications of remote sensing, volume 3

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. Of the four change detection techniques (post classification comparison, delta data, spectral/temporal, and layered spectral temporal), the post classification comparison was selected for further development. This was based upon test performances of the four change detection method, straightforwardness of the procedures, and the output products desired. A standardized modified, supervised classification procedure for analyzing the Texas coastal zone data was compiled. This procedure was developed in order that all quadrangles in the study are would be classified using similar analysis techniques to allow for meaningful comparisons and evaluations of the classifications.

  5. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  6. [Land cover classification of Four Lakes Region in Hubei Province based on MODIS and ENVISAT data].

    PubMed

    Xue, Lian; Jin, Wei-Bin; Xiong, Qin-Xue; Liu, Zhang-Yong

    2010-03-01

    Based on the differences of back scattering coefficient in ENVISAT ASAR data, a classification was made on the towns, waters, and vegetation-covered areas in the Four Lakes Region of Hubei Province. According to the local cropping systems and phenological characteristics in the region, and by using the discrepancies of the MODIS-NDVI index from late April to early May, the vegetation-covered areas were classified into croplands and non-croplands. The classification results based on the above-mentioned procedure was verified by the classification results based on the ETM data with high spatial resolution. Based on the DEM data, the non-croplands were categorized into forest land and bottomland; and based on the discrepancies of mean NDVI index per month, the crops were identified as mid rice, late rice, and cotton, and the croplands were identified as paddy field and upland field. The land cover classification based on the MODIS data with low spatial resolution was basically consistent with that based on the ETM data with high spatial resolution, and the total error rate was about 13.15% when the classification results based on ETM data were taken as the standard. The utilization of the above-mentioned procedures for large scale land cover classification and mapping could make the fast tracking of regional land cover classification.

  7. Inter-comparison of weather and circulation type classifications for hydrological drought development

    NASA Astrophysics Data System (ADS)

    Fleig, Anne K.; Tallaksen, Lena M.; Hisdal, Hege; Stahl, Kerstin; Hannah, David M.

    Classifications of weather and circulation patterns are often applied in research seeking to relate atmospheric state to surface environmental phenomena. However, numerous procedures have been applied to define the patterns, thus limiting comparability between studies. The COST733 Action “ Harmonisation and Applications of Weather Type Classifications for European regions” tests 73 different weather type classifications (WTC) and their associate weather types (WTs) and compares the WTCs’ utility for various applications. The objective of this study is to evaluate the potential of these WTCs for analysis of regional hydrological drought development in north-western Europe. Hydrological drought is defined in terms of a Regional Drought Area Index (RDAI), which is based on deficits derived from daily river flow series. RDAI series (1964-2001) were calculated for four homogeneous regions in Great Britain and two in Denmark. For each region, WTs associated with hydrological drought development were identified based on antecedent and concurrent WT-frequencies for major drought events. The utility of the different WTCs for the study of hydrological drought development was evaluated, and the influence of WTC attributes, i.e. input variables, number of defined WTs and general classification concept, on WTC performance was assessed. The objective Grosswetterlagen (OGWL), the objective Second-Generation Lamb Weather Type Classification (LWT2) with 18 WTs and two implementations of the objective Wetterlagenklassifikation (WLK; with 40 and 28 WTs) outperformed all other WTCs. In general, WTCs with more WTs (⩾27) were found to perform better than WTCs with less (⩽18) WTs. The influence of input variables was not consistent across the different classification procedures, and the performance of a WTC was determined primarily by the classification procedure itself. Overall, classification procedures following the relatively simple general classification concept of predefining WTs based on thresholds, performed better than those based on more sophisticated classification concepts such as deriving WTs by cluster analysis or artificial neural networks. In particular, PCA based WTCs with 9 WTs and automated WTCs with a high number of predefined WTs (subjectively and threshold based) performed well. It is suggested that the explicit consideration of the air flow characteristics of meridionality, zonality and cyclonicity in the definition of WTs is a useful feature for a WTC when analysing regional hydrological drought development.

  8. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  9. Land Cover Classification in a Complex Urban-Rural Landscape with Quickbird Imagery

    PubMed Central

    Moran, Emilio Federico.

    2010-01-01

    High spatial resolution images have been increasingly used for urban land use/cover classification, but the high spectral variation within the same land cover, the spectral confusion among different land covers, and the shadow problem often lead to poor classification performance based on the traditional per-pixel spectral-based classification methods. This paper explores approaches to improve urban land cover classification with Quickbird imagery. Traditional per-pixel spectral-based supervised classification, incorporation of textural images and multispectral images, spectral-spatial classifier, and segmentation-based classification are examined in a relatively new developing urban landscape, Lucas do Rio Verde in Mato Grosso State, Brazil. This research shows that use of spatial information during the image classification procedure, either through the integrated use of textural and spectral images or through the use of segmentation-based classification method, can significantly improve land cover classification performance. PMID:21643433

  10. An unsupervised classification approach for analysis of Landsat data to monitor land reclamation in Belmont county, Ohio

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O.; Bloemer, H. H. L.; Campbell, W. J.

    1981-01-01

    Two unsupervised classification procedures for analyzing Landsat data used to monitor land reclamation in a surface mining area in east central Ohio are compared for agreement with data collected from the corresponding locations on the ground. One procedure is based on a traditional unsupervised-clustering/maximum-likelihood algorithm sequence that assumes spectral groupings in the Landsat data in n-dimensional space; the other is based on a nontraditional unsupervised-clustering/canonical-transformation/clustering algorithm sequence that not only assumes spectral groupings in n-dimensional space but also includes an additional feature-extraction technique. It is found that the nontraditional procedure provides an appreciable improvement in spectral groupings and apparently increases the level of accuracy in the classification of land cover categories.

  11. 40 CFR 152.164 - Classification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Classification procedures. 152.164... PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.164 Classification procedures. (a) Grouping of products for classification purposes. In its discretion, the Agency may identify...

  12. 40 CFR 152.164 - Classification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Classification procedures. 152.164... PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.164 Classification procedures. (a) Grouping of products for classification purposes. In its discretion, the Agency may identify...

  13. 40 CFR 152.164 - Classification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Classification procedures. 152.164... PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.164 Classification procedures. (a) Grouping of products for classification purposes. In its discretion, the Agency may identify...

  14. 40 CFR 152.164 - Classification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Classification procedures. 152.164... PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.164 Classification procedures. (a) Grouping of products for classification purposes. In its discretion, the Agency may identify...

  15. 40 CFR 152.164 - Classification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Classification procedures. 152.164... PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.164 Classification procedures. (a) Grouping of products for classification purposes. In its discretion, the Agency may identify...

  16. Practical Procedures for Constructing Mastery Tests to Minimize Errors of Classification and to Maximize or Optimize Decision Reliability.

    ERIC Educational Resources Information Center

    Byars, Alvin Gregg

    The objectives of this investigation are to develop, describe, assess, and demonstrate procedures for constructing mastery tests to minimize errors of classification and to maximize decision reliability. The guidelines are based on conditions where item exchangeability is a reasonable assumption and the test constructor can control the number of…

  17. 9 CFR 147.6 - Procedure for determining the status of flocks reacting to tests for Mycoplasma gallisepticum...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agency may examine reactors by the in vivo bio-assay, PCR-based procedures, and/or culture procedures before final determination of the flock status is made. (13) If the in vivo bio-assay, PCR-based... classification for which it was tested. (14) If the in vivo bio-assay, PCR-based procedures, or culture...

  18. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery.

    PubMed

    Belgiu, Mariana; Dr Guţ, Lucian; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules.

  19. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery

    PubMed Central

    Belgiu, Mariana; Drǎguţ, Lucian; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules. PMID:24623959

  20. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery

    NASA Astrophysics Data System (ADS)

    Belgiu, Mariana; ǎguţ, Lucian, , Dr; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules.

  1. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  2. California desert resource inventory using multispectral classification of digitally mosaicked Landsat frames

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Mcleod, R. G.; Zobrist, A. L.; Johnson, H. B.

    1979-01-01

    Procedures for adjustment of brightness values between frames and the digital mosaicking of Landsat frames to standard map projections are developed for providing a continuous data base for multispectral thematic classification. A combination of local terrain variations in the Californian deserts and a global sampling strategy based on transects provided the framework for accurate classification throughout the entire geographic region.

  3. Application of GIS-based Procedure on Slopeland Use Classification and Identification

    NASA Astrophysics Data System (ADS)

    KU, L. C.; LI, M. C.

    2016-12-01

    In Taiwan, the "Slopeland Conservation and Utilization Act" regulates the management of the slopelands. It categorizes the slopeland into land suitable for agricultural or animal husbandry, land suitable for forestry and land for enhanced conservation, according to the environmental factors of average slope, effective soil depth, soil erosion and parental rock. Traditionally, investigations of environmental factors require cost-effective field works. It has been confronted with many practical issues such as non-evaluated cadastral parcels, evaluation results depending on expert's opinion, difficulties in field measurement and judgment, and time consuming. This study aimed to develop a GIS-based procedure involved in the acceleration of slopeland use classification and quality improvement. First, the environmental factors of slopelands were analyzed by GIS and SPSS software. The analysis involved with the digital elevation model (DEM), soil depth map, land use map and satellite images. Second, 5% of the analyzed slopelands were selected to perform the site investigations and correct the results of classification. Finally, a 2nd examination was involved by randomly selected 2% of the analyzed slopelands to perform the accuracy evaluation. It was showed the developed procedure is effective in slopeland use classification and identification. Keywords: Slopeland Use Classification, GIS, Management

  4. Differences in forest area classification based on tree tally from variable- and fixed-radius plots

    Treesearch

    David Azuma; Vicente J. Monleon

    2011-01-01

    In forest inventory, it is not enough to formulate a definition; it is also necessary to define the "measurement procedure." In the classification of forestland by dominant cover type, the measurement design (the plot) can affect the outcome of the classification. We present results of a simulation study comparing classification of the dominant cover type...

  5. 7 CFR 27.34 - Classification procedure.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Classification procedure. 27.34 Section 27.34... REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Classification and Micronaire Determinations § 27.34 Classification procedure. Classification shall proceed as rapidly as possible, but not...

  6. 7 CFR 27.34 - Classification procedure.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Classification procedure. 27.34 Section 27.34... REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Classification and Micronaire Determinations § 27.34 Classification procedure. Classification shall proceed as rapidly as possible, but not...

  7. 7 CFR 27.34 - Classification procedure.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Classification procedure. 27.34 Section 27.34... REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Classification and Micronaire Determinations § 27.34 Classification procedure. Classification shall proceed as rapidly as possible, but not...

  8. 7 CFR 27.34 - Classification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Classification procedure. 27.34 Section 27.34... REGULATIONS COTTON CLASSIFICATION UNDER COTTON FUTURES LEGISLATION Regulations Classification and Micronaire Determinations § 27.34 Classification procedure. Classification shall proceed as rapidly as possible, but not...

  9. Brief surgical procedure code lists for outcomes measurement and quality improvement in resource-limited settings.

    PubMed

    Liu, Charles; Kayima, Peter; Riesel, Johanna; Situma, Martin; Chang, David; Firth, Paul

    2017-11-01

    The lack of a classification system for surgical procedures in resource-limited settings hinders outcomes measurement and reporting. Existing procedure coding systems are prohibitively large and expensive to implement. We describe the creation and prospective validation of 3 brief procedure code lists applicable in low-resource settings, based on analysis of surgical procedures performed at Mbarara Regional Referral Hospital, Uganda's second largest public hospital. We reviewed operating room logbooks to identify all surgical operations performed at Mbarara Regional Referral Hospital during 2014. Based on the documented indication for surgery and procedure(s) performed, we assigned each operation up to 4 procedure codes from the International Classification of Diseases, 9th Revision, Clinical Modification. Coding of procedures was performed by 2 investigators, and a random 20% of procedures were coded by both investigators. These codes were aggregated to generate procedure code lists. During 2014, 6,464 surgical procedures were performed at Mbarara Regional Referral Hospital, to which we assigned 435 unique procedure codes. Substantial inter-rater reliability was achieved (κ = 0.7037). The 111 most common procedure codes accounted for 90% of all codes assigned, 180 accounted for 95%, and 278 accounted for 98%. We considered these sets of codes as 3 procedure code lists. In a prospective validation, we found that these lists described 83.2%, 89.2%, and 92.6% of surgical procedures performed at Mbarara Regional Referral Hospital during August to September of 2015, respectively. Empirically generated brief procedure code lists based on International Classification of Diseases, 9th Revision, Clinical Modification can be used to classify almost all surgical procedures performed at a Ugandan referral hospital. Such a standardized procedure coding system may enable better surgical data collection for administration, research, and quality improvement in resource-limited settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A conceptual weather-type classification procedure for the Philadelphia, Pennsylvania, area

    USGS Publications Warehouse

    McCabe, Gregory J.

    1990-01-01

    A simple method of weather-type classification, based on a conceptual model of pressure systems that pass through the Philadelphia, Pennsylvania, area, has been developed. The only inputs required for the procedure are daily mean wind direction and cloud cover, which are used to index the relative position of pressure systems and fronts to Philadelphia.Daily mean wind-direction and cloud-cover data recorded at Philadelphia, Pennsylvania, from January 1954 through August 1988 were used to categorize daily weather conditions. The conceptual weather types reflect changes in daily air and dew-point temperatures, and changes in monthly mean temperature and monthly and annual precipitation. The weather-type classification produced by using the conceptual model was similar to a classification produced by using a multivariate statistical classification procedure. Even though the conceptual weather types are derived from a small amount of data, they appear to account for the variability of daily weather patterns sufficiently to describe distinct weather conditions for use in environmental analyses of weather-sensitive processes.

  11. 78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and known as ``registration'' by the U.S. cotton industry and the Intercontinental Exchange (ICE). In...

  12. 76 FR 60388 - Revision of Cotton Futures Classification Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-29

    ...-005] RIN 0581-AD16 Revision of Cotton Futures Classification Procedures AGENCY: Agricultural Marketing... update the procedures for cotton futures quality classification services by using Smith-Doxey classification data in the cotton futures classification process. In addition, references to a separate and...

  13. 28 CFR 524.73 - Classification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Classification procedures. 524.73 Section..., CLASSIFICATION, AND TRANSFER CLASSIFICATION OF INMATES Central Inmate Monitoring (CIM) System § 524.73 Classification procedures. (a) Initial assignment. Except as provided for in paragraphs (a) (1) through (4) of...

  14. 28 CFR 524.73 - Classification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Classification procedures. 524.73 Section..., CLASSIFICATION, AND TRANSFER CLASSIFICATION OF INMATES Central Inmate Monitoring (CIM) System § 524.73 Classification procedures. (a) Initial assignment. Except as provided for in paragraphs (a) (1) through (4) of...

  15. 28 CFR 524.73 - Classification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Classification procedures. 524.73 Section..., CLASSIFICATION, AND TRANSFER CLASSIFICATION OF INMATES Central Inmate Monitoring (CIM) System § 524.73 Classification procedures. (a) Initial assignment. Except as provided for in paragraphs (a) (1) through (4) of...

  16. 28 CFR 524.73 - Classification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Classification procedures. 524.73 Section..., CLASSIFICATION, AND TRANSFER CLASSIFICATION OF INMATES Central Inmate Monitoring (CIM) System § 524.73 Classification procedures. (a) Initial assignment. Except as provided for in paragraphs (a) (1) through (4) of...

  17. Ranking of predictor variables based on effect size criterion provides an accurate means of automatically classifying opinion column articles

    NASA Astrophysics Data System (ADS)

    Legara, Erika Fille; Monterola, Christopher; Abundo, Cheryl

    2011-01-01

    We demonstrate an accurate procedure based on linear discriminant analysis that allows automatic authorship classification of opinion column articles. First, we extract the following stylometric features of 157 column articles from four authors: statistics on high frequency words, number of words per sentence, and number of sentences per paragraph. Then, by systematically ranking these features based on an effect size criterion, we show that we can achieve an average classification accuracy of 93% for the test set. In comparison, frequency size based ranking has an average accuracy of 80%. The highest possible average classification accuracy of our data merely relying on chance is ∼31%. By carrying out sensitivity analysis, we show that the effect size criterion is superior than frequency ranking because there exist low frequency words that significantly contribute to successful author discrimination. Consistent results are seen when the procedure is applied in classifying the undisputed Federalist papers of Alexander Hamilton and James Madison. To the best of our knowledge, the work is the first attempt in classifying opinion column articles, that by virtue of being shorter in length (as compared to novels or short stories), are more prone to over-fitting issues. The near perfect classification for the longer papers supports this claim. Our results provide an important insight on authorship attribution that has been overlooked in previous studies: that ranking discriminant variables based on word frequency counts is not necessarily an optimal procedure.

  18. EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution

    PubMed Central

    Alwanni, Hisham; Baslan, Yara; Alnuman, Nasim; Daoud, Mohammad I.

    2017-01-01

    This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88.8% and 90.2%, respectively, for the subject-dependent training procedure, and 80.8% and 87.8%, respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations. PMID:28832513

  19. An automatic agricultural zone classification procedure for crop inventory satellite images

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Kux, H. J.; Velasco, F. R. D.; Deoliveira, M. O. B.

    1982-01-01

    A classification procedure for assessing crop areal proportion in multispectral scanner image is discussed. The procedure is into four parts: labeling; classification; proportion estimation; and evaluation. The procedure also has the following characteristics: multitemporal classification; the need for a minimum field information; and verification capability between automatic classification and analyst labeling. The processing steps and the main algorithms involved are discussed. An outlook on the future of this technology is also presented.

  20. A New Classification System for Unilateral Cleft Lip and Palate Infants to assist Presurgical Infant Orthopedics.

    PubMed

    Daigavane, P S; Hazarey, P V; Niranjane, P; Vasudevan, S D; Thombare, B R; Daigavane, S

    2015-01-01

    The proposed advantages of pre-surgical naso-alveolar moulding (PNAM) are easy primary lip repair which heals under minimum tension reducing the scar formation and improving the aesthetic results in addition to reshaping of alar cartilage and improvement of nasal symmetry.However, the anatomy and alveolar morphology varies for each cleft child; the procedure for PNAM differs accordingly. In an attempt to categorize unilateral cleft lip and palate cases as per anatomical variations, a new classification system has been proposed. This classification aims to give an insight in unilateral cleft morphology based on which modification in PNAM procedure could be done.

  1. A Partial Least Squares Based Procedure for Upstream Sequence Classification in Prokaryotes.

    PubMed

    Mehmood, Tahir; Bohlin, Jon; Snipen, Lars

    2015-01-01

    The upstream region of coding genes is important for several reasons, for instance locating transcription factor, binding sites, and start site initiation in genomic DNA. Motivated by a recently conducted study, where multivariate approach was successfully applied to coding sequence modeling, we have introduced a partial least squares (PLS) based procedure for the classification of true upstream prokaryotic sequence from background upstream sequence. The upstream sequences of conserved coding genes over genomes were considered in analysis, where conserved coding genes were found by using pan-genomics concept for each considered prokaryotic species. PLS uses position specific scoring matrix (PSSM) to study the characteristics of upstream region. Results obtained by PLS based method were compared with Gini importance of random forest (RF) and support vector machine (SVM), which is much used method for sequence classification. The upstream sequence classification performance was evaluated by using cross validation, and suggested approach identifies prokaryotic upstream region significantly better to RF (p-value < 0.01) and SVM (p-value < 0.01). Further, the proposed method also produced results that concurred with known biological characteristics of the upstream region.

  2. Household-based trip survey procedure

    DOT National Transportation Integrated Search

    1993-02-22

    This memorandum describes the development of interim trip production cross-classification models for four home-based trip purposes: home-based work (HBW), home-based shopping and personal business (HBPB), home-based social and recreational (HBSR), an...

  3. Toward Automated Cochlear Implant Fitting Procedures Based on Event-Related Potentials.

    PubMed

    Finke, Mareike; Billinger, Martin; Büchner, Andreas

    Cochlear implants (CIs) restore hearing to the profoundly deaf by direct electrical stimulation of the auditory nerve. To provide an optimal electrical stimulation pattern the CI must be individually fitted to each CI user. To date, CI fitting is primarily based on subjective feedback from the user. However, not all CI users are able to provide such feedback, for example, small children. This study explores the possibility of using the electroencephalogram (EEG) to objectively determine if CI users are able to hear differences in tones presented to them, which has potential applications in CI fitting or closed loop systems. Deviant and standard stimuli were presented to 12 CI users in an active auditory oddball paradigm. The EEG was recorded in two sessions and classification of the EEG data was performed with shrinkage linear discriminant analysis. Also, the impact of CI artifact removal on classification performance and the possibility to reuse a trained classifier in future sessions were evaluated. Overall, classification performance was above chance level for all participants although performance varied considerably between participants. Also, artifacts were successfully removed from the EEG without impairing classification performance. Finally, reuse of the classifier causes only a small loss in classification performance. Our data provide first evidence that EEG can be automatically classified on single-trial basis in CI users. Despite the slightly poorer classification performance over sessions, classifier and CI artifact correction appear stable over successive sessions. Thus, classifier and artifact correction weights can be reused without repeating the set-up procedure in every session, which makes the technique easier applicable. With our present data, we can show successful classification of event-related cortical potential patterns in CI users. In the future, this has the potential to objectify and automate parts of CI fitting procedures.

  4. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    PubMed

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  5. 42 CFR 412.60 - DRG classification and weighting factors.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false DRG classification and weighting factors. 412.60... discharge is based, as appropriate, on the patient's age, sex, principal diagnosis (that is, the diagnosis...), secondary diagnoses, procedures performed, and discharge status. (2) Each discharge is assigned to only one...

  6. 42 CFR 412.60 - DRG classification and weighting factors.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false DRG classification and weighting factors. 412.60... discharge is based, as appropriate, on the patient's age, sex, principal diagnosis (that is, the diagnosis...), secondary diagnoses, procedures performed, and discharge status. (2) Each discharge is assigned to only one...

  7. Automatic detection of sleep macrostructure based on a sensorized T-shirt.

    PubMed

    Bianchi, Anna M; Mendez, Martin O

    2010-01-01

    In the present work we apply a fully automatic procedure to the analysis of signal coming from a sensorized T-shit, worn during the night, for sleep evaluation. The goodness and reliability of the signals recorded trough the T-shirt was previously tested, while the employed algorithms for feature extraction and sleep classification were previously developed on standard ECG recordings and the obtained classification was compared to the standard clinical practice based on polysomnography (PSG). In the present work we combined T-shirt recordings and automatic classification and could obtain reliable sleep profiles, i.e. the sleep classification in WAKE, REM (rapid eye movement) and NREM stages, based on heart rate variability (HRV), respiration and movement signals.

  8. Fuzzy Classification of Ocean Color Satellite Data for Bio-optical Algorithm Constituent Retrievals

    NASA Technical Reports Server (NTRS)

    Campbell, Janet W.

    1998-01-01

    The ocean has been traditionally viewed as a 2 class system. Morel and Prieur (1977) classified ocean water according to the dominant absorbent particle suspended in the water column. Case 1 is described as having a high concentration of phytoplankton (and detritus) relative to other particles. Conversely, case 2 is described as having inorganic particles such as suspended sediments in high concentrations. Little work has gone into the problem of mixing bio-optical models for these different water types. An approach is put forth here to blend bio-optical algorithms based on a fuzzy classification scheme. This scheme involves two procedures. First, a clustering procedure identifies classes and builds class statistics from in-situ optical measurements. Next, a classification procedure assigns satellite pixels partial memberships to these classes based on their ocean color reflectance signature. These membership assignments can be used as the basis for a weighting retrievals from class-specific bio-optical algorithms. This technique is demonstrated with in-situ optical measurements and an image from the SeaWiFS ocean color satellite.

  9. 48 CFR 245.201-73 - Security classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Security classification... Procedures 245.201-73 Security classification. Follow the procedures at PGI 245.201-73 for security classification. ...

  10. 48 CFR 245.201-73 - Security classification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Security classification... Procedures 245.201-73 Security classification. Follow the procedures at PGI 245.201-73 for security classification. ...

  11. Segmentation of bone and soft tissue regions in digital radiographic images of extremities

    NASA Astrophysics Data System (ADS)

    Pakin, S. Kubilay; Gaborski, Roger S.; Barski, Lori L.; Foos, David H.; Parker, Kevin J.

    2001-07-01

    This paper presents an algorithm for segmentation of computed radiography (CR) images of extremities into bone and soft tissue regions. The algorithm is a region-based one in which the regions are constructed using a growing procedure with two different statistical tests. Following the growing process, tissue classification procedure is employed. The purpose of the classification is to label each region as either bone or soft tissue. This binary classification goal is achieved by using a voting procedure that consists of clustering of regions in each neighborhood system into two classes. The voting procedure provides a crucial compromise between local and global analysis of the image, which is necessary due to strong exposure variations seen on the imaging plate. Also, the existence of regions whose size is large enough such that exposure variations can be observed through them makes it necessary to use overlapping blocks during the classification. After the classification step, resulting bone and soft tissue regions are refined by fitting a 2nd order surface to each tissue, and reevaluating the label of each region according to the distance between the region and surfaces. The performance of the algorithm is tested on a variety of extremity images using manually segmented images as gold standard. The experiments showed that our algorithm provided a bone boundary with an average area overlap of 90% compared to the gold standard.

  12. 77 FR 5379 - Revision of Cotton Futures Classification Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... 0581-AD16 Revision of Cotton Futures Classification Procedures AGENCY: Agricultural Marketing Service... for cotton futures quality classification services by using Smith-Doxey classification data in the cotton futures classification process. In addition, references to a separate and optional review of...

  13. Designing a training tool for imaging mental models

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  14. Statistical Analysis of Q-matrix Based Diagnostic Classification Models

    PubMed Central

    Chen, Yunxiao; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang

    2014-01-01

    Diagnostic classification models have recently gained prominence in educational assessment, psychiatric evaluation, and many other disciplines. Central to the model specification is the so-called Q-matrix that provides a qualitative specification of the item-attribute relationship. In this paper, we develop theories on the identifiability for the Q-matrix under the DINA and the DINO models. We further propose an estimation procedure for the Q-matrix through the regularized maximum likelihood. The applicability of this procedure is not limited to the DINA or the DINO model and it can be applied to essentially all Q-matrix based diagnostic classification models. Simulation studies are conducted to illustrate its performance. Furthermore, two case studies are presented. The first case is a data set on fraction subtraction (educational application) and the second case is a subsample of the National Epidemiological Survey on Alcohol and Related Conditions concerning the social anxiety disorder (psychiatric application). PMID:26294801

  15. 49 CFR 8.19 - Procedures for submitting and processing requests for classification reviews.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... for classification reviews. 8.19 Section 8.19 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information § 8.19 Procedures for submitting and processing requests for classification reviews. (a) The Director...

  16. Do we need a new classification of parotid gland surgery?

    PubMed

    Wierzbicka, Małgorzata; Piwowarczyk, Krzysztof; Nogala, Hanna; Błaszczyńska, Marzena; Kosiedrowski, Michał; Mazurek, Cezary

    2016-06-30

    In February 2016 the European Salivary Gland Society (ESGS) presented and recommended classification of parotidectomies based on the anatomical I-V level division of parotid gland. The main goal of this paper is to present the new classification, and to answer the question if it is more precise compared to classic one. 607 patients (315 man, 292 women) operated on for parotid tumours in a tertiary referral centre, Department of Otolaryngology, Head and Neck Surgery, Medical University of Poznań (502 benign and 105 malignant tumours). Parotid surgery descriptions provided by retrospective analysis of all operating protocols covering the years 2006-2015 were "translated" into the new classification proposed by the ESGS. Analysis of operating protocols and fitting them into the new classification proposed by the ESGS show some discrepancies, in both benign and malignant tumours. Based on the re-evaluation of 607 cases, in 94 procedures for benign tumors the only information available was that "surgery was performed within the superficial lobe". Thus, the new classification forces the surgeon to be much more precise than previously. In 3 cases the whole superficial lobe was removed, together with the upper part of the deep lobe. Because the classification lacked parotidectomy I-II-IV, it indicated that the new classification was insufficient in the aforementioned three cases. In 6 cases of ECD more than one parotid gland tumour was removed. Among malignant tumours, total parotidectomy was the predominant procedure. In 3/13 cases of expanded parotidectomy the temporomandibular joint (TMJ) was additionally removed and it seems that the acronym TMJ should be included among the additional resected structures. It is also necessary to supplement the description of the treatment with casuistically resected anatomical structures for oncological purposes (RT planning) and follow-up imaging. Currently, since 2015 in Poland there has been the National Cancer Registry of benign salivary gland tumours (https://guzyslinianek.pcss.pl). New surgical anatomy and classification based on it will be very helpful in unequivocal, albeit brief and not laborious, reporting of procedures. To summarize, the classification is: easy to use, precise, and forced the surgeon to make a detailed description saving time at the same time. Although it is broad and accurate, it did not cover all clinically rare cases, multiple foci and it does not contain key information about the rupture of the tumour's capsule, so it is necessary to complement the type of surgery by this annotations. The simple, clear and comprehensive classification is especially valuable for centres that lead registration. Thus, we are personally grateful for this new classification, which facilitates multicentre communication.

  17. 9 CFR 145.33 - Terminology and classification; flocks and products.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Such action shall not be taken until a thorough investigation has been made by the Service and the.... gallisepticum as provided in § 145.14(b), or by a polymerase chain reaction (PCR)-based procedure approved by...(b) or by a polymerase chain reaction (PCR)-based procedure approved by the Department. If fewer than...

  18. An algorithm for the arithmetic classification of multilattices.

    PubMed

    Indelicato, Giuliana

    2013-01-01

    A procedure for the construction and the classification of monoatomic multilattices in arbitrary dimension is developed. The algorithm allows one to determine the location of the points of all monoatomic multilattices with a given symmetry, or to determine whether two assigned multilattices are arithmetically equivalent. This approach is based on ideas from integral matrix theory, in particular the reduction to the Smith normal form, and can be coded to provide a classification software package.

  19. Attention Recognition in EEG-Based Affective Learning Research Using CFS+KNN Algorithm.

    PubMed

    Hu, Bin; Li, Xiaowei; Sun, Shuting; Ratcliffe, Martyn

    2018-01-01

    The research detailed in this paper focuses on the processing of Electroencephalography (EEG) data to identify attention during the learning process. The identification of affect using our procedures is integrated into a simulated distance learning system that provides feedback to the user with respect to attention and concentration. The authors propose a classification procedure that combines correlation-based feature selection (CFS) and a k-nearest-neighbor (KNN) data mining algorithm. To evaluate the CFS+KNN algorithm, it was test against CFS+C4.5 algorithm and other classification algorithms. The classification performance was measured 10 times with different 3-fold cross validation data. The data was derived from 10 subjects while they were attempting to learn material in a simulated distance learning environment. A self-assessment model of self-report was used with a single valence to evaluate attention on 3 levels (high, neutral, low). It was found that CFS+KNN had a much better performance, giving the highest correct classification rate (CCR) of % for the valence dimension divided into three classes.

  20. Exploring Genome-Wide Expression Profiles Using Machine Learning Techniques.

    PubMed

    Kebschull, Moritz; Papapanou, Panos N

    2017-01-01

    Although contemporary high-throughput -omics methods produce high-dimensional data, the resulting wealth of information is difficult to assess using traditional statistical procedures. Machine learning methods facilitate the detection of additional patterns, beyond the mere identification of lists of features that differ between groups.Here, we demonstrate the utility of (1) supervised classification algorithms in class validation, and (2) unsupervised clustering in class discovery. We use data from our previous work that described the transcriptional profiles of gingival tissue samples obtained from subjects suffering from chronic or aggressive periodontitis (1) to test whether the two diagnostic entities were also characterized by differences on the molecular level, and (2) to search for a novel, alternative classification of periodontitis based on the tissue transcriptomes.Using machine learning technology, we provide evidence for diagnostic imprecision in the currently accepted classification of periodontitis, and demonstrate that a novel, alternative classification based on differences in gingival tissue transcriptomes is feasible. The outlined procedures allow for the unbiased interrogation of high-dimensional datasets for characteristic underlying classes, and are applicable to a broad range of -omics data.

  1. Evaluation of results of US corn and soybeans exploratory experiment: Classification procedures verification test. [Missouri, Iowa, Indiana, and Illinois

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.; Baird, J. E. (Principal Investigator)

    1980-01-01

    The classification procedure utilized in making crop proportion estimates for corn and soybeans using remotely sensed data was evaluated. The procedure was derived during the transition year of the Large Area Crop Inventory Experiment. Analysis of variance techniques were applied to classifications performed by 3 groups of analysts who processed 25 segments selected from 4 agrophysical units (APU's). Group and APU effects were assessed to determine factors which affected the quality of the classifications. The classification results were studied to determine the effectiveness of the procedure in producing corn and soybeans proportion estimates.

  2. Coding of procedures documented by general practitioners in Swedish primary care-an explorative study using two procedure coding systems

    PubMed Central

    2012-01-01

    Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095

  3. Neural network approaches versus statistical methods in classification of multisource remote sensing data

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.

    1990-01-01

    Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.

  4. Proposal for a new content model for the Austrian Procedure Catalogue.

    PubMed

    Neururer, Sabrina B; Pfeiffer, Karl P

    2013-01-01

    The Austrian Procedure Catalogue is used for procedure coding in Austria. Its architecture and content has some major weaknesses. The aim of this study is the presentation of a new potential content model for this classification system consisting of main characteristics of health interventions. It is visualized using a UML class diagram. Based on this proposition, an implementation of an ontology for procedure coding is planned.

  5. 10 CFR 1045.8 - Procedural exemptions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Classification. (b) A request for an exemption shall be made in writing to the Director of Classification and... DEPARTMENT OF ENERGY (GENERAL PROVISIONS) NUCLEAR CLASSIFICATION AND DECLASSIFICATION Program Management of the Restricted Data and Formerly Restricted Data Classification System § 1045.8 Procedural exemptions...

  6. 10 CFR 1045.8 - Procedural exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Classification. (b) A request for an exemption shall be made in writing to the Director of Classification and... DEPARTMENT OF ENERGY (GENERAL PROVISIONS) NUCLEAR CLASSIFICATION AND DECLASSIFICATION Program Management of the Restricted Data and Formerly Restricted Data Classification System § 1045.8 Procedural exemptions...

  7. 77 FR 20503 - Revision of Cotton Classification Procedures for Determining Cotton Leaf Grade

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ...-AD19 Revision of Cotton Classification Procedures for Determining Cotton Leaf Grade AGENCY... amending the procedures for determining the official leaf grade for Upland and Pima cotton. The leaf grade is a part of the official classification which denotes cotton fiber quality used in cotton marketing...

  8. 42 CFR 416.167 - Basis of payment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... classification (APC) groups and payment weights. (1) ASC covered surgical procedures are classified using the APC... section, an ASC relative payment weight is determined based on the APC relative payment weight for each covered surgical procedure and covered ancillary service that has an applicable APC relative payment...

  9. Landsat TM Classifications For SAFIS Using FIA Field Plots

    Treesearch

    William H. Cooke; Andrew J. Hartsell

    2001-01-01

    Wall-to-wall Landsat Thematic Mapper (TM) classification efforts in Georgia require field validation. We developed a new crown modeling procedure based on Forest Health Monitoring (FHM) data to test Forest Inventory and Analysis (FIA) data. These models simulate the proportion of tree crowns that reflect light on a FIA subplot basis. We averaged subplot crown...

  10. Vessel Classification in Cosmo-Skymed SAR Data Using Hierarchical Feature Selection

    NASA Astrophysics Data System (ADS)

    Makedonas, A.; Theoharatos, C.; Tsagaris, V.; Anastasopoulos, V.; Costicoglou, S.

    2015-04-01

    SAR based ship detection and classification are important elements of maritime monitoring applications. Recently, high-resolution SAR data have opened new possibilities to researchers for achieving improved classification results. In this work, a hierarchical vessel classification procedure is presented based on a robust feature extraction and selection scheme that utilizes scale, shape and texture features in a hierarchical way. Initially, different types of feature extraction algorithms are implemented in order to form the utilized feature pool, able to represent the structure, material, orientation and other vessel type characteristics. A two-stage hierarchical feature selection algorithm is utilized next in order to be able to discriminate effectively civilian vessels into three distinct types, in COSMO-SkyMed SAR images: cargos, small ships and tankers. In our analysis, scale and shape features are utilized in order to discriminate smaller types of vessels present in the available SAR data, or shape specific vessels. Then, the most informative texture and intensity features are incorporated in order to be able to better distinguish the civilian types with high accuracy. A feature selection procedure that utilizes heuristic measures based on features' statistical characteristics, followed by an exhaustive research with feature sets formed by the most qualified features is carried out, in order to discriminate the most appropriate combination of features for the final classification. In our analysis, five COSMO-SkyMed SAR data with 2.2m x 2.2m resolution were used to analyse the detailed characteristics of these types of ships. A total of 111 ships with available AIS data were used in the classification process. The experimental results show that this method has good performance in ship classification, with an overall accuracy reaching 83%. Further investigation of additional features and proper feature selection is currently in progress.

  11. 12 CFR 1229.12 - Procedures related to capital classification and other actions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Procedures related to capital classification and other actions. 1229.12 Section 1229.12 Banks and Banking FEDERAL HOUSING FINANCE AGENCY ENTITY REGULATIONS CAPITAL CLASSIFICATIONS AND PROMPT CORRECTIVE ACTION Federal Home Loan Banks § 1229.12 Procedures...

  12. 21 CFR 860.84 - Classification procedures for “old devices.”

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Classification procedures for “old devices.” (a) This subpart sets forth the procedures for the original... the appropriate classification panel organized and operated in accordance with section 513 (b) and (c... set forth in § 860.7 relating to the determination of safety and effectiveness; (2) Determines the...

  13. 48 CFR 245.201-71 - Security classification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Security classification... Procedures 245.201-71 Security classification. Follow the procedures at PGI 245.201-71 for security classification. [76 FR 3537, Jan. 20, 2011. Redesignated and amended at 77 FR 76937, Dec. 31, 2012] ...

  14. 48 CFR 245.201-71 - Security classification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Security classification... Procedures 245.201-71 Security classification. Follow the procedures at PGI 245.201-71 for security classification. [76 FR 3537, Jan. 20, 2011. Redesignated and amended at 77 FR 76937, Dec. 31, 2012] ...

  15. Perceptual-motor skill learning in Gilles de la Tourette syndrome. Evidence for multiple procedural learning and memory systems.

    PubMed

    Marsh, Rachel; Alexander, Gerianne M; Packard, Mark G; Zhu, Hongtu; Peterson, Bradley S

    2005-01-01

    Procedural learning and memory systems likely comprise several skills that are differentially affected by various illnesses of the central nervous system, suggesting their relative functional independence and reliance on differing neural circuits. Gilles de la Tourette syndrome (GTS) is a movement disorder that involves disturbances in the structure and function of the striatum and related circuitry. Recent studies suggest that patients with GTS are impaired in performance of a probabilistic classification task that putatively involves the acquisition of stimulus-response (S-R)-based habits. Assessing the learning of perceptual-motor skills and probabilistic classification in the same samples of GTS and healthy control subjects may help to determine whether these various forms of procedural (habit) learning rely on the same or differing neuroanatomical substrates and whether those substrates are differentially affected in persons with GTS. Therefore, we assessed perceptual-motor skill learning using the pursuit-rotor and mirror tracing tasks in 50 patients with GTS and 55 control subjects who had previously been compared at learning a task of probabilistic classifications. The GTS subjects did not differ from the control subjects in performance of either the pursuit rotor or mirror-tracing tasks, although they were significantly impaired in the acquisition of a probabilistic classification task. In addition, learning on the perceptual-motor tasks was not correlated with habit learning on the classification task in either the GTS or healthy control subjects. These findings suggest that the differing forms of procedural learning are dissociable both functionally and neuroanatomically. The specific deficits in the probabilistic classification form of habit learning in persons with GTS are likely to be a consequence of disturbances in specific corticostriatal circuits, but not the same circuits that subserve the perceptual-motor form of habit learning.

  16. 48 CFR 970.5217-1 - Work for Others Program.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... selection is based on merit or peer review, the work involves basic or applied research to further advance... all Work for Others projects in accordance with the standards, policies, and procedures that apply to..., safeguards and classification procedures, and human and animal research regulations; (8) May subcontract...

  17. 21 CFR 860.130 - General procedures under section 513(e) of the act.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE CLASSIFICATION PROCEDURES Reclassification § 860.130... reclassification proceedings under the act based upon new information. (b) A proceeding to reclassify a device... would provide reasonable assurance of the safety and effectiveness of the device and there is sufficient...

  18. Accreditation Standards: Policies, Procedures, and Criteria. Revised Edition.

    ERIC Educational Resources Information Center

    Association of Independent Colleges and Schools, Washington, DC.

    Statements of policies and procedures and evaluation criteria used by the Accrediting Commission of the Association of Independent Colleges and Schools are presented. The organization and function of the Accrediting Commission, the bases of eligibility for evaluation and accreditation of all types of institutions, and the general classification of…

  19. Two techniques for mapping and area estimation of small grains in California using Landsat digital data

    NASA Technical Reports Server (NTRS)

    Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.

    1984-01-01

    Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.

  20. An Iterative Inference Procedure Applying Conditional Random Fields for Simultaneous Classification of Land Cover and Land Use

    NASA Astrophysics Data System (ADS)

    Albert, L.; Rottensteiner, F.; Heipke, C.

    2015-08-01

    Land cover and land use exhibit strong contextual dependencies. We propose a novel approach for the simultaneous classification of land cover and land use, where semantic and spatial context is considered. The image sites for land cover and land use classification form a hierarchy consisting of two layers: a land cover layer and a land use layer. We apply Conditional Random Fields (CRF) at both layers. The layers differ with respect to the image entities corresponding to the nodes, the employed features and the classes to be distinguished. In the land cover layer, the nodes represent super-pixels; in the land use layer, the nodes correspond to objects from a geospatial database. Both CRFs model spatial dependencies between neighbouring image sites. The complex semantic relations between land cover and land use are integrated in the classification process by using contextual features. We propose a new iterative inference procedure for the simultaneous classification of land cover and land use, in which the two classification tasks mutually influence each other. This helps to improve the classification accuracy for certain classes. The main idea of this approach is that semantic context helps to refine the class predictions, which, in turn, leads to more expressive context information. Thus, potentially wrong decisions can be reversed at later stages. The approach is designed for input data based on aerial images. Experiments are carried out on a test site to evaluate the performance of the proposed method. We show the effectiveness of the iterative inference procedure and demonstrate that a smaller size of the super-pixels has a positive influence on the classification result.

  1. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S. cotton industry and the Intercontinental Exchange (ICE). In response to requests from the U.S. cotton...

  2. Auto-SEIA: simultaneous optimization of image processing and machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Negro Maggio, Valentina; Iocchi, Luca

    2015-02-01

    Object classification from images is an important task for machine vision and it is a crucial ingredient for many computer vision applications, ranging from security and surveillance to marketing. Image based object classification techniques properly integrate image processing and machine learning (i.e., classification) procedures. In this paper we present a system for automatic simultaneous optimization of algorithms and parameters for object classification from images. More specifically, the proposed system is able to process a dataset of labelled images and to return a best configuration of image processing and classification algorithms and of their parameters with respect to the accuracy of classification. Experiments with real public datasets are used to demonstrate the effectiveness of the developed system.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cavanaugh, J.E.; McQuarrie, A.D.; Shumway, R.H.

    Conventional methods for discriminating between earthquakes and explosions at regional distances have concentrated on extracting specific features such as amplitude and spectral ratios from the waveforms of the P and S phases. We consider here an optimum nonparametric classification procedure derived from the classical approach to discriminating between two Gaussian processes with unequal spectra. Two robust variations based on the minimum discrimination information statistic and Renyi's entropy are also considered. We compare the optimum classification procedure with various amplitude and spectral ratio discriminants and show that its performance is superior when applied to a small population of 8 land-based earthquakesmore » and 8 mining explosions recorded in Scandinavia. Several parametric characterizations of the notion of complexity based on modeling earthquakes and explosions as autoregressive or modulated autoregressive processes are also proposed and their performance compared with the nonparametric and feature extraction approaches.« less

  4. 76 FR 13449 - Proposed Collection; Comment Request for Revenue Procedure 2009-41 (Rev. Proc. 2002-59 Is...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-11

    ... Classification Elections. DATES: Written comments should be received on or before May 10, 2011 to be assured of... Classification Elections. OMB Number: 1545-1771. Revenue Procedure Number: Revenue Procedure 2009-41. (Rev. Proc... Internal Revenue Code for an eligible entity that requests relief for a late classification election filed...

  5. Some sequential, distribution-free pattern classification procedures with applications

    NASA Technical Reports Server (NTRS)

    Poage, J. L.

    1971-01-01

    Some sequential, distribution-free pattern classification techniques are presented. The decision problem to which the proposed classification methods are applied is that of discriminating between two kinds of electroencephalogram responses recorded from a human subject: spontaneous EEG and EEG driven by a stroboscopic light stimulus at the alpha frequency. The classification procedures proposed make use of the theory of order statistics. Estimates of the probabilities of misclassification are given. The procedures were tested on Gaussian samples and the EEG responses.

  6. 12 CFR 605.502 - Program and procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... procedures. (a) The Farm Credit Administration has no authority for the original classification of... classify information. (b) Derivative classification. “Derivative classification” means the incorporating... developed material consistent with the classification markings that apply to the source information...

  7. 12 CFR 605.502 - Program and procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... procedures. (a) The Farm Credit Administration has no authority for the original classification of... classify information. (b) Derivative classification. “Derivative classification” means the incorporating... developed material consistent with the classification markings that apply to the source information...

  8. 12 CFR 605.502 - Program and procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... procedures. (a) The Farm Credit Administration has no authority for the original classification of... classify information. (b) Derivative classification. “Derivative classification” means the incorporating... developed material consistent with the classification markings that apply to the source information...

  9. 12 CFR 605.502 - Program and procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... procedures. (a) The Farm Credit Administration has no authority for the original classification of... classify information. (b) Derivative classification. “Derivative classification” means the incorporating... developed material consistent with the classification markings that apply to the source information...

  10. A comparison of unsupervised classification procedures on LANDSAT MSS data for an area of complex surface conditions in Basilicata, Southern Italy

    NASA Technical Reports Server (NTRS)

    Justice, C.; Townshend, J. (Principal Investigator)

    1981-01-01

    Two unsupervised classification procedures were applied to ratioed and unratioed LANDSAT multispectral scanner data of an area of spatially complex vegetation and terrain. An objective accuracy assessment was undertaken on each classification and comparison was made of the classification accuracies. The two unsupervised procedures use the same clustering algorithm. By on procedure the entire area is clustered and by the other a representative sample of the area is clustered and the resulting statistics are extrapolated to the remaining area using a maximum likelihood classifier. Explanation is given of the major steps in the classification procedures including image preprocessing; classification; interpretation of cluster classes; and accuracy assessment. Of the four classifications undertaken, the monocluster block approach on the unratioed data gave the highest accuracy of 80% for five coarse cover classes. This accuracy was increased to 84% by applying a 3 x 3 contextual filter to the classified image. A detailed description and partial explanation is provided for the major misclassification. The classification of the unratioed data produced higher percentage accuracies than for the ratioed data and the monocluster block approach gave higher accuracies than clustering the entire area. The moncluster block approach was additionally the most economical in terms of computing time.

  11. Characterization of groups using composite kernels and multi-source fMRI analysis data: application to schizophrenia

    PubMed Central

    Castro, Eduardo; Martínez-Ramón, Manel; Pearlson, Godfrey; Sui, Jing; Calhoun, Vince D.

    2011-01-01

    Pattern classification of brain imaging data can enable the automatic detection of differences in cognitive processes of specific groups of interest. Furthermore, it can also give neuroanatomical information related to the regions of the brain that are most relevant to detect these differences by means of feature selection procedures, which are also well-suited to deal with the high dimensionality of brain imaging data. This work proposes the application of recursive feature elimination using a machine learning algorithm based on composite kernels to the classification of healthy controls and patients with schizophrenia. This framework, which evaluates nonlinear relationships between voxels, analyzes whole-brain fMRI data from an auditory task experiment that is segmented into anatomical regions and recursively eliminates the uninformative ones based on their relevance estimates, thus yielding the set of most discriminative brain areas for group classification. The collected data was processed using two analysis methods: the general linear model (GLM) and independent component analysis (ICA). GLM spatial maps as well as ICA temporal lobe and default mode component maps were then input to the classifier. A mean classification accuracy of up to 95% estimated with a leave-two-out cross-validation procedure was achieved by doing multi-source data classification. In addition, it is shown that the classification accuracy rate obtained by using multi-source data surpasses that reached by using single-source data, hence showing that this algorithm takes advantage of the complimentary nature of GLM and ICA. PMID:21723948

  12. Is There Room for Prevention? Examining the Effect of Outpatient Facility Type on the Risk of Surgical Site Infection.

    PubMed

    Parikh, Rishi; Pollock, Daniel; Sharma, Jyotirmay; Edwards, Jonathan

    2016-10-01

    OBJECTIVE We compared risk for surgical site infection (SSI) following surgical breast procedures among 2 patient groups: those whose procedures were performed in ambulatory surgery centers (ASCs) and those whose procedures were performed in hospital-based outpatient facilities. DESIGN Cohort study using National Healthcare Safety Network (NHSN) SSI data for breast procedures performed from 2010 to 2014. METHODS Unconditional multivariate logistic regression was used to examine the association between facility type and breast SSI, adjusting for American Society of Anesthesiologists (ASA) Physical Status Classification, patient age, and duration of procedure. Other potential adjustment factors examined were wound classification, anesthesia use, and gender. RESULTS Among 124,021 total outpatient breast procedures performed between 2010 and 2014, 110,987 procedure reports submitted to the NHSN provided complete covariate data and were included in the analysis. Breast procedures performed in ASCs carried a lower risk of SSI compared with those performed in hospital-based outpatient settings. For patients aged ≤51 years, the adjusted risk ratio was 0.36 (95% CI, 0.25-0.50) and for patients >51 years old, the adjusted risk ratio was 0.32 (95% CI, 0.21-0.49). CONCLUSIONS SSI risk following breast procedures was significantly lower among ASC patients than among hospital-based outpatients. These findings should be placed in the context of study limitations, including the possibility of incomplete ascertainment of SSIs and shortcomings in the data available to control for differences in patient case mix. Additional studies are needed to better understand the role of procedural settings in SSI risk following breast procedures and to identify prevention opportunities. Infect Control Hosp Epidemiol 2016;1-7.

  13. Canonical Measure of Correlation (CMC) and Canonical Measure of Distance (CMD) between sets of data. Part 3. Variable selection in classification.

    PubMed

    Ballabio, Davide; Consonni, Viviana; Mauri, Andrea; Todeschini, Roberto

    2010-01-11

    In multivariate regression and classification issues variable selection is an important procedure used to select an optimal subset of variables with the aim of producing more parsimonious and eventually more predictive models. Variable selection is often necessary when dealing with methodologies that produce thousands of variables, such as Quantitative Structure-Activity Relationships (QSARs) and highly dimensional analytical procedures. In this paper a novel method for variable selection for classification purposes is introduced. This method exploits the recently proposed Canonical Measure of Correlation between two sets of variables (CMC index). The CMC index is in this case calculated for two specific sets of variables, the former being comprised of the independent variables and the latter of the unfolded class matrix. The CMC values, calculated by considering one variable at a time, can be sorted and a ranking of the variables on the basis of their class discrimination capabilities results. Alternatively, CMC index can be calculated for all the possible combinations of variables and the variable subset with the maximal CMC can be selected, but this procedure is computationally more demanding and classification performance of the selected subset is not always the best one. The effectiveness of the CMC index in selecting variables with discriminative ability was compared with that of other well-known strategies for variable selection, such as the Wilks' Lambda, the VIP index based on the Partial Least Squares-Discriminant Analysis, and the selection provided by classification trees. A variable Forward Selection based on the CMC index was finally used in conjunction of Linear Discriminant Analysis. This approach was tested on several chemical data sets. Obtained results were encouraging.

  14. Completion of a Liver Surgery Complexity Score and Classification Based on an International Survey of Experts.

    PubMed

    Lee, Major K; Gao, Feng; Strasberg, Steven M

    2016-08-01

    Liver resections have classically been distinguished as "minor" or "major," based on number of segments removed. This is flawed because the number of segments resected alone does not convey the complexity of a resection. We recently developed a 3-tiered classification for the complexity of liver resections based on utility weighting by experts. This study aims to complete the earlier classification and to illustrate its application. Two surveys were administered to expert liver surgeons. Experts were asked to rate the difficulty of various open liver resections on a scale of 1 to 10. Statistical methods were then used to develop a complexity score for each procedure. Sixty-six of 135 (48.9%) surgeons responded to the earlier survey, and 66 of 122 (54.1%) responded to the current survey. In all, 19 procedures were rated. The lowest mean score of 1.36 (indicating least difficult) was given to peripheral wedge resection. Right hepatectomy with IVC reconstruction was deemed most difficult, with a score of 9.35. Complexity scores were similar for 9 procedures present in both surveys. Caudate resection, hepaticojejunostomy, and vascular reconstruction all increased the complexity of standard resections significantly. These data permit quantitative assessment of the difficulty of a variety of liver resections. The complexity scores generated allow for separation of liver resections into 3 categories of complexity (low complexity, medium complexity, and high complexity) on a quantitative basis. This provides a more accurate representation of the complexity of procedures in comparative studies. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. A remote sensing based vegetation classification logic for global land cover analysis

    USGS Publications Warehouse

    Running, Steven W.; Loveland, Thomas R.; Pierce, Lars L.; Nemani, R.R.; Hunt, E. Raymond

    1995-01-01

    This article proposes a simple new logic for classifying global vegetation. The critical features of this classification are that 1) it is based on simple, observable, unambiguous characteristics of vegetation structure that are important to ecosystem biogeochemistry and can be measured in the field for validation, 2) the structural characteristics are remotely sensible so that repeatable and efficient global reclassifications of existing vegetation will be possible, and 3) the defined vegetation classes directly translate into the biophysical parameters of interest by global climate and biogeochemical models. A first test of this logic for the continental United States is presented based on an existing 1 km AVHRR normalized difference vegetation index database. Procedures for solving critical remote sensing problems needed to implement the classification are discussed. Also, some inferences from this classification to advanced vegetation biophysical variables such as specific leaf area and photosynthetic capacity useful to global biogeochemical modeling are suggested.

  16. Analysis of verbal communication during teaching in the operating room and the potentials for surgical training.

    PubMed

    Blom, E M; Verdaasdonk, E G G; Stassen, L P S; Stassen, H G; Wieringa, P A; Dankelman, J

    2007-09-01

    Verbal communication in the operating room during surgical procedures affects team performance, reflects individual skills, and is related to the complexity of the operation process. During the procedural training of surgeons (residents), feedback and guidance is given through verbal communication. A classification method based on structural analysis of the contents was developed to analyze verbal communication. This study aimed to evaluate whether a classification method for the contents of verbal communication in the operating room could provide insight into the teaching processes. Eight laparoscopic cholecystectomies were videotaped. Two entire cholecystectomies and the dissection phase of six additional procedures were analyzed by categorization of the communication in terms of type (4 categories: commanding, explaining, questioning, and miscellaneous) and content (9 categories: operation method, location, direction, instrument handling, visualization, anatomy and pathology, general, private, undefinable). The operation was divided into six phases: start, dissection, clipping, separating, control, closing. Classification of the communication during two entire procedures showed that each phase of the operation was dominated by different kinds of communication. A high percentage of explaining anatomy and pathology was found throughout the whole procedure except for the control and closing phases. In the dissection phases, 60% of verbal communication concerned explaining. These explaining communication events were divided as follows: 27% operation method, 19% anatomy and pathology, 25% location (positioning of the instrument-tissue interaction), 15% direction (direction of tissue manipulation), 11% instrument handling, and 3% other nonclassified instructions. The proposed classification method is feasible for analyzing verbal communication during surgical procedures. Communication content objectively reflects the interaction between surgeon and resident. This information can potentially be used to specify training needs, and may contribute to the evaluation of different training methods.

  17. Multiple Spectral-Spatial Classification Approach for Hyperspectral Data

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Benediktsson, Jon Atli; Chanussot, Jocelyn; Tilton, James C.

    2010-01-01

    A .new multiple classifier approach for spectral-spatial classification of hyperspectral images is proposed. Several classifiers are used independently to classify an image. For every pixel, if all the classifiers have assigned this pixel to the same class, the pixel is kept as a marker, i.e., a seed of the spatial region, with the corresponding class label. We propose to use spectral-spatial classifiers at the preliminary step of the marker selection procedure, each of them combining the results of a pixel-wise classification and a segmentation map. Different segmentation methods based on dissimilar principles lead to different classification results. Furthermore, a minimum spanning forest is built, where each tree is rooted on a classification -driven marker and forms a region in the spectral -spatial classification: map. Experimental results are presented for two hyperspectral airborne images. The proposed method significantly improves classification accuracies, when compared to previously proposed classification techniques.

  18. 75 FR 9955 - Labor Surplus Area Classification Under Executive Orders 12073 and 10582

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Surplus Area Classification Under Executive Orders 12073 and 10582 AGENCY: Employment and Training Administration, Labor. ACTION: Notice... supplementary, eligibility, classification procedures and petition for exceptional circumstances procedure...

  19. Privacy Act System of Records: Federal Lead-Based Paint Program System of Records, EPA-54

    EPA Pesticide Factsheets

    Learn about the Federal Lead-Based Paint Program System of Records (FLPPSOR), including the security classification, individuals covered by the system, categories of records, routine uses of the records, and other security procedures.

  20. Shadow detection and removal in RGB VHR images for land use unsupervised classification

    NASA Astrophysics Data System (ADS)

    Movia, A.; Beinat, A.; Crosilla, F.

    2016-09-01

    Nowadays, high resolution aerial images are widely available thanks to the diffusion of advanced technologies such as UAVs (Unmanned Aerial Vehicles) and new satellite missions. Although these developments offer new opportunities for accurate land use analysis and change detection, cloud and terrain shadows actually limit benefits and possibilities of modern sensors. Focusing on the problem of shadow detection and removal in VHR color images, the paper proposes new solutions and analyses how they can enhance common unsupervised classification procedures for identifying land use classes related to the CO2 absorption. To this aim, an improved fully automatic procedure has been developed for detecting image shadows using exclusively RGB color information, and avoiding user interaction. Results show a significant accuracy enhancement with respect to similar methods using RGB based indexes. Furthermore, novel solutions derived from Procrustes analysis have been applied to remove shadows and restore brightness in the images. In particular, two methods implementing the so called "anisotropic Procrustes" and the "not-centered oblique Procrustes" algorithms have been developed and compared with the linear correlation correction method based on the Cholesky decomposition. To assess how shadow removal can enhance unsupervised classifications, results obtained with classical methods such as k-means, maximum likelihood, and self-organizing maps, have been compared to each other and with a supervised clustering procedure.

  1. Classification of longitudinal data through a semiparametric mixed-effects model based on lasso-type estimators.

    PubMed

    Arribas-Gil, Ana; De la Cruz, Rolando; Lebarbier, Emilie; Meza, Cristian

    2015-06-01

    We propose a classification method for longitudinal data. The Bayes classifier is classically used to determine a classification rule where the underlying density in each class needs to be well modeled and estimated. This work is motivated by a real dataset of hormone levels measured at the early stages of pregnancy that can be used to predict normal versus abnormal pregnancy outcomes. The proposed model, which is a semiparametric linear mixed-effects model (SLMM), is a particular case of the semiparametric nonlinear mixed-effects class of models (SNMM) in which finite dimensional (fixed effects and variance components) and infinite dimensional (an unknown function) parameters have to be estimated. In SNMM's maximum likelihood estimation is performed iteratively alternating parametric and nonparametric procedures. However, if one can make the assumption that the random effects and the unknown function interact in a linear way, more efficient estimation methods can be used. Our contribution is the proposal of a unified estimation procedure based on a penalized EM-type algorithm. The Expectation and Maximization steps are explicit. In this latter step, the unknown function is estimated in a nonparametric fashion using a lasso-type procedure. A simulation study and an application on real data are performed. © 2015, The International Biometric Society.

  2. Auditory “bubbles”: Efficient classification of the spectrotemporal modulations essential for speech intelligibility

    PubMed Central

    Venezia, Jonathan H.; Hickok, Gregory; Richards, Virginia M.

    2016-01-01

    Speech intelligibility depends on the integrity of spectrotemporal patterns in the signal. The current study is concerned with the speech modulation power spectrum (MPS), which is a two-dimensional representation of energy at different combinations of temporal and spectral (i.e., spectrotemporal) modulation rates. A psychophysical procedure was developed to identify the regions of the MPS that contribute to successful reception of auditory sentences. The procedure, based on the two-dimensional image classification technique known as “bubbles” (Gosselin and Schyns (2001). Vision Res. 41, 2261–2271), involves filtering (i.e., degrading) the speech signal by removing parts of the MPS at random, and relating filter patterns to observer performance (keywords identified) over a number of trials. The result is a classification image (CImg) or “perceptual map” that emphasizes regions of the MPS essential for speech intelligibility. This procedure was tested using normal-rate and 2×-time-compressed sentences. The results indicated: (a) CImgs could be reliably estimated in individual listeners in relatively few trials, (b) CImgs tracked changes in spectrotemporal modulation energy induced by time compression, though not completely, indicating that “perceptual maps” deviated from physical stimulus energy, and (c) the bubbles method captured variance in intelligibility not reflected in a common modulation-based intelligibility metric (spectrotemporal modulation index or STMI). PMID:27586738

  3. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    PubMed Central

    Rutzinger, Martin; Höfle, Bernhard; Hollaus, Markus; Pfeifer, Norbert

    2008-01-01

    Airborne laser scanning (ALS) is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (>20 echoes/m2) and additional classification variables from full-waveform (FWF) ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA) approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation) are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original measurements directly, i.e. the acquired points. Gridding of the data is not necessary, a process which is inherently coupled to loss of data and precision. The 3D properties provide especially a good separability of buildings and terrain points respectively, if they are occluded by vegetation. PMID:27873771

  4. Possible world based consistency learning model for clustering and classifying uncertain data.

    PubMed

    Liu, Han; Zhang, Xianchao; Zhang, Xiaotong

    2018-06-01

    Possible world has shown to be effective for handling various types of data uncertainty in uncertain data management. However, few uncertain data clustering and classification algorithms are proposed based on possible world. Moreover, existing possible world based algorithms suffer from the following issues: (1) they deal with each possible world independently and ignore the consistency principle across different possible worlds; (2) they require the extra post-processing procedure to obtain the final result, which causes that the effectiveness highly relies on the post-processing method and the efficiency is also not very good. In this paper, we propose a novel possible world based consistency learning model for uncertain data, which can be extended both for clustering and classifying uncertain data. This model utilizes the consistency principle to learn a consensus affinity matrix for uncertain data, which can make full use of the information across different possible worlds and then improve the clustering and classification performance. Meanwhile, this model imposes a new rank constraint on the Laplacian matrix of the consensus affinity matrix, thereby ensuring that the number of connected components in the consensus affinity matrix is exactly equal to the number of classes. This also means that the clustering and classification results can be directly obtained without any post-processing procedure. Furthermore, for the clustering and classification tasks, we respectively derive the efficient optimization methods to solve the proposed model. Experimental results on real benchmark datasets and real world uncertain datasets show that the proposed model outperforms the state-of-the-art uncertain data clustering and classification algorithms in effectiveness and performs competitively in efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Artificial neural network classification using a minimal training set - Comparison to conventional supervised classification

    NASA Technical Reports Server (NTRS)

    Hepner, George F.; Logan, Thomas; Ritter, Niles; Bryant, Nevin

    1990-01-01

    Recent research has shown an artificial neural network (ANN) to be capable of pattern recognition and the classification of image data. This paper examines the potential for the application of neural network computing to satellite image processing. A second objective is to provide a preliminary comparison and ANN classification. An artificial neural network can be trained to do land-cover classification of satellite imagery using selected sites representative of each class in a manner similar to conventional supervised classification. One of the major problems associated with recognition and classifications of pattern from remotely sensed data is the time and cost of developing a set of training sites. This reseach compares the use of an ANN back propagation classification procedure with a conventional supervised maximum likelihood classification procedure using a minimal training set. When using a minimal training set, the neural network is able to provide a land-cover classification superior to the classification derived from the conventional classification procedure. This research is the foundation for developing application parameters for further prototyping of software and hardware implementations for artificial neural networks in satellite image and geographic information processing.

  6. Maxillectomy defects: a suggested classification scheme.

    PubMed

    Akinmoladun, V I; Dosumu, O O; Olusanya, A A; Ikusika, O F

    2013-06-01

    The term "maxillectomy" has been used to describe a variety of surgical procedures for a spectrum of diseases involving a diverse anatomical site. Hence, classifications of maxillectomy defects have often made communication difficult. This article highlights this problem, emphasises the need for a uniform system of classification and suggests a classification system which is simple and comprehensive. Articles related to this subject, especially those with specified classifications of maxillary surgical defects were sourced from the internet through Google, Scopus and PubMed using the search terms maxillectomy defects classification. A manual search through available literature was also done. The review of the materials revealed many classifications and modifications of classifications from the descriptive, reconstructive and prosthodontic perspectives. No globally acceptable classification exists among practitioners involved in the management of diseases in the mid-facial region. There were over 14 classifications of maxillary defects found in the English literature. Attempts made to address the inadequacies of previous classifications have tended to result in cumbersome and relatively complex classifications. A single classification that is based on both surgical and prosthetic considerations is most desirable and is hereby proposed.

  7. Upper Kalamazoo watershed land cover inventory. [based on remote sensing

    NASA Technical Reports Server (NTRS)

    Richason, B., III; Enslin, W.

    1973-01-01

    Approximately 1000 square miles of the eastern portion of the watershed were inventoried based on remote sensing imagery. The classification scheme, imagery and interpretation procedures, and a cost analysis are discussed. The distributions of land cover within the area are tabulated.

  8. Parametric estimates for the receiver operating characteristic curve generalization for non-monotone relationships.

    PubMed

    Martínez-Camblor, Pablo; Pardo-Fernández, Juan C

    2017-01-01

    Diagnostic procedures are based on establishing certain conditions and then checking if those conditions are satisfied by a given individual. When the diagnostic procedure is based on a continuous marker, this is equivalent to fix a region or classification subset and then check if the observed value of the marker belongs to that region. Receiver operating characteristic curve is a valuable and popular tool to study and compare the diagnostic ability of a given marker. Besides, the area under the receiver operating characteristic curve is frequently used as an index of the global discrimination ability. This paper revises and widens the scope of the receiver operating characteristic curve definition by setting the classification subsets in which the final decision is based in the spotlight of the analysis. We revise the definition of the receiver operating characteristic curve in terms of particular classes of classification subsets and then focus on a receiver operating characteristic curve generalization for situations in which both low and high values of the marker are associated with more probability of having the studied characteristic. Parametric and non-parametric estimators of the receiver operating characteristic curve generalization are investigated. Monte Carlo studies and real data examples illustrate their practical performance.

  9. Reconstruction Using Locoregional Flaps for Large Skull Base Defects.

    PubMed

    Hatano, Takaharu; Motomura, Hisashi; Ayabe, Shinobu

    2015-06-01

    We present a modified locoregional flap for the reconstruction of large anterior skull base defects that should be reconstructed with a free flap according to Yano's algorithm. No classification of skull base defects had been proposed for a long time. Yano et al suggested a new classification in 2012. The lb defect of Yano's classification extends horizontally from the cribriform plate to the orbital roof. According to Yano's algorithm for subsequent skull base reconstructive procedures, a lb defect should be reconstructed with a free flap such as an anterolateral thigh free flap or rectus abdominis myocutaneous free flap. However, our modified locoregional flap has also enabled reconstruction of lb defects. In this case series, we used a locoregional flap for lb defects. No major postoperative complications occurred. We present our modified locoregional flap that enables reconstruction of lb defects.

  10. Structural knowledge learning from maps for supervised land cover/use classification: Application to the monitoring of land cover/use maps in French Guiana

    NASA Astrophysics Data System (ADS)

    Bayoudh, Meriam; Roux, Emmanuel; Richard, Gilles; Nock, Richard

    2015-03-01

    The number of satellites and sensors devoted to Earth observation has become increasingly elevated, delivering extensive data, especially images. At the same time, the access to such data and the tools needed to process them has considerably improved. In the presence of such data flow, we need automatic image interpretation methods, especially when it comes to the monitoring and prediction of environmental and societal changes in highly dynamic socio-environmental contexts. This could be accomplished via artificial intelligence. The concept described here relies on the induction of classification rules that explicitly take into account structural knowledge, using Aleph, an Inductive Logic Programming (ILP) system, combined with a multi-class classification procedure. This methodology was used to monitor changes in land cover/use of the French Guiana coastline. One hundred and fifty-eight classification rules were induced from 3 diachronic land cover/use maps including 38 classes. These rules were expressed in first order logic language, which makes them easily understandable by non-experts. A 10-fold cross-validation gave significant average values of 84.62%, 99.57% and 77.22% for classification accuracy, specificity and sensitivity, respectively. Our methodology could be beneficial to automatically classify new objects and to facilitate object-based classification procedures.

  11. Laboratory evaluation of detectors of explosives' effluents

    DOT National Transportation Integrated Search

    1972-11-30

    This document contains the classification, technical description and laboratory evaluation of five commercial detectors for explosives' effluents. It includes an outline of operating principles, test and evaluation procedures. The evaluation is based...

  12. Perspectives on Machine Learning for Classification of Schizotypy Using fMRI Data.

    PubMed

    Madsen, Kristoffer H; Krohne, Laerke G; Cai, Xin-Lu; Wang, Yi; Chan, Raymond C K

    2018-03-15

    Functional magnetic resonance imaging is capable of estimating functional activation and connectivity in the human brain, and lately there has been increased interest in the use of these functional modalities combined with machine learning for identification of psychiatric traits. While these methods bear great potential for early diagnosis and better understanding of disease processes, there are wide ranges of processing choices and pitfalls that may severely hamper interpretation and generalization performance unless carefully considered. In this perspective article, we aim to motivate the use of machine learning schizotypy research. To this end, we describe common data processing steps while commenting on best practices and procedures. First, we introduce the important role of schizotypy to motivate the importance of reliable classification, and summarize existing machine learning literature on schizotypy. Then, we describe procedures for extraction of features based on fMRI data, including statistical parametric mapping, parcellation, complex network analysis, and decomposition methods, as well as classification with a special focus on support vector classification and deep learning. We provide more detailed descriptions and software as supplementary material. Finally, we present current challenges in machine learning for classification of schizotypy and comment on future trends and perspectives.

  13. Molecular classification of pesticides including persistent organic pollutants, phenylurea and sulphonylurea herbicides.

    PubMed

    Torrens, Francisco; Castellano, Gloria

    2014-06-05

    Pesticide residues in wine were analyzed by liquid chromatography-tandem mass spectrometry. Retentions are modelled by structure-property relationships. Bioplastic evolution is an evolutionary perspective conjugating effect of acquired characters and evolutionary indeterminacy-morphological determination-natural selection principles; its application to design co-ordination index barely improves correlations. Fractal dimensions and partition coefficient differentiate pesticides. Classification algorithms are based on information entropy and its production. Pesticides allow a structural classification by nonplanarity, and number of O, S, N and Cl atoms and cycles; different behaviours depend on number of cycles. The novelty of the approach is that the structural parameters are related to retentions. Classification algorithms are based on information entropy. When applying procedures to moderate-sized sets, excessive results appear compatible with data suffering a combinatorial explosion. However, equipartition conjecture selects criterion resulting from classification between hierarchical trees. Information entropy permits classifying compounds agreeing with principal component analyses. Periodic classification shows that pesticides in the same group present similar properties; those also in equal period, maximum resemblance. The advantage of the classification is to predict the retentions for molecules not included in the categorization. Classification extends to phenyl/sulphonylureas and the application will be to predict their retentions.

  14. The Use of the International Classification of Functioning, Disability and Health, Version for Children and Youth (ICF-CY), in Portuguese Special Education Assessment and Eligibility Procedures: The Professionals' Perceptions

    ERIC Educational Resources Information Center

    Sanches-Ferreira, Manuela; Silveira-Maia, Mónica; Alves, Sílvia

    2014-01-01

    Portugal was the first country decreeing the mandatory use of the International Classification of Functioning, Disability and Health: Child and Youth (ICF-CY) framework for guiding special education assessment process and to base eligibility decision-making on students' functioning profiles--in contrast with traditional approaches centred on…

  15. Intrinsic Remediation Engineering Evaluation/Cost Analysis for Car Care Center at Bolling Air Force Base, Washington, DC

    DTIC Science & Technology

    1997-01-01

    supplemented using established literature values for similar aquifer materials . The groundwater sampling activities and analytical results from both...subsurface materials recovered. Observed soil classification types compared very favorably to the soil classifications determined by the CPT tests. 0 2.1.5...other similar substances were handled in a manner consistent with accepted safety procedures and standard operating practices. Well completion materials

  16. Automatic classification of scar tissue in late gadolinium enhancement cardiac MRI for the assessment of left-atrial wall injury after radiofrequency ablation

    PubMed Central

    Morris, Alan; Burgon, Nathan; McGann, Christopher; MacLeod, Robert; Cates, Joshua

    2013-01-01

    Radiofrequency ablation is a promising procedure for treating atrial fibrillation (AF) that relies on accurate lesion delivery in the left atrial (LA) wall for success. Late Gadolinium Enhancement MRI (LGE MRI) at three months post-ablation has proven effective for noninvasive assessment of the location and extent of scar formation, which are important factors for predicting patient outcome and planning of redo ablation procedures. We have developed an algorithm for automatic classification in LGE MRI of scar tissue in the LA wall and have evaluated accuracy and consistency compared to manual scar classifications by expert observers. Our approach clusters voxels based on normalized intensity and was chosen through a systematic comparison of the performance of multivariate clustering on many combinations of image texture. Algorithm performance was determined by overlap with ground truth, using multiple overlap measures, and the accuracy of the estimation of the total amount of scar in the LA. Ground truth was determined using the STAPLE algorithm, which produces a probabilistic estimate of the true scar classification from multiple expert manual segmentations. Evaluation of the ground truth data set was based on both inter- and intra-observer agreement, with variation among expert classifiers indicating the difficulty of scar classification for a given a dataset. Our proposed automatic scar classification algorithm performs well for both scar localization and estimation of scar volume: for ground truth datasets considered easy, variability from the ground truth was low; for those considered difficult, variability from ground truth was on par with the variability across experts. PMID:24236224

  17. Automatic classification of scar tissue in late gadolinium enhancement cardiac MRI for the assessment of left-atrial wall injury after radiofrequency ablation

    NASA Astrophysics Data System (ADS)

    Perry, Daniel; Morris, Alan; Burgon, Nathan; McGann, Christopher; MacLeod, Robert; Cates, Joshua

    2012-03-01

    Radiofrequency ablation is a promising procedure for treating atrial fibrillation (AF) that relies on accurate lesion delivery in the left atrial (LA) wall for success. Late Gadolinium Enhancement MRI (LGE MRI) at three months post-ablation has proven effective for noninvasive assessment of the location and extent of scar formation, which are important factors for predicting patient outcome and planning of redo ablation procedures. We have developed an algorithm for automatic classification in LGE MRI of scar tissue in the LA wall and have evaluated accuracy and consistency compared to manual scar classifications by expert observers. Our approach clusters voxels based on normalized intensity and was chosen through a systematic comparison of the performance of multivariate clustering on many combinations of image texture. Algorithm performance was determined by overlap with ground truth, using multiple overlap measures, and the accuracy of the estimation of the total amount of scar in the LA. Ground truth was determined using the STAPLE algorithm, which produces a probabilistic estimate of the true scar classification from multiple expert manual segmentations. Evaluation of the ground truth data set was based on both inter- and intra-observer agreement, with variation among expert classifiers indicating the difficulty of scar classification for a given a dataset. Our proposed automatic scar classification algorithm performs well for both scar localization and estimation of scar volume: for ground truth datasets considered easy, variability from the ground truth was low; for those considered difficult, variability from ground truth was on par with the variability across experts.

  18. 32 CFR 1700.8 - Action on the request.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... recommendations, ODNI staff shall be guided by the procedures specified in § 1700.10 regarding confidential...) When the withholding is based in whole or in part on a security classification, the explanation shall...

  19. 32 CFR 1700.8 - Action on the request.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... recommendations, ODNI staff shall be guided by the procedures specified in § 1700.10 regarding confidential...) When the withholding is based in whole or in part on a security classification, the explanation shall...

  20. A revisitation of TRIX for trophic status assessment in the light of the European Water Framework Directive: application to Italian coastal waters.

    PubMed

    Pettine, Maurizio; Casentini, Barbara; Fazi, Stefano; Giovanardi, Franco; Pagnotta, Romano

    2007-09-01

    The trophic status classification of coastal waters at the European scale requires the availability of harmonised indicators and procedures. The composite trophic status index (TRIX) provides useful metrics for the assessment of the trophic status of coastal waters. It was originally developed for Italian coastal waters and then applied in many European seas (Adriatic, Tyrrhenian, Baltic, Black and Northern seas). The TRIX index does not fulfil the classification procedure suggested by the WFD for two reasons: (a) it is based on an absolute trophic scale without any normalization to type-specific reference conditions; (b) it makes an ex ante aggregation of biological (Chl-a) and physico-chemical (oxygen, nutrients) quality elements, instead of an ex post integration of separate evaluations of biological and subsequent chemical quality elements. A revisitation of the TRIX index in the light of the European Water Framework Directive (WFD, 2000/60/EC) and new TRIX derived tools are presented in this paper. A number of Italian coastal sites were grouped into different types based on a thorough analysis of their hydro-morphological conditions, and type-specific reference sites were selected. Unscaled TRIX values (UNTRIX) for reference and impacted sites have been calculated and two alternative UNTRIX-based classification procedures are discussed. The proposed procedures, to be validated on a broader scale, provide users with simple tools that give an integrated view of nutrient enrichment and its effects on algal biomass (Chl-a) and on oxygen levels. This trophic evaluation along with phytoplankton indicator species and algal blooms contribute to the comprehensive assessment of phytoplankton, one of the biological quality elements in coastal waters.

  1. DIF Trees: Using Classification Trees to Detect Differential Item Functioning

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.; Wang, Qiu

    2010-01-01

    A nonparametric tree classification procedure is used to detect differential item functioning for items that are dichotomously scored. Classification trees are shown to be an alternative procedure to detect differential item functioning other than the use of traditional Mantel-Haenszel and logistic regression analysis. A nonparametric…

  2. 18 CFR 3a.13 - Classification responsibility and procedure.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Classification responsibility and procedure. 3a.13 Section 3a.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES NATIONAL SECURITY INFORMATION Classification § 3a...

  3. 18 CFR 3a.13 - Classification responsibility and procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Classification responsibility and procedure. 3a.13 Section 3a.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES NATIONAL SECURITY INFORMATION Classification § 3a...

  4. 18 CFR 3a.13 - Classification responsibility and procedure.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Classification responsibility and procedure. 3a.13 Section 3a.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES NATIONAL SECURITY INFORMATION Classification § 3a...

  5. 18 CFR 3a.13 - Classification responsibility and procedure.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Classification responsibility and procedure. 3a.13 Section 3a.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES NATIONAL SECURITY INFORMATION Classification § 3a...

  6. 10 CFR 1045.10 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Government-wide policies and procedures concerning the classification and declassification of RD and FRD information. (b) This subpart establishes procedures for classification prohibitions for RD and FRD, describes authorities and procedures for identifying RD and FRD information, and specifies the policies and criteria DOE...

  7. 10 CFR 1045.10 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Government-wide policies and procedures concerning the classification and declassification of RD and FRD information. (b) This subpart establishes procedures for classification prohibitions for RD and FRD, describes authorities and procedures for identifying RD and FRD information, and specifies the policies and criteria DOE...

  8. 10 CFR 1045.10 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Government-wide policies and procedures concerning the classification and declassification of RD and FRD information. (b) This subpart establishes procedures for classification prohibitions for RD and FRD, describes authorities and procedures for identifying RD and FRD information, and specifies the policies and criteria DOE...

  9. Reverse Shoulder Arthroplasty Prosthesis Design Classification System.

    PubMed

    Routman, Howard D; Flurin, Pierre-Henri; Wright, Thomas W; Zuckerman, Joseph D; Hamilton, Matthew A; Roche, Christopher P

    2015-12-01

    Multiple different reverse total shoulder arthroplasty (rTSA) prosthesis designs are available in the global marketplace for surgeons to perform this growing procedure. Subtle differences in rTSA prosthesis design parameters have been shown to have significant biomechanical impact and clinical consequences. We propose an rTSA prosthesis design classification system to objectively identify and categorize different designs based upon their specific glenoid and humeral prosthetic characteristics for the purpose of standardizing nomenclature that will help the orthopaedic surgeon determine which combination of design configurations best suit a given clinical scenario. The impact of each prosthesis classification type on shoulder muscle length and deltoid wrapping are also described to illustrate how each prosthesis classification type impacts these biomechanical parameters.

  10. EL68D Wasteway Watershed Land-Cover Generation

    USGS Publications Warehouse

    Ruhl, Sheila; Usery, E. Lynn; Finn, Michael P.

    2007-01-01

    Classification of land cover from Landsat Enhanced Thematic Mapper Plus (ETM+) for the EL68D Wasteway Watershed in the State of Washington is documented. The procedures for classification include use of two ETM+ scenes in a simultaneous unsupervised classification process supported by extensive field data collection using Global Positioning System receivers and digital photos. The procedure resulted in a detailed classification at the individual crop species level.

  11. The normative structure of mathematization in systematic biology.

    PubMed

    Sterner, Beckett; Lidgard, Scott

    2014-06-01

    We argue that the mathematization of science should be understood as a normative activity of advocating for a particular methodology with its own criteria for evaluating good research. As a case study, we examine the mathematization of taxonomic classification in systematic biology. We show how mathematization is a normative activity by contrasting its distinctive features in numerical taxonomy in the 1960s with an earlier reform advocated by Ernst Mayr starting in the 1940s. Both Mayr and the numerical taxonomists sought to formalize the work of classification, but Mayr introduced a qualitative formalism based on human judgment for determining the taxonomic rank of populations, while the numerical taxonomists introduced a quantitative formalism based on automated procedures for computing classifications. The key contrast between Mayr and the numerical taxonomists is how they conceptualized the temporal structure of the workflow of classification, specifically where they allowed meta-level discourse about difficulties in producing the classification. Copyright © 2014. Published by Elsevier Ltd.

  12. A fingerprint classification algorithm based on combination of local and global information

    NASA Astrophysics Data System (ADS)

    Liu, Chongjin; Fu, Xiang; Bian, Junjie; Feng, Jufu

    2011-12-01

    Fingerprint recognition is one of the most important technologies in biometric identification and has been wildly applied in commercial and forensic areas. Fingerprint classification, as the fundamental procedure in fingerprint recognition, can sharply decrease the quantity for fingerprint matching and improve the efficiency of fingerprint recognition. Most fingerprint classification algorithms are based on the number and position of singular points. Because the singular points detecting method only considers the local information commonly, the classification algorithms are sensitive to noise. In this paper, we propose a novel fingerprint classification algorithm combining the local and global information of fingerprint. Firstly we use local information to detect singular points and measure their quality considering orientation structure and image texture in adjacent areas. Furthermore the global orientation model is adopted to measure the reliability of singular points group. Finally the local quality and global reliability is weighted to classify fingerprint. Experiments demonstrate the accuracy and effectivity of our algorithm especially for the poor quality fingerprint images.

  13. Mechanization of Library Procedures in the Medium-Sized Medical Library: XIV. Correlations between National Library of Medicine Classification Numbers and MeSH Headings *

    PubMed Central

    Fenske, Ruth E.

    1972-01-01

    The purpose of this study was to determine the amount of correlation between National Library of Medicine classification numbers and MeSH headings in a body of cataloging which had already been done and then to find out which of two alternative methods of utilizing the correlation would be best. There was a correlation of 44.5% between classification numbers and subject headings in the data base studied, cataloging data covering 8,137 books. The results indicate that a subject heading index showing classification numbers would be the preferred method of utilization, because it would be more accurate than the alternative considered, an arrangement by classification numbers which would be consulted to obtain subject headings. PMID:16017607

  14. Classification image analysis: estimation and statistical inference for two-alternative forced-choice experiments

    NASA Technical Reports Server (NTRS)

    Abbey, Craig K.; Eckstein, Miguel P.

    2002-01-01

    We consider estimation and statistical hypothesis testing on classification images obtained from the two-alternative forced-choice experimental paradigm. We begin with a probabilistic model of task performance for simple forced-choice detection and discrimination tasks. Particular attention is paid to general linear filter models because these models lead to a direct interpretation of the classification image as an estimate of the filter weights. We then describe an estimation procedure for obtaining classification images from observer data. A number of statistical tests are presented for testing various hypotheses from classification images based on some more compact set of features derived from them. As an example of how the methods we describe can be used, we present a case study investigating detection of a Gaussian bump profile.

  15. Adjustment of localized alveolar ridge defects by soft tissue transplantation to improve mucogingival esthetics: a proposal for clinical classification and an evaluation of procedures.

    PubMed

    Studer, S; Naef, R; Schärer, P

    1997-12-01

    Esthetically correct treatment of a localized alveolar ridge defect is a frequent prosthetic challenge. Such defects can be overcome not only by a variety of prosthetic means, but also by several periodontal surgical techniques, notably soft tissue augmentations. Preoperative classification of the localized alveolar ridge defect can be greatly useful in evaluating the prognosis and technical difficulties involved. A semiquantitative classification, dependent on the severity of vertical and horizontal dimensional loss, is proposed to supplement the recognized qualitative classification of a ridge defect. Various methods of soft tissue augmentation are evaluated, based on initial volumetric measurements. The roll flap technique is proposed when the problem is related to ridge quality (single-tooth defect with little horizontal and vertical loss). Larger defects in which a volumetric problem must be solved are corrected through the subepithelial connective tissue technique. Additional mucogingival problems (eg, insufficient gingival width, high frenum, gingival scarring, or tattoo) should not be corrected simultaneously with augmentation procedures. In these cases, the onlay transplant technique is favored.

  16. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, P.; Beaudet, P.

    1980-01-01

    The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data is considered. Decision tree classification, a popular approach to the problem, is characterized by the property that samples are subjected to a sequence of decision rules before they are assigned to a unique class. An automated technique for effective decision tree design which relies only on apriori statistics is presented. This procedure utilizes a set of two dimensional canonical transforms and Bayes table look-up decision rules. An optimal design at each node is derived based on the associated decision table. A procedure for computing the global probability of correct classfication is also provided. An example is given in which class statistics obtained from an actual LANDSAT scene are used as input to the program. The resulting decision tree design has an associated probability of correct classification of .76 compared to the theoretically optimum .79 probability of correct classification associated with a full dimensional Bayes classifier. Recommendations for future research are included.

  17. Toward optimal feature and time segment selection by divergence method for EEG signals classification.

    PubMed

    Wang, Jie; Feng, Zuren; Lu, Na; Luo, Jing

    2018-06-01

    Feature selection plays an important role in the field of EEG signals based motor imagery pattern classification. It is a process that aims to select an optimal feature subset from the original set. Two significant advantages involved are: lowering the computational burden so as to speed up the learning procedure and removing redundant and irrelevant features so as to improve the classification performance. Therefore, feature selection is widely employed in the classification of EEG signals in practical brain-computer interface systems. In this paper, we present a novel statistical model to select the optimal feature subset based on the Kullback-Leibler divergence measure, and automatically select the optimal subject-specific time segment. The proposed method comprises four successive stages: a broad frequency band filtering and common spatial pattern enhancement as preprocessing, features extraction by autoregressive model and log-variance, the Kullback-Leibler divergence based optimal feature and time segment selection and linear discriminate analysis classification. More importantly, this paper provides a potential framework for combining other feature extraction models and classification algorithms with the proposed method for EEG signals classification. Experiments on single-trial EEG signals from two public competition datasets not only demonstrate that the proposed method is effective in selecting discriminative features and time segment, but also show that the proposed method yields relatively better classification results in comparison with other competitive methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Pyrotechnic hazards classification and evaluation program. Phase 3, segments 1-4: Investigation of sensitivity test methods and procedures for pyrotechnic hazards evaluation and classification, part A

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The findings, conclusions, and recommendations relative to the investigations conducted to evaluate tests for classifying pyrotechnic materials and end items as to their hazard potential are presented. Information required to establish an applicable means of determining the potential hazards of pyrotechnics is described. Hazard evaluations are based on the peak overpressure or impulse resulting from the explosion as a function of distance from the source. Other hazard classification tests include dust ignition sensitivity, impact ignition sensitivity, spark ignition sensitivity, and differential thermal analysis.

  19. Controlling a human-computer interface system with a novel classification method that uses electrooculography signals.

    PubMed

    Wu, Shang-Lin; Liao, Lun-De; Lu, Shao-Wei; Jiang, Wei-Ling; Chen, Shi-An; Lin, Chin-Teng

    2013-08-01

    Electrooculography (EOG) signals can be used to control human-computer interface (HCI) systems, if properly classified. The ability to measure and process these signals may help HCI users to overcome many of the physical limitations and inconveniences in daily life. However, there are currently no effective multidirectional classification methods for monitoring eye movements. Here, we describe a classification method used in a wireless EOG-based HCI device for detecting eye movements in eight directions. This device includes wireless EOG signal acquisition components, wet electrodes and an EOG signal classification algorithm. The EOG classification algorithm is based on extracting features from the electrical signals corresponding to eight directions of eye movement (up, down, left, right, up-left, down-left, up-right, and down-right) and blinking. The recognition and processing of these eight different features were achieved in real-life conditions, demonstrating that this device can reliably measure the features of EOG signals. This system and its classification procedure provide an effective method for identifying eye movements. Additionally, it may be applied to study eye functions in real-life conditions in the near future.

  20. Group-theoretical approach to the construction of bases in 2{sup n}-dimensional Hilbert space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, A.; Romero, J. L.; Klimov, A. B., E-mail: klimov@cencar.udg.mx

    2011-06-15

    We propose a systematic procedure to construct all the possible bases with definite factorization structure in 2{sup n}-dimensional Hilbert space and discuss an algorithm for the determination of basis separability. The results are applied for classification of bases for an n-qubit system.

  1. 76 FR 80278 - Revision of Cotton Classification Procedures for Determining Cotton Leaf Grade

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ...-0066] RIN 0581-AD19 Revision of Cotton Classification Procedures for Determining Cotton Leaf Grade... Pima cotton. The leaf grade is a part of the official classification which denotes cotton fiber quality used in cotton marketing and manufacturing of cotton products. Currently, the leaf grade is determined...

  2. Evaluation of SLAR and thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator); Knowlton, D. J.; Dean, M. E.

    1981-01-01

    A set of training statistics for the 30 meter resolution simulated thematic mapper MSS data was generated based on land use/land cover classes. In addition to this supervised data set, a nonsupervised multicluster block of training statistics is being defined in order to compare the classification results and evaluate the effect of the different training selection methods on classification performance. Two test data sets, defined using a stratified sampling procedure incorporating a grid system with dimensions of 50 lines by 50 columns, and another set based on an analyst supervised set of test fields were used to evaluate the classifications of the TMS data. The supervised training data set generated training statistics, and a per point Gaussian maximum likelihood classification of the 1979 TMS data was obtained. The August 1980 MSS data was radiometrically adjusted. The SAR data was redigitized and the SAR imagery was qualitatively analyzed.

  3. Refining Landsat classification results using digital terrain data

    USGS Publications Warehouse

    Miller, Wayne A.; Shasby, Mark

    1982-01-01

     Scientists at the U.S. Geological Survey's Earth Resources Observation systems (EROS) Data Center have recently completed two land-cover mapping projects in which digital terrain data were used to refine Landsat classification results. Digital ter rain data were incorporated into the Landsat classification process using two different procedures that required developing decision criteria either subjectively or quantitatively. The subjective procedure was used in a vegetation mapping project in Arizona, and the quantitative procedure was used in a forest-fuels mapping project in Montana. By incorporating digital terrain data into the Landsat classification process, more spatially accurate landcover maps were produced for both projects.

  4. TAXONOMY OF MEDICAL DEVICES IN THE LOGIC OF HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Henschke, Cornelia; Panteli, Dimitra; Perleth, Matthias; Busse, Reinhard

    2015-01-01

    The suitability of general HTA methodology for medical devices is gaining interest as a topic of scientific discourse. Given the broad range of medical devices, there might be differences between groups of devices that impact both the necessity and the methods of their assessment. Our aim is to develop a taxonomy that provides researchers and policy makers with an orientation tool on how to approach the assessment of different types of medical devices. Several classifications for medical devices based on varying rationales for different regulatory and reporting purposes were analyzed in detail to develop a comprehensive taxonomic model. The taxonomy is based on relevant aspects of existing classification schemes incorporating elements of risk and functionality. Its 9 × 6 matrix distinguishes between the diagnostic or therapeutic nature of devices and considers whether the medical device is directly used by patients, constitutes part of a specific procedure, or can be used for a variety of procedures. We considered the relevance of different device categories in regard to HTA to be considerably variable, ranging from high to low. Existing medical device classifications cannot be used for HTA as they are based on different underlying logics. The developed taxonomy combines different device classification schemes used for different purposes. It aims at providing decision makers with a tool enabling them to consider device characteristics in detail across more than one dimension. The placement of device groups in the matrix can provide decision support on the necessity of conducting a full HTA.

  5. New classification system for indications for endoscopic retrograde cholangiopancreatography predicts diagnoses and adverse events.

    PubMed

    Yuen, Nicholas; O'Shaughnessy, Pauline; Thomson, Andrew

    2017-12-01

    Indications for endoscopic retrograde cholangiopancreatography (ERCP) have received little attention, especially in scientific or objective terms. To review the prevailing ERCP indications in the literature, and to propose and evaluate a new ERCP indication system, which relies on more objective pre-procedure parameters. An analysis was conducted on 1758 consecutive ERCP procedures, in which contemporaneous use was made of an a-priori indication system. Indications were based on the objective pre-procedure parameters and divided into primary [cholangitis, clinical evidence of biliary leak, acute (biliary) pancreatitis, abnormal intraoperative cholangiogram (IOC), or change/removal of stent for benign/malignant disease] and secondary [combination of two or three of: pain attributable to biliary disease ('P'), imaging evidence of biliary disease ('I'), and abnormal liver function tests (LFTs) ('L')]. A secondary indication was only used if a primary indication was not present. The relationship between this newly developed classification system and ERCP findings and adverse events was examined. The indications of cholangitis and positive IOC were predictive of choledocholithiasis at ERCP (101/154 and 74/141 procedures, respectively). With respect to secondary indications, only if all three of 'P', 'I', and 'L' were present there was a statistically significant association with choledocholithiasis (χ 2 (1) = 35.3, p < .001). Adverse events were associated with an unusual indication leading to greater risk of unplanned hospitalization (χ 2 (1) = 17.0, p < .001). An a-priori-based indication system for ERCP, which relies on pre-ERCP objective parameters, provides a more useful and scientific classification system than is available currently.

  6. Classification of circulation type sequences applied to snow avalanches over the eastern Pyrenees (Andorra and Catalonia)

    NASA Astrophysics Data System (ADS)

    Esteban, Pere; Beck, Christoph; Philipp, Andreas

    2010-05-01

    Using data associated with accidents or damages caused by snow avalanches over the eastern Pyrenees (Andorra and Catalonia) several atmospheric circulation type catalogues have been obtained. For this purpose, different circulation type classification methods based on Principal Component Analysis (T-mode and S-mode using the extreme scores) and on optimization procedures (Improved K-means and SANDRA) were applied . Considering the characteristics of the phenomena studied, not only single day circulation patterns were taken into account but also sequences of circulation types of varying length. Thus different classifications with different numbers of types and for different sequence lengths were obtained using the different classification methods. Simple between type variability, within type variability, and outlier detection procedures have been applied for selecting the best result concerning snow avalanches type classifications. Furthermore, days without occurrence of the hazards were also related to the avalanche centroids using pattern-correlations, facilitating the calculation of the anomalies between hazardous and no hazardous days, and also frequencies of occurrence of hazardous events for each circulation type. Finally, the catalogues statistically considered the best results are evaluated using the avalanche forecaster expert knowledge. Consistent explanation of snow avalanches occurrence by means of circulation sequences is obtained, but always considering results from classifications with different sequence length. This work has been developed in the framework of the COST Action 733 (Harmonisation and Applications of Weather Type Classifications for European regions).

  7. 43 CFR 2461.1 - Proposed classifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Proposed classifications. 2461.1 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.1 Proposed classifications. (a) Proposed classifications will...

  8. 43 CFR 2461.4 - Changing classifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Changing classifications. 2461.4 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.4 Changing classifications. Classifications may be changed...

  9. 43 CFR 2461.1 - Proposed classifications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Proposed classifications. 2461.1 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.1 Proposed classifications. (a) Proposed classifications will...

  10. 43 CFR 2461.4 - Changing classifications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Changing classifications. 2461.4 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.4 Changing classifications. Classifications may be changed...

  11. 43 CFR 2461.1 - Proposed classifications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Proposed classifications. 2461.1 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.1 Proposed classifications. (a) Proposed classifications will...

  12. 43 CFR 2461.4 - Changing classifications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Changing classifications. 2461.4 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.4 Changing classifications. Classifications may be changed...

  13. 43 CFR 2461.1 - Proposed classifications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Proposed classifications. 2461.1 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.1 Proposed classifications. (a) Proposed classifications will...

  14. 43 CFR 2461.4 - Changing classifications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Changing classifications. 2461.4 Section... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.4 Changing classifications. Classifications may be changed...

  15. Multiple-Primitives Hierarchical Classification of Airborne Laser Scanning Data in Urban Areas

    NASA Astrophysics Data System (ADS)

    Ni, H.; Lin, X. G.; Zhang, J. X.

    2017-09-01

    A hierarchical classification method for Airborne Laser Scanning (ALS) data of urban areas is proposed in this paper. This method is composed of three stages among which three types of primitives are utilized, i.e., smooth surface, rough surface, and individual point. In the first stage, the input ALS data is divided into smooth surfaces and rough surfaces by employing a step-wise point cloud segmentation method. In the second stage, classification based on smooth surfaces and rough surfaces is performed. Points in the smooth surfaces are first classified into ground and buildings based on semantic rules. Next, features of rough surfaces are extracted. Then, points in rough surfaces are classified into vegetation and vehicles based on the derived features and Random Forests (RF). In the third stage, point-based features are extracted for the ground points, and then, an individual point classification procedure is performed to classify the ground points into bare land, artificial ground and greenbelt. Moreover, the shortages of the existing studies are analyzed, and experiments show that the proposed method overcomes these shortages and handles more types of objects.

  16. Recommendations for the classification of HIV associated neuromanifestations in the German DRG system.

    PubMed

    Evers, Stefan; Fiori, W; Brockmeyer, N; Arendt, G; Husstedt, I-W

    2005-09-12

    HIV associated neuromanifestations are of growing importance in the in-patient treatment of HIV infected patients. In Germany, all in-patients have to be coded according to the ICD-10 classification and the German DRG-system. We present recommendations how to code the different primary and secondary neuromanifestations of HIV infection. These recommendations are based on the commentary of the German DRG procedures and are aimed to establish uniform coding of neuromanifestations.

  17. Updating Allergy and/or Hypersensitivity Diagnostic Procedures in the WHO ICD-11 Revision.

    PubMed

    Tanno, Luciana Kase; Calderon, Moises A; Li, James; Casale, Thomas; Demoly, Pascal

    2016-01-01

    The classification of allergy and/or hypersensitivity conditions for the World Health Organization (WHO) International Classification of Diseases (ICD)-11 provides the appropriate corresponding codes for allergic diseases, assuming that the final diagnosis is correct. This classification should be linked to in vitro and in vivo diagnostic procedures. Considering the impact for our specialty, we decided to review the codification of these procedures into the ICD aiming to have a baseline and to suggest changes and/or submit new proposals. For that, we prepared a list of the relevant allergy and/or hypersensitivity diagnostic procedures that health care professionals are dealing with on a daily basis. This was based on the main current guidelines and selected all possible and relevant corresponding terms from the ICD-10 (2015 version) and the ICD-11 β phase foundation (June 2015 version). More than 90% of very specific and important diagnostic procedures currently used by the allergists' community on a daily basis are missing. We observed that some concepts usually used by the allergist community on a daily basis are not fully recognized by other specialties. The whole scheme and the correspondence in the ICD-10 (2015 version) and ICD-11 foundation (June 2015 version) provided us a big picture of the missing or imprecise terms and how they are scattered in the current ICD-11 framework, allowing us to submit new proposals to increase the visibility of the allergy and/or hypersensitivity conditions and diagnostic procedures. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. All rights reserved.

  18. Non-target adjacent stimuli classification improves performance of classical ERP-based brain computer interface

    NASA Astrophysics Data System (ADS)

    Ceballos, G. A.; Hernández, L. F.

    2015-04-01

    Objective. The classical ERP-based speller, or P300 Speller, is one of the most commonly used paradigms in the field of Brain Computer Interfaces (BCI). Several alterations to the visual stimuli presentation system have been developed to avoid unfavorable effects elicited by adjacent stimuli. However, there has been little, if any, regard to useful information contained in responses to adjacent stimuli about spatial location of target symbols. This paper aims to demonstrate that combining the classification of non-target adjacent stimuli with standard classification (target versus non-target) significantly improves classical ERP-based speller efficiency. Approach. Four SWLDA classifiers were trained and combined with the standard classifier: the lower row, upper row, right column and left column classifiers. This new feature extraction procedure and the classification method were carried out on three open databases: the UAM P300 database (Universidad Autonoma Metropolitana, Mexico), BCI competition II (dataset IIb) and BCI competition III (dataset II). Main results. The inclusion of the classification of non-target adjacent stimuli improves target classification in the classical row/column paradigm. A gain in mean single trial classification of 9.6% and an overall improvement of 25% in simulated spelling speed was achieved. Significance. We have provided further evidence that the ERPs produced by adjacent stimuli present discriminable features, which could provide additional information about the spatial location of intended symbols. This work promotes the searching of information on the peripheral stimulation responses to improve the performance of emerging visual ERP-based spellers.

  19. Multiclassifier system with hybrid learning applied to the control of bioprosthetic hand.

    PubMed

    Kurzynski, Marek; Krysmann, Maciej; Trajdos, Pawel; Wolczowski, Andrzej

    2016-02-01

    In this paper the problem of recognition of the intended hand movements for the control of bio-prosthetic hand is addressed. The proposed method is based on recognition of electromiographic (EMG) and mechanomiographic (MMG) biosignals using a multiclassifier system (MCS) working in a two-level structure with a dynamic ensemble selection (DES) scheme and original concepts of competence function. Additionally, feedback information coming from bioprosthesis sensors on the correct/incorrect classification is applied to the adjustment of the combining mechanism during MCS operation through adaptive tuning competences of base classifiers depending on their decisions. Three MCS systems operating in decision tree structure and with different tuning algorithms are developed. In the MCS1 system, competence is uniformly allocated to each class belonging to the group indicated by the feedback signal. In the MCS2 system, the modification of competence depends on the node of decision tree at which a correct/incorrect classification is made. In the MCS3 system, the randomized model of classifier and the concept of cross-competence are used in the tuning procedure. Experimental investigations on the real data and computer-simulated procedure of generating feedback signals are performed. In these investigations classification accuracy of the MCS systems developed is compared and furthermore, the MCS systems are evaluated with respect to the effectiveness of the procedure of tuning competence. The results obtained indicate that modification of competence of base classifiers during the working phase essentially improves performance of the MCS system and that this improvement depends on the MCS system and tuning method used. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. 19 CFR 152.16 - Judicial changes in classification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF THE TREASURY (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Classification § 152.16 Judicial changes in classification. The following procedures apply to changes in classification made by... 19 Customs Duties 2 2010-04-01 2010-04-01 false Judicial changes in classification. 152.16 Section...

  1. High-speed potato grading and quality inspection based on a color vision system

    NASA Astrophysics Data System (ADS)

    Noordam, Jacco C.; Otten, Gerwoud W.; Timmermans, Toine J. M.; van Zwol, Bauke H.

    2000-03-01

    A high-speed machine vision system for the quality inspection and grading of potatoes has been developed. The vision system grades potatoes on size, shape and external defects such as greening, mechanical damages, rhizoctonia, silver scab, common scab, cracks and growth cracks. A 3-CCD line-scan camera inspects the potatoes in flight as they pass under the camera. The use of mirrors to obtain a 360-degree view of the potato and the lack of product holders guarantee a full view of the potato. To achieve the required capacity of 12 tons/hour, 11 SHARC Digital Signal Processors perform the image processing and classification tasks. The total capacity of the system is about 50 potatoes/sec. The color segmentation procedure uses Linear Discriminant Analysis (LDA) in combination with a Mahalanobis distance classifier to classify the pixels. The procedure for the detection of misshapen potatoes uses a Fourier based shape classification technique. Features such as area, eccentricity and central moments are used to discriminate between similar colored defects. Experiments with red and yellow skin-colored potatoes have shown that the system is robust and consistent in its classification.

  2. Multi-spectral brain tissue segmentation using automatically trained k-Nearest-Neighbor classification.

    PubMed

    Vrooman, Henri A; Cocosco, Chris A; van der Lijn, Fedde; Stokking, Rik; Ikram, M Arfan; Vernooij, Meike W; Breteler, Monique M B; Niessen, Wiro J

    2007-08-01

    Conventional k-Nearest-Neighbor (kNN) classification, which has been successfully applied to classify brain tissue in MR data, requires training on manually labeled subjects. This manual labeling is a laborious and time-consuming procedure. In this work, a new fully automated brain tissue classification procedure is presented, in which kNN training is automated. This is achieved by non-rigidly registering the MR data with a tissue probability atlas to automatically select training samples, followed by a post-processing step to keep the most reliable samples. The accuracy of the new method was compared to rigid registration-based training and to conventional kNN-based segmentation using training on manually labeled subjects for segmenting gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) in 12 data sets. Furthermore, for all classification methods, the performance was assessed when varying the free parameters. Finally, the robustness of the fully automated procedure was evaluated on 59 subjects. The automated training method using non-rigid registration with a tissue probability atlas was significantly more accurate than rigid registration. For both automated training using non-rigid registration and for the manually trained kNN classifier, the difference with the manual labeling by observers was not significantly larger than inter-observer variability for all tissue types. From the robustness study, it was clear that, given an appropriate brain atlas and optimal parameters, our new fully automated, non-rigid registration-based method gives accurate and robust segmentation results. A similarity index was used for comparison with manually trained kNN. The similarity indices were 0.93, 0.92 and 0.92, for CSF, GM and WM, respectively. It can be concluded that our fully automated method using non-rigid registration may replace manual segmentation, and thus that automated brain tissue segmentation without laborious manual training is feasible.

  3. Deep convolutional neural network training enrichment using multi-view object-based analysis of Unmanned Aerial systems imagery for wetlands classification

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Abd-Elrahman, Amr

    2018-05-01

    Deep convolutional neural network (DCNN) requires massive training datasets to trigger its image classification power, while collecting training samples for remote sensing application is usually an expensive process. When DCNN is simply implemented with traditional object-based image analysis (OBIA) for classification of Unmanned Aerial systems (UAS) orthoimage, its power may be undermined if the number training samples is relatively small. This research aims to develop a novel OBIA classification approach that can take advantage of DCNN by enriching the training dataset automatically using multi-view data. Specifically, this study introduces a Multi-View Object-based classification using Deep convolutional neural network (MODe) method to process UAS images for land cover classification. MODe conducts the classification on multi-view UAS images instead of directly on the orthoimage, and gets the final results via a voting procedure. 10-fold cross validation results show the mean overall classification accuracy increasing substantially from 65.32%, when DCNN was applied on the orthoimage to 82.08% achieved when MODe was implemented. This study also compared the performances of the support vector machine (SVM) and random forest (RF) classifiers with DCNN under traditional OBIA and the proposed multi-view OBIA frameworks. The results indicate that the advantage of DCNN over traditional classifiers in terms of accuracy is more obvious when these classifiers were applied with the proposed multi-view OBIA framework than when these classifiers were applied within the traditional OBIA framework.

  4. A detailed procedure for the use of small-scale photography in land use classification

    NASA Technical Reports Server (NTRS)

    Vegas, P. L.

    1974-01-01

    A procedure developed to produce accurate land use maps from available high-altitude, small-scale photography in a cost-effective manner is presented. An alternative procedure, for use when the capability for updating the resultant land use map is not required, is also presented. The technical approach is discussed in detail, and personnel and equipment needs are analyzed. Accuracy percentages are listed, and costs are cited. The experiment land use classification categories are explained, and a proposed national land use classification system is recommended.

  5. Automatic breast density classification using a convolutional neural network architecture search procedure

    NASA Astrophysics Data System (ADS)

    Fonseca, Pablo; Mendoza, Julio; Wainer, Jacques; Ferrer, Jose; Pinto, Joseph; Guerrero, Jorge; Castaneda, Benjamin

    2015-03-01

    Breast parenchymal density is considered a strong indicator of breast cancer risk and therefore useful for preventive tasks. Measurement of breast density is often qualitative and requires the subjective judgment of radiologists. Here we explore an automatic breast composition classification workflow based on convolutional neural networks for feature extraction in combination with a support vector machines classifier. This is compared to the assessments of seven experienced radiologists. The experiments yielded an average kappa value of 0.58 when using the mode of the radiologists' classifications as ground truth. Individual radiologist performance against this ground truth yielded kappa values between 0.56 and 0.79.

  6. The utility of the diagnosis of pedophilia: a comparison of various classification procedures.

    PubMed

    Kingston, Drew A; Firestone, Philip; Moulden, Heather M; Bradford, John M

    2007-06-01

    This study examined the utility of the diagnosis of pedophilia in a sample of extra-familial child molesters assessed at a university teaching hospital between 1982 and 1992. Pedophilia was defined in one of four ways: (1) DSM diagnosis made by a psychiatrist; (2) deviant phallometric profile; (3) DSM diagnosis and a deviant phallometric profile; and, (4) high scores based on the Screening Scale for Pedophilic Interest (Seto & Lalumière, 2001). Demographic data, psychological tests, and offence history were obtained and group differences were analyzed along with the ability of certain variables to contribute uniquely to the classification of pedophilia. Results indicated that few significant differences existed on psychological measures between pedophilic and nonpedophilic extra-familial child molesters regardless of the classification system employed. Finally, results indicated that the procedures used to define pedophilia were not significantly related to one another. Results are discussed in terms of the utility of the diagnosis of pedophilia.

  7. Progress toward the determination of correct classification rates in fire debris analysis.

    PubMed

    Waddell, Erin E; Song, Emma T; Rinke, Caitlin N; Williams, Mary R; Sigman, Michael E

    2013-07-01

    Principal components analysis (PCA), linear discriminant analysis (LDA), and quadratic discriminant analysis (QDA) were used to develop a multistep classification procedure for determining the presence of ignitable liquid residue in fire debris and assigning any ignitable liquid residue present into the classes defined under the American Society for Testing and Materials (ASTM) E 1618-10 standard method. A multistep classification procedure was tested by cross-validation based on model data sets comprised of the time-averaged mass spectra (also referred to as total ion spectra) of commercial ignitable liquids and pyrolysis products from common building materials and household furnishings (referred to simply as substrates). Fire debris samples from laboratory-scale and field test burns were also used to test the model. The optimal model's true-positive rate was 81.3% for cross-validation samples and 70.9% for fire debris samples. The false-positive rate was 9.9% for cross-validation samples and 8.9% for fire debris samples. © 2013 American Academy of Forensic Sciences.

  8. [Diagnosis-related groups as an instrument to develop suitable case-based lump sums in hematology and oncology].

    PubMed

    Thalheimer, Markus

    2011-01-01

    In 2003 a new reimbursement system was established for German hospitals. The approximately 17 million inpatient cases per year are now reimbursed based on a per-case payment regarding diagnoses and procedures, which was developed from an internationally approved system. The aim was a better conformity of costs and efforts in in-patient cases. In the first 2 years after implementation, the German diagnosis-related group (DRG) system was not able to adequately represent the complex structures of treatment in hematological and oncological in-patients. By creating new diagnoses and procedures (International Classification of Diseases 10 (ICD-10) and Surgical Operations and Procedures Classification System (OPS) catalogues), generating new DRGs and better splitting of existing ones, the hematology and oncology field could be much better described in the following years. The implementation of about 70 'co-payment structures' for new and expensive drugs and procedures in oncology was also crucial. To reimburse innovations, an additional system of co-payments for innovations was established to bridge the time until innovations are represented within the DRG system itself. In summary, hematological and oncological in-patients, including cases with extraordinary costs, are meanwhile well mapped in the German reimbursement system. Any tendencies to rationing could thereby be avoided, as most of the established procedures and costly drugs are adequately represented in the DRG system. Copyright © 2011 S. Karger AG, Basel.

  9. Classification of wheat: Badhwar profile similarity technique

    NASA Technical Reports Server (NTRS)

    Austin, W. W.

    1980-01-01

    The Badwar profile similarity classification technique used successfully for classification of corn was applied to spring wheat classifications. The software programs and the procedures used to generate full-scene classifications are presented, and numerical results of the acreage estimations are given.

  10. Satellite land use acquisition and applications to hydrologic planning models

    NASA Technical Reports Server (NTRS)

    Algazi, V. R.; Suk, M.

    1977-01-01

    A developing operational procedure for use by the Corps of Engineers in the acquisition of land use information for hydrologic planning purposes was described. The operational conditions preclude the use of dedicated, interactive image processing facilities. Given the constraints, an approach to land use classification based on clustering seems promising and was explored in detail. The procedure is outlined and examples of application to two watersheds given.

  11. Comparing and Combining Dichotomous and Polytomous Items with SPRT Procedure in Computerized Classification Testing.

    ERIC Educational Resources Information Center

    Lau, C. Allen; Wang, Tianyou

    The purposes of this study were to: (1) extend the sequential probability ratio testing (SPRT) procedure to polytomous item response theory (IRT) models in computerized classification testing (CCT); (2) compare polytomous items with dichotomous items using the SPRT procedure for their accuracy and efficiency; (3) study a direct approach in…

  12. Automatic interpretation of ERTS data for forest management

    NASA Technical Reports Server (NTRS)

    Kirvida, L.; Johnson, G. R.

    1973-01-01

    Automatic stratification of forested land from ERTS-1 data provides a valuable tool for resource management. The results are useful for wood product yield estimates, recreation and wild life management, forest inventory and forest condition monitoring. Automatic procedures based on both multi-spectral and spatial features are evaluated. With five classes, training and testing on the same samples, classification accuracy of 74% was achieved using the MSS multispectral features. When adding texture computed from 8 x 8 arrays, classification accuracy of 99% was obtained.

  13. Single-particle cryo-EM using alignment by classification (ABC): the structure of Lumbricus terrestris haemoglobin.

    PubMed

    Afanasyev, Pavel; Seer-Linnemayr, Charlotte; Ravelli, Raimond B G; Matadeen, Rishi; De Carlo, Sacha; Alewijnse, Bart; Portugal, Rodrigo V; Pannu, Navraj S; Schatz, Michael; van Heel, Marin

    2017-09-01

    Single-particle cryogenic electron microscopy (cryo-EM) can now yield near-atomic resolution structures of biological complexes. However, the reference-based alignment algorithms commonly used in cryo-EM suffer from reference bias, limiting their applicability (also known as the 'Einstein from random noise' problem). Low-dose cryo-EM therefore requires robust and objective approaches to reveal the structural information contained in the extremely noisy data, especially when dealing with small structures. A reference-free pipeline is presented for obtaining near-atomic resolution three-dimensional reconstructions from heterogeneous ('four-dimensional') cryo-EM data sets. The methodologies integrated in this pipeline include a posteriori camera correction, movie-based full-data-set contrast transfer function determination, movie-alignment algorithms, (Fourier-space) multivariate statistical data compression and unsupervised classification, 'random-startup' three-dimensional reconstructions, four-dimensional structural refinements and Fourier shell correlation criteria for evaluating anisotropic resolution. The procedures exclusively use information emerging from the data set itself, without external 'starting models'. Euler-angle assignments are performed by angular reconstitution rather than by the inherently slower projection-matching approaches. The comprehensive 'ABC-4D' pipeline is based on the two-dimensional reference-free 'alignment by classification' (ABC) approach, where similar images in similar orientations are grouped by unsupervised classification. Some fundamental differences between X-ray crystallography versus single-particle cryo-EM data collection and data processing are discussed. The structure of the giant haemoglobin from Lumbricus terrestris at a global resolution of ∼3.8 Å is presented as an example of the use of the ABC-4D procedure.

  14. 49 CFR 8.19 - Procedures for submitting and processing requests for classification reviews.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... classification determination made by another department or agency, the Committee will immediately consult with... for classification reviews. 8.19 Section 8.19 Transportation Office of the Secretary of Transportation CLASSIFIED INFORMATION: CLASSIFICATION/DECLASSIFICATION/ACCESS Classification/Declassification of Information...

  15. CW-SSIM kernel based random forest for image classification

    NASA Astrophysics Data System (ADS)

    Fan, Guangzhe; Wang, Zhou; Wang, Jiheng

    2010-07-01

    Complex wavelet structural similarity (CW-SSIM) index has been proposed as a powerful image similarity metric that is robust to translation, scaling and rotation of images, but how to employ it in image classification applications has not been deeply investigated. In this paper, we incorporate CW-SSIM as a kernel function into a random forest learning algorithm. This leads to a novel image classification approach that does not require a feature extraction or dimension reduction stage at the front end. We use hand-written digit recognition as an example to demonstrate our algorithm. We compare the performance of the proposed approach with random forest learning based on other kernels, including the widely adopted Gaussian and the inner product kernels. Empirical evidences show that the proposed method is superior in its classification power. We also compared our proposed approach with the direct random forest method without kernel and the popular kernel-learning method support vector machine. Our test results based on both simulated and realworld data suggest that the proposed approach works superior to traditional methods without the feature selection procedure.

  16. Stroke localization and classification using microwave tomography with k-means clustering and support vector machine.

    PubMed

    Guo, Lei; Abbosh, Amin

    2018-05-01

    For any chance for stroke patients to survive, the stroke type should be classified to enable giving medication within a few hours of the onset of symptoms. In this paper, a microwave-based stroke localization and classification framework is proposed. It is based on microwave tomography, k-means clustering, and a support vector machine (SVM) method. The dielectric profile of the brain is first calculated using the Born iterative method, whereas the amplitude of the dielectric profile is then taken as the input to k-means clustering. The cluster is selected as the feature vector for constructing and testing the SVM. A database of MRI-derived realistic head phantoms at different signal-to-noise ratios is used in the classification procedure. The performance of the proposed framework is evaluated using the receiver operating characteristic (ROC) curve. The results based on a two-dimensional framework show that 88% classification accuracy, with a sensitivity of 91% and a specificity of 87%, can be achieved. Bioelectromagnetics. 39:312-324, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  17. Taxonomy of asteroids. [according to polarimetric, spectrophotometric, radiometric, and UBV photometric data

    NASA Technical Reports Server (NTRS)

    Bowell, E.; Chapman, C. R.; Gradie, J. C.; Zellner, B.; Morrison, D.

    1978-01-01

    A taxonomic system for asteroids is discussed which is based on seven directly observable parameters from polarimetry, spectrophotometry, radiometry, and UBV photometry. The classification scheme is entirely empirical and independent of specific mineralogical interpretations. Five broad classes (designated C, S, M, E, and R), as well as an 'unclassifiable' designation, are defined on the basis of observational data for 523 asteroids. Computer-generated type classifications and derived diameters are given for the 523 asteroids, and the application of the classification procedure is illustrated. Of the 523 asteroids classified, 190 are identified as C objects, 141 as S type, 13 as type M, three as type E, three as type R, 55 as unclassifiable, and 118 as ambiguous. The present taxonomic system is compared with several other asteroid classification systems.

  18. Does ASA classification impact success rates of endovascular aneurysm repairs?

    PubMed

    Conners, Michael S; Tonnessen, Britt H; Sternbergh, W Charles; Carter, Glen; Yoselevitz, Moises; Money, Samuel R

    2002-09-01

    The purpose of this study was to evaluate the technical success, clinical success, postoperative complication rate, need for a secondary procedure, and mortality rate with endovascular aneurysm repair (EAR), based on the physical status classification scheme advocated by the American Society of Anesthesiologists (ASA). At a single institution 167 patients underwent attempted EAR. Query of a prospectively maintained database supplemented with a retrospective review of medical records was used to gather statistics pertaining to patient demographics and outcome. In patients selected for EAR on the basis of acceptable anatomy, technical and clinical success rates were not significantly different among the different ASA classifications. Importantly, postoperative complication and 30-day mortality rates do not appear to significantly differ among the different ASA classifications in this patient population.

  19. Using Landsat MSS data with soils information to identify wetland habitats

    NASA Technical Reports Server (NTRS)

    Ernst, C. L.; Hoffer, R. M.

    1981-01-01

    A previous study showed that certain fresh water wetland vegetation types can be spectrally separated when a maximum likelihood classification procedure is applied to Landsat spectral data. However, wetland and upland types which have similar vegetative life forms (e.g., upland hardwoods and hardwood swamps) are often confused because of spectral similarity. Therefore, the current investigation attempts to differentiate similar wetland and upland types by combining Landsat multispectral scanner (MSS) data with soils information. The Pigeon River area in northern Indiana used in the earlier study was also employed in this investigation. A layered classification algorithm which combined soils and spectral data was used to generate a wetland classification. The results of the spectral/soils wetland classification are compared to the previous classification that had been based on spectral data alone. The results indicate wetland habitat mapping can be improved by combining soils and other ancillary data with Landsat spectral data.

  20. 43 CFR 2461.2 - Classifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Classifications. 2461.2 Section 2461.2..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.2 Classifications. Not less than 60 days after publication of the...

  1. 43 CFR 2461.2 - Classifications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false Classifications. 2461.2 Section 2461.2..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.2 Classifications. Not less than 60 days after publication of the...

  2. 43 CFR 2461.2 - Classifications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Classifications. 2461.2 Section 2461.2..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.2 Classifications. Not less than 60 days after publication of the...

  3. 43 CFR 2461.2 - Classifications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Classifications. 2461.2 Section 2461.2..., DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Multiple-Use Classification Procedures § 2461.2 Classifications. Not less than 60 days after publication of the...

  4. Comparative Performance Analysis of Support Vector Machine, Random Forest, Logistic Regression and k-Nearest Neighbours in Rainbow Trout (Oncorhynchus Mykiss) Classification Using Image-Based Features

    PubMed Central

    Císař, Petr; Labbé, Laurent; Souček, Pavel; Pelissier, Pablo; Kerneis, Thierry

    2018-01-01

    The main aim of this study was to develop a new objective method for evaluating the impacts of different diets on the live fish skin using image-based features. In total, one-hundred and sixty rainbow trout (Oncorhynchus mykiss) were fed either a fish-meal based diet (80 fish) or a 100% plant-based diet (80 fish) and photographed using consumer-grade digital camera. Twenty-three colour features and four texture features were extracted. Four different classification methods were used to evaluate fish diets including Random forest (RF), Support vector machine (SVM), Logistic regression (LR) and k-Nearest neighbours (k-NN). The SVM with radial based kernel provided the best classifier with correct classification rate (CCR) of 82% and Kappa coefficient of 0.65. Although the both LR and RF methods were less accurate than SVM, they achieved good classification with CCR 75% and 70% respectively. The k-NN was the least accurate (40%) classification model. Overall, it can be concluded that consumer-grade digital cameras could be employed as the fast, accurate and non-invasive sensor for classifying rainbow trout based on their diets. Furthermore, these was a close association between image-based features and fish diet received during cultivation. These procedures can be used as non-invasive, accurate and precise approaches for monitoring fish status during the cultivation by evaluating diet’s effects on fish skin. PMID:29596375

  5. Comparative Performance Analysis of Support Vector Machine, Random Forest, Logistic Regression and k-Nearest Neighbours in Rainbow Trout (Oncorhynchus Mykiss) Classification Using Image-Based Features.

    PubMed

    Saberioon, Mohammadmehdi; Císař, Petr; Labbé, Laurent; Souček, Pavel; Pelissier, Pablo; Kerneis, Thierry

    2018-03-29

    The main aim of this study was to develop a new objective method for evaluating the impacts of different diets on the live fish skin using image-based features. In total, one-hundred and sixty rainbow trout ( Oncorhynchus mykiss ) were fed either a fish-meal based diet (80 fish) or a 100% plant-based diet (80 fish) and photographed using consumer-grade digital camera. Twenty-three colour features and four texture features were extracted. Four different classification methods were used to evaluate fish diets including Random forest (RF), Support vector machine (SVM), Logistic regression (LR) and k -Nearest neighbours ( k -NN). The SVM with radial based kernel provided the best classifier with correct classification rate (CCR) of 82% and Kappa coefficient of 0.65. Although the both LR and RF methods were less accurate than SVM, they achieved good classification with CCR 75% and 70% respectively. The k -NN was the least accurate (40%) classification model. Overall, it can be concluded that consumer-grade digital cameras could be employed as the fast, accurate and non-invasive sensor for classifying rainbow trout based on their diets. Furthermore, these was a close association between image-based features and fish diet received during cultivation. These procedures can be used as non-invasive, accurate and precise approaches for monitoring fish status during the cultivation by evaluating diet's effects on fish skin.

  6. [Complex surgical procedures in orthopedics and trauma surgery. A contribution to the proposal procedure for the DRG system in 2009].

    PubMed

    Flohé, S; Nabring, J; Luetkes, P; Nast-Kolb, D; Windolf, J

    2008-10-01

    Since the DRG system was introduced in 2003/2004 the system for remuneration has been continually modified in conjunction with input from specialized medical associations. As part of this development of the payment system, the criteria for classification of a diagnosis-related group were further expanded and new functions were added. This contribution addresses the importance of the complex surgical procedures as criteria for subdivision of the DRG case-based lump sums in orthopedics and trauma surgery.

  7. The developmental processes for NANDA International Nursing Diagnoses.

    PubMed

    Scroggins, Leann M

    2008-01-01

    This study aims to provide a step-by-step procedural guideline for the development of a nursing diagnosis that meets the necessary criteria for inclusion in the NANDA International and NNN classification systems. The guideline is based on the processes developed by the Diagnosis Development Committee of NANDA International and includes the necessary processes for development of Actual, Wellness, Health Promotion, and Risk nursing diagnoses. Definitions of Actual, Wellness, Health Promotion, and Risk nursing diagnoses along with inclusion criteria and taxonomy rules have been incorporated into the guideline to streamline the development and review processes for submitted diagnoses. A step-by-step procedural guideline will assist the submitter to move efficiently and effectively through the submission process, resulting in increased submissions and enhancement of the NANDA International and NNN classification systems.

  8. Adverse events following cervical manipulative therapy: consensus on classification among Dutch medical specialists, manual therapists, and patients.

    PubMed

    Kranenburg, Hendrikus A; Lakke, Sandra E; Schmitt, Maarten A; Van der Schans, Cees P

    2017-12-01

    To obtain consensus-based agreement on a classification system of adverse events (AE) following cervical spinal manipulation. The classification system should be comprised of clear definitions, include patients' and clinicians' perspectives, and have an acceptable number of categories. Design : A three-round Delphi study. Participants : Thirty Dutch participants (medical specialists, manual therapists, and patients) participated in an online survey. Procedure : Participants inventoried AE and were asked about their preferences for either a three- or a four-category classification system. The identified AE were classified by two analysts following the International Classification of Functioning, Disability and Health (ICF), and the International Classification of Diseases and Related Health Problems (ICD-10). Participants were asked to classify the severity for all AE in relation to the time duration. Consensus occurred in a three-category classification system. There was strong consensus for 16 AE in all severities (no, minor, and major AE) and all three time durations [hours, days, weeks]. The 16 AE included anxiety, flushing, skin rash, fainting, dizziness, coma, altered sensation, muscle tenderness, pain, increased pain during movement, radiating pain, dislocation, fracture, transient ischemic attack, stroke, and death. Mild to strong consensus was reached for 13 AE. A consensus-based classification system of AE is established which includes patients' and clinicians' perspectives and has three categories. The classification comprises a precise description of potential AE in accordance with internationally accepted classifications. After international validation, clinicians and researchers may use this AE classification system to report AE in clinical practice and research.

  9. 43 CFR 2450.4 - Protests: Initial classification decision.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Protests: Initial classification decision... CLASSIFICATION SYSTEM Petition-Application Procedures § 2450.4 Protests: Initial classification decision. (a) For a period of 30 days after the proposed classification decision has been served upon the parties...

  10. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  11. Use of feature extraction techniques for the texture and context information in ERTS imagery: Spectral and textural processing of ERTS imagery. [classification of Kansas land use

    NASA Technical Reports Server (NTRS)

    Haralick, R. H. (Principal Investigator); Bosley, R. J.

    1974-01-01

    The author has identified the following significant results. A procedure was developed to extract cross-band textural features from ERTS MSS imagery. Evolving from a single image texture extraction procedure which uses spatial dependence matrices to measure relative co-occurrence of nearest neighbor grey tones, the cross-band texture procedure uses the distribution of neighboring grey tone N-tuple differences to measure the spatial interrelationships, or co-occurrences, of the grey tone N-tuples present in a texture pattern. In both procedures, texture is characterized in such a way as to be invariant under linear grey tone transformations. However, the cross-band procedure complements the single image procedure by extracting texture information and spectral information contained in ERTS multi-images. Classification experiments show that when used alone, without spectral processing, the cross-band texture procedure extracts more information than the single image texture analysis. Results show an improvement in average correct classification from 86.2% to 88.8% for ERTS image no. 1021-16333 with the cross-band texture procedure. However, when used together with spectral features, the single image texture plus spectral features perform better than the cross-band texture plus spectral features, with an average correct classification of 93.8% and 91.6%, respectively.

  12. Object-based locust habitat mapping using high-resolution multispectral satellite data in the southern Aral Sea basin

    NASA Astrophysics Data System (ADS)

    Navratil, Peter; Wilps, Hans

    2013-01-01

    Three different object-based image classification techniques are applied to high-resolution satellite data for the mapping of the habitats of Asian migratory locust (Locusta migratoria migratoria) in the southern Aral Sea basin, Uzbekistan. A set of panchromatic and multispectral Système Pour l'Observation de la Terre-5 satellite images was spectrally enhanced by normalized difference vegetation index and tasseled cap transformation and segmented into image objects, which were then classified by three different classification approaches: a rule-based hierarchical fuzzy threshold (HFT) classification method was compared to a supervised nearest neighbor classifier and classification tree analysis by the quick, unbiased, efficient statistical trees algorithm. Special emphasis was laid on the discrimination of locust feeding and breeding habitats due to the significance of this discrimination for practical locust control. Field data on vegetation and land cover, collected at the time of satellite image acquisition, was used to evaluate classification accuracy. The results show that a robust HFT classifier outperformed the two automated procedures by 13% overall accuracy. The classification method allowed a reliable discrimination of locust feeding and breeding habitats, which is of significant importance for the application of the resulting data for an economically and environmentally sound control of locust pests because exact spatial knowledge on the habitat types allows a more effective surveying and use of pesticides.

  13. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images

    PubMed Central

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r2=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance. PMID:24146963

  14. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images.

    PubMed

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.

  15. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    DOE PAGES

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang; ...

    2017-04-03

    Here, the feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validationmore » results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.« less

  16. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang

    Here, the feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validationmore » results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.« less

  17. Application of visible and near-infrared spectroscopy to classification of Miscanthus species.

    PubMed

    Jin, Xiaoli; Chen, Xiaoling; Xiao, Liang; Shi, Chunhai; Chen, Liang; Yu, Bin; Yi, Zili; Yoo, Ji Hye; Heo, Kweon; Yu, Chang Yeon; Yamada, Toshihiko; Sacks, Erik J; Peng, Junhua

    2017-01-01

    The feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validation results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species.

  18. Application of visible and near-infrared spectroscopy to classification of Miscanthus species

    PubMed Central

    Shi, Chunhai; Chen, Liang; Yu, Bin; Yi, Zili; Yoo, Ji Hye; Heo, Kweon; Yu, Chang Yeon; Yamada, Toshihiko; Sacks, Erik J.; Peng, Junhua

    2017-01-01

    The feasibility of visible and near infrared (NIR) spectroscopy as tool to classify Miscanthus samples was explored in this study. Three types of Miscanthus plants, namely, M. sinensis, M. sacchariflorus and M. fIoridulus, were analyzed using a NIR spectrophotometer. Several classification models based on the NIR spectra data were developed using line discriminated analysis (LDA), partial least squares (PLS), least squares support vector machine regression (LSSVR), radial basis function (RBF) and neural network (NN). The principal component analysis (PCA) presented rough classification with overlapping samples, while the models of Line_LSSVR, RBF_LSSVR and RBF_NN presented almost same calibration and validation results. Due to the higher speed of Line_LSSVR than RBF_LSSVR and RBF_NN, we selected the line_LSSVR model as a representative. In our study, the model based on line_LSSVR showed higher accuracy than LDA and PLS models. The total correct classification rates of 87.79 and 96.51% were observed based on LDA and PLS model in the testing set, respectively, while the line_LSSVR showed 99.42% of total correct classification rate. Meanwhile, the lin_LSSVR model in the testing set showed correct classification rate of 100, 100 and 96.77% for M. sinensis, M. sacchariflorus and M. fIoridulus, respectively. The lin_LSSVR model assigned 99.42% of samples to the right groups, except one M. fIoridulus sample. The results demonstrated that NIR spectra combined with a preliminary morphological classification could be an effective and reliable procedure for the classification of Miscanthus species. PMID:28369059

  19. On the Implementation of a Land Cover Classification System for SAR Images Using Khoros

    NASA Technical Reports Server (NTRS)

    Medina Revera, Edwin J.; Espinosa, Ramon Vasquez

    1997-01-01

    The Synthetic Aperture Radar (SAR) sensor is widely used to record data about the ground under all atmospheric conditions. The SAR acquired images have very good resolution which necessitates the development of a classification system that process the SAR images to extract useful information for different applications. In this work, a complete system for the land cover classification was designed and programmed using the Khoros, a data flow visual language environment, taking full advantages of the polymorphic data services that it provides. Image analysis was applied to SAR images to improve and automate the processes of recognition and classification of the different regions like mountains and lakes. Both unsupervised and supervised classification utilities were used. The unsupervised classification routines included the use of several Classification/Clustering algorithms like the K-means, ISO2, Weighted Minimum Distance, and the Localized Receptive Field (LRF) training/classifier. Different texture analysis approaches such as Invariant Moments, Fractal Dimension and Second Order statistics were implemented for supervised classification of the images. The results and conclusions for SAR image classification using the various unsupervised and supervised procedures are presented based on their accuracy and performance.

  20. EUCLID: automatic classification of proteins in functional classes by their database annotations.

    PubMed

    Tamames, J; Ouzounis, C; Casari, G; Sander, C; Valencia, A

    1998-01-01

    A tool is described for the automatic classification of sequences in functional classes using their database annotations. The Euclid system is based on a simple learning procedure from examples provided by human experts. Euclid is freely available for academics at http://www.gredos.cnb.uam.es/EUCLID, with the corresponding dictionaries for the generation of three, eight and 14 functional classes. E-mail: valencia@cnb.uam.es The results of the EUCLID classification of different genomes are available at http://www.sander.ebi.ac. uk/genequiz/. A detailed description of the different applications mentioned in the text is available at http://www.gredos.cnb.uam. es/EUCLID/Full_Paper

  1. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  2. 43 CFR 2462.1 - Publication of notice of, and public hearings on, proposed classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... hearings on, proposed classification. 2462.1 Section 2462.1 Public Lands: Interior Regulations Relating to... (2000) BUREAU INITIATED CLASSIFICATION SYSTEM Disposal Classification Procedure: Over 2,560 Acres § 2462.1 Publication of notice of, and public hearings on, proposed classification. The authorized officer...

  3. 43 CFR 2462.2 - Publication of notice of classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Publication of notice of classification... CLASSIFICATION SYSTEM Disposal Classification Procedure: Over 2,560 Acres § 2462.2 Publication of notice of classification. After having considered the comments received as the result of publication, the authorized...

  4. Feasibility and validity of International Classification of Diseases based case mix indices.

    PubMed

    Yang, Che-Ming; Reinke, William

    2006-10-06

    Severity of illness is an omnipresent confounder in health services research. Resource consumption can be applied as a proxy of severity. The most commonly cited hospital resource consumption measure is the case mix index (CMI) and the best-known illustration of the CMI is the Diagnosis Related Group (DRG) CMI used by Medicare in the U.S. For countries that do not have DRG type CMIs, the adjustment for severity has been troublesome for either reimbursement or research purposes. The research objective of this study is to ascertain the construct validity of CMIs derived from International Classification of Diseases (ICD) in comparison with DRG CMI. The study population included 551 acute care hospitals in Taiwan and 2,462,006 inpatient reimbursement claims. The 18th version of GROUPER, the Medicare DRG classification software, was applied to Taiwan's 1998 National Health Insurance (NHI) inpatient claim data to derive the Medicare DRG CMI. The same weighting principles were then applied to determine the ICD principal diagnoses and procedures based costliness and length of stay (LOS) CMIs. Further analyses were conducted based on stratifications according to teaching status, accreditation levels, and ownership categories. The best ICD-based substitute for the DRG costliness CMI (DRGCMI) is the ICD principal diagnosis costliness CMI (ICDCMI-DC) in general and in most categories with Spearman's correlation coefficients ranging from 0.938-0.462. The highest correlation appeared in the non-profit sector. ICD procedure costliness CMI (ICDCMI-PC) outperformed ICDCMI-DC only at the medical center level, which consists of tertiary care hospitals and is more procedure intensive. The results of our study indicate that an ICD-based CMI can quite fairly approximate the DRGCMI, especially ICDCMI-DC. Therefore, substituting ICDs for DRGs in computing the CMI ought to be feasible and valid in countries that have not implemented DRGs.

  5. The impact of ICD-9 revascularization procedure codes on estimates of racial disparities in ischemic stroke.

    PubMed

    Boan, Andrea D; Voeks, Jenifer H; Feng, Wuwei Wayne; Bachman, David L; Jauch, Edward C; Adams, Robert J; Ovbiagele, Bruce; Lackland, Daniel T

    2014-01-01

    The use of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9) diagnostic codes can identify racial disparities in ischemic stroke hospitalizations; however, inclusion of revascularization procedure codes as acute stroke events may affect the magnitude of the risk difference. This study assesses the impact of excluding revascularization procedure codes in the ICD-9 definition of ischemic stroke, compared with the traditional inclusive definition, on racial disparity estimates for stroke incidence and recurrence. Patients discharged with a diagnosis of ischemic stroke (ICD-9 codes 433.00-434.91 and 436) were identified from a statewide inpatient discharge database from 2010 to 2012. Race-age specific disparity estimates of stroke incidence and recurrence and 1-year cumulative recurrent stroke rates were compared between the routinely used traditional classification and a modified classification of stroke that excluded primary ICD-9 cerebral revascularization procedures codes (38.12, 00.61, and 00.63). The traditional classification identified 7878 stroke hospitalizations, whereas the modified classification resulted in 18% fewer hospitalizations (n = 6444). The age-specific black to white rate ratios were significantly higher in the modified than in the traditional classification for stroke incidence (rate ratio, 1.50; 95% confidence interval [CI], 1.43-1.58 vs. rate ratio, 1.24; 95% CI, 1.18-1.30, respectively). In whites, the 1-year cumulative recurrence rate was significantly reduced by 46% (45-64 years) and 49% (≥ 65 years) in the modified classification, largely explained by a higher rate of cerebral revascularization procedures among whites. There were nonsignificant reductions of 14% (45-64 years) and 19% (≥ 65 years) among blacks. Including cerebral revascularization procedure codes overestimates hospitalization rates for ischemic stroke and significantly underestimates the racial disparity estimates in stroke incidence and recurrence. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  6. Development of an automated ultrasonic testing system

    NASA Astrophysics Data System (ADS)

    Shuxiang, Jiao; Wong, Brian Stephen

    2005-04-01

    Non-Destructive Testing is necessary in areas where defects in structures emerge over time due to wear and tear and structural integrity is necessary to maintain its usability. However, manual testing results in many limitations: high training cost, long training procedure, and worse, the inconsistent test results. A prime objective of this project is to develop an automatic Non-Destructive testing system for a shaft of the wheel axle of a railway carriage. Various methods, such as the neural network, pattern recognition methods and knowledge-based system are used for the artificial intelligence problem. In this paper, a statistical pattern recognition approach, Classification Tree is applied. Before feature selection, a thorough study on the ultrasonic signals produced was carried out. Based on the analysis of the ultrasonic signals, three signal processing methods were developed to enhance the ultrasonic signals: Cross-Correlation, Zero-Phase filter and Averaging. The target of this step is to reduce the noise and make the signal character more distinguishable. Four features: 1. The Auto Regressive Model Coefficients. 2. Standard Deviation. 3. Pearson Correlation 4. Dispersion Uniformity Degree are selected. And then a Classification Tree is created and applied to recognize the peak positions and amplitudes. Searching local maximum is carried out before feature computing. This procedure reduces much computation time in the real-time testing. Based on this algorithm, a software package called SOFRA was developed to recognize the peaks, calibrate automatically and test a simulated shaft automatically. The automatic calibration procedure and the automatic shaft testing procedure are developed.

  7. 28 CFR 524.73 - Classification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that the affected inmate is notified in writing as promptly as possible of the classification and the...) Central Office Inmate Monitoring Section—reviews classification decisions for all future separation... involving Witness Security cases. (2) Regional Office—reviews CIM classification decisions for Disruptive...

  8. 18 CFR 1301.65 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... classification. 1301.65 Section 1301.65 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY PROCEDURES Protection of National Security Classified Information § 1301.65 Derivative classification. (a) In... classified, and the marking of newly developed material consistent with the classification markings that...

  9. 18 CFR 1301.65 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... classification. 1301.65 Section 1301.65 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY PROCEDURES Protection of National Security Classified Information § 1301.65 Derivative classification. (a) In... classified, and the marking of newly developed material consistent with the classification markings that...

  10. 18 CFR 1301.65 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... classification. 1301.65 Section 1301.65 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY PROCEDURES Protection of National Security Classified Information § 1301.65 Derivative classification. (a) In... classified, and the marking of newly developed material consistent with the classification markings that...

  11. Inter-rater reliability of a modified version of Delitto et al.’s classification-based system for low back pain: a pilot study

    PubMed Central

    Apeldoorn, Adri T.; van Helvoirt, Hans; Ostelo, Raymond W.; Meihuizen, Hanneke; Kamper, Steven J.; van Tulder, Maurits W.; de Vet, Henrica C. W.

    2016-01-01

    Study design Observational inter-rater reliability study. Objectives To examine: (1) the inter-rater reliability of a modified version of Delitto et al.’s classification-based algorithm for patients with low back pain; (2) the influence of different levels of familiarity with the system; and (3) the inter-rater reliability of algorithm decisions in patients who clearly fit into a subgroup (clear classifications) and those who do not (unclear classifications). Methods Patients were examined twice on the same day by two of three participating physical therapists with different levels of familiarity with the system. Patients were classified into one of four classification groups. Raters were blind to the others’ classification decision. In order to quantify the inter-rater reliability, percentages of agreement and Cohen’s Kappa were calculated. Results A total of 36 patients were included (clear classification n = 23; unclear classification n = 13). The overall rate of agreement was 53% and the Kappa value was 0·34 [95% confidence interval (CI): 0·11–0·57], which indicated only fair inter-rater reliability. Inter-rater reliability for patients with a clear classification (agreement 52%, Kappa value 0·29) was not higher than for patients with an unclear classification (agreement 54%, Kappa value 0·33). Familiarity with the system (i.e. trained with written instructions and previous research experience with the algorithm) did not improve the inter-rater reliability. Conclusion Our pilot study challenges the inter-rater reliability of the classification procedure in clinical practice. Therefore, more knowledge is needed about factors that affect the inter-rater reliability, in order to improve the clinical applicability of the classification scheme. PMID:27559279

  12. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning.

    PubMed

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-03-15

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood.

  13. Multidimensional classification of magma types for altered igneous rocks and application to their tectonomagmatic discrimination and igneous provenance of siliciclastic sediments

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Pandarinath, Kailasa; Amezcua-Valdez, Alejandra; Rosales-Rivera, Mauricio; Verma, Sanjeet K.; Quiroz-Ruiz, Alfredo; Armstrong-Altrin, John S.

    2017-05-01

    A new multidimensional scheme consistent with the International Union of Geological Sciences (IUGS) is proposed for the classification of igneous rocks in terms of four magma types: ultrabasic, basic, intermediate, and acid. Our procedure is based on an extensive database of major element composition of a total of 33,868 relatively fresh rock samples having a multinormal distribution (initial database with 37,215 samples). Multinormally distributed database in terms of log-ratios of samples was ascertained by a new computer program DOMuDaF, in which the discordancy test was applied at the 99.9% confidence level. Isometric log-ratio (ilr) transformation was used to provide overall percent correct classification of 88.7%, 75.8%, 88.0%, and 80.9% for ultrabasic, basic, intermediate, and acid rocks, respectively. Given the known mathematical and uncertainty propagation properties, this transformation could be adopted for routine applications. The incorrect classification was mainly for the "neighbour" magma types, e.g., basic for ultrabasic and vice versa. Some of these misclassifications do not have any effect on multidimensional tectonic discrimination. For an efficient application of this multidimensional scheme, a new computer program MagClaMSys_ilr (MagClaMSys-Magma Classification Major-element based System) was written, which is available for on-line processing on http://tlaloc.ier.unam.mx/index.html. This classification scheme was tested from newly compiled data for relatively fresh Neogene igneous rocks and was found to be consistent with the conventional IUGS procedure. The new scheme was successfully applied to inter-laboratory data for three geochemical reference materials (basalts JB-1 and JB-1a, and andesite JA-3) from Japan and showed that the inferred magma types are consistent with the rock name (basic for basalts JB-1 and JB-1a and intermediate for andesite JA-3). The scheme was also successfully applied to five case studies of older Archaean to Mesozoic igneous rocks. Similar or more reliable results were obtained from existing tectonomagmatic discrimination diagrams when used in conjunction with the new computer program as compared to the IUGS scheme. The application to three case studies of igneous provenance of sedimentary rocks was demonstrated as a novel approach. Finally, we show that the new scheme is more robust for post-emplacement compositional changes than the conventional IUGS procedure.

  14. Restricted Boltzmann machines based oversampling and semi-supervised learning for false positive reduction in breast CAD.

    PubMed

    Cao, Peng; Liu, Xiaoli; Bao, Hang; Yang, Jinzhu; Zhao, Dazhe

    2015-01-01

    The false-positive reduction (FPR) is a crucial step in the computer aided detection system for the breast. The issues of imbalanced data distribution and the limitation of labeled samples complicate the classification procedure. To overcome these challenges, we propose oversampling and semi-supervised learning methods based on the restricted Boltzmann machines (RBMs) to solve the classification of imbalanced data with a few labeled samples. To evaluate the proposed method, we conducted a comprehensive performance study and compared its results with the commonly used techniques. Experiments on benchmark dataset of DDSM demonstrate the effectiveness of the RBMs based oversampling and semi-supervised learning method in terms of geometric mean (G-mean) for false positive reduction in Breast CAD.

  15. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    NASA Technical Reports Server (NTRS)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post-classification steps. Within this chapter, each of the four approaches is described in terms of scale and accuracy classifying urban land use and urban land cover; and for its range of urban applications. We demonstrate the overview of four main classification groups in Figure 1 while Table 1 details the approaches with respect to classification requirements and procedures (e.g., reflectance conversion, steps before training sample selection, training samples, spatial approaches commonly used, classifiers, primary inputs for classification, output structures, number of output layers, and accuracy assessment). The chapter concludes with a brief summary of the methods reviewed and the challenges that remain in developing new classification methods for improving the efficiency and accuracy of mapping urban areas.

  16. 43 CFR 2450.3 - Proposed classification decision.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 43 Public Lands: Interior 2 2011-10-01 2011-10-01 false Proposed classification decision. 2450.3... MANAGEMENT, DEPARTMENT OF THE INTERIOR LAND RESOURCE MANAGEMENT (2000) PETITION-APPLICATION CLASSIFICATION SYSTEM Petition-Application Procedures § 2450.3 Proposed classification decision. (a) The State Director...

  17. 10 CFR 110.122 - Classification assistance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Classification assistance. 110.122 Section 110.122 Energy... Procedures for Classified Information in Hearings § 110.122 Classification assistance. On the request of any... security classification of information and the protective requirements to be observed. ...

  18. 10 CFR 110.122 - Classification assistance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Classification assistance. 110.122 Section 110.122 Energy... Procedures for Classified Information in Hearings § 110.122 Classification assistance. On the request of any... security classification of information and the protective requirements to be observed. ...

  19. 10 CFR 110.122 - Classification assistance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Classification assistance. 110.122 Section 110.122 Energy... Procedures for Classified Information in Hearings § 110.122 Classification assistance. On the request of any... security classification of information and the protective requirements to be observed. ...

  20. 10 CFR 110.122 - Classification assistance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Classification assistance. 110.122 Section 110.122 Energy... Procedures for Classified Information in Hearings § 110.122 Classification assistance. On the request of any... security classification of information and the protective requirements to be observed. ...

  1. 10 CFR 110.122 - Classification assistance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Classification assistance. 110.122 Section 110.122 Energy... Procedures for Classified Information in Hearings § 110.122 Classification assistance. On the request of any... security classification of information and the protective requirements to be observed. ...

  2. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar.

    PubMed

    Raja Abdullah, Raja Syamsul Azmir; Abdul Aziz, Noor Hafizah; Abdul Rashid, Nur Emileen; Ahmad Salah, Asem; Hashim, Fazirulhisyam

    2016-09-29

    The passive bistatic radar (PBR) system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR). The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS) for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE) base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system's capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  3. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar

    PubMed Central

    Raja Abdullah, Raja Syamsul Azmir; Abdul Aziz, Noor Hafizah; Abdul Rashid, Nur Emileen; Ahmad Salah, Asem; Hashim, Fazirulhisyam

    2016-01-01

    The passive bistatic radar (PBR) system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR). The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS) for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE) base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system’s capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications. PMID:27690051

  4. Activities identification for activity-based cost/management applications of the diagnostics outpatient procedures.

    PubMed

    Alrashdan, Abdalla; Momani, Amer; Ababneh, Tamador

    2012-01-01

    One of the most challenging problems facing healthcare providers is to determine the actual cost for their procedures, which is important for internal accounting and price justification to insurers. The objective of this paper is to find suitable categories to identify the diagnostic outpatient medical procedures and translate them from functional orientation to process orientation. A hierarchal task tree is developed based on a classification schema of procedural activities. Each procedure is seen as a process consisting of a number of activities. This makes a powerful foundation for activity-based cost/management implementation and provides enough information to discover the value-added and non-value-added activities that assist in process improvement and eventually may lead to cost reduction. Work measurement techniques are used to identify the standard time of each activity at the lowest level of the task tree. A real case study at a private hospital is presented to demonstrate the proposed methodology. © 2011 National Association for Healthcare Quality.

  5. Automatic photointerpretation for plant species and stress identification (ERTS-A1)

    NASA Technical Reports Server (NTRS)

    Swanlund, G. D. (Principal Investigator); Kirvida, L.; Johnson, G. R.

    1973-01-01

    The author has identified the following significant results. Automatic stratification of forested land from ERTS-1 data provides a valuable tool for resource management. The results are useful for wood product yield estimates, recreation and wildlife management, forest inventory, and forest condition monitoring. Automatic procedures based on both multispectral and spatial features are evaluated. With five classes, training and testing on the same samples, classification accuracy of 74 percent was achieved using the MSS multispectral features. When adding texture computed from 8 x 8 arrays, classification accuracy of 90 percent was obtained.

  6. Rationale for classification of combustible gases, vapors and dusts with reference to the National Electrical Code

    NASA Astrophysics Data System (ADS)

    1982-07-01

    Serious reservations about the entire classification procedure of chemical compounds present in electrical equipment environments and the precepts on which it is based are discussed. Although some tests were conducted on selected key compounds, the committee primarily considered the chemical similarity of compounds and other known flammability properties and relied heavily on the experience and intuition of its members. The committee also recommended that the NEC grouping of dusts be changed in some ways and has reclassified dusts according to the modified version of the code.

  7. An analysis of the synoptic and climatological applicability of circulation type classifications for Ireland

    NASA Astrophysics Data System (ADS)

    Broderick, Ciaran; Fealy, Rowan

    2013-04-01

    Circulation type classifications (CTCs) compiled as part of the COST733 Action, entitled 'Harmonisation and Application of Weather Type Classifications for European Regions', are examined for their synoptic and climatological applicability to Ireland based on their ability to characterise surface temperature and precipitation. In all 16 different objective classification schemes, representative of four different methodological approaches to circulation typing (optimization algorithms, threshold based methods, eigenvector techniques and leader algorithms) are considered. Several statistical metrics which variously quantify the ability of CTCs to discretize daily data into well-defined homogeneous groups are used to evaluate and compare different approaches to synoptic typing. The records from 14 meteorological stations located across the island of Ireland are used in the study. The results indicate that while it was not possible to identify a single optimum classification or approach to circulation typing - conditional on the location and surface variables considered - a number of general assertions regarding the performance of different schemes can be made. The findings for surface temperature indicate that that those classifications based on predefined thresholds (e.g. Litynski, GrossWetterTypes and original Lamb Weather Type) perform well, as do the Kruizinga and Lund classification schemes. Similarly for precipitation predefined type classifications return high skill scores, as do those classifications derived using some optimization procedure (e.g. SANDRA, Self Organizing Maps and K-Means clustering). For both temperature and precipitation the results generally indicate that the classifications perform best for the winter season - reflecting the closer coupling between large-scale circulation and surface conditions during this period. In contrast to the findings for temperature, spatial patterns in the performance of classifications were more evident for precipitation. In the case of this variable those more westerly synoptic stations open to zonal airflow and less influenced by regional scale forcings generally exhibited a stronger link with large-scale circulation.

  8. iPcc: a novel feature extraction method for accurate disease class discovery and prediction

    PubMed Central

    Ren, Xianwen; Wang, Yong; Zhang, Xiang-Sun; Jin, Qi

    2013-01-01

    Gene expression profiling has gradually become a routine procedure for disease diagnosis and classification. In the past decade, many computational methods have been proposed, resulting in great improvements on various levels, including feature selection and algorithms for classification and clustering. In this study, we present iPcc, a novel method from the feature extraction perspective to further propel gene expression profiling technologies from bench to bedside. We define ‘correlation feature space’ for samples based on the gene expression profiles by iterative employment of Pearson’s correlation coefficient. Numerical experiments on both simulated and real gene expression data sets demonstrate that iPcc can greatly highlight the latent patterns underlying noisy gene expression data and thus greatly improve the robustness and accuracy of the algorithms currently available for disease diagnosis and classification based on gene expression profiles. PMID:23761440

  9. A targeted change-detection procedure by combining change vector analysis and post-classification approach

    NASA Astrophysics Data System (ADS)

    Ye, Su; Chen, Dongmei; Yu, Jie

    2016-04-01

    In remote sensing, conventional supervised change-detection methods usually require effective training data for multiple change types. This paper introduces a more flexible and efficient procedure that seeks to identify only the changes that users are interested in, here after referred to as "targeted change detection". Based on a one-class classifier "Support Vector Domain Description (SVDD)", a novel algorithm named "Three-layer SVDD Fusion (TLSF)" is developed specially for targeted change detection. The proposed algorithm combines one-class classification generated from change vector maps, as well as before- and after-change images in order to get a more reliable detecting result. In addition, this paper introduces a detailed workflow for implementing this algorithm. This workflow has been applied to two case studies with different practical monitoring objectives: urban expansion and forest fire assessment. The experiment results of these two case studies show that the overall accuracy of our proposed algorithm is superior (Kappa statistics are 86.3% and 87.8% for Case 1 and 2, respectively), compared to applying SVDD to change vector analysis and post-classification comparison.

  10. Automatic road sign detecion and classification based on support vector machines and HOG descriptos

    NASA Astrophysics Data System (ADS)

    Adam, A.; Ioannidis, C.

    2014-05-01

    This paper examines the detection and classification of road signs in color-images acquired by a low cost camera mounted on a moving vehicle. A new method for the detection and classification of road signs is proposed based on color based detection, in order to locate regions of interest. Then, a circular Hough transform is applied to complete detection taking advantage of the shape properties of the road signs. The regions of interest are finally represented using HOG descriptors and are fed into trained Support Vector Machines (SVMs) in order to be recognized. For the training procedure, a database with several training examples depicting Greek road sings has been developed. Many experiments have been conducted and are presented, to measure the efficiency of the proposed methodology especially under adverse weather conditions and poor illumination. For the experiments training datasets consisting of different number of examples were used and the results are presented, along with some possible extensions of this work.

  11. Locally Weighted Score Estimation for Quantile Classification in Binary Regression Models

    PubMed Central

    Rice, John D.; Taylor, Jeremy M. G.

    2016-01-01

    One common use of binary response regression methods is classification based on an arbitrary probability threshold dictated by the particular application. Since this is given to us a priori, it is sensible to incorporate the threshold into our estimation procedure. Specifically, for the linear logistic model, we solve a set of locally weighted score equations, using a kernel-like weight function centered at the threshold. The bandwidth for the weight function is selected by cross validation of a novel hybrid loss function that combines classification error and a continuous measure of divergence between observed and fitted values; other possible cross-validation functions based on more common binary classification metrics are also examined. This work has much in common with robust estimation, but diers from previous approaches in this area in its focus on prediction, specifically classification into high- and low-risk groups. Simulation results are given showing the reduction in error rates that can be obtained with this method when compared with maximum likelihood estimation, especially under certain forms of model misspecification. Analysis of a melanoma data set is presented to illustrate the use of the method in practice. PMID:28018492

  12. Master standard data quantity food production code. Macro elements for synthesizing production labor time.

    PubMed

    Matthews, M E; Waldvogel, C F; Mahaffey, M J; Zemel, P C

    1978-06-01

    Preparation procedures of standardized quantity formulas were analyzed for similarities and differences in production activities, and three entrée classifications were developed, based on these activities. Two formulas from each classification were selected, preparation procedures were divided into elements of production, and the MSD Quantity Food Production Code was applied. Macro elements not included in the existing Code were simulated, coded, assigned associated Time Measurement Units, and added to the MSD Quantity Food Production Code. Repeated occurrence of similar elements within production methods indicated that macro elements could be synthesized for use within one or more entrée classifications. Basic elements were grouped, simulated, and macro elements were derived. Macro elements were applied in the simulated production of 100 portions of each entrée formula. Total production time for each formula and average production time for each entrée classification were calculated. Application of macro elements indicated that this method of predetermining production time was feasible and could be adapted by quantity foodservice managers as a decision technique used to evaluate menu mix, production personnel schedules, and allocation of equipment usage. These macro elements could serve as a basis for further development and refinement of other macro elements which could be applied to a variety of menu item formulas.

  13. Estimating Classification Consistency and Accuracy for Cognitive Diagnostic Assessment

    ERIC Educational Resources Information Center

    Cui, Ying; Gierl, Mark J.; Chang, Hua-Hua

    2012-01-01

    This article introduces procedures for the computation and asymptotic statistical inference for classification consistency and accuracy indices specifically designed for cognitive diagnostic assessments. The new classification indices can be used as important indicators of the reliability and validity of classification results produced by…

  14. 32 CFR 1648.5 - Procedures during personal appearance before the local board.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SELECTIVE SERVICE SYSTEM CLASSIFICATION BY LOCAL BOARD § 1648.5 Procedures during personal appearance before... classification; direct attention to any information in his file; and present such further information as he... prohibited in proceedings before the board. This does not prevent the registrant or Selective Service from...

  15. 26 CFR 301.7701-1 - Classification of organizations for federal tax purposes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Classification of organizations for federal tax purposes. 301.7701-1 Section 301.7701-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURE AND ADMINISTRATION PROCEDURE AND ADMINISTRATION Definitions § 301.7701-1...

  16. [Nursing care in fluorescein angiography].

    PubMed

    Santos-Blanco, Feliciano

    2008-01-01

    Fluoresceinic angiography of the ocular fundus is a diagnostic technique to study retinal and choroidal circulation. This technique consists of parenteral administration of 500 mg of sodium fluorescein 10% and photographing the fluorescence in the eye vessels. Although this substance is fairly safe, it may also produce mild, moderate or severe local and/or general adverse reactions. The nursing process is routinely used in hospital units but not always in outpatient clinics, even through the use of invasive procedures with intravenous medication administration is common. Therefore, nurses, as those reponsible for intravenous administration, should use the nursing process to guarantee the quality of care required by the patient. To do this, we describe an individualized care plan based on evaluation by Marjorie Gordon's functional health patterns, NANDA's nursing diagnoses Taxonomy II, Nursing Outcomes Classification (NOC), Nursing Interventions Classifications (NIC) and potential complications of the procedure.

  17. CAMUR: Knowledge extraction from RNA-seq cancer data through equivalent classification rules.

    PubMed

    Cestarelli, Valerio; Fiscon, Giulia; Felici, Giovanni; Bertolazzi, Paola; Weitschek, Emanuel

    2016-03-01

    Nowadays, knowledge extraction methods from Next Generation Sequencing data are highly requested. In this work, we focus on RNA-seq gene expression analysis and specifically on case-control studies with rule-based supervised classification algorithms that build a model able to discriminate cases from controls. State of the art algorithms compute a single classification model that contains few features (genes). On the contrary, our goal is to elicit a higher amount of knowledge by computing many classification models, and therefore to identify most of the genes related to the predicted class. We propose CAMUR, a new method that extracts multiple and equivalent classification models. CAMUR iteratively computes a rule-based classification model, calculates the power set of the genes present in the rules, iteratively eliminates those combinations from the data set, and performs again the classification procedure until a stopping criterion is verified. CAMUR includes an ad-hoc knowledge repository (database) and a querying tool.We analyze three different types of RNA-seq data sets (Breast, Head and Neck, and Stomach Cancer) from The Cancer Genome Atlas (TCGA) and we validate CAMUR and its models also on non-TCGA data. Our experimental results show the efficacy of CAMUR: we obtain several reliable equivalent classification models, from which the most frequent genes, their relationships, and the relation with a particular cancer are deduced. dmb.iasi.cnr.it/camur.php emanuel@iasi.cnr.it Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  18. A statistical approach to root system classification

    PubMed Central

    Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter

    2013-01-01

    Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for “plant functional type” identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential. PMID:23914200

  19. A statistical approach to root system classification.

    PubMed

    Bodner, Gernot; Leitner, Daniel; Nakhforoosh, Alireza; Sobotik, Monika; Moder, Karl; Kaul, Hans-Peter

    2013-01-01

    Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for "plant functional type" identification in ecology can be applied to the classification of root systems. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. The study demonstrates that principal component based rooting types provide efficient and meaningful multi-trait classifiers. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems) is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Rooting types emerging from measured data, mainly distinguished by diameter/weight and density dominated types. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement techniques are essential.

  20. Supervised DNA Barcodes species classification: analysis, comparisons and results

    PubMed Central

    2014-01-01

    Background Specific fragments, coming from short portions of DNA (e.g., mitochondrial, nuclear, and plastid sequences), have been defined as DNA Barcode and can be used as markers for organisms of the main life kingdoms. Species classification with DNA Barcode sequences has been proven effective on different organisms. Indeed, specific gene regions have been identified as Barcode: COI in animals, rbcL and matK in plants, and ITS in fungi. The classification problem assigns an unknown specimen to a known species by analyzing its Barcode. This task has to be supported with reliable methods and algorithms. Methods In this work the efficacy of supervised machine learning methods to classify species with DNA Barcode sequences is shown. The Weka software suite, which includes a collection of supervised classification methods, is adopted to address the task of DNA Barcode analysis. Classifier families are tested on synthetic and empirical datasets belonging to the animal, fungus, and plant kingdoms. In particular, the function-based method Support Vector Machines (SVM), the rule-based RIPPER, the decision tree C4.5, and the Naïve Bayes method are considered. Additionally, the classification results are compared with respect to ad-hoc and well-established DNA Barcode classification methods. Results A software that converts the DNA Barcode FASTA sequences to the Weka format is released, to adapt different input formats and to allow the execution of the classification procedure. The analysis of results on synthetic and real datasets shows that SVM and Naïve Bayes outperform on average the other considered classifiers, although they do not provide a human interpretable classification model. Rule-based methods have slightly inferior classification performances, but deliver the species specific positions and nucleotide assignments. On synthetic data the supervised machine learning methods obtain superior classification performances with respect to the traditional DNA Barcode classification methods. On empirical data their classification performances are at a comparable level to the other methods. Conclusions The classification analysis shows that supervised machine learning methods are promising candidates for handling with success the DNA Barcoding species classification problem, obtaining excellent performances. To conclude, a powerful tool to perform species identification is now available to the DNA Barcoding community. PMID:24721333

  1. 40 CFR 11.5 - Procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL SECURITY CLASSIFICATION REGULATIONS..., safekeeping, accountability, transmission, disposition, and destruction of classification information and... shall conform with the National Security Council Directive of May 17, 1972, governing the classification...

  2. 40 CFR 11.5 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL SECURITY CLASSIFICATION REGULATIONS..., safekeeping, accountability, transmission, disposition, and destruction of classification information and... shall conform with the National Security Council Directive of May 17, 1972, governing the classification...

  3. 40 CFR 11.5 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL SECURITY CLASSIFICATION REGULATIONS..., safekeeping, accountability, transmission, disposition, and destruction of classification information and... shall conform with the National Security Council Directive of May 17, 1972, governing the classification...

  4. 40 CFR 11.5 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GENERAL SECURITY CLASSIFICATION REGULATIONS..., safekeeping, accountability, transmission, disposition, and destruction of classification information and... shall conform with the National Security Council Directive of May 17, 1972, governing the classification...

  5. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  6. The Evolution of Complex Microsurgical Midface Reconstruction: A Classification Scheme and Reconstructive Algorithm.

    PubMed

    Alam, Daniel; Ali, Yaseen; Klem, Christopher; Coventry, Daniel

    2016-11-01

    Orbito-malar reconstruction after oncological resection represents one of the most challenging facial reconstructive procedures. Until the last few decades, rehabilitation was typically prosthesis based with a limited role for surgery. The advent of microsurgical techniques allowed large-volume tissue reconstitution from a distant donor site, revolutionizing the potential approaches to these defects. The authors report a novel surgery-based algorithm and a classification scheme for complete midface reconstruction with a foundation in the Gillies principles of like-to-like reconstruction and with a significant role of computer-aided virtual planning. With this approach, the authors have been able to achieve significantly better patient outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Medical X-ray Image Hierarchical Classification Using a Merging and Splitting Scheme in Feature Space.

    PubMed

    Fesharaki, Nooshin Jafari; Pourghassem, Hossein

    2013-07-01

    Due to the daily mass production and the widespread variation of medical X-ray images, it is necessary to classify these for searching and retrieving proposes, especially for content-based medical image retrieval systems. In this paper, a medical X-ray image hierarchical classification structure based on a novel merging and splitting scheme and using shape and texture features is proposed. In the first level of the proposed structure, to improve the classification performance, similar classes with regard to shape contents are grouped based on merging measures and shape features into the general overlapped classes. In the next levels of this structure, the overlapped classes split in smaller classes based on the classification performance of combination of shape and texture features or texture features only. Ultimately, in the last levels, this procedure is also continued forming all the classes, separately. Moreover, to optimize the feature vector in the proposed structure, we use orthogonal forward selection algorithm according to Mahalanobis class separability measure as a feature selection and reduction algorithm. In other words, according to the complexity and inter-class distance of each class, a sub-space of the feature space is selected in each level and then a supervised merging and splitting scheme is applied to form the hierarchical classification. The proposed structure is evaluated on a database consisting of 2158 medical X-ray images of 18 classes (IMAGECLEF 2005 database) and accuracy rate of 93.6% in the last level of the hierarchical structure for an 18-class classification problem is obtained.

  8. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  9. Application of genomic technologies for characterization, typing, and detection of E. coli

    USDA-ARS?s Scientific Manuscript database

    Serotyping using polyclonal antibodies raised in rabbits has been the gold standard for classification of E. coli based on the O- (somatic) and H- (flagellar) antigens; however, problems associated with serotyping are that the procedure is time consuming and labor intensive, cross reactions among di...

  10. Brain tumor segmentation based on local independent projection-based classification.

    PubMed

    Huang, Meiyan; Yang, Wei; Wu, Yao; Jiang, Jun; Chen, Wufan; Feng, Qianjin

    2014-10-01

    Brain tumor segmentation is an important procedure for early tumor diagnosis and radiotherapy planning. Although numerous brain tumor segmentation methods have been presented, enhancing tumor segmentation methods is still challenging because brain tumor MRI images exhibit complex characteristics, such as high diversity in tumor appearance and ambiguous tumor boundaries. To address this problem, we propose a novel automatic tumor segmentation method for MRI images. This method treats tumor segmentation as a classification problem. Additionally, the local independent projection-based classification (LIPC) method is used to classify each voxel into different classes. A novel classification framework is derived by introducing the local independent projection into the classical classification model. Locality is important in the calculation of local independent projections for LIPC. Locality is also considered in determining whether local anchor embedding is more applicable in solving linear projection weights compared with other coding methods. Moreover, LIPC considers the data distribution of different classes by learning a softmax regression model, which can further improve classification performance. In this study, 80 brain tumor MRI images with ground truth data are used as training data and 40 images without ground truth data are used as testing data. The segmentation results of testing data are evaluated by an online evaluation tool. The average dice similarities of the proposed method for segmenting complete tumor, tumor core, and contrast-enhancing tumor on real patient data are 0.84, 0.685, and 0.585, respectively. These results are comparable to other state-of-the-art methods.

  11. One input-class and two input-class classifications for differentiating olive oil from other edible vegetable oils by use of the normal-phase liquid chromatography fingerprint of the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; Pérez-Castaño, Estefanía; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-04-15

    A new method for differentiation of olive oil (independently of the quality category) from other vegetable oils (canola, safflower, corn, peanut, seeds, grapeseed, palm, linseed, sesame and soybean) has been developed. The analytical procedure for chromatographic fingerprinting of the methyl-transesterified fraction of each vegetable oil, using normal-phase liquid chromatography, is described and the chemometric strategies applied and discussed. Some chemometric methods, such as k-nearest neighbours (kNN), partial least squared-discriminant analysis (PLS-DA), support vector machine classification analysis (SVM-C), and soft independent modelling of class analogies (SIMCA), were applied to build classification models. Performance of the classification was evaluated and ranked using several classification quality metrics. The discriminant analysis, based on the use of one input-class, (plus a dummy class) was applied for the first time in this study. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Singularity and Nonnormality in the Classification of Compositional Data

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Davis, J.C.; Olea, R.A.; Harff, Jan

    1998-01-01

    Geologists may want to classify compositional data and express the classification as a map. Regionalized classification is a tool that can be used for this purpose, but it incorporates discriminant analysis, which requires the computation and inversion of a covariance matrix. Covariance matrices of compositional data always will be singular (noninvertible) because of the unit-sum constraint. Fortunately, discriminant analyses can be calculated using a pseudo-inverse of the singular covariance matrix; this is done automatically by some statistical packages such as SAS. Granulometric data from the Darss Sill region of the Baltic Sea is used to explore how the pseudo-inversion procedure influences discriminant analysis results, comparing the algorithm used by SAS to the more conventional Moore-Penrose algorithm. Logratio transforms have been recommended to overcome problems associated with analysis of compositional data, including singularity. A regionalized classification of the Darss Sill data after logratio transformation is different only slightly from one based on raw granulometric data, suggesting that closure problems do not influence severely regionalized classification of compositional data.

  13. Automotive System for Remote Surface Classification.

    PubMed

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-04-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions.

  14. Automotive System for Remote Surface Classification

    PubMed Central

    Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail

    2017-01-01

    In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions. PMID:28368297

  15. Testing Multivariate Adaptive Regression Splines (MARS) as a Method of Land Cover Classification of TERRA-ASTER Satellite Images.

    PubMed

    Quirós, Elia; Felicísimo, Angel M; Cuartero, Aurora

    2009-01-01

    This work proposes a new method to classify multi-spectral satellite images based on multivariate adaptive regression splines (MARS) and compares this classification system with the more common parallelepiped and maximum likelihood (ML) methods. We apply the classification methods to the land cover classification of a test zone located in southwestern Spain. The basis of the MARS method and its associated procedures are explained in detail, and the area under the ROC curve (AUC) is compared for the three methods. The results show that the MARS method provides better results than the parallelepiped method in all cases, and it provides better results than the maximum likelihood method in 13 cases out of 17. These results demonstrate that the MARS method can be used in isolation or in combination with other methods to improve the accuracy of soil cover classification. The improvement is statistically significant according to the Wilcoxon signed rank test.

  16. High Dimensional Classification Using Features Annealed Independence Rules.

    PubMed

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  17. A neural network approach to cloud classification

    NASA Technical Reports Server (NTRS)

    Lee, Jonathan; Weger, Ronald C.; Sengupta, Sailes K.; Welch, Ronald M.

    1990-01-01

    It is shown that, using high-spatial-resolution data, very high cloud classification accuracies can be obtained with a neural network approach. A texture-based neural network classifier using only single-channel visible Landsat MSS imagery achieves an overall cloud identification accuracy of 93 percent. Cirrus can be distinguished from boundary layer cloudiness with an accuracy of 96 percent, without the use of an infrared channel. Stratocumulus is retrieved with an accuracy of 92 percent, cumulus at 90 percent. The use of the neural network does not improve cirrus classification accuracy. Rather, its main effect is in the improved separation between stratocumulus and cumulus cloudiness. While most cloud classification algorithms rely on linear parametric schemes, the present study is based on a nonlinear, nonparametric four-layer neural network approach. A three-layer neural network architecture, the nonparametric K-nearest neighbor approach, and the linear stepwise discriminant analysis procedure are compared. A significant finding is that significantly higher accuracies are attained with the nonparametric approaches using only 20 percent of the database as training data, compared to 67 percent of the database in the linear approach.

  18. A Model Assessment and Classification System for Men and Women in Correctional Institutions.

    ERIC Educational Resources Information Center

    Hellervik, Lowell W.; And Others

    The report describes a manpower assessment and classification system for criminal offenders directed towards making practical training and job classification decisions. The model is not concerned with custody classifications except as they affect occupational/training possibilities. The model combines traditional procedures of vocational…

  19. Evidence-based severity assessment: Impact of repeated versus single open-field testing on welfare in C57BL/6J mice.

    PubMed

    Bodden, Carina; Siestrup, Sophie; Palme, Rupert; Kaiser, Sylvia; Sachser, Norbert; Richter, S Helene

    2018-01-15

    According to current guidelines on animal experiments, a prospective assessment of the severity of each procedure is mandatory. However, so far, the classification of procedures into different severity categories mainly relies on theoretic considerations, since it is not entirely clear which of the various procedures compromise the welfare of animals, or, to what extent. Against this background, a systematic empirical investigation of the impact of each procedure, including behavioral testing, seems essential. Therefore, the present study was designed to elucidate the effects of repeated versus single testing on mouse welfare, using one of the most commonly used paradigms for behavioral phenotyping in behavioral neuroscience, the open-field test. In an independent groups design, laboratory mice (Mus musculus f. domestica) experienced either repeated, single, or no open-field testing - procedures that are assigned to different severity categories. Interestingly, testing experiences did not affect fecal corticosterone metabolites, body weights, elevated plus-maze or home cage behavior differentially. Thus, with respect to the assessed endocrinological, physical, and behavioral outcome measures, no signs of compromised welfare could be detected in mice that were tested in the open-field repeatedly, once, or, not at all. These findings challenge current classification guidelines and may, furthermore, stimulate systematic research on the severity of single procedures involving living animals. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Treatment of congential vascular disorders: classification, step program, and therapeutic procedures

    NASA Astrophysics Data System (ADS)

    Philipp, Carsten M.; Poetke, Margitta; Engel-Murke, Frank; Waldschmidt, J.; Berlien, Hans-Peter

    1994-02-01

    Because of the different step programs concerning the preoperative diagnostic and the onset of therapy for the various types of congenital vascular disorders (CVD) a clear classification is important. One has to discern the vascular malformations, including the port wine stain, from the real hemangiomas which are vascular tumors. As former classification, mostly based on histological findings, showed little evidence to a clinical step program, we developed a descriptive classification which allows an early differentiation between the two groups of CVD. In most cases this can be done by a precise medical history of the onset and development of the disorder, a close look to the clinical signs and by Duplex-Ultrasound and MRI-diagnostic. With this protocol and the case adapted use of different lasers and laser techniques we have not seen any severe complications as skin necrosis or nerve lesions.

  1. Machine learning in soil classification.

    PubMed

    Bhattacharya, B; Solomatine, D P

    2006-03-01

    In a number of engineering problems, e.g. in geotechnics, petroleum engineering, etc. intervals of measured series data (signals) are to be attributed a class maintaining the constraint of contiguity and standard classification methods could be inadequate. Classification in this case needs involvement of an expert who observes the magnitude and trends of the signals in addition to any a priori information that might be available. In this paper, an approach for automating this classification procedure is presented. Firstly, a segmentation algorithm is developed and applied to segment the measured signals. Secondly, the salient features of these segments are extracted using boundary energy method. Based on the measured data and extracted features to assign classes to the segments classifiers are built; they employ Decision Trees, ANN and Support Vector Machines. The methodology was tested in classifying sub-surface soil using measured data from Cone Penetration Testing and satisfactory results were obtained.

  2. Symbolic dynamic filtering and language measure for behavior identification of mobile robots.

    PubMed

    Mallapragada, Goutham; Ray, Asok; Jin, Xin

    2012-06-01

    This paper presents a procedure for behavior identification of mobile robots, which requires limited or no domain knowledge of the underlying process. While the features of robot behavior are extracted by symbolic dynamic filtering of the observed time series, the behavior patterns are classified based on language measure theory. The behavior identification procedure has been experimentally validated on a networked robotic test bed by comparison with commonly used tools, namely, principal component analysis for feature extraction and Bayesian risk analysis for pattern classification.

  3. The changes in hazard classification and product notification procedures of the new European CLP and Cosmetics Regulations.

    PubMed

    de Groot, Ronald; Brekelmans, Pieter; Herremans, Joke; Meulenbelt, Jan

    2010-01-01

    The United Nations Globally Harmonized System of Classification and Labelling of Chemicals (UN-GHS) is developed to harmonize the criteria for hazard communication worldwide. The European Regulation on classification, labeling, and packaging of substances and mixtures [CLP Regulation (European Commission, EC) No 1272/2008] will align the existing European Union (EU) legislation to the UN-GHS. This CLP Regulation entered into force on January 20, 2009, and will, after a transitional period, replace the current rules on classification, labeling, and packaging for supply and use in Europe. Both old and new classifications will exist simultaneously until 2010 for substances and until 2015 for mixtures. The new hazard classification will introduce new health hazard classes and categories, with associated new hazard pictograms, signal words, Hazard (H)-statements, and Precautionary (P)-statements as labeling elements. Furthermore, the CLP Regulation will affect the notification of product information on hazardous products to poisons information centers (PICs). At this moment product notification widely varies in procedures and requirements across EU Member States. Article 45 of the CLP Regulation contains a provision stating that the EC will (by January 20, 2012) review the possibility of harmonizing product notification. The European Association of Poisons Centres and Clinical Toxicologists (EAPCCT) is recognized as an important stakeholder. For cosmetic products, the new Cosmetics Regulation will directly implement a new procedure for electronic cosmetic product notification in all EU Member States. Both the CLP Regulation and the Cosmetics Regulation will develop their own product notification procedure within different time frames. Harmonization of notification procedures for both product groups, especially a common electronic format, would be most effective from a cost-benefit viewpoint and would be welcomed by PICs.

  4. Guidelines for procedural pain in the newborn

    PubMed Central

    Lago, Paola; Garetti, Elisabetta; Merazzi, Daniele; Pieragostini, Luisa; Ancora, Gina; Pirelli, Anna; Bellieni, Carlo Valerio

    2009-01-01

    Despite accumulating evidence that procedural pain experienced by newborn infants may have acute and even long-term detrimental effects on their subsequent behaviour and neurological outcome, pain control and prevention remain controversial issues. Our aim was to develop guidelines based on evidence and clinical practice for preventing and controlling neonatal procedural pain in the light of the evidence-based recommendations contained in the SIGN classification. A panel of expert neonatologists used systematic review, data synthesis and open discussion to reach a consensus on the level of evidence supported by the literature or customs in clinical practice and to describe a global analgesic management, considering pharmacological, non-pharmacological, behavioural and environmental measures for each invasive procedure. There is strong evidence to support some analgesic measures, e.g. sucrose or breast milk for minor invasive procedures, and combinations of drugs for tracheal intubation. Many other pain control measures used during chest tube placement and removal, screening and treatment for ROP, or for postoperative pain, are still based not on evidence, but on good practice or expert opinions. Conclusion: These guidelines should help improving the health care professional's awareness of the need to adequately manage procedural pain in neonates, based on the strongest evidence currently available. PMID:19484828

  5. The economic implications of a multimodal analgesic regimen for patients undergoing major orthopedic surgery: a comparative study of direct costs.

    PubMed

    Duncan, Christopher M; Hall Long, Kirsten; Warner, David O; Hebl, James R

    2009-01-01

    Total knee and total hip arthoplasty (THA) are 2 of the most common surgical procedures performed in the United States and represent the greatest single Medicare procedural expenditure. This study was designed to evaluate the economic impact of implementing a multimodal analgesic regimen (Total Joint Regional Anesthesia [TJRA] Clinical Pathway) on the estimated direct medical costs of patients undergoing lower extremity joint replacement surgery. An economic cost comparison was performed on Mayo Clinic patients (n = 100) undergoing traditional total knee or total hip arthroplasty using the TJRA Clinical Pathway. Study patients were matched 1:1 with historical controls undergoing similar procedures using traditional anesthetic (non-TJRA) techniques. Matching criteria included age, sex, surgeon, type of procedure, and American Society of Anesthesiologists (ASA) physical status (PS) classification. Hospital-based direct costs were collected for each patient and analyzed in standardized inflation-adjusted constant dollars using cost-to-charge ratios, wage indexes, and physician services valued using Medicare reimbursement rates. The estimated mean direct hospital costs were compared between groups, and a subgroup analysis was performed based on ASA PS classification. The estimated mean direct hospital costs were significantly reduced among TJRA patients when compared with controls (cost difference, 1999 dollars; 95% confidence interval, 584-3231 dollars; P = 0.0004). A significant reduction in hospital-based (Medicare Part A) costs accounted for the majority of the total cost savings. Use of a comprehensive, multimodal analgesic regimen (TJRA Clinical Pathway) in patients undergoing lower extremity joint replacement surgery provides a significant reduction in the estimated total direct medical costs. The reduction in mean cost is primarily associated with lower hospital-based (Medicare Part A) costs, with the greatest overall cost difference appearing among patients with significant comorbidities (ASA PS III-IV patients).

  6. Classification of parotidectomies: a proposal of the European Salivary Gland Society.

    PubMed

    Quer, M; Guntinas-Lichius, O; Marchal, F; Vander Poorten, V; Chevalier, D; León, X; Eisele, D; Dulguerov, P

    2016-10-01

    The objective of this study is to provide a comprehensive classification system for parotidectomy operations. Data sources include Medline publications, author's experience, and consensus round table at the Third European Salivary Gland Society (ESGS) Meeting. The Medline database was searched with the term "parotidectomy" and "definition". The various definitions of parotidectomy procedures and parotid gland subdivisions extracted. Previous classification systems re-examined and a new classification proposed by a consensus. The ESGS proposes to subdivide the parotid parenchyma in five levels: I (lateral superior), II (lateral inferior), III (deep inferior), IV (deep superior), V (accessory). A new classification is proposed where the type of resection is divided into formal parotidectomy with facial nerve dissection and extracapsular dissection. Parotidectomies are further classified according to the levels removed, as well as the extra-parotid structures ablated. A new classification of parotidectomy procedures is proposed.

  7. Mössbauer investigations to characterize Fe lattice sites in sheet silicates and Peru Basin deep-sea sediments

    NASA Astrophysics Data System (ADS)

    Lougear, André; König, Iris; Trautwein, Alfred X.; Suess, Erwin

    A procedure to classify different Fe lattice sites, i.e., OH-group geometries, in the clay mineral content of deep-sea sediments was developed using Mössbauer spectroscopy at low temperature (77 K). This speciation is of interest with regard to the redox behavior, reactivity and color of marine sediments, since substantial iron redox transitions (associated with sediment color change) have been documented for the structural sheet silicate iron. Lattice site classification was achieved for the Fe(II) fraction, all of which is structural clay Fe(II) in the sediments under investigation. Whereas the major part of the Fe(III) is structural clay iron as well, there is a small Fe(III) fraction in oxide minerals. Therefore, further elaboration of the procedure would be required to also achieve lattice site classification for the Fe(III) fraction. Analysis of the Mössbauer spectra is based on computer fits, the input parameters of which were derived from a separate study of Fe(II)-rich pure chlorites. The procedure of classification is qualified to investigate, e.g., in laboratory experiments, the site-specific reaction rates and the effects on sediment color of iron redox transitions in the sheet silicate content of sediments. The new skills were successfully applied in environmental impact studies on the mining of polymetallic nodules from the Peru Basin deep-sea floor.

  8. Cognitive approaches for patterns analysis and security applications

    NASA Astrophysics Data System (ADS)

    Ogiela, Marek R.; Ogiela, Lidia

    2017-08-01

    In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.

  9. Algorithms for Hyperspectral Endmember Extraction and Signature Classification with Morphological Dendritic Networks

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Ritter, G.

    Accurate multispectral or hyperspectral signature classification is key to the nonimaging detection and recognition of space objects. Additionally, signature classification accuracy depends on accurate spectral endmember determination [1]. Previous approaches to endmember computation and signature classification were based on linear operators or neural networks (NNs) expressed in terms of the algebra (R, +, x) [1,2]. Unfortunately, class separation in these methods tends to be suboptimal, and the number of signatures that can be accurately classified often depends linearly on the number of NN inputs. This can lead to poor endmember distinction, as well as potentially significant classification errors in the presence of noise or densely interleaved signatures. In contrast to traditional CNNs, autoassociative morphological memories (AMM) are a construct similar to Hopfield autoassociatived memories defined on the (R, +, ?,?) lattice algebra [3]. Unlimited storage and perfect recall of noiseless real valued patterns has been proven for AMMs [4]. However, AMMs suffer from sensitivity to specific noise models, that can be characterized as erosive and dilative noise. On the other hand, the prior definition of a set of endmembers corresponds to material spectra lying on vertices of the minimum convex region covering the image data. These vertices can be characterized as morphologically independent patterns. It has further been shown that AMMs can be based on dendritic computation [3,6]. These techniques yield improved accuracy and class segmentation/separation ability in the presence of highly interleaved signature data. In this paper, we present a procedure for endmember determination based on AMM noise sensitivity, which employs morphological dendritic computation. We show that detected endmembers can be exploited by AMM based classification techniques, to achieve accurate signature classification in the presence of noise, closely spaced or interleaved signatures, and simulated camera optical distortions. In particular, we examine two critical cases: (1) classification of multiple closely spaced signatures that are difficult to separate using distance measures, and (2) classification of materials in simulated hyperspectral images of spaceborne satellites. In each case, test data are derived from a NASA database of space material signatures. Additional analysis pertains to computational complexity and noise sensitivity, which are superior to classical NN based techniques.

  10. Voxel-Based Neighborhood for Spatial Shape Pattern Classification of Lidar Point Clouds with Supervised Learning

    PubMed Central

    Plaza-Leiva, Victoria; Gomez-Ruiz, Jose Antonio; Mandow, Anthony; García-Cerezo, Alfonso

    2017-01-01

    Improving the effectiveness of spatial shape features classification from 3D lidar data is very relevant because it is largely used as a fundamental step towards higher level scene understanding challenges of autonomous vehicles and terrestrial robots. In this sense, computing neighborhood for points in dense scans becomes a costly process for both training and classification. This paper proposes a new general framework for implementing and comparing different supervised learning classifiers with a simple voxel-based neighborhood computation where points in each non-overlapping voxel in a regular grid are assigned to the same class by considering features within a support region defined by the voxel itself. The contribution provides offline training and online classification procedures as well as five alternative feature vector definitions based on principal component analysis for scatter, tubular and planar shapes. Moreover, the feasibility of this approach is evaluated by implementing a neural network (NN) method previously proposed by the authors as well as three other supervised learning classifiers found in scene processing methods: support vector machines (SVM), Gaussian processes (GP), and Gaussian mixture models (GMM). A comparative performance analysis is presented using real point clouds from both natural and urban environments and two different 3D rangefinders (a tilting Hokuyo UTM-30LX and a Riegl). Classification performance metrics and processing time measurements confirm the benefits of the NN classifier and the feasibility of voxel-based neighborhood. PMID:28294963

  11. 40 CFR 152.160 - Scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.160 Scope. (a) Types of classification. A pesticide product may be unclassified, or it may be classified for restricted use or for...

  12. 40 CFR 152.160 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.160 Scope. (a) Types of classification. A pesticide product may be unclassified, or it may be classified for restricted use or for...

  13. 40 CFR 152.160 - Scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.160 Scope. (a) Types of classification. A pesticide product may be unclassified, or it may be classified for restricted use or for...

  14. 40 CFR 152.160 - Scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.160 Scope. (a) Types of classification. A pesticide product may be unclassified, or it may be classified for restricted use or for...

  15. 40 CFR 152.160 - Scope.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.160 Scope. (a) Types of classification. A pesticide product may be unclassified, or it may be classified for restricted use or for...

  16. 50 CFR 540.3 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the fact that the Commission does not have original classification authority and national security... conjunction with a transfer of functions, to the appropriate federal agency exercising original classification... contractor of the Commission originates information that is believed to require classification, the Executive...

  17. 50 CFR 540.3 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the fact that the Commission does not have original classification authority and national security... conjunction with a transfer of functions, to the appropriate federal agency exercising original classification... contractor of the Commission originates information that is believed to require classification, the Executive...

  18. 50 CFR 540.3 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the fact that the Commission does not have original classification authority and national security... conjunction with a transfer of functions, to the appropriate federal agency exercising original classification... contractor of the Commission originates information that is believed to require classification, the Executive...

  19. The ESHRE/ESGE consensus on the classification of female genital tract congenital anomalies.

    PubMed

    Grimbizis, Grigoris F; Gordts, Stephan; Di Spiezio Sardo, Attilio; Brucker, Sara; De Angelis, Carlo; Gergolet, Marco; Li, Tin-Chiu; Tanos, Vasilios; Brölmann, Hans; Gianaroli, Luca; Campo, Rudi

    2013-08-01

    What classification system is more suitable for the accurate, clear, simple and related to the clinical management categorization of female genital anomalies? The new ESHRE/ESGE classification system of female genital anomalies is presented. Congenital malformations of the female genital tract are common miscellaneous deviations from normal anatomy with health and reproductive consequences. Until now, three systems have been proposed for their categorization but all of them are associated with serious limitations. The European Society of Human Reproduction and Embryology (ESHRE) and the European Society for Gynaecological Endoscopy (ESGE) have established a common Working Group, under the name CONUTA (CONgenital UTerine Anomalies), with the goal of developing a new updated classification system. A scientific committee (SC) has been appointed to run the project, looking also for consensus within the scientists working in the field. The new system is designed and developed based on (i) scientific research through critical review of current proposals and preparation of an initial proposal for discussion between the experts, (ii) consensus measurement among the experts through the use of the DELPHI procedure and (iii) consensus development by the SC, taking into account the results of the DELPHI procedure and the comments of the experts. Almost 90 participants took part in the process of development of the ESHRE/ESGE classification system, contributing with their structured answers and comments. The ESHRE/ESGE classification system is based on anatomy. Anomalies are classified into the following main classes, expressing uterine anatomical deviations deriving from the same embryological origin: U0, normal uterus; U1, dysmorphic uterus; U2, septate uterus; U3, bicorporeal uterus; U4, hemi-uterus; U5, aplastic uterus; U6, for still unclassified cases. Main classes have been divided into sub-classes expressing anatomical varieties with clinical significance. Cervical and vaginal anomalies are classified independently into sub-classes having clinical significance. The ESHRE/ESGE classification of female genital anomalies seems to fulfill the expectations and the needs of the experts in the field, but its clinical value needs to be proved in everyday practice. The ESHRE/ESGE classification system of female genital anomalies could be used as a starting point for the development of guidelines for their diagnosis and treatment. None.

  20. 47 CFR 63.13 - Procedures for modifying regulatory classification of U.S. international carriers from dominant...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... classification of U.S. international carriers from dominant to non-dominant. 63.13 Section 63.13... for modifying regulatory classification of U.S. international carriers from dominant to non-dominant... in its application to demonstrate that it qualifies for non-dominant classification pursuant to § 63...

  1. 47 CFR 63.13 - Procedures for modifying regulatory classification of U.S. international carriers from dominant...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... classification of U.S. international carriers from dominant to non-dominant. 63.13 Section 63.13... for modifying regulatory classification of U.S. international carriers from dominant to non-dominant... in its application to demonstrate that it qualifies for non-dominant classification pursuant to § 63...

  2. 21 CFR 860.5 - Confidentiality and use of data and information submitted in connection with classification and...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... submitted in connection with classification and reclassification. 860.5 Section 860.5 Food and Drugs FOOD... DEVICE CLASSIFICATION PROCEDURES General § 860.5 Confidentiality and use of data and information submitted in connection with classification and reclassification. (a) This section governs the availability...

  3. 21 CFR 860.5 - Confidentiality and use of data and information submitted in connection with classification and...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... submitted in connection with classification and reclassification. 860.5 Section 860.5 Food and Drugs FOOD... DEVICE CLASSIFICATION PROCEDURES General § 860.5 Confidentiality and use of data and information submitted in connection with classification and reclassification. (a) This section governs the availability...

  4. 21 CFR 860.5 - Confidentiality and use of data and information submitted in connection with classification and...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... submitted in connection with classification and reclassification. 860.5 Section 860.5 Food and Drugs FOOD... DEVICE CLASSIFICATION PROCEDURES General § 860.5 Confidentiality and use of data and information submitted in connection with classification and reclassification. (a) This section governs the availability...

  5. 21 CFR 860.5 - Confidentiality and use of data and information submitted in connection with classification and...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... submitted in connection with classification and reclassification. 860.5 Section 860.5 Food and Drugs FOOD... DEVICE CLASSIFICATION PROCEDURES General § 860.5 Confidentiality and use of data and information submitted in connection with classification and reclassification. (a) This section governs the availability...

  6. 21 CFR 860.5 - Confidentiality and use of data and information submitted in connection with classification and...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... submitted in connection with classification and reclassification. 860.5 Section 860.5 Food and Drugs FOOD... DEVICE CLASSIFICATION PROCEDURES General § 860.5 Confidentiality and use of data and information submitted in connection with classification and reclassification. (a) This section governs the availability...

  7. Towards a robust framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Deshmukh, A.; Samal, A.; Singh, R.

    2017-12-01

    Classification of catchments based on various measures of similarity has emerged as an important technique to understand regional scale hydrologic behavior. Classification of catchment characteristics and/or streamflow response has been used reveal which characteristics are more likely to explain the observed variability of hydrologic response. However, numerous algorithms for supervised or unsupervised classification are available, making it hard to identify the algorithm most suitable for the dataset at hand. Consequently, existing catchment classification studies vary significantly in the classification algorithms employed with no previous attempt at understanding the degree of uncertainty in classification due to this algorithmic choice. This hinders the generalizability of interpretations related to hydrologic behavior. Our goal is to develop a protocol that can be followed while classifying hydrologic datasets. We focus on a classification framework for unsupervised classification and provide a step-by-step classification procedure. The steps include testing the clusterabiltiy of original dataset prior to classification, feature selection, validation of clustered data, and quantification of similarity of two clusterings. We test several commonly available methods within this framework to understand the level of similarity of classification results across algorithms. We apply the proposed framework on recently developed datasets for India to analyze to what extent catchment properties can explain observed catchment response. Our testing dataset includes watershed characteristics for over 200 watersheds which comprise of both natural (physio-climatic) characteristics and socio-economic characteristics. This framework allows us to understand the controls on observed hydrologic variability across India.

  8. Accuracy assessment, using stratified plurality sampling, of portions of a LANDSAT classification of the Arctic National Wildlife Refuge Coastal Plain

    NASA Technical Reports Server (NTRS)

    Card, Don H.; Strong, Laurence L.

    1989-01-01

    An application of a classification accuracy assessment procedure is described for a vegetation and land cover map prepared by digital image processing of LANDSAT multispectral scanner data. A statistical sampling procedure called Stratified Plurality Sampling was used to assess the accuracy of portions of a map of the Arctic National Wildlife Refuge coastal plain. Results are tabulated as percent correct classification overall as well as per category with associated confidence intervals. Although values of percent correct were disappointingly low for most categories, the study was useful in highlighting sources of classification error and demonstrating shortcomings of the plurality sampling method.

  9. 40 CFR 260.33 - Procedures for variances from classification as a solid waste, for variances to be classified as...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... classification as a solid waste, for variances to be classified as a boiler, or for non-waste determinations. 260.33 Section 260.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) HAZARDOUS WASTE MANAGEMENT SYSTEM: GENERAL Rulemaking Petitions § 260.33 Procedures for variances...

  10. 40 CFR 260.33 - Procedures for variances from classification as a solid waste, for variances to be classified as...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... classification as a solid waste, for variances to be classified as a boiler, or for non-waste determinations. 260.33 Section 260.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) HAZARDOUS WASTE MANAGEMENT SYSTEM: GENERAL Rulemaking Petitions § 260.33 Procedures for variances...

  11. 40 CFR 260.33 - Procedures for variances from classification as a solid waste, for variances to be classified as...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... classification as a solid waste, for variances to be classified as a boiler, or for non-waste determinations. 260.33 Section 260.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) HAZARDOUS WASTE MANAGEMENT SYSTEM: GENERAL Rulemaking Petitions § 260.33 Procedures for variances...

  12. Crop Identification Technology Assessment for Remote Sensing (CITARS)

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.

    1975-01-01

    The results of classifications and experiments performed for the Crop Identification Technology Assessment for Remote Sensing (CITARS) project are summarized. Fifteen data sets were classified using two analysis procedures. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. In addition, 20 data sets were classified using training statistics from another segment or date. The results of both the local and non-local classifications in terms of classification and proportion estimation are presented. Several additional experiments are described which were performed to provide additional understanding of the CITARS results. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, the spectral discriminability of corn, soybeans, and other, and analysis of aircraft multispectral data.

  13. A generalized procedure for analyzing sustained and dynamic vocal fold vibrations from laryngeal high-speed videos using phonovibrograms.

    PubMed

    Unger, Jakob; Schuster, Maria; Hecker, Dietmar J; Schick, Bernhard; Lohscheller, Jörg

    2016-01-01

    This work presents a computer-based approach to analyze the two-dimensional vocal fold dynamics of endoscopic high-speed videos, and constitutes an extension and generalization of a previously proposed wavelet-based procedure. While most approaches aim for analyzing sustained phonation conditions, the proposed method allows for a clinically adequate analysis of both dynamic as well as sustained phonation paradigms. The analysis procedure is based on a spatio-temporal visualization technique, the phonovibrogram, that facilitates the documentation of the visible laryngeal dynamics. From the phonovibrogram, a low-dimensional set of features is computed using a principle component analysis strategy that quantifies the type of vibration patterns, irregularity, lateral symmetry and synchronicity, as a function of time. Two different test bench data sets are used to validate the approach: (I) 150 healthy and pathologic subjects examined during sustained phonation. (II) 20 healthy and pathologic subjects that were examined twice: during sustained phonation and a glissando from a low to a higher fundamental frequency. In order to assess the discriminative power of the extracted features, a Support Vector Machine is trained to distinguish between physiologic and pathologic vibrations. The results for sustained phonation sequences are compared to the previous approach. Finally, the classification performance of the stationary analyzing procedure is compared to the transient analysis of the glissando maneuver. For the first test bench the proposed procedure outperformed the previous approach (proposed feature set: accuracy: 91.3%, sensitivity: 80%, specificity: 97%, previous approach: accuracy: 89.3%, sensitivity: 76%, specificity: 96%). Comparing the classification performance of the second test bench further corroborates that analyzing transient paradigms provides clear additional diagnostic value (glissando maneuver: accuracy: 90%, sensitivity: 100%, specificity: 80%, sustained phonation: accuracy: 75%, sensitivity: 80%, specificity: 70%). The incorporation of parameters describing the temporal evolvement of vocal fold vibration clearly improves the automatic identification of pathologic vibration patterns. Furthermore, incorporating a dynamic phonation paradigm provides additional valuable information about the underlying laryngeal dynamics that cannot be derived from sustained conditions. The proposed generalized approach provides a better overall classification performance than the previous approach, and hence constitutes a new advantageous tool for an improved clinical diagnosis of voice disorders. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Ground Truth Sampling and LANDSAT Accuracy Assessment

    NASA Technical Reports Server (NTRS)

    Robinson, J. W.; Gunther, F. J.; Campbell, W. J.

    1982-01-01

    It is noted that the key factor in any accuracy assessment of remote sensing data is the method used for determining the ground truth, independent of the remote sensing data itself. The sampling and accuracy procedures developed for nuclear power plant siting study are described. The purpose of the sampling procedure was to provide data for developing supervised classifications for two study sites and for assessing the accuracy of that and the other procedures used. The purpose of the accuracy assessment was to allow the comparison of the cost and accuracy of various classification procedures as applied to various data types.

  15. Topobathymetric LiDAR point cloud processing and landform classification in a tidal environment

    NASA Astrophysics Data System (ADS)

    Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner

    2017-04-01

    Historically it has been difficult to create high resolution Digital Elevation Models (DEMs) in land-water transition zones due to shallow water depth and often challenging environmental conditions. This gap of information has been reflected as a "white ribbon" with no data in the land-water transition zone. In recent years, the technology of airborne topobathymetric Light Detection and Ranging (LiDAR) has proven capable of filling out the gap by simultaneously capturing topographic and bathymetric elevation information, using only a single green laser. We collected green LiDAR point cloud data in the Knudedyb tidal inlet system in the Danish Wadden Sea in spring 2014. Creating a DEM from a point cloud requires the general processing steps of data filtering, water surface detection and refraction correction. However, there is no transparent and reproducible method for processing green LiDAR data into a DEM, specifically regarding the procedure of water surface detection and modelling. We developed a step-by-step procedure for creating a DEM from raw green LiDAR point cloud data, including a procedure for making a Digital Water Surface Model (DWSM) (see Andersen et al., 2017). Two different classification analyses were applied to the high resolution DEM: A geomorphometric and a morphological classification, respectively. The classification methods were originally developed for a small test area; but in this work, we have used the classification methods to classify the complete Knudedyb tidal inlet system. References Andersen MS, Gergely Á, Al-Hamdani Z, Steinbacher F, Larsen LR, Ernstsen VB (2017). Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment. Hydrol. Earth Syst. Sci., 21: 43-63, doi:10.5194/hess-21-43-2017. Acknowledgements This work was funded by the Danish Council for Independent Research | Natural Sciences through the project "Process-based understanding and prediction of morphodynamics in a natural coastal system in response to climate change" (Steno Grant no. 10-081102) and by the Geocenter Denmark through the project "Closing the gap! - Coherent land-water environmental mapping (LAWA)" (Grant no. 4-2015).

  16. Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands

    USDA-ARS?s Scientific Manuscript database

    Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...

  17. Development of a corn and soybean labeling procedure for use with profile parameter classification

    NASA Technical Reports Server (NTRS)

    Magness, E. R. (Principal Investigator)

    1982-01-01

    Some essential processes for the development of a green-number-based logic for identifying (labeling) crops in LANDSAT imagery are documented. The supporting data and subsequent conclusions that resulted from development of a specific labeling logic for corn and soybean crops in the United States are recorded.

  18. R package PRIMsrc: Bump Hunting by Patient Rule Induction Method for Survival, Regression and Classification

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    PRIMsrc is a novel implementation of a non-parametric bump hunting procedure, based on the Patient Rule Induction Method (PRIM), offering a unified treatment of outcome variables, including censored time-to-event (Survival), continuous (Regression) and discrete (Classification) responses. To fit the model, it uses a recursive peeling procedure with specific peeling criteria and stopping rules depending on the response. To validate the model, it provides an objective function based on prediction-error or other specific statistic, as well as two alternative cross-validation techniques, adapted to the task of decision-rule making and estimation in the three types of settings. PRIMsrc comes as an open source R package, including at this point: (i) a main function for fitting a Survival Bump Hunting model with various options allowing cross-validated model selection to control model size (#covariates) and model complexity (#peeling steps) and generation of cross-validated end-point estimates; (ii) parallel computing; (iii) various S3-generic and specific plotting functions for data visualization, diagnostic, prediction, summary and display of results. It is available on CRAN and GitHub. PMID:26798326

  19. 46 CFR 8.200 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Recognition of a Classification Society § 8.200 Purpose. This subpart establishes criteria and procedures for vessel classification societies to obtain recognition from the Coast Guard. This recognition is necessary in order for a classification society to become authorized to perform vessel inspection and...

  20. 46 CFR 8.200 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Recognition of a Classification Society § 8.200 Purpose. This subpart establishes criteria and procedures for vessel classification societies to obtain recognition from the Coast Guard. This recognition is necessary in order for a classification society to become authorized to perform vessel inspection and...

  1. Automatic Detection of Welding Defects using Deep Neural Network

    NASA Astrophysics Data System (ADS)

    Hou, Wenhui; Wei, Ye; Guo, Jie; Jin, Yi; Zhu, Chang'an

    2018-01-01

    In this paper, we propose an automatic detection schema including three stages for weld defects in x-ray images. Firstly, the preprocessing procedure for the image is implemented to locate the weld region; Then a classification model which is trained and tested by the patches cropped from x-ray images is constructed based on deep neural network. And this model can learn the intrinsic feature of images without extra calculation; Finally, the sliding-window approach is utilized to detect the whole images based on the trained model. In order to evaluate the performance of the model, we carry out several experiments. The results demonstrate that the classification model we proposed is effective in the detection of welded joints quality.

  2. Characterization and classification of South American land cover types using satellite data

    NASA Technical Reports Server (NTRS)

    Townshend, J. R. G.; Justice, C. O.; Kalb, V.

    1987-01-01

    Various methods are compared for carrying out land cover classifications of South America using multitemporal Advanced Very High Resolution Radiometer data. Fifty-two images of the normalized difference vegetation index (NDVI) from a 1-year period are used to generate multitemporal data sets. Three main approaches to land cover classification are considered, namely the use of the principal components transformed images, the use of a characteristic curves procedure based on NDVI values plotted against time, and finally application of the maximum likelihood rule to multitemporal data sets. Comparison of results from training sites indicates that the last approach yields the most accurate results. Despite the reliance on training site figures for performance assessment, the results are nevertheless extremely encouraging, with accuracies for several cover types exceeding 90 per cent.

  3. Field sampling and data analysis methods for development of ecological land classifications: an application on the Manistee National Forest.

    Treesearch

    George E. Host; Carl W. Ramm; Eunice A. Padley; Kurt S. Pregitzer; James B. Hart; David T. Cleland

    1992-01-01

    Presents technical documentation for development of an Ecological Classification System for the Manistee National Forest in northwest Lower Michigan, and suggests procedures applicable to other ecological land classification projects. Includes discussion of sampling design, field data collection, data summarization and analyses, development of classification units,...

  4. Object-based Classification for Detecting Landslides and Stochastic Procedure to landslide susceptibility maps - A Case at Baolai Village, SW Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, Ying-Tong; Chang, Kuo-Chen; Yang, Ci-Jian

    2017-04-01

    As the result of global warming in the past decades, Taiwan has experienced more and more extreme typhoons with hazardous massive landslides. In this study, we use object-oriented analysis method to classify landslide area at Baolai village by using Formosat-2 satellite images. We used for multiresolution segmented to generate the blocks, and used hierarchical logic to classified 5 different kinds of features. After that, classification the landslide into different type of landslide. Beside, we use stochastic procedure to integrate landslide susceptibility maps. This study assumed that in the extreme event, 2009 Typhoon Morakot, which precipitation goes to 1991.5mm in 5 days, and the highest landslide susceptible area. The results show that study area's landslide area was greatly changes, most of landslide was erosion by gully and made dip slope slide, or erosion by the stream, especially at undercut bank. From the landslide susceptibility maps, we know that the old landslide area have high potential to occur landslides in the extreme event. This study demonstrates the changing of landslide area and the landslide susceptible area. Keywords: Formosat-2, object-oriented, segmentation, classification, landslide, Baolai Village, SW Taiwan, FS

  5. Context aware decision support in neurosurgical oncology based on an efficient classification of endomicroscopic data.

    PubMed

    Li, Yachun; Charalampaki, Patra; Liu, Yong; Yang, Guang-Zhong; Giannarou, Stamatia

    2018-06-13

    Probe-based confocal laser endomicroscopy (pCLE) enables in vivo, in situ tissue characterisation without changes in the surgical setting and simplifies the oncological surgical workflow. The potential of this technique in identifying residual cancer tissue and improving resection rates of brain tumours has been recently verified in pilot studies. The interpretation of endomicroscopic information is challenging, particularly for surgeons who do not themselves routinely review histopathology. Also, the diagnosis can be examiner-dependent, leading to considerable inter-observer variability. Therefore, automatic tissue characterisation with pCLE would support the surgeon in establishing diagnosis as well as guide robot-assisted intervention procedures. The aim of this work is to propose a deep learning-based framework for brain tissue characterisation for context aware diagnosis support in neurosurgical oncology. An efficient representation of the context information of pCLE data is presented by exploring state-of-the-art CNN models with different tuning configurations. A novel video classification framework based on the combination of convolutional layers with long-range temporal recursion has been proposed to estimate the probability of each tumour class. The video classification accuracy is compared for different network architectures and data representation and video segmentation methods. We demonstrate the application of the proposed deep learning framework to classify Glioblastoma and Meningioma brain tumours based on endomicroscopic data. Results show significant improvement of our proposed image classification framework over state-of-the-art feature-based methods. The use of video data further improves the classification performance, achieving accuracy equal to 99.49%. This work demonstrates that deep learning can provide an efficient representation of pCLE data and accurately classify Glioblastoma and Meningioma tumours. The performance evaluation analysis shows the potential clinical value of the technique.

  6. A language of health in action: Read Codes, classifications and groupings.

    PubMed Central

    Stuart-Buttle, C. D.; Read, J. D.; Sanderson, H. F.; Sutton, Y. M.

    1996-01-01

    A cornerstone of the Information Management and Technology Strategy of the National Health Service's (NHS) Executive is fully operational, person-based clinical information systems, from which flow all of the data needed for direct and indirect care of patients by healthcare providers, and local and national management of the NHS. The currency of these data flows are firstly Read-coded clinical terms, secondly the classifications, the International, Classification of Disease and Health Related Problems, 10th Revision (ICD-10) and The Office of Population Censuses and Surveys Classification of Surgical Operations and Procedures, 4th Revision (OPCS-4), and thirdly Healthcare Resource Groups and Health Benefit Groups, all of which together are called the "language of health", an essential element of the electronic clinical record. This paper briefly describes the three main constituents of the language, and how, together with person-based, fully operational clinical information systems, it enables more effective and efficient healthcare delivery. It also describes how the remaining projects of the IM&T Strategy complete the key components necessary to provide the systems that will enable the flow of person-based data, collected once at the point of care and shared amongst all legitimate users via the electronic patient record. PMID:8947631

  7. Combining machine learning and ontological data handling for multi-source classification of nature conservation areas

    NASA Astrophysics Data System (ADS)

    Moran, Niklas; Nieland, Simon; Tintrup gen. Suntrup, Gregor; Kleinschmit, Birgit

    2017-02-01

    Manual field surveys for nature conservation management are expensive and time-consuming and could be supplemented and streamlined by using Remote Sensing (RS). RS is critical to meet requirements of existing laws such as the EU Habitats Directive (HabDir) and more importantly to meet future challenges. The full potential of RS has yet to be harnessed as different nomenclatures and procedures hinder interoperability, comparison and provenance. Therefore, automated tools are needed to use RS data to produce comparable, empirical data outputs that lend themselves to data discovery and provenance. These issues are addressed by a novel, semi-automatic ontology-based classification method that uses machine learning algorithms and Web Ontology Language (OWL) ontologies that yields traceable, interoperable and observation-based classification outputs. The method was tested on European Union Nature Information System (EUNIS) grasslands in Rheinland-Palatinate, Germany. The developed methodology is a first step in developing observation-based ontologies in the field of nature conservation. The tests show promising results for the determination of the grassland indicators wetness and alkalinity with an overall accuracy of 85% for alkalinity and 76% for wetness.

  8. Analytical Procedures for Testability.

    DTIC Science & Technology

    1983-01-01

    Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum

  9. Improved Sparse Multi-Class SVM and Its Application for Gene Selection in Cancer Classification

    PubMed Central

    Huang, Lingkang; Zhang, Hao Helen; Zeng, Zhao-Bang; Bushel, Pierre R.

    2013-01-01

    Background Microarray techniques provide promising tools for cancer diagnosis using gene expression profiles. However, molecular diagnosis based on high-throughput platforms presents great challenges due to the overwhelming number of variables versus the small sample size and the complex nature of multi-type tumors. Support vector machines (SVMs) have shown superior performance in cancer classification due to their ability to handle high dimensional low sample size data. The multi-class SVM algorithm of Crammer and Singer provides a natural framework for multi-class learning. Despite its effective performance, the procedure utilizes all variables without selection. In this paper, we propose to improve the procedure by imposing shrinkage penalties in learning to enforce solution sparsity. Results The original multi-class SVM of Crammer and Singer is effective for multi-class classification but does not conduct variable selection. We improved the method by introducing soft-thresholding type penalties to incorporate variable selection into multi-class classification for high dimensional data. The new methods were applied to simulated data and two cancer gene expression data sets. The results demonstrate that the new methods can select a small number of genes for building accurate multi-class classification rules. Furthermore, the important genes selected by the methods overlap significantly, suggesting general agreement among different variable selection schemes. Conclusions High accuracy and sparsity make the new methods attractive for cancer diagnostics with gene expression data and defining targets of therapeutic intervention. Availability: The source MATLAB code are available from http://math.arizona.edu/~hzhang/software.html. PMID:23966761

  10. Classifying quantum entanglement through topological links

    NASA Astrophysics Data System (ADS)

    Quinta, Gonçalo M.; André, Rui

    2018-04-01

    We propose an alternative classification scheme for quantum entanglement based on topological links. This is done by identifying a nonrigid ring to a particle, attributing the act of cutting and removing a ring to the operation of tracing out the particle, and associating linked rings to entangled particles. This analogy naturally leads us to a classification of multipartite quantum entanglement based on all possible distinct links for a given number of rings. To determine all different possibilities, we develop a formalism that associates any link to a polynomial, with each polynomial thereby defining a distinct equivalence class. To demonstrate the use of this classification scheme, we choose qubit quantum states as our example of physical system. A possible procedure to obtain qubit states from the polynomials is also introduced, providing an example state for each link class. We apply the formalism for the quantum systems of three and four qubits and demonstrate the potential of these tools in a context of qubit networks.

  11. Resting State EEG-based biometrics for individual identification using convolutional neural networks.

    PubMed

    Lan Ma; Minett, James W; Blu, Thierry; Wang, William S-Y

    2015-08-01

    Biometrics is a growing field, which permits identification of individuals by means of unique physical features. Electroencephalography (EEG)-based biometrics utilizes the small intra-personal differences and large inter-personal differences between individuals' brainwave patterns. In the past, such methods have used features derived from manually-designed procedures for this purpose. Another possibility is to use convolutional neural networks (CNN) to automatically extract an individual's best and most unique neural features and conduct classification, using EEG data derived from both Resting State with Open Eyes (REO) and Resting State with Closed Eyes (REC). Results indicate that this CNN-based joint-optimized EEG-based Biometric System yields a high degree of accuracy of identification (88%) for 10-class classification. Furthermore, rich inter-personal difference can be found using a very low frequency band (0-2Hz). Additionally, results suggest that the temporal portions over which subjects can be individualized is less than 200 ms.

  12. Engagement Assessment Using EEG Signals

    NASA Technical Reports Server (NTRS)

    Li, Feng; Li, Jiang; McKenzie, Frederic; Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean

    2012-01-01

    In this paper, we present methods to analyze and improve an EEG-based engagement assessment approach, consisting of data preprocessing, feature extraction and engagement state classification. During data preprocessing, spikes, baseline drift and saturation caused by recording devices in EEG signals are identified and eliminated, and a wavelet based method is utilized to remove ocular and muscular artifacts in the EEG recordings. In feature extraction, power spectrum densities with 1 Hz bin are calculated as features, and these features are analyzed using the Fisher score and the one way ANOVA method. In the classification step, a committee classifier is trained based on the extracted features to assess engagement status. Finally, experiment results showed that there exist significant differences in the extracted features among different subjects, and we have implemented a feature normalization procedure to mitigate the differences and significantly improved the engagement assessment performance.

  13. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Geometry-based ensembles: toward a structural characterization of the classification boundary.

    PubMed

    Pujol, Oriol; Masip, David

    2009-06-01

    This paper introduces a novel binary discriminative learning technique based on the approximation of the nonlinear decision boundary by a piecewise linear smooth additive model. The decision border is geometrically defined by means of the characterizing boundary points-points that belong to the optimal boundary under a certain notion of robustness. Based on these points, a set of locally robust linear classifiers is defined and assembled by means of a Tikhonov regularized optimization procedure in an additive model to create a final lambda-smooth decision rule. As a result, a very simple and robust classifier with a strong geometrical meaning and nonlinear behavior is obtained. The simplicity of the method allows its extension to cope with some of today's machine learning challenges, such as online learning, large-scale learning or parallelization, with linear computational complexity. We validate our approach on the UCI database, comparing with several state-of-the-art classification techniques. Finally, we apply our technique in online and large-scale scenarios and in six real-life computer vision and pattern recognition problems: gender recognition based on face images, intravascular ultrasound tissue classification, speed traffic sign detection, Chagas' disease myocardial damage severity detection, old musical scores clef classification, and action recognition using 3D accelerometer data from a wearable device. The results are promising and this paper opens a line of research that deserves further attention.

  15. Crop identification technology assessment for remote sensing (CITARS). Volume 6: Data processing at the laboratory for applications of remote sensing

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.

    1975-01-01

    The results of classifications and experiments for the crop identification technology assessment for remote sensing are summarized. Using two analysis procedures, 15 data sets were classified. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. Additionally, 20 data sets were classified using training statistics from another segment or date. The classification and proportion estimation results of the local and nonlocal classifications are reported. Data also describe several other experiments to provide additional understanding of the results of the crop identification technology assessment for remote sensing. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, spectral discriminability of corn, soybeans, and other, and analyses of aircraft multispectral data.

  16. Rocket propulsion hazard summary: Safety classification, handling experience and application to space shuttle payload

    NASA Technical Reports Server (NTRS)

    Pennington, D. F.; Man, T.; Persons, B.

    1977-01-01

    The DOT classification for transportation, the military classification for quantity distance, and hazard compatibility grouping used to regulate the transportation and storage of explosives are presented along with a discussion of tests used in determining sensitivity of propellants to an impact/shock environment in the absence of a large explosive donor. The safety procedures and requirements of a Scout launch vehicle, Western and Eastern Test Range, and the Minuteman, Delta, and Poseidon programs are reviewed and summarized. Requirements of the space transportation system safety program include safety reviews from the subsystem level to the completed payload. The Scout safety procedures will satisfy a portion of these requirements but additional procedures need to be implemented to comply with the safety requirements for Shuttle operation from the Eastern Test Range.

  17. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating

    NASA Astrophysics Data System (ADS)

    Matikainen, Leena; Karila, Kirsi; Hyyppä, Juha; Litkey, Paula; Puttonen, Eetu; Ahokas, Eero

    2017-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with passive multispectral information from aerial images, has shown its high feasibility for automated mapping processes. The main benefits have been achieved in the mapping of elevated objects such as buildings and trees. Recently, the first multispectral airborne laser scanners have been launched, and active multispectral information is for the first time available for 3D ALS point clouds from a single sensor. This article discusses the potential of this new technology in map updating, especially in automated object-based land cover classification and change detection in a suburban area. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from an object-based random forests analysis suggest that the multispectral ALS data are very useful for land cover classification, considering both elevated classes and ground-level classes. The overall accuracy of the land cover classification results with six classes was 96% compared with validation points. The classes under study included building, tree, asphalt, gravel, rocky area and low vegetation. Compared to classification of single-channel data, the main improvements were achieved for ground-level classes. According to feature importance analyses, multispectral intensity features based on several channels were more useful than those based on one channel. Automatic change detection for buildings and roads was also demonstrated by utilising the new multispectral ALS data in combination with old map vectors. In change detection of buildings, an old digital surface model (DSM) based on single-channel ALS data was also used. Overall, our analyses suggest that the new data have high potential for further increasing the automation level in mapping. Unlike passive aerial imaging commonly used in mapping, the multispectral ALS technology is independent of external illumination conditions, and there are no shadows on intensity images produced from the data. These are significant advantages in developing automated classification and change detection procedures.

  18. 23 CFR 470.105 - Urban area boundaries and highway functional classification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... classification. 470.105 Section 470.105 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... criteria and procedures are provided in the FHWA publication “Highway Functional Classification—Concepts... functional classification shall be mapped and submitted to the Federal Highway Administration (FHWA) for...

  19. 32 CFR 2001.11 - Original classification authority.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... classification authority. Agencies not possessing such authority shall forward requests to the Director of ISOO... authority. The Director of ISOO shall forward the request, along with the Director's recommendation, to the... level of original classification authority shall forward requests in accordance with the procedures of...

  20. Skin lesion computational diagnosis of dermoscopic images: Ensemble models based on input feature manipulation.

    PubMed

    Oliveira, Roberta B; Pereira, Aledir S; Tavares, João Manuel R S

    2017-10-01

    The number of deaths worldwide due to melanoma has risen in recent times, in part because melanoma is the most aggressive type of skin cancer. Computational systems have been developed to assist dermatologists in early diagnosis of skin cancer, or even to monitor skin lesions. However, there still remains a challenge to improve classifiers for the diagnosis of such skin lesions. The main objective of this article is to evaluate different ensemble classification models based on input feature manipulation to diagnose skin lesions. Input feature manipulation processes are based on feature subset selections from shape properties, colour variation and texture analysis to generate diversity for the ensemble models. Three subset selection models are presented here: (1) a subset selection model based on specific feature groups, (2) a correlation-based subset selection model, and (3) a subset selection model based on feature selection algorithms. Each ensemble classification model is generated using an optimum-path forest classifier and integrated with a majority voting strategy. The proposed models were applied on a set of 1104 dermoscopic images using a cross-validation procedure. The best results were obtained by the first ensemble classification model that generates a feature subset ensemble based on specific feature groups. The skin lesion diagnosis computational system achieved 94.3% accuracy, 91.8% sensitivity and 96.7% specificity. The input feature manipulation process based on specific feature subsets generated the greatest diversity for the ensemble classification model with very promising results. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  2. Design of partially supervised classifiers for multispectral image data

    NASA Technical Reports Server (NTRS)

    Jeon, Byeungwoo; Landgrebe, David

    1993-01-01

    A partially supervised classification problem is addressed, especially when the class definition and corresponding training samples are provided a priori only for just one particular class. In practical applications of pattern classification techniques, a frequently observed characteristic is the heavy, often nearly impossible requirements on representative prior statistical class characteristics of all classes in a given data set. Considering the effort in both time and man-power required to have a well-defined, exhaustive list of classes with a corresponding representative set of training samples, this 'partially' supervised capability would be very desirable, assuming adequate classifier performance can be obtained. Two different classification algorithms are developed to achieve simplicity in classifier design by reducing the requirement of prior statistical information without sacrificing significant classifying capability. The first one is based on optimal significance testing, where the optimal acceptance probability is estimated directly from the data set. In the second approach, the partially supervised classification is considered as a problem of unsupervised clustering with initially one known cluster or class. A weighted unsupervised clustering procedure is developed to automatically define other classes and estimate their class statistics. The operational simplicity thus realized should make these partially supervised classification schemes very viable tools in pattern classification.

  3. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  4. Brain-computer interfacing under distraction: an evaluation study

    NASA Astrophysics Data System (ADS)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech

    2016-10-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.

  5. A real-time classification algorithm for EEG-based BCI driven by self-induced emotions.

    PubMed

    Iacoviello, Daniela; Petracca, Andrea; Spezialetti, Matteo; Placidi, Giuseppe

    2015-12-01

    The aim of this paper is to provide an efficient, parametric, general, and completely automatic real time classification method of electroencephalography (EEG) signals obtained from self-induced emotions. The particular characteristics of the considered low-amplitude signals (a self-induced emotion produces a signal whose amplitude is about 15% of a really experienced emotion) require exploring and adapting strategies like the Wavelet Transform, the Principal Component Analysis (PCA) and the Support Vector Machine (SVM) for signal processing, analysis and classification. Moreover, the method is thought to be used in a multi-emotions based Brain Computer Interface (BCI) and, for this reason, an ad hoc shrewdness is assumed. The peculiarity of the brain activation requires ad-hoc signal processing by wavelet decomposition, and the definition of a set of features for signal characterization in order to discriminate different self-induced emotions. The proposed method is a two stages algorithm, completely parameterized, aiming at a multi-class classification and may be considered in the framework of machine learning. The first stage, the calibration, is off-line and is devoted at the signal processing, the determination of the features and at the training of a classifier. The second stage, the real-time one, is the test on new data. The PCA theory is applied to avoid redundancy in the set of features whereas the classification of the selected features, and therefore of the signals, is obtained by the SVM. Some experimental tests have been conducted on EEG signals proposing a binary BCI, based on the self-induced disgust produced by remembering an unpleasant odor. Since in literature it has been shown that this emotion mainly involves the right hemisphere and in particular the T8 channel, the classification procedure is tested by using just T8, though the average accuracy is calculated and reported also for the whole set of the measured channels. The obtained classification results are encouraging with percentage of success that is, in the average for the whole set of the examined subjects, above 90%. An ongoing work is the application of the proposed procedure to map a large set of emotions with EEG and to establish the EEG headset with the minimal number of channels to allow the recognition of a significant range of emotions both in the field of affective computing and in the development of auxiliary communication tools for subjects affected by severe disabilities. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. 76 FR 47531 - Approval of Classification Societies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... proposed rulemaking (NPRM) proposing application procedures and performance standards that classification... exempt from Coast Guard approval prior to working in the United States. Because [[Page 47532

  7. Conflicts in wound classification of neonatal operations.

    PubMed

    Vu, Lan T; Nobuhara, Kerilyn K; Lee, Hanmin; Farmer, Diana L

    2009-06-01

    This study sought to determine the reliability of wound classification guidelines when applied to neonatal operations. This study is a cross-sectional web-based survey of pediatric surgeons. From a random sample of 22 neonatal operations, participants classified each operation as "clean," "clean-contaminated," "contaminated," or "dirty or infected," and specified duration of perioperative antibiotics as "none," "single preoperative," "24 hours," or ">24 hours." Unweighted kappa score was calculated to estimate interrater reliability. Overall interrater reliability for wound classification was poor (kappa = 0.30). The following operations were classified as clean: pyloromyotomy, resection of sequestration, resection of sacrococcygeal teratoma, oophorectomy, and immediate repair of omphalocele; as clean-contaminated: Ladd procedure, bowel resection for midgut volvulus and meconium peritonitis, fistula ligation of tracheoesophageal fistula, primary esophageal anastomosis of esophageal atresia, thoracic lobectomy, staged closure of gastroschisis, delayed repair and primary closure of omphalocele, perineal anoplasty and diverting colostomy for imperforate anus, anal pull-through for Hirschsprung disease, and colostomy closure; and as dirty: perforated necrotizing enterocolitis. There is poor consensus on how neonatal operations are classified based on contamination. An improved classification system will provide more accurate risk assessment for development of surgical site infections and identify neonates who would benefit from antibiotic prophylaxis.

  8. Land use classification using texture information in ERTS-A MSS imagery

    NASA Technical Reports Server (NTRS)

    Haralick, R. M. (Principal Investigator); Shanmugam, K. S.; Bosley, R.

    1973-01-01

    The author has identified the following significant results. Preliminary digital analysis of ERTS-1 MSS imagery reveals that the textural features of the imagery are very useful for land use classification. A procedure for extracting the textural features of ERTS-1 imagery is presented and the results of a land use classification scheme based on the textural features are also presented. The land use classification algorithm using textural features was tested on a 5100 square mile area covered by part of an ERTS-1 MSS band 5 image over the California coastline. The image covering this area was blocked into 648 subimages of size 8.9 square miles each. Based on a color composite of the image set, a total of 7 land use categories were identified. These land use categories are: coastal forest, woodlands, annual grasslands, urban areas, large irrigated fields, small irrigated fields, and water. The automatic classifier was trained to identify the land use categories using only the textural characteristics of the subimages; 75 percent of the subimages were assigned correct identifications. Since texture and spectral features provide completely different kinds of information, a significant increase in identification accuracy will take place when both features are used together.

  9. Neuromuscular disease classification system

    NASA Astrophysics Data System (ADS)

    Sáez, Aurora; Acha, Begoña; Montero-Sánchez, Adoración; Rivas, Eloy; Escudero, Luis M.; Serrano, Carmen

    2013-06-01

    Diagnosis of neuromuscular diseases is based on subjective visual assessment of biopsies from patients by the pathologist specialist. A system for objective analysis and classification of muscular dystrophies and neurogenic atrophies through muscle biopsy images of fluorescence microscopy is presented. The procedure starts with an accurate segmentation of the muscle fibers using mathematical morphology and a watershed transform. A feature extraction step is carried out in two parts: 24 features that pathologists take into account to diagnose the diseases and 58 structural features that the human eye cannot see, based on the assumption that the biopsy is considered as a graph, where the nodes are represented by each fiber, and two nodes are connected if two fibers are adjacent. A feature selection using sequential forward selection and sequential backward selection methods, a classification using a Fuzzy ARTMAP neural network, and a study of grading the severity are performed on these two sets of features. A database consisting of 91 images was used: 71 images for the training step and 20 as the test. A classification error of 0% was obtained. It is concluded that the addition of features undetectable by the human visual inspection improves the categorization of atrophic patterns.

  10. An extension of the receiver operating characteristic curve and AUC-optimal classification.

    PubMed

    Takenouchi, Takashi; Komori, Osamu; Eguchi, Shinto

    2012-10-01

    While most proposed methods for solving classification problems focus on minimization of the classification error rate, we are interested in the receiver operating characteristic (ROC) curve, which provides more information about classification performance than the error rate does. The area under the ROC curve (AUC) is a natural measure for overall assessment of a classifier based on the ROC curve. We discuss a class of concave functions for AUC maximization in which a boosting-type algorithm including RankBoost is considered, and the Bayesian risk consistency and the lower bound of the optimum function are discussed. A procedure derived by maximizing a specific optimum function has high robustness, based on gross error sensitivity. Additionally, we focus on the partial AUC, which is the partial area under the ROC curve. For example, in medical screening, a high true-positive rate to the fixed lower false-positive rate is preferable and thus the partial AUC corresponding to lower false-positive rates is much more important than the remaining AUC. We extend the class of concave optimum functions for partial AUC optimality with the boosting algorithm. We investigated the validity of the proposed method through several experiments with data sets in the UCI repository.

  11. Antibacterial Activity of Imidazolium-Based Ionic Liquids Investigated by QSAR Modeling and Experimental Studies.

    PubMed

    Hodyna, Diana; Kovalishyn, Vasyl; Rogalsky, Sergiy; Blagodatnyi, Volodymyr; Petko, Kirill; Metelytsia, Larisa

    2016-09-01

    Predictive QSAR models for the inhibitors of B. subtilis and Ps. aeruginosa among imidazolium-based ionic liquids were developed using literary data. The regression QSAR models were created through Artificial Neural Network and k-nearest neighbor procedures. The classification QSAR models were constructed using WEKA-RF (random forest) method. The predictive ability of the models was tested by fivefold cross-validation; giving q(2) = 0.77-0.92 for regression models and accuracy 83-88% for classification models. Twenty synthesized samples of 1,3-dialkylimidazolium ionic liquids with predictive value of activity level of antimicrobial potential were evaluated. For all asymmetric 1,3-dialkylimidazolium ionic liquids, only compounds containing at least one radical with alkyl chain length of 12 carbon atoms showed high antibacterial activity. However, the activity of symmetric 1,3-dialkylimidazolium salts was found to have opposite relationship with the length of aliphatic radical being maximum for compounds based on 1,3-dioctylimidazolium cation. The obtained experimental results suggested that the application of classification QSAR models is more accurate for the prediction of activity of new imidazolium-based ILs as potential antibacterials. © 2016 John Wiley & Sons A/S.

  12. Identifying complications of interventional procedures from UK routine healthcare databases: a systematic search for methods using clinical codes.

    PubMed

    Keltie, Kim; Cole, Helen; Arber, Mick; Patrick, Hannah; Powell, John; Campbell, Bruce; Sims, Andrew

    2014-11-28

    Several authors have developed and applied methods to routine data sets to identify the nature and rate of complications following interventional procedures. But, to date, there has been no systematic search for such methods. The objective of this article was to find, classify and appraise published methods, based on analysis of clinical codes, which used routine healthcare databases in a United Kingdom setting to identify complications resulting from interventional procedures. A literature search strategy was developed to identify published studies that referred, in the title or abstract, to the name or acronym of a known routine healthcare database and to complications from procedures or devices. The following data sources were searched in February and March 2013: Cochrane Methods Register, Conference Proceedings Citation Index - Science, Econlit, EMBASE, Health Management Information Consortium, Health Technology Assessment database, MathSciNet, MEDLINE, MEDLINE in-process, OAIster, OpenGrey, Science Citation Index Expanded and ScienceDirect. Of the eligible papers, those which reported methods using clinical coding were classified and summarised in tabular form using the following headings: routine healthcare database; medical speciality; method for identifying complications; length of follow-up; method of recording comorbidity. The benefits and limitations of each approach were assessed. From 3688 papers identified from the literature search, 44 reported the use of clinical codes to identify complications, from which four distinct methods were identified: 1) searching the index admission for specified clinical codes, 2) searching a sequence of admissions for specified clinical codes, 3) searching for specified clinical codes for complications from procedures and devices within the International Classification of Diseases 10th revision (ICD-10) coding scheme which is the methodology recommended by NHS Classification Service, and 4) conducting manual clinical review of diagnostic and procedure codes. The four distinct methods identifying complication from codified data offer great potential in generating new evidence on the quality and safety of new procedures using routine data. However the most robust method, using the methodology recommended by the NHS Classification Service, was the least frequently used, highlighting that much valuable observational data is being ignored.

  13. A laboratory procedure for measuring and georeferencing soil colour

    NASA Astrophysics Data System (ADS)

    Marques-Mateu, A.; Balaguer-Puig, M.; Moreno-Ramon, H.; Ibanez-Asensio, S.

    2015-04-01

    Remote sensing and geospatial applications very often require ground truth data to assess outcomes from spatial analyses or environmental models. Those data sets, however, may be difficult to collect in proper format or may even be unavailable. In the particular case of soil colour the collection of reliable ground data can be cumbersome due to measuring methods, colour communication issues, and other practical factors which lead to a lack of standard procedure for soil colour measurement and georeferencing. In this paper we present a laboratory procedure that provides colour coordinates of georeferenced soil samples which become useful in later processing stages of soil mapping and classification from digital images. The procedure requires a laboratory setup consisting of a light booth and a trichromatic colorimeter, together with a computer program that performs colour measurement, storage, and colour space transformation tasks. Measurement tasks are automated by means of specific data logging routines which allow storing recorded colour data in a spatial format. A key feature of the system is the ability of transforming between physically-based colour spaces and the Munsell system which is still the standard in soil science. The working scheme pursues the automation of routine tasks whenever possible and the avoidance of input mistakes by means of a convenient layout of the user interface. The program can readily manage colour and coordinate data sets which eventually allow creating spatial data sets. All the tasks regarding data joining between colorimeter measurements and samples locations are executed by the software in the background, allowing users to concentrate on samples processing. As a result, we obtained a robust and fully functional computer-based procedure which has proven a very useful tool for sample classification or cataloging purposes as well as for integrating soil colour data with other remote sensed and spatial data sets.

  14. Key Issues in the Analysis of Remote Sensing Data: A report on the workshop

    NASA Technical Reports Server (NTRS)

    Swain, P. H. (Principal Investigator)

    1981-01-01

    The procedures of a workshop assessing the state of the art of machine analysis of remotely sensed data are summarized. Areas discussed were: data bases, image registration, image preprocessing operations, map oriented considerations, advanced digital systems, artificial intelligence methods, image classification, and improved classifier training. Recommendations of areas for further research are presented.

  15. Effect of Content Knowledge on Angoff-Style Standard Setting Judgments

    ERIC Educational Resources Information Center

    Margolis, Melissa J.; Mee, Janet; Clauser, Brian E.; Winward, Marcia; Clauser, Jerome C.

    2016-01-01

    Evidence to support the credibility of standard setting procedures is a critical part of the validity argument for decisions made based on tests that are used for classification. One area in which there has been limited empirical study is the impact of standard setting judge selection on the resulting cut score. One important issue related to…

  16. Delinquency Level Classification Via the HEW Community Program Youth Impact Scales.

    ERIC Educational Resources Information Center

    Truckenmiller, James L.

    The former HEW National Strategy for Youth Development (NSYD) model was created as a community-based planning and procedural tool to promote youth development and prevent delinquency. To assess the predictive power of NSYD Impact Scales in classifying youths into low, medium, and high delinquency levels, male and female students aged 10-19 years…

  17. A neural network for noise correlation classification

    NASA Astrophysics Data System (ADS)

    Paitz, Patrick; Gokhberg, Alexey; Fichtner, Andreas

    2018-02-01

    We present an artificial neural network (ANN) for the classification of ambient seismic noise correlations into two categories, suitable and unsuitable for noise tomography. By using only a small manually classified data subset for network training, the ANN allows us to classify large data volumes with low human effort and to encode the valuable subjective experience of data analysts that cannot be captured by a deterministic algorithm. Based on a new feature extraction procedure that exploits the wavelet-like nature of seismic time-series, we efficiently reduce the dimensionality of noise correlation data, still keeping relevant features needed for automated classification. Using global- and regional-scale data sets, we show that classification errors of 20 per cent or less can be achieved when the network training is performed with as little as 3.5 per cent and 16 per cent of the data sets, respectively. Furthermore, the ANN trained on the regional data can be applied to the global data, and vice versa, without a significant increase of the classification error. An experiment where four students manually classified the data, revealed that the classification error they would assign to each other is substantially larger than the classification error of the ANN (>35 per cent). This indicates that reproducibility would be hampered more by human subjectivity than by imperfections of the ANN.

  18. Direct costs of emergency medical care: a diagnosis-based case-mix classification system.

    PubMed

    Baraff, L J; Cameron, J M; Sekhon, R

    1991-01-01

    To develop a diagnosis-based case mix classification system for emergency department patient visits based on direct costs of care designed for an outpatient setting. Prospective provider time study with collection of financial data from each hospital's accounts receivable system and medical information, including discharge diagnosis, from hospital medical records. Three community hospital EDs in Los Angeles County during selected times in 1984. Only direct costs of care were included: health care provider time, ED management and clerical personnel excluding registration, nonlabor ED expense including supplies, and ancillary hospital services. Indirect costs for hospitals and physicians, including depreciation and amortization, debt service, utilities, malpractice insurance, administration, billing, registration, and medical records were not included. Costs were derived by valuing provider time based on a formula using annual income or salary and fringe benefits, productivity and direct care factors, and using hospital direct cost to charge ratios. Physician costs were based on a national study of emergency physician income and excluded practice costs. Patients were classified into one of 216 emergency department groups (EDGs) on the basis of the discharge diagnosis, patient disposition, age, and the presence of a limited number of physician procedures. Total mean direct costs ranged from $23 for follow-up visit to $936 for trauma, admitted, with critical care procedure. The mean total direct costs for the 16,771 nonadmitted patients was $69. Of this, 34% was for ED costs, 45% was for ancillary service costs, and 21% was for physician costs. The mean total direct costs for the 1,955 admitted patients was $259. Of this, 23% was for ED costs, 63% was for ancillary service costs, and 14% was for physician costs. Laboratory and radiographic services accounted for approximately 85% of all ancillary service costs and 38% of total direct costs for nonadmitted patients versus 80% of ancillary service costs and 51% of total direct costs for admitted patients. We have developed a diagnosis-based case mix classification system for ED patient visits based on direct costs of care designed for an outpatient setting which, unlike diagnosis-related groups, includes the measurement of time-based cost for physician and nonphysician services. This classification system helps to define direct costs of hospital and physician emergency services by type of patient.

  19. The First AO Classification System for Fractures of the Craniomaxillofacial Skeleton: Rationale, Methodological Background, Developmental Process, and Objectives

    PubMed Central

    Audigé, Laurent; Cornelius, Carl-Peter; Ieva, Antonio Di; Prein, Joachim

    2014-01-01

    Validated trauma classification systems are the sole means to provide the basis for reliable documentation and evaluation of patient care, which will open the gateway to evidence-based procedures and healthcare in the coming years. With the support of AO Investigation and Documentation, a classification group was established to develop and evaluate a comprehensive classification system for craniomaxillofacial (CMF) fractures. Blueprints for fracture classification in the major constituents of the human skull were drafted and then evaluated by a multispecialty group of experienced CMF surgeons and a radiologist in a structured process during iterative agreement sessions. At each session, surgeons independently classified the radiological imaging of up to 150 consecutive cases with CMF fractures. During subsequent review meetings, all discrepancies in the classification outcome were critically appraised for clarification and improvement until consensus was reached. The resulting CMF classification system is structured in a hierarchical fashion with three levels of increasing complexity. The most elementary level 1 simply distinguishes four fracture locations within the skull: mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94). Levels 2 and 3 focus on further defining the fracture locations and for fracture morphology, achieving an almost individual mapping of the fracture pattern. This introductory article describes the rationale for the comprehensive AO CMF classification system, discusses the methodological framework, and provides insight into the experiences and interactions during the evaluation process within the core groups. The details of this system in terms of anatomy and levels are presented in a series of focused tutorials illustrated with case examples in this special issue of the Journal. PMID:25489387

  20. The First AO Classification System for Fractures of the Craniomaxillofacial Skeleton: Rationale, Methodological Background, Developmental Process, and Objectives.

    PubMed

    Audigé, Laurent; Cornelius, Carl-Peter; Di Ieva, Antonio; Prein, Joachim

    2014-12-01

    Validated trauma classification systems are the sole means to provide the basis for reliable documentation and evaluation of patient care, which will open the gateway to evidence-based procedures and healthcare in the coming years. With the support of AO Investigation and Documentation, a classification group was established to develop and evaluate a comprehensive classification system for craniomaxillofacial (CMF) fractures. Blueprints for fracture classification in the major constituents of the human skull were drafted and then evaluated by a multispecialty group of experienced CMF surgeons and a radiologist in a structured process during iterative agreement sessions. At each session, surgeons independently classified the radiological imaging of up to 150 consecutive cases with CMF fractures. During subsequent review meetings, all discrepancies in the classification outcome were critically appraised for clarification and improvement until consensus was reached. The resulting CMF classification system is structured in a hierarchical fashion with three levels of increasing complexity. The most elementary level 1 simply distinguishes four fracture locations within the skull: mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94). Levels 2 and 3 focus on further defining the fracture locations and for fracture morphology, achieving an almost individual mapping of the fracture pattern. This introductory article describes the rationale for the comprehensive AO CMF classification system, discusses the methodological framework, and provides insight into the experiences and interactions during the evaluation process within the core groups. The details of this system in terms of anatomy and levels are presented in a series of focused tutorials illustrated with case examples in this special issue of the Journal.

  1. Deep learning for hybrid EEG-fNIRS brain-computer interface: application to motor imagery classification.

    PubMed

    Chiarelli, Antonio Maria; Croce, Pierpaolo; Merla, Arcangelo; Zappasodi, Filippo

    2018-06-01

    Brain-computer interface (BCI) refers to procedures that link the central nervous system to a device. BCI was historically performed using electroencephalography (EEG). In the last years, encouraging results were obtained by combining EEG with other neuroimaging technologies, such as functional near infrared spectroscopy (fNIRS). A crucial step of BCI is brain state classification from recorded signal features. Deep artificial neural networks (DNNs) recently reached unprecedented complex classification outcomes. These performances were achieved through increased computational power, efficient learning algorithms, valuable activation functions, and restricted or back-fed neurons connections. By expecting significant overall BCI performances, we investigated the capabilities of combining EEG and fNIRS recordings with state-of-the-art deep learning procedures. We performed a guided left and right hand motor imagery task on 15 subjects with a fixed classification response time of 1 s and overall experiment length of 10 min. Left versus right classification accuracy of a DNN in the multi-modal recording modality was estimated and it was compared to standalone EEG and fNIRS and other classifiers. At a group level we obtained significant increase in performance when considering multi-modal recordings and DNN classifier with synergistic effect. BCI performances can be significantly improved by employing multi-modal recordings that provide electrical and hemodynamic brain activity information, in combination with advanced non-linear deep learning classification procedures.

  2. Deep learning for hybrid EEG-fNIRS brain–computer interface: application to motor imagery classification

    NASA Astrophysics Data System (ADS)

    Chiarelli, Antonio Maria; Croce, Pierpaolo; Merla, Arcangelo; Zappasodi, Filippo

    2018-06-01

    Objective. Brain–computer interface (BCI) refers to procedures that link the central nervous system to a device. BCI was historically performed using electroencephalography (EEG). In the last years, encouraging results were obtained by combining EEG with other neuroimaging technologies, such as functional near infrared spectroscopy (fNIRS). A crucial step of BCI is brain state classification from recorded signal features. Deep artificial neural networks (DNNs) recently reached unprecedented complex classification outcomes. These performances were achieved through increased computational power, efficient learning algorithms, valuable activation functions, and restricted or back-fed neurons connections. By expecting significant overall BCI performances, we investigated the capabilities of combining EEG and fNIRS recordings with state-of-the-art deep learning procedures. Approach. We performed a guided left and right hand motor imagery task on 15 subjects with a fixed classification response time of 1 s and overall experiment length of 10 min. Left versus right classification accuracy of a DNN in the multi-modal recording modality was estimated and it was compared to standalone EEG and fNIRS and other classifiers. Main results. At a group level we obtained significant increase in performance when considering multi-modal recordings and DNN classifier with synergistic effect. Significance. BCI performances can be significantly improved by employing multi-modal recordings that provide electrical and hemodynamic brain activity information, in combination with advanced non-linear deep learning classification procedures.

  3. [Guideline development for rehabilitation of breast cancer patients - phase 2: findings from the classification of therapeutic procedures, KTL-data-analysis].

    PubMed

    Domann, U; Brüggemann, S; Klosterhuis, H; Weis, J

    2007-08-01

    Aim of this project is the development of an evidence based guideline for the rehabilitation of breast cancer patients, funded by the German Pension Insurance scheme. The project consists of four phases. This paper is focused on the 2nd phase, i.e., analysis of procedures in rehabilitation based on evidence based therapeutic modules. As a result of a systematic literature review 14 therapeutic modules were defined. From a total of 840 possible KTL Codes (Klassifikation Therapeutischer Leistungen, Classification of therapeutic procedures), 229 could be assigned to these modules. These analyses are based on 24685 patients in 57 rehabilitation clinics, who had been treated in 2003. For these modules the number of patients having received those interventions as well as the duration of the modules were calculated. The data were analysed with respect to the influence of age and comorbidity. Moreover, differences between rehabilitation clinics were investigated according to the category of interventions. Our findings show great variability in the use of the therapeutic modules. Therapeutic modules like Physiotherapy (91.6%), Training Therapy (85.2%) and Information (97.8%) are provided to most of the patients. Younger patients receive more treatments than older patients, and patients with higher comorbidity receive more Physiotherapie, Lymphoedema Therapy and Psychological Interventions than patients without comorbidities. Data analysis shows wide interindividual variability with regard to the therapeutic modules. This variability is related to age and comorbidity of the patients. Furthermore, great differences were found between the rehabilitation clinics concerning the use of the various interventions. This variability supports the necessity of developing and implementing an evidence based guideline for the rehabilitation of breast cancer patients. The next step will be discussing these findings with experts from science and clinical practice.

  4. Independent components analysis to increase efficiency of discriminant analysis methods (FDA and LDA): Application to NMR fingerprinting of wine.

    PubMed

    Monakhova, Yulia B; Godelmann, Rolf; Kuballa, Thomas; Mushtakova, Svetlana P; Rutledge, Douglas N

    2015-08-15

    Discriminant analysis (DA) methods, such as linear discriminant analysis (LDA) or factorial discriminant analysis (FDA), are well-known chemometric approaches for solving classification problems in chemistry. In most applications, principle components analysis (PCA) is used as the first step to generate orthogonal eigenvectors and the corresponding sample scores are utilized to generate discriminant features for the discrimination. Independent components analysis (ICA) based on the minimization of mutual information can be used as an alternative to PCA as a preprocessing tool for LDA and FDA classification. To illustrate the performance of this ICA/DA methodology, four representative nuclear magnetic resonance (NMR) data sets of wine samples were used. The classification was performed regarding grape variety, year of vintage and geographical origin. The average increase for ICA/DA in comparison with PCA/DA in the percentage of correct classification varied between 6±1% and 8±2%. The maximum increase in classification efficiency of 11±2% was observed for discrimination of the year of vintage (ICA/FDA) and geographical origin (ICA/LDA). The procedure to determine the number of extracted features (PCs, ICs) for the optimum DA models was discussed. The use of independent components (ICs) instead of principle components (PCs) resulted in improved classification performance of DA methods. The ICA/LDA method is preferable to ICA/FDA for recognition tasks based on NMR spectroscopic measurements. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A procedure to detect abnormal sensorimotor control in adolescents with idiopathic scoliosis.

    PubMed

    Pialasse, Jean-Philippe; Mercier, Pierre; Descarreaux, Martin; Simoneau, Martin

    2017-09-01

    This work identifies, among adolescents with idiopathic scoliosis, those demonstrating impaired sensorimotor control through a classification procedure comparing the amplitude of their vestibular-evoked postural responses. The sensorimotor control of healthy adolescents (n=17) and adolescents with idiopathic scoliosis (n=52) with either mild (Cobb angle≥15° and ≤30°) or severe (Cobb angle >30°) spine deformation was assessed through galvanic vestibular stimulation. A classification procedure sorted out adolescents with idiopathic scoliosis whether the amplitude of their vestibular-evoked postural response was dissimilar or similar to controls. Compared to controls, galvanic vestibular stimulation evoked larger postural response in adolescents with idiopathic scoliosis. Nonetheless, the classification procedure revealed that only 42.5% of all patients showed impaired sensorimotor control. Consequently, identifying patients with sensorimotor control impairment would allow to apply personalized treatments, help clinicians to establish prognosis and hopefully improve the condition of patients with adolescent idiopathic scoliosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Use of Binary Partition Tree and energy minimization for object-based classification of urban land cover

    NASA Astrophysics Data System (ADS)

    Li, Mengmeng; Bijker, Wietske; Stein, Alfred

    2015-04-01

    Two main challenges are faced when classifying urban land cover from very high resolution satellite images: obtaining an optimal image segmentation and distinguishing buildings from other man-made objects. For optimal segmentation, this work proposes a hierarchical representation of an image by means of a Binary Partition Tree (BPT) and an unsupervised evaluation of image segmentations by energy minimization. For building extraction, we apply fuzzy sets to create a fuzzy landscape of shadows which in turn involves a two-step procedure. The first step is a preliminarily image classification at a fine segmentation level to generate vegetation and shadow information. The second step models the directional relationship between building and shadow objects to extract building information at the optimal segmentation level. We conducted the experiments on two datasets of Pléiades images from Wuhan City, China. To demonstrate its performance, the proposed classification is compared at the optimal segmentation level with Maximum Likelihood Classification and Support Vector Machine classification. The results show that the proposed classification produced the highest overall accuracies and kappa coefficients, and the smallest over-classification and under-classification geometric errors. We conclude first that integrating BPT with energy minimization offers an effective means for image segmentation. Second, we conclude that the directional relationship between building and shadow objects represented by a fuzzy landscape is important for building extraction.

  7. 47 CFR 1.929 - Classification of filings as major or minor.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Classification of filings as major or minor. 1.929 Section 1.929 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE... Classification of filings as major or minor. Applications and amendments to applications for stations in the...

  8. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification.

    PubMed

    Baczyńska, Anna K; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach's alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed.

  9. International consensus for neuroblastoma molecular diagnostics: report from the International Neuroblastoma Risk Group (INRG) Biology Committee

    PubMed Central

    Ambros, P F; Ambros, I M; Brodeur, G M; Haber, M; Khan, J; Nakagawara, A; Schleiermacher, G; Speleman, F; Spitz, R; London, W B; Cohn, S L; Pearson, A D J; Maris, J M

    2009-01-01

    Neuroblastoma serves as a paradigm for utilising tumour genomic data for determining patient prognosis and treatment allocation. However, before the establishment of the International Neuroblastoma Risk Group (INRG) Task Force in 2004, international consensus on markers, methodology, and data interpretation did not exist, compromising the reliability of decisive genetic markers and inhibiting translational research efforts. The objectives of the INRG Biology Committee were to identify highly prognostic genetic aberrations to be included in the new INRG risk classification schema and to develop precise definitions, decisive biomarkers, and technique standardisation. The review of the INRG database (n=8800 patients) by the INRG Task Force finally enabled the identification of the most significant neuroblastoma biomarkers. In addition, the Biology Committee compared the standard operating procedures of different cooperative groups to arrive at international consensus for methodology, nomenclature, and future directions. Consensus was reached to include MYCN status, 11q23 allelic status, and ploidy in the INRG classification system on the basis of an evidence-based review of the INRG database. Standardised operating procedures for analysing these genetic factors were adopted, and criteria for proper nomenclature were developed. Neuroblastoma treatment planning is highly dependant on tumour cell genomic features, and it is likely that a comprehensive panel of DNA-based biomarkers will be used in future risk assignment algorithms applying genome-wide techniques. Consensus on methodology and interpretation is essential for uniform INRG classification and will greatly facilitate international and cooperative clinical and translational research studies. PMID:19401703

  10. 40 CFR 152.166 - Labeling of restricted use products.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152... formulation into other pesticide products) is not subject to the labeling requirements of this subpart. ...

  11. 40 CFR 152.166 - Labeling of restricted use products.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152... formulation into other pesticide products) is not subject to the labeling requirements of this subpart. ...

  12. 40 CFR 152.166 - Labeling of restricted use products.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152... formulation into other pesticide products) is not subject to the labeling requirements of this subpart. ...

  13. 40 CFR 152.166 - Labeling of restricted use products.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152... formulation into other pesticide products) is not subject to the labeling requirements of this subpart. ...

  14. 40 CFR 152.166 - Labeling of restricted use products.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152... formulation into other pesticide products) is not subject to the labeling requirements of this subpart. ...

  15. The influence of multispectral scanner spatial resolution on forest feature classification

    NASA Technical Reports Server (NTRS)

    Sadowski, F. G.; Malila, W. A.; Sarno, J. E.; Nalepka, R. F.

    1977-01-01

    Inappropriate spatial resolution and corresponding data processing techniques may be major causes for non-optimal forest classification results frequently achieved from multispectral scanner (MSS) data. Procedures and results of empirical investigations are studied to determine the influence of MSS spatial resolution on the classification of forest features into levels of detail or hierarchies of information that might be appropriate for nationwide forest surveys and detailed in-place inventories. Two somewhat different, but related studies are presented. The first consisted of establishing classification accuracies for several hierarchies of features as spatial resolution was progressively coarsened from (2 meters) squared to (64 meters) squared. The second investigated the capabilities for specialized processing techniques to improve upon the results of conventional processing procedures for both coarse and fine resolution data.

  16. A higher order conditional random field model for simultaneous classification of land cover and land use

    NASA Astrophysics Data System (ADS)

    Albert, Lena; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    We propose a new approach for the simultaneous classification of land cover and land use considering spatial as well as semantic context. We apply a Conditional Random Fields (CRF) consisting of a land cover and a land use layer. In the land cover layer of the CRF, the nodes represent super-pixels; in the land use layer, the nodes correspond to objects from a geospatial database. Intra-layer edges of the CRF model spatial dependencies between neighbouring image sites. All spatially overlapping sites in both layers are connected by inter-layer edges, which leads to higher order cliques modelling the semantic relation between all land cover and land use sites in the clique. A generic formulation of the higher order potential is proposed. In order to enable efficient inference in the two-layer higher order CRF, we propose an iterative inference procedure in which the two classification tasks mutually influence each other. We integrate contextual relations between land cover and land use in the classification process by using contextual features describing the complex dependencies of all nodes in a higher order clique. These features are incorporated in a discriminative classifier, which approximates the higher order potentials during the inference procedure. The approach is designed for input data based on aerial images. Experiments are carried out on two test sites to evaluate the performance of the proposed method. The experiments show that the classification results are improved compared to the results of a non-contextual classifier. For land cover classification, the result is much more homogeneous and the delineation of land cover segments is improved. For the land use classification, an improvement is mainly achieved for land use objects showing non-typical characteristics or similarities to other land use classes. Furthermore, we have shown that the size of the super-pixels has an influence on the level of detail of the classification result, but also on the degree of smoothing induced by the segmentation method, which is especially beneficial for land cover classes covering large, homogeneous areas.

  17. A phylogenomic approach to bacterial subspecies classification: proof of concept in Mycobacterium abscessus.

    PubMed

    Tan, Joon Liang; Khang, Tsung Fei; Ngeow, Yun Fong; Choo, Siew Woh

    2013-12-13

    Mycobacterium abscessus is a rapidly growing mycobacterium that is often associated with human infections. The taxonomy of this species has undergone several revisions and is still being debated. In this study, we sequenced the genomes of 12 M. abscessus strains and used phylogenomic analysis to perform subspecies classification. A data mining approach was used to rank and select informative genes based on the relative entropy metric for the construction of a phylogenetic tree. The resulting tree topology was similar to that generated using the concatenation of five classical housekeeping genes: rpoB, hsp65, secA, recA and sodA. Additional support for the reliability of the subspecies classification came from the analysis of erm41 and ITS gene sequences, single nucleotide polymorphisms (SNPs)-based classification and strain clustering demonstrated by a variable number tandem repeat (VNTR) assay and a multilocus sequence analysis (MLSA). We subsequently found that the concatenation of a minimal set of three median-ranked genes: DNA polymerase III subunit alpha (polC), 4-hydroxy-2-ketovalerate aldolase (Hoa) and cell division protein FtsZ (ftsZ), is sufficient to recover the same tree topology. PCR assays designed specifically for these genes showed that all three genes could be amplified in the reference strain of M. abscessus ATCC 19977T. This study provides proof of concept that whole-genome sequence-based data mining approach can provide confirmatory evidence of the phylogenetic informativeness of existing markers, as well as lead to the discovery of a more economical and informative set of markers that produces similar subspecies classification in M. abscessus. The systematic procedure used in this study to choose the informative minimal set of gene markers can potentially be applied to species or subspecies classification of other bacteria.

  18. Land cover mapping of North and Central America—Global Land Cover 2000

    USGS Publications Warehouse

    Latifovic, Rasim; Zhu, Zhi-Liang

    2004-01-01

    The Land Cover Map of North and Central America for the year 2000 (GLC 2000-NCA), prepared by NRCan/CCRS and USGS/EROS Data Centre (EDC) as a regional component of the Global Land Cover 2000 project, is the subject of this paper. A new mapping approach for transforming satellite observations acquired by the SPOT4/VGTETATION (VGT) sensor into land cover information is outlined. The procedure includes: (1) conversion of daily data into 10-day composite; (2) post-seasonal correction and refinement of apparent surface reflectance in 10-day composite images; and (3) extraction of land cover information from the composite images. The pre-processing and mosaicking techniques developed and used in this study proved to be very effective in removing cloud contamination, BRDF effects, and noise in Short Wave Infra-Red (SWIR). The GLC 2000-NCA land cover map is provided as a regional product with 28 land cover classes based on modified Federal Geographic Data Committee/Vegetation Classification Standard (FGDC NVCS) classification system, and as part of a global product with 22 land cover classes based on Land Cover Classification System (LCCS) of the Food and Agriculture Organisation. The map was compared on both areal and per-pixel bases over North and Central America to the International Geosphere–Biosphere Programme (IGBP) global land cover classification, the University of Maryland global land cover classification (UMd) and the Moderate Resolution Imaging Spectroradiometer (MODIS) Global land cover classification produced by Boston University (BU). There was good agreement (79%) on the spatial distribution and areal extent of forest between GLC 2000-NCA and the other maps, however, GLC 2000-NCA provides additional information on the spatial distribution of forest types. The GLC 2000-NCA map was produced at the continental level incorporating specific needs of the region.

  19. 40 CFR 152.161 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.161 Definitions. In addition to the definitions in § 152.3, the following terms are defined for the purposes of this subpart: (a...

  20. 40 CFR 152.161 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.161 Definitions. In addition to the definitions in § 152.3, the following terms are defined for the purposes of this subpart: (a...

  1. Nearest Neighbor Classification of Stationary Time Series: An Application to Anesthesia Level Classification by EEG Analysis.

    DTIC Science & Technology

    1980-12-05

    classification procedures that are common in speech processing. The anesthesia level classification by EEG time series population screening problem example is in...formance. The use of the KL number type metric in NN rule classification, in a delete-one subj ect ’s EE-at-a-time KL-NN and KL- kNN classification of the...17 individual labeled EEG sample population using KL-NN and KL- kNN rules. The results obtained are shown in Table 1. The entries in the table indicate

  2. Influence of leaching conditions for ecotoxicological classification of ash.

    PubMed

    Stiernström, S; Enell, A; Wik, O; Hemström, K; Breitholtz, M

    2014-02-01

    The Waste Framework Directive (WFD; 2008/98/EC) states that classification of hazardous ecotoxicological properties of wastes (i.e. criteria H-14), should be based on the Community legislation on chemicals (i.e. CLP Regulation 1272/2008). However, harmonizing the waste and chemical classification may involve drastic changes related to choice of leaching tests as compared to e.g. the current European standard for ecotoxic characterization of waste (CEN 14735). The primary aim of the present study was therefore to evaluate the influence of leaching conditions, i.e. pH (inherent pH (∼10), and 7), liquid to solid (L/S) ratio (10 and 1000 L/kg) and particle size (<4 mm, <1 mm, and <0.125 mm), for subsequent chemical analysis and ecotoxicity testing in relation to classification of municipal waste incineration bottom ash. The hazard potential, based on either comparisons between element levels in leachate and literature toxicity data or ecotoxicity testing of the leachates, was overall significantly higher at low particle size (<0.125 mm) as compared to particle fractions <1mm and <4mm, at pH 10 as compared to pH 7, and at L/S 10 as compared to L/S 1000. These results show that the choice of leaching conditions is crucial for H-14 classification of ash and must be carefully considered in deciding on future guidance procedures in Europe. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. [Medulloblastoma. Pathology].

    PubMed

    Siegfried, A; Delisle, M-B

    2018-04-24

    Medulloblastomas, embryonal neuroepithelial tumors developed in the cerebellum or brain stem, are mainly observed in childhood. The treatment of WHO-Grade IV tumors depends on stratifications that are usually based on postoperative data, histopathological subtype, tumor extension and presence of MYC or NMYC amplifications. Recently, molecular biology studies, based on new technologies (i.e. sequencing, transcriptomic, methylomic) have introduced genetic subtypes integrated into the latest WHO-2016 neuropathological classification. According to this classification, the three genetic groups WNT, SHH, with or without mutated TP53 gene, and non-WNT/non-SHH, comprising subgroups 3 and 4, are recalled in this review. The contribution of immunohistochemistry to define these groups is specified. The four histopathological groups are detailed in comparison to the WHO-2007 classification and the molecular data: classic medulloblastoma, desmoplastic/nodular medulloblastoma, medulloblastoma with extensive nodularity, and large cell/anaplastic medulloblastoma. The groups defined on genetic and histopathological grounds are not strictly concordant. Depending on the age of the patients, their correlations are different, as well as their role in the management and prognosis of these tumors. Other embryonal tumors, for which new classifications are in progress and gliomas may be confused with a medulloblastoma and the elements of the differential diagnosis of these entities are discussed. This evolution in classification fully justifies ongoing structuring procedures such as histopathological review (RENOCLIP) and the organization of molecular biology platforms. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  4. Improving the Reliability of Tinnitus Screening in Laboratory Animals.

    PubMed

    Jones, Aikeen; May, Bradford J

    2017-02-01

    Behavioral screening remains a contentious issue for animal studies of tinnitus. Most paradigms base a positive tinnitus test on an animal's natural tendency to respond to the "sound" of tinnitus as if it were an actual sound. As a result, animals with tinnitus are expected to display sound-conditioned behaviors when no sound is present or to miss gaps in background sounds because tinnitus "fills in the gap." Reliable confirmation of the behavioral indications of tinnitus can be problematic because the reinforcement contingencies of conventional discrimination tasks break down an animal's tendency to group tinnitus with sound. When responses in silence are rewarded, animals respond in silence regardless of their tinnitus status. When responses in silence are punished, animals stop responding. This study introduces stimulus classification as an alternative approach to tinnitus screening. Classification procedures train animals to respond to the common perceptual features that define a group of sounds (e.g., high pitch or narrow bandwidth). Our procedure trains animals to drink when they hear tinnitus and to suppress drinking when they hear other sounds. Animals with tinnitus are revealed by their tendency to drink in the presence of unreinforced probe sounds that share the perceptual features of the tinnitus classification. The advantages of this approach are illustrated by taking laboratory rats through a testing sequence that includes classification training, the experimental induction of tinnitus, and postinduction screening. Behavioral indications of tinnitus are interpreted and then verified by simulating a known tinnitus percept with objective sounds.

  5. Evidence and evidence gaps in the treatment of Eustachian tube dysfunction and otitis media

    PubMed Central

    Teschner, Magnus

    2016-01-01

    Evidence-based medicine is an approach to medical treatment intended to optimize patient-oriented decision-making on the basis of empirically proven effectiveness. For this purpose, a classification system has been established to categorize studies – and hence therapy options – in respect of associated evidence according to defined criteria. The Eustachian tube connects the nasopharynx with the middle ear cavity. Its key function is to ensure middle ear ventilation. Compromised ventilation results in inflammatory middle ear disorders. Numerous evidence-based therapy options are available for the treatment of impaired middle ear ventilation and otitis media, the main therapeutic approach being antibiotic treatment. More recent procedures such as balloon dilation of the Eustachian tube have also shown initial success but must undergo further evaluation with regard to evidence. There is, as yet, no evidence for some of the other long-established procedures. Owing to the multitude of variables, the classification of evidence levels for various treatment approaches calls for highly diversified assessment. Numerous evidence-based studies are therefore necessary in order to evaluate the evidence pertaining to existing and future therapy solutions for impaired middle ear ventilation and otitis media. If this need is addressed, a wealth of implications can be expected for therapeutic approaches in the years to come. PMID:28025605

  6. The ESHRE/ESGE consensus on the classification of female genital tract congenital anomalies†,‡

    PubMed Central

    Grimbizis, Grigoris F.; Gordts, Stephan; Di Spiezio Sardo, Attilio; Brucker, Sara; De Angelis, Carlo; Gergolet, Marco; Li, Tin-Chiu; Tanos, Vasilios; Brölmann, Hans; Gianaroli, Luca; Campo, Rudi

    2013-01-01

    STUDY QUESTION What classification system is more suitable for the accurate, clear, simple and related to the clinical management categorization of female genital anomalies? SUMMARY ANSWER The new ESHRE/ESGE classification system of female genital anomalies is presented. WHAT IS KNOWN ALREADY Congenital malformations of the female genital tract are common miscellaneous deviations from normal anatomy with health and reproductive consequences. Until now, three systems have been proposed for their categorization but all of them are associated with serious limitations. STUDY DESIGN, SIZE AND DURATION The European Society of Human Reproduction and Embryology (ESHRE) and the European Society for Gynaecological Endoscopy (ESGE) have established a common Working Group, under the name CONUTA (CONgenital UTerine Anomalies), with the goal of developing a new updated classification system. A scientific committee (SC) has been appointed to run the project, looking also for consensus within the scientists working in the field. PARTICIPANTS/MATERIALS, SETTING, METHODS The new system is designed and developed based on (i) scientific research through critical review of current proposals and preparation of an initial proposal for discussion between the experts, (ii) consensus measurement among the experts through the use of the DELPHI procedure and (iii) consensus development by the SC, taking into account the results of the DELPHI procedure and the comments of the experts. Almost 90 participants took part in the process of development of the ESHRE/ESGE classification system, contributing with their structured answers and comments. MAIN RESULTS AND THE ROLE OF CHANCE The ESHRE/ESGE classification system is based on anatomy. Anomalies are classified into the following main classes, expressing uterine anatomical deviations deriving from the same embryological origin: U0, normal uterus; U1, dysmorphic uterus; U2, septate uterus; U3, bicorporeal uterus; U4, hemi-uterus; U5, aplastic uterus; U6, for still unclassified cases. Main classes have been divided into sub-classes expressing anatomical varieties with clinical significance. Cervical and vaginal anomalies are classified independently into sub-classes having clinical significance. LIMITATIONS, REASONS FOR CAUTION The ESHRE/ESGE classification of female genital anomalies seems to fulfill the expectations and the needs of the experts in the field, but its clinical value needs to be proved in everyday practice. WIDER IMPLICATIONS OF THE FINDINGS The ESHRE/ESGE classification system of female genital anomalies could be used as a starting point for the development of guidelines for their diagnosis and treatment. STUDY FUNDING/COMPETING INTEREST(S) None. PMID:23771171

  7. Automatic classification of visual evoked potentials based on wavelet decomposition

    NASA Astrophysics Data System (ADS)

    Stasiakiewicz, Paweł; Dobrowolski, Andrzej P.; Tomczykiewicz, Kazimierz

    2017-04-01

    Diagnosis of part of the visual system, that is responsible for conducting compound action potential, is generally based on visual evoked potentials generated as a result of stimulation of the eye by external light source. The condition of patient's visual path is assessed by set of parameters that describe the time domain characteristic extremes called waves. The decision process is compound therefore diagnosis significantly depends on experience of a doctor. The authors developed a procedure - based on wavelet decomposition and linear discriminant analysis - that ensures automatic classification of visual evoked potentials. The algorithm enables to assign individual case to normal or pathological class. The proposed classifier has a 96,4% sensitivity at 10,4% probability of false alarm in a group of 220 cases and area under curve ROC equals to 0,96 which, from the medical point of view, is a very good result.

  8. 40 CFR 152.167 - Distribution and sale of restricted use products.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.167 Distribution and sale of restricted use products. Unless modified by the Agency, the...

  9. 40 CFR 152.167 - Distribution and sale of restricted use products.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.167 Distribution and sale of restricted use products. Unless modified by the Agency, the...

  10. 40 CFR 152.167 - Distribution and sale of restricted use products.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.167 Distribution and sale of restricted use products. Unless modified by the Agency, the...

  11. 40 CFR 152.167 - Distribution and sale of restricted use products.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.167 Distribution and sale of restricted use products. Unless modified by the Agency, the...

  12. 40 CFR 152.167 - Distribution and sale of restricted use products.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.167 Distribution and sale of restricted use products. Unless modified by the Agency, the...

  13. Classifying the Indication for Colonoscopy Procedures: A Comparison of NLP Approaches in a Diverse National Healthcare System.

    PubMed

    Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L

    2015-01-01

    In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.

  14. Global similarity predicts dissociation of classification and recognition: evidence questioning the implicit-explicit learning distinction in amnesia.

    PubMed

    Jamieson, Randall K; Holmes, Signy; Mewhort, D J K

    2010-11-01

    Dissociation of classification and recognition in amnesia is widely taken to imply 2 functional systems: an implicit procedural-learning system that is spared in amnesia and an explicit episodic-learning system that is compromised. We argue that both tasks reflect the global similarity of probes to memory. In classification, subjects sort unstudied grammatical exemplars from lures, whereas in recognition, they sort studied grammatical exemplars from lures. Hence, global similarity is necessarily greater in recognition than in classification. Moreover, a grammatical exemplar's similarity to studied exemplars is a nonlinear function of the integrity of the data in memory. Assuming that data integrity is better for control subjects than for subjects with amnesia, the nonlinear relation combined with the advantage for recognition over classification predicts the dissociation of recognition and classification. To illustrate the dissociation of recognition and classification in healthy undergraduates, we manipulated study time to vary the integrity of the data in memory and brought the dissociation under experimental control. We argue that the dissociation reflects a general cost in memory rather than a selective impairment of separate procedural and episodic systems. (c) 2010 APA, all rights reserved

  15. Decision Tree Repository and Rule Set Based Mingjiang River Estuarine Wetlands Classifaction

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Li, X.; Xiao, W.

    2018-05-01

    The increasing urbanization and industrialization have led to wetland losses in estuarine area of Mingjiang River over past three decades. There has been increasing attention given to produce wetland inventories using remote sensing and GIS technology. Due to inconsistency training site and training sample, traditionally pixel-based image classification methods can't achieve a comparable result within different organizations. Meanwhile, object-oriented image classification technique shows grate potential to solve this problem and Landsat moderate resolution remote sensing images are widely used to fulfill this requirement. Firstly, the standardized atmospheric correct, spectrally high fidelity texture feature enhancement was conducted before implementing the object-oriented wetland classification method in eCognition. Secondly, we performed the multi-scale segmentation procedure, taking the scale, hue, shape, compactness and smoothness of the image into account to get the appropriate parameters, using the top and down region merge algorithm from single pixel level, the optimal texture segmentation scale for different types of features is confirmed. Then, the segmented object is used as the classification unit to calculate the spectral information such as Mean value, Maximum value, Minimum value, Brightness value and the Normalized value. The Area, length, Tightness and the Shape rule of the image object Spatial features and texture features such as Mean, Variance and Entropy of image objects are used as classification features of training samples. Based on the reference images and the sampling points of on-the-spot investigation, typical training samples are selected uniformly and randomly for each type of ground objects. The spectral, texture and spatial characteristics of each type of feature in each feature layer corresponding to the range of values are used to create the decision tree repository. Finally, with the help of high resolution reference images, the random sampling method is used to conduct the field investigation, achieve an overall accuracy of 90.31 %, and the Kappa coefficient is 0.88. The classification method based on decision tree threshold values and rule set developed by the repository, outperforms the results obtained from the traditional methodology. Our decision tree repository and rule set based object-oriented classification technique was an effective method for producing comparable and consistency wetlands data set.

  16. FROM2D to 3d Supervised Segmentation and Classification for Cultural Heritage Applications

    NASA Astrophysics Data System (ADS)

    Grilli, E.; Dininno, D.; Petrucci, G.; Remondino, F.

    2018-05-01

    The digital management of architectural heritage information is still a complex problem, as a heritage object requires an integrated representation of various types of information in order to develop appropriate restoration or conservation strategies. Currently, there is extensive research focused on automatic procedures of segmentation and classification of 3D point clouds or meshes, which can accelerate the study of a monument and integrate it with heterogeneous information and attributes, useful to characterize and describe the surveyed object. The aim of this study is to propose an optimal, repeatable and reliable procedure to manage various types of 3D surveying data and associate them with heterogeneous information and attributes to characterize and describe the surveyed object. In particular, this paper presents an approach for classifying 3D heritage models, starting from the segmentation of their textures based on supervised machine learning methods. Experimental results run on three different case studies demonstrate that the proposed approach is effective and with many further potentials.

  17. Interannual drought index variations in Central Europe related to the large-scale atmospheric circulation—application and evaluation of statistical downscaling approaches based on circulation type classifications

    NASA Astrophysics Data System (ADS)

    Beck, Christoph; Philipp, Andreas; Jacobeit, Jucundus

    2015-08-01

    This contribution investigates the relationship between the large-scale atmospheric circulation and interannual variations of the standardized precipitation index (SPI) in Central Europe. To this end, circulation types (CT) have been derived from a variety of circulation type classifications (CTC) applied to daily sea level pressure (SLP) data and mean circulation indices of vorticity ( V), zonality ( Z) and meridionality ( M) have been calculated. Occurrence frequencies of CTs and circulation indices have been utilized as predictors within multiple regression models (MRM) for the estimation of gridded 3-month SPI values over Central Europe, for the period 1950 to 2010. CTC-based MRMs used in the analyses comprise variants concerning the basic method for CT classification, the number of CTs, the size and location of the spatial domain used for CTCs and the exclusive use of CT frequencies or the combined use of CT frequencies and mean circulation indices as predictors. Adequate MRM predictor combinations have been identified by applying stepwise multiple regression analyses within a resampling framework. The performance (robustness) of the resulting MRMs has been quantified based on a leave-one-out cross-validation procedure applying several skill scores. Furthermore, the relative importance of individual predictors has been estimated for each MRM. From these analyses, it can be stated that model skill is improved by (i) the consideration of vorticity characteristics within CTCs, (ii) a relatively small size of the spatial domain to which CTCs are applied and (iii) the inclusion of mean circulation indices. However, model skill exhibits distinct variations between seasons and regions. Whereas promising skill can be stated for the western and northwestern parts of the Central European domain, only unsatisfactory skill is reached in the more continental regions and particularly during summer. Thus, it can be concluded that the presented approaches feature the potential for the downscaling of Central European drought index variations from the large-scale circulation, at least for some regions. Further improvements of CTC-based approaches may be expected from the optimization of CTCs for explaining the SPI, e.g. via the inclusion of additional variables in the classification procedure.

  18. A novel tree-based procedure for deciphering the genomic spectrum of clinical disease entities.

    PubMed

    Mbogning, Cyprien; Perdry, Hervé; Toussile, Wilson; Broët, Philippe

    2014-01-01

    Dissecting the genomic spectrum of clinical disease entities is a challenging task. Recursive partitioning (or classification trees) methods provide powerful tools for exploring complex interplay among genomic factors, with respect to a main factor, that can reveal hidden genomic patterns. To take confounding variables into account, the partially linear tree-based regression (PLTR) model has been recently published. It combines regression models and tree-based methodology. It is however computationally burdensome and not well suited for situations for which a large number of exploratory variables is expected. We developed a novel procedure that represents an alternative to the original PLTR procedure, and considered different selection criteria. A simulation study with different scenarios has been performed to compare the performances of the proposed procedure to the original PLTR strategy. The proposed procedure with a Bayesian Information Criterion (BIC) achieved good performances to detect the hidden structure as compared to the original procedure. The novel procedure was used for analyzing patterns of copy-number alterations in lung adenocarcinomas, with respect to Kirsten Rat Sarcoma Viral Oncogene Homolog gene (KRAS) mutation status, while controlling for a cohort effect. Results highlight two subgroups of pure or nearly pure wild-type KRAS tumors with particular copy-number alteration patterns. The proposed procedure with a BIC criterion represents a powerful and practical alternative to the original procedure. Our procedure performs well in a general framework and is simple to implement.

  19. Improving the Interpretability of Classification Rules Discovered by an Ant Colony Algorithm: Extended Results.

    PubMed

    Otero, Fernando E B; Freitas, Alex A

    2016-01-01

    Most ant colony optimization (ACO) algorithms for inducing classification rules use a ACO-based procedure to create a rule in a one-at-a-time fashion. An improved search strategy has been proposed in the cAnt-Miner[Formula: see text] algorithm, where an ACO-based procedure is used to create a complete list of rules (ordered rules), i.e., the ACO search is guided by the quality of a list of rules instead of an individual rule. In this paper we propose an extension of the cAnt-Miner[Formula: see text] algorithm to discover a set of rules (unordered rules). The main motivations for this work are to improve the interpretation of individual rules by discovering a set of rules and to evaluate the impact on the predictive accuracy of the algorithm. We also propose a new measure to evaluate the interpretability of the discovered rules to mitigate the fact that the commonly used model size measure ignores how the rules are used to make a class prediction. Comparisons with state-of-the-art rule induction algorithms, support vector machines, and the cAnt-Miner[Formula: see text] producing ordered rules are also presented.

  20. 14 CFR 21.93 - Classification of changes in type design.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Classification of changes in type design... TRANSPORTATION AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Changes to Type Certificates § 21.93 Classification of changes in type design. (a) In addition to changes in type design specified in paragraph (b) of...

  1. 10 CFR 9.61 - Procedures for processing requests for records exempt in whole or in part.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... whether it continues to warrant classification under criteria established by an Executive Order to be kept... classification under these criteria shall be declassified and made available to the individual. If the requested... classifying agency to review the information to ascertain if classification is still warranted. If the...

  2. 10 CFR 9.61 - Procedures for processing requests for records exempt in whole or in part.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... whether it continues to warrant classification under criteria established by an Executive Order to be kept... classification under these criteria shall be declassified and made available to the individual. If the requested... classifying agency to review the information to ascertain if classification is still warranted. If the...

  3. 10 CFR 9.61 - Procedures for processing requests for records exempt in whole or in part.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... whether it continues to warrant classification under criteria established by an Executive Order to be kept... classification under these criteria shall be declassified and made available to the individual. If the requested... classifying agency to review the information to ascertain if classification is still warranted. If the...

  4. 10 CFR 9.61 - Procedures for processing requests for records exempt in whole or in part.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... whether it continues to warrant classification under criteria established by an Executive Order to be kept... classification under these criteria shall be declassified and made available to the individual. If the requested... classifying agency to review the information to ascertain if classification is still warranted. If the...

  5. 10 CFR 9.61 - Procedures for processing requests for records exempt in whole or in part.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... whether it continues to warrant classification under criteria established by an Executive Order to be kept... classification under these criteria shall be declassified and made available to the individual. If the requested... classifying agency to review the information to ascertain if classification is still warranted. If the...

  6. Fast clustering algorithm for large ECG data sets based on CS theory in combination with PCA and K-NN methods.

    PubMed

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2014-01-01

    Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.

  7. Text Classification for Organizational Researchers

    PubMed Central

    Kobayashi, Vladimer B.; Mol, Stefan T.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.

    2017-01-01

    Organizations are increasingly interested in classifying texts or parts thereof into categories, as this enables more effective use of their information. Manual procedures for text classification work well for up to a few hundred documents. However, when the number of documents is larger, manual procedures become laborious, time-consuming, and potentially unreliable. Techniques from text mining facilitate the automatic assignment of text strings to categories, making classification expedient, fast, and reliable, which creates potential for its application in organizational research. The purpose of this article is to familiarize organizational researchers with text mining techniques from machine learning and statistics. We describe the text classification process in several roughly sequential steps, namely training data preparation, preprocessing, transformation, application of classification techniques, and validation, and provide concrete recommendations at each step. To help researchers develop their own text classifiers, the R code associated with each step is presented in a tutorial. The tutorial draws from our own work on job vacancy mining. We end the article by discussing how researchers can validate a text classification model and the associated output. PMID:29881249

  8. Representing and comparing protein structures as paths in three-dimensional space

    PubMed Central

    Zhi, Degui; Krishna, S Sri; Cao, Haibo; Pevzner, Pavel; Godzik, Adam

    2006-01-01

    Background Most existing formulations of protein structure comparison are based on detailed atomic level descriptions of protein structures and bypass potential insights that arise from a higher-level abstraction. Results We propose a structure comparison approach based on a simplified representation of proteins that describes its three-dimensional path by local curvature along the generalized backbone of the polypeptide. We have implemented a dynamic programming procedure that aligns curvatures of proteins by optimizing a defined sum turning angle deviation measure. Conclusion Although our procedure does not directly optimize global structural similarity as measured by RMSD, our benchmarking results indicate that it can surprisingly well recover the structural similarity defined by structure classification databases and traditional structure alignment programs. In addition, our program can recognize similarities between structures with extensive conformation changes that are beyond the ability of traditional structure alignment programs. We demonstrate the applications of procedure to several contexts of structure comparison. An implementation of our procedure, CURVE, is available as a public webserver. PMID:17052359

  9. Classification of red wine based on its protected designation of origin (PDO) using Laser-induced Breakdown Spectroscopy (LIBS).

    PubMed

    Moncayo, S; Rosales, J D; Izquierdo-Hornillos, R; Anzano, J; Caceres, J O

    2016-09-01

    This work reports on a simple and fast classification procedure for the quality control of red wines with protected designation of origin (PDO) by means of Laser Induced Breakdown Spectroscopy (LIBS) technique combined with Neural Networks (NN) in order to increase the quality assurance and authenticity issues. A total of thirty-eight red wine samples from different PDO were analyzed to detect fake wines and to avoid unfair competition in the market. LIBS is well known for not requiring sample preparation, however, in order to increase its analytical performance a new sample preparation treatment by previous liquid-to-solid transformation of the wine using a dry collagen gel has been developed. The use of collagen pellets allowed achieving successful classification results, avoiding the limitations and difficulties of working with aqueous samples. The performance of the NN model was assessed by three validation procedures taking into account their sensitivity (internal validation), generalization ability and robustness (independent external validation). The results of the use of a spectroscopic technique coupled with a chemometric analysis (LIBS-NN) are discussed in terms of its potential use in the food industry, providing a methodology able to perform the quality control of alcoholic beverages. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A Deep Learning Approach for Fault Diagnosis of Induction Motors in Manufacturing

    NASA Astrophysics Data System (ADS)

    Shao, Si-Yu; Sun, Wen-Jun; Yan, Ru-Qiang; Wang, Peng; Gao, Robert X.

    2017-11-01

    Extracting features from original signals is a key procedure for traditional fault diagnosis of induction motors, as it directly influences the performance of fault recognition. However, high quality features need expert knowledge and human intervention. In this paper, a deep learning approach based on deep belief networks (DBN) is developed to learn features from frequency distribution of vibration signals with the purpose of characterizing working status of induction motors. It combines feature extraction procedure with classification task together to achieve automated and intelligent fault diagnosis. The DBN model is built by stacking multiple-units of restricted Boltzmann machine (RBM), and is trained using layer-by-layer pre-training algorithm. Compared with traditional diagnostic approaches where feature extraction is needed, the presented approach has the ability of learning hierarchical representations, which are suitable for fault classification, directly from frequency distribution of the measurement data. The structure of the DBN model is investigated as the scale and depth of the DBN architecture directly affect its classification performance. Experimental study conducted on a machine fault simulator verifies the effectiveness of the deep learning approach for fault diagnosis of induction motors. This research proposes an intelligent diagnosis method for induction motor which utilizes deep learning model to automatically learn features from sensor data and realize working status recognition.

  11. Fine-grained leukocyte classification with deep residual learning for microscopic images.

    PubMed

    Qin, Feiwei; Gao, Nannan; Peng, Yong; Wu, Zizhao; Shen, Shuying; Grudtsin, Artur

    2018-08-01

    Leukocyte classification and cytometry have wide applications in medical domain, previous researches usually exploit machine learning techniques to classify leukocytes automatically. However, constrained by the past development of machine learning techniques, for example, extracting distinctive features from raw microscopic images are difficult, the widely used SVM classifier only has relative few parameters to tune, these methods cannot efficiently handle fine-grained classification cases when the white blood cells have up to 40 categories. Based on deep learning theory, a systematic study is conducted on finer leukocyte classification in this paper. A deep residual neural network based leukocyte classifier is constructed at first, which can imitate the domain expert's cell recognition process, and extract salient features robustly and automatically. Then the deep neural network classifier's topology is adjusted according to the prior knowledge of white blood cell test. After that the microscopic image dataset with almost one hundred thousand labeled leukocytes belonging to 40 categories is built, and combined training strategies are adopted to make the designed classifier has good generalization ability. The proposed deep residual neural network based classifier was tested on microscopic image dataset with 40 leukocyte categories. It achieves top-1 accuracy of 77.80%, top-5 accuracy of 98.75% during the training procedure. The average accuracy on the test set is nearly 76.84%. This paper presents a fine-grained leukocyte classification method for microscopic images, based on deep residual learning theory and medical domain knowledge. Experimental results validate the feasibility and effectiveness of our approach. Extended experiments support that the fine-grained leukocyte classifier could be used in real medical applications, assist doctors in diagnosing diseases, reduce human power significantly. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Place-classification analysis of community vulnerability to near-field tsunami threats in the U.S. Pacific Northwest (Invited)

    NASA Astrophysics Data System (ADS)

    Wood, N. J.; Jones, J.; Spielman, S.

    2013-12-01

    Near-field tsunami hazards are credible threats to many coastal communities throughout the world. Along the U.S. Pacific Northwest coast, low-lying areas could be inundated by a series of catastrophic tsunami waves that begin to arrive in a matter of minutes following a Cascadia subduction zone (CSZ) earthquake. This presentation summarizes analytical efforts to classify communities with similar characteristics of community vulnerability to tsunami hazards. This work builds on past State-focused inventories of community exposure to CSZ-related tsunami hazards in northern California, Oregon, and Washington. Attributes used in the classification, or cluster analysis, include demography of residents, spatial extent of the developed footprint based on mid-resolution land cover data, distribution of the local workforce, and the number and type of public venues, dependent-care facilities, and community-support businesses. Population distributions also are characterized by a function of travel time to safety, based on anisotropic, path-distance, geospatial modeling. We used an unsupervised-model-based clustering algorithm and a v-fold, cross-validation procedure (v=50) to identify the appropriate number of community types. We selected class solutions that provided the appropriate balance between parsimony and model fit. The goal of the vulnerability classification is to provide emergency managers with a general sense of the types of communities in tsunami hazard zones based on similar characteristics instead of only providing an exhaustive list of attributes for individual communities. This classification scheme can be then used to target and prioritize risk-reduction efforts that address common issues across multiple communities. The presentation will include a discussion of the utility of proposed place classifications to support regional preparedness and outreach efforts.

  13. Single-particle cryo-EM using alignment by classification (ABC): the structure of Lumbricus terrestris haemoglobin

    PubMed Central

    Seer-Linnemayr, Charlotte; Ravelli, Raimond B. G.; Matadeen, Rishi; De Carlo, Sacha; Alewijnse, Bart; Portugal, Rodrigo V.; Pannu, Navraj S.; Schatz, Michael; van Heel, Marin

    2017-01-01

    Single-particle cryogenic electron microscopy (cryo-EM) can now yield near-atomic resolution structures of biological complexes. However, the reference-based alignment algorithms commonly used in cryo-EM suffer from reference bias, limiting their applicability (also known as the ‘Einstein from random noise’ problem). Low-dose cryo-EM therefore requires robust and objective approaches to reveal the structural information contained in the extremely noisy data, especially when dealing with small structures. A reference-free pipeline is presented for obtaining near-atomic resolution three-dimensional reconstructions from heterogeneous (‘four-dimensional’) cryo-EM data sets. The methodologies integrated in this pipeline include a posteriori camera correction, movie-based full-data-set contrast transfer function determination, movie-alignment algorithms, (Fourier-space) multivariate statistical data compression and unsupervised classification, ‘random-startup’ three-dimensional reconstructions, four-dimensional structural refinements and Fourier shell correlation criteria for evaluating anisotropic resolution. The procedures exclusively use information emerging from the data set itself, without external ‘starting models’. Euler-angle assignments are performed by angular reconstitution rather than by the inherently slower projection-matching approaches. The comprehensive ‘ABC-4D’ pipeline is based on the two-dimensional reference-free ‘alignment by classification’ (ABC) approach, where similar images in similar orientations are grouped by unsupervised classification. Some fundamental differences between X-ray crystallography versus single-particle cryo-EM data collection and data processing are discussed. The structure of the giant haemoglobin from Lumbricus terrestris at a global resolution of ∼3.8 Å is presented as an example of the use of the ABC-4D procedure. PMID:28989723

  14. Aggregative Learning Method and Its Application for Communication Quality Evaluation

    NASA Astrophysics Data System (ADS)

    Akhmetov, Dauren F.; Kotaki, Minoru

    2007-12-01

    In this paper, so-called Aggregative Learning Method (ALM) is proposed to improve and simplify the learning and classification abilities of different data processing systems. It provides a universal basis for design and analysis of mathematical models of wide class. A procedure was elaborated for time series model reconstruction and analysis for linear and nonlinear cases. Data approximation accuracy (during learning phase) and data classification quality (during recall phase) are estimated from introduced statistic parameters. The validity and efficiency of the proposed approach have been demonstrated through its application for monitoring of wireless communication quality, namely, for Fixed Wireless Access (FWA) system. Low memory and computation resources were shown to be needed for the procedure realization, especially for data classification (recall) stage. Characterized with high computational efficiency and simple decision making procedure, the derived approaches can be useful for simple and reliable real-time surveillance and control system design.

  15. Users manual for the US baseline corn and soybean segment classification procedure

    NASA Technical Reports Server (NTRS)

    Horvath, R.; Colwell, R. (Principal Investigator); Hay, C.; Metzler, M.; Mykolenko, O.; Odenweller, J.; Rice, D.

    1981-01-01

    A user's manual for the classification component of the FY-81 U.S. Corn and Soybean Pilot Experiment in the Foreign Commodity Production Forecasting Project of AgRISTARS is presented. This experiment is one of several major experiments in AgRISTARS designed to measure and advance the remote sensing technologies for cropland inventory. The classification procedure discussed is designed to produce segment proportion estimates for corn and soybeans in the U.S. Corn Belt (Iowa, Indiana, and Illinois) using LANDSAT data. The estimates are produced by an integrated Analyst/Machine procedure. The Analyst selects acquisitions, participates in stratification, and assigns crop labels to selected samples. In concert with the Analyst, the machine digitally preprocesses LANDSAT data to remove external effects, stratifies the data into field like units and into spectrally similar groups, statistically samples the data for Analyst labeling, and combines the labeled samples into a final estimate.

  16. 40 CFR 152.171 - Restrictions other than those relating to use by certified applicators.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.171 Restrictions other than those relating to use by certified applicators...

  17. 40 CFR 152.171 - Restrictions other than those relating to use by certified applicators.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.171 Restrictions other than those relating to use by certified applicators...

  18. 40 CFR 152.171 - Restrictions other than those relating to use by certified applicators.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.171 Restrictions other than those relating to use by certified applicators...

  19. 40 CFR 152.171 - Restrictions other than those relating to use by certified applicators.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.171 Restrictions other than those relating to use by certified applicators...

  20. Osteochondroma of the mandibular condyle: a classification system based on computed tomographic appearances.

    PubMed

    Chen, Min-jie; Yang, Chi; Qiu, Ya-ting; Zhou, Qin; Huang, Dong; Shi, Hui-min

    2014-09-01

    The objectives of this study were to introduce the classification of osteochondroma of the mandibular condyle based on computed tomographic images and to present our treatment experiences. From January 2002 and December 2012, a total of 61 patients with condylar osteochondroma were treated in our division. Both clinical and radiologic aspects were reviewed. The average follow-up period was 24.3 months with a range of 6 to 120 months. Two types of condylar osteochondroma were presented: type 1 (protruding expansion) in 50 patients (82.0%) and type 2 (globular expansion) in 11 patients (18.0%). Type 1 condylar osteochondroma presented 5 forms: anterior/anteromedial (58%), posterior/posteromedial (6%), medial (16%), lateral (6%), and gigantic (14%). Local resection was performed on patients with type 1 condylar osteochondroma. Subtotal condylectomy/total condylectomy using costochondral graft reconstruction with/without orthognathic surgeries was performed on patients with type 2 condylar osteochondroma. During the follow-up period, tumor reformation, condyle absorption, and new deformity were not detected. The patients almost reattained facial symmetry. Preoperative classification based on computed tomographic images will help surgeons to choose the suitable surgical procedure to treat the condylar osteochondroma.

  1. Are paediatric operations evidence based? A prospective analysis of general surgery practice in a teaching paediatric hospital.

    PubMed

    Zani-Ruttenstock, Elke; Zani, Augusto; Bullman, Emma; Lapidus-Krol, Eveline; Pierro, Agostino

    2015-01-01

    Paediatric surgical practice should be based upon solid scientific evidence. A study in 1998 (Baraldini et al., Pediatr Surg Int) indicated that only a quarter of paediatric operations were supported by the then gold standard of evidence based medicine (EBM) which was defined by randomized controlled trials (RCTs). The aim of the current study was to re-evaluate paediatric surgical practice 16 years after the previous study in a larger cohort of patients. A prospective observational study was performed in a tertiary level teaching hospital for children. The study was approved by the local research ethics board. All diagnostic and therapeutic procedures requiring a general anaesthetic carried out over a 4-week period (24 Feb 2014-22 Mar 2014) under the general surgery service or involving a general paediatric surgeon were included in the study. Pubmed and EMBASE were used to search in the literature for the highest level of evidence supporting the recorded procedures. Evidence was classified according to the Oxford Centre for Evidence Based Medicine (OCEBM) 2009 system as well as according to the classification used by Baraldini et al. Results was compared using Χ (2) test. P < 0.05 was considered statistically significant. During the study period, 126 operations (36 different types) were performed on 118 patients. According to the OCEBM classification, 62 procedures (49 %) were supported by systematic reviews of multiple homogeneous RCTs (level 1a), 13 (10 %) by individual RCTs (level 1b), 5 (4 %) by systematic reviews of cohort studies (level 2a), 11 (9 %) by individual cohort studies, 1 (1 %) by systematic review of case-control studies (level 3a), 14 (11 %) by case-control studies (level 3b), 9 (7 %) by case series (type 4) and 11 procedures (9 %) were based on expert opinion or deemed self-evident interventions (type 5). High level of evidence (OCEBM level 1a or 1b or level I according to Baraldini et al. PSI 1998) supported 75 (60 %) operations in the current study compared to 18 (26 %) in the study of 1998 (P < 0.0001). The present study shows that nowadays a remarkable number of paediatric surgical procedures are supported by high level of evidence. Despite this improvement in evidence-based paediatric surgical practice, more than a third of the procedures still lack sufficient evidence-based literature support. More RCTs are warranted to support and direct paediatric surgery practice according to the principals of EBM.

  2. Log-ratio transformed major element based multidimensional classification for altered High-Mg igneous rocks

    NASA Astrophysics Data System (ADS)

    Verma, Surendra P.; Rivera-Gómez, M. Abdelaly; Díaz-González, Lorena; Quiroz-Ruiz, Alfredo

    2016-12-01

    A new multidimensional classification scheme consistent with the chemical classification of the International Union of Geological Sciences (IUGS) is proposed for the nomenclature of High-Mg altered rocks. Our procedure is based on an extensive database of major element (SiO2, TiO2, Al2O3, Fe2O3t, MnO, MgO, CaO, Na2O, K2O, and P2O5) compositions of a total of 33,868 (920 High-Mg and 32,948 "Common") relatively fresh igneous rock samples. The database consisting of these multinormally distributed samples in terms of their isometric log-ratios was used to propose a set of 11 discriminant functions and 6 diagrams to facilitate High-Mg rock classification. The multinormality required by linear discriminant and canonical analysis was ascertained by a new computer program DOMuDaF. One multidimensional function can distinguish the High-Mg and Common igneous rocks with high percent success values of about 86.4% and 98.9%, respectively. Similarly, from 10 discriminant functions the High-Mg rocks can also be classified as one of the four rock types (komatiite, meimechite, picrite, and boninite), with high success values of about 88%-100%. Satisfactory functioning of this new classification scheme was confirmed by seven independent tests. Five further case studies involving application to highly altered rocks illustrate the usefulness of our proposal. A computer program HMgClaMSys was written to efficiently apply the proposed classification scheme, which will be available for online processing of igneous rock compositional data. Monte Carlo simulation modeling and mass-balance computations confirmed the robustness of our classification with respect to analytical errors and postemplacement compositional changes.

  3. Feature Selection Has a Large Impact on One-Class Classification Accuracy for MicroRNAs in Plants.

    PubMed

    Yousef, Malik; Saçar Demirci, Müşerref Duygu; Khalifa, Waleed; Allmer, Jens

    2016-01-01

    MicroRNAs (miRNAs) are short RNA sequences involved in posttranscriptional gene regulation. Their experimental analysis is complicated and, therefore, needs to be supplemented with computational miRNA detection. Currently computational miRNA detection is mainly performed using machine learning and in particular two-class classification. For machine learning, the miRNAs need to be parametrized and more than 700 features have been described. Positive training examples for machine learning are readily available, but negative data is hard to come by. Therefore, it seems prerogative to use one-class classification instead of two-class classification. Previously, we were able to almost reach two-class classification accuracy using one-class classifiers. In this work, we employ feature selection procedures in conjunction with one-class classification and show that there is up to 36% difference in accuracy among these feature selection methods. The best feature set allowed the training of a one-class classifier which achieved an average accuracy of ~95.6% thereby outperforming previous two-class-based plant miRNA detection approaches by about 0.5%. We believe that this can be improved upon in the future by rigorous filtering of the positive training examples and by improving current feature clustering algorithms to better target pre-miRNA feature selection.

  4. Automated analysis of food-borne pathogens using a novel microbial cell culture, sensing and classification system.

    PubMed

    Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi

    2016-02-21

    We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.

  5. The morphology and classification of α ganglion cells in the rat retinae: a fractal analysis study.

    PubMed

    Jelinek, Herbert F; Ristanović, Dušan; Milošević, Nebojša T

    2011-09-30

    Rat retinal ganglion cells have been proposed to consist of a varying number of subtypes. Dendritic morphology is an essential aspect of classification and a necessary step toward understanding structure-function relationships of retinal ganglion cells. This study aimed at using a heuristic classification procedure in combination with the box-counting analysis to classify the alpha ganglion cells in the rat retinae based on the dendritic branching pattern and to investigate morphological changes with retinal eccentricity. The cells could be divided into two groups: cells with simple dendritic pattern (box dimension lower than 1.390) and cells with complex dendritic pattern (box dimension higher than 1.390) according to their dendritic branching pattern complexity. Both were further divided into two subtypes due to the stratification within the inner plexiform layer. In the present study we have shown that the alpha rat RCGs can be classified further by their dendritic branching complexity and thus extend those of previous reports that fractal analysis can be successfully used in neuronal classification, particularly that the fractal dimension represents a robust and sensitive tool for the classification of retinal ganglion cells. A hypothesis of possible functional significance of our classification scheme is also discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Mining disease fingerprints from within genetic pathways.

    PubMed

    Nabhan, Ahmed Ragab; Sarkar, Indra Neil

    2012-01-01

    Mining biological networks can be an effective means to uncover system level knowledge out of micro level associations, such as encapsulated in genetic pathways. Analysis of human disease genetic pathways can lead to the identification of major mechanisms that may underlie disorders at an abstract functional level. The focus of this study was to develop an approach for structural pattern analysis and classification of genetic pathways of diseases. A probabilistic model was developed to capture characteristic components ('fingerprints') of functionally annotated pathways. A probability estimation procedure of this model searched for fingerprints in each disease pathway while improving probability estimates of model parameters. The approach was evaluated on data from the Kyoto Encyclopedia of Genes and Genomes (consisting of 56 pathways across seven disease categories). Based on the achieved average classification accuracy of up to ~77%, the findings suggest that these fingerprints may be used for classification and discovery of genetic pathways.

  7. Mining Disease Fingerprints From Within Genetic Pathways

    PubMed Central

    Nabhan, Ahmed Ragab; Sarkar, Indra Neil

    2012-01-01

    Mining biological networks can be an effective means to uncover system level knowledge out of micro level associations, such as encapsulated in genetic pathways. Analysis of human disease genetic pathways can lead to the identification of major mechanisms that may underlie disorders at an abstract functional level. The focus of this study was to develop an approach for structural pattern analysis and classification of genetic pathways of diseases. A probabilistic model was developed to capture characteristic components (‘fingerprints’) of functionally annotated pathways. A probability estimation procedure of this model searched for fingerprints in each disease pathway while improving probability estimates of model parameters. The approach was evaluated on data from the Kyoto Encyclopedia of Genes and Genomes (consisting of 56 pathways across seven disease categories). Based on the achieved average classification accuracy of up to ∼77%, the findings suggest that these fingerprints may be used for classification and discovery of genetic pathways. PMID:23304411

  8. Classification and reporting of severity experienced by animals used in scientific procedures: FELASA/ECLAM/ESLAV Working Group report.

    PubMed

    Smith, David; Anderson, David; Degryse, Anne-Dominique; Bol, Carla; Criado, Ana; Ferrara, Alessia; Franco, Nuno Henrique; Gyertyan, Istvan; Orellana, Jose M; Ostergaard, Grete; Varga, Orsolya; Voipio, Hanna-Marja

    2018-02-01

    Directive 2010/63/EU introduced requirements for the classification of the severity of procedures to be applied during the project authorisation process to use animals in scientific procedures and also to report actual severity experienced by each animal used in such procedures. These requirements offer opportunities during the design, conduct and reporting of procedures to consider the adverse effects of procedures and how these can be reduced to minimize the welfare consequences for the animals. Better recording and reporting of adverse effects should also help in highlighting priorities for refinement of future similar procedures and benchmarking good practice. Reporting of actual severity should help inform the public of the relative severity of different areas of scientific research and, over time, may show trends regarding refinement. Consistency of assignment of severity categories across Member States is a key requirement, particularly if re-use is considered, or the safeguard clause is to be invoked. The examples of severity classification given in Annex VIII are limited in number, and have little descriptive power to aid assignment. Additionally, the examples given often relate to the procedure and do not attempt to assess the outcome, such as adverse effects that may occur. The aim of this report is to deliver guidance on the assignment of severity, both prospectively and at the end of a procedure. A number of animal models, in current use, have been used to illustrate the severity assessment process from inception of the project, through monitoring during the course of the procedure to the final assessment of actual severity at the end of the procedure (Appendix 1).

  9. Classification and reporting of severity experienced by animals used in scientific procedures: FELASA/ECLAM/ESLAV Working Group report

    PubMed Central

    Smith, David; Anderson, David; Degryse, Anne-Dominique; Bol, Carla; Criado, Ana; Ferrara, Alessia; Gyertyan, Istvan; Orellana, Jose M; Ostergaard, Grete; Varga, Orsolya; Voipio, Hanna-Marja

    2018-01-01

    Directive 2010/63/EU introduced requirements for the classification of the severity of procedures to be applied during the project authorisation process to use animals in scientific procedures and also to report actual severity experienced by each animal used in such procedures. These requirements offer opportunities during the design, conduct and reporting of procedures to consider the adverse effects of procedures and how these can be reduced to minimize the welfare consequences for the animals. Better recording and reporting of adverse effects should also help in highlighting priorities for refinement of future similar procedures and benchmarking good practice. Reporting of actual severity should help inform the public of the relative severity of different areas of scientific research and, over time, may show trends regarding refinement. Consistency of assignment of severity categories across Member States is a key requirement, particularly if re-use is considered, or the safeguard clause is to be invoked. The examples of severity classification given in Annex VIII are limited in number, and have little descriptive power to aid assignment. Additionally, the examples given often relate to the procedure and do not attempt to assess the outcome, such as adverse effects that may occur. The aim of this report is to deliver guidance on the assignment of severity, both prospectively and at the end of a procedure. A number of animal models, in current use, have been used to illustrate the severity assessment process from inception of the project, through monitoring during the course of the procedure to the final assessment of actual severity at the end of the procedure (Appendix 1). PMID:29359995

  10. Statistical methods and neural network approaches for classification of data from multiple sources

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon Atli; Swain, Philip H.

    1990-01-01

    Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.

  11. A new feature extraction method for signal classification applied to cord dorsum potentials detection

    PubMed Central

    Vidaurre, D.; Rodríguez, E. E.; Bielza, C.; Larrañaga, P.; Rudomin, P.

    2012-01-01

    In the spinal cord of the anesthetized cat, spontaneous cord dorsum potentials (CDPs) appear synchronously along the lumbo-sacral segments. These CDPs have different shapes and magnitudes. Previous work has indicated that some CDPs appear to be specially associated with the activation of spinal pathways that lead to primary afferent depolarization and presynaptic inhibition. Visual detection and classification of these CDPs provides relevant information on the functional organization of the neural networks involved in the control of sensory information and allows the characterization of the changes produced by acute nerve and spinal lesions. We now present a novel feature extraction approach for signal classification, applied to CDP detection. The method is based on an intuitive procedure. We first remove by convolution the noise from the CDPs recorded in each given spinal segment. Then, we assign a coefficient for each main local maximum of the signal using its amplitude and distance to the most important maximum of the signal. These coefficients will be the input for the subsequent classification algorithm. In particular, we employ gradient boosting classification trees. This combination of approaches allows a faster and more accurate discrimination of CDPs than is obtained by other methods. PMID:22929924

  12. A new feature extraction method for signal classification applied to cord dorsum potential detection.

    PubMed

    Vidaurre, D; Rodríguez, E E; Bielza, C; Larrañaga, P; Rudomin, P

    2012-10-01

    In the spinal cord of the anesthetized cat, spontaneous cord dorsum potentials (CDPs) appear synchronously along the lumbo-sacral segments. These CDPs have different shapes and magnitudes. Previous work has indicated that some CDPs appear to be specially associated with the activation of spinal pathways that lead to primary afferent depolarization and presynaptic inhibition. Visual detection and classification of these CDPs provides relevant information on the functional organization of the neural networks involved in the control of sensory information and allows the characterization of the changes produced by acute nerve and spinal lesions. We now present a novel feature extraction approach for signal classification, applied to CDP detection. The method is based on an intuitive procedure. We first remove by convolution the noise from the CDPs recorded in each given spinal segment. Then, we assign a coefficient for each main local maximum of the signal using its amplitude and distance to the most important maximum of the signal. These coefficients will be the input for the subsequent classification algorithm. In particular, we employ gradient boosting classification trees. This combination of approaches allows a faster and more accurate discrimination of CDPs than is obtained by other methods.

  13. Techniques for capturing expert knowledge - An expert systems/hypertext approach

    NASA Technical Reports Server (NTRS)

    Lafferty, Larry; Taylor, Greg; Schumann, Robin; Evans, Randy; Koller, Albert M., Jr.

    1990-01-01

    The knowledge-acquisition strategy developed for the Explosive Hazards Classification (EHC) Expert System is described in which expert systems and hypertext are combined, and broad applications are proposed. The EHC expert system is based on rapid prototyping in which primary knowledge acquisition from experts is not emphasized; the explosive hazards technical bulletin, technical guidance, and minimal interviewing are used to develop the knowledge-based system. Hypertext is used to capture the technical information with respect to four issues including procedural, materials, test, and classification issues. The hypertext display allows the integration of multiple knowlege representations such as clarifications or opinions, and thereby allows the performance of a broad range of tasks on a single machine. Among other recommendations, it is suggested that the integration of hypertext and expert systems makes the resulting synergistic system highly efficient.

  14. Automated classification of Permanent Scatterers time-series based on statistical characterization tests

    NASA Astrophysics Data System (ADS)

    Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal

    2013-04-01

    The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.

  15. Case-based statistical learning applied to SPECT image classification

    NASA Astrophysics Data System (ADS)

    Górriz, Juan M.; Ramírez, Javier; Illán, I. A.; Martínez-Murcia, Francisco J.; Segovia, Fermín.; Salas-Gonzalez, Diego; Ortiz, A.

    2017-03-01

    Statistical learning and decision theory play a key role in many areas of science and engineering. Some examples include time series regression and prediction, optical character recognition, signal detection in communications or biomedical applications for diagnosis and prognosis. This paper deals with the topic of learning from biomedical image data in the classification problem. In a typical scenario we have a training set that is employed to fit a prediction model or learner and a testing set on which the learner is applied to in order to predict the outcome for new unseen patterns. Both processes are usually completely separated to avoid over-fitting and due to the fact that, in practice, the unseen new objects (testing set) have unknown outcomes. However, the outcome yields one of a discrete set of values, i.e. the binary diagnosis problem. Thus, assumptions on these outcome values could be established to obtain the most likely prediction model at the training stage, that could improve the overall classification accuracy on the testing set, or keep its performance at least at the level of the selected statistical classifier. In this sense, a novel case-based learning (c-learning) procedure is proposed which combines hypothesis testing from a discrete set of expected outcomes and a cross-validated classification stage.

  16. Entropy-based gene ranking without selection bias for the predictive classification of microarray data.

    PubMed

    Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe

    2003-11-06

    We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.

  17. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  18. Surgical manual of the Korean Gynecologic Oncology Group: classification of hysterectomy and lymphadenectomy

    PubMed Central

    Choi, Chel Hun; Chun, Yi Kyeong

    2017-01-01

    The Surgery Treatment Modality Committee of the Korean Gynecologic Oncologic Group (KGOG) has determined to develop a surgical manual to facilitate clinical trials and to improve communication between investigators by standardizing and precisely describing operating procedures. The literature on anatomic terminology, identification of surgical components, and surgical techniques were reviewed and discussed in depth to develop a surgical manual for gynecologic oncology. The surgical procedures provided here represent the minimum requirements for participating in a clinical trial. These procedures should be described in the operation record form, and the pathologic findings obtained from the procedures should be recorded in the pathologic report form. Here, we focused on radical hysterectomy and lymphadenectomy, and we developed a KGOG classification for those conditions. PMID:27670259

  19. Surgical manual of the Korean Gynecologic Oncology Group: classification of hysterectomy and lymphadenectomy.

    PubMed

    Lee, Maria; Choi, Chel Hun; Chun, Yi Kyeong; Kim, Yun Hwan; Lee, Kwang Beom; Lee, Shin Wha; Shim, Seung Hyuk; Song, Yong Jung; Roh, Ju Won; Chang, Suk Joon; Lee, Jong Min

    2017-01-01

    The Surgery Treatment Modality Committee of the Korean Gynecologic Oncologic Group (KGOG) has determined to develop a surgical manual to facilitate clinical trials and to improve communication between investigators by standardizing and precisely describing operating procedures. The literature on anatomic terminology, identification of surgical components, and surgical techniques were reviewed and discussed in depth to develop a surgical manual for gynecologic oncology. The surgical procedures provided here represent the minimum requirements for participating in a clinical trial. These procedures should be described in the operation record form, and the pathologic findings obtained from the procedures should be recorded in the pathologic report form. Here, we focused on radical hysterectomy and lymphadenectomy, and we developed a KGOG classification for those conditions.

  20. 40 CFR 152.161 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.161 Definitions. In addition to... use means any pesticide application that occurs outside enclosed manmade structures or the...

  1. 40 CFR 152.161 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.161 Definitions. In addition to... use means any pesticide application that occurs outside enclosed manmade structures or the...

  2. 40 CFR 152.161 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS PESTICIDE REGISTRATION AND CLASSIFICATION PROCEDURES Classification of Pesticides § 152.161 Definitions. In addition to... use means any pesticide application that occurs outside enclosed manmade structures or the...

  3. Some Dimensions of Auditory Sonar Signal Perception and Their Relationships to Target Classification

    DTIC Science & Technology

    1981-02-13

    a priori how the sample of experimental stimuli related to the classification stereotypes of experienced sonar personnel, Question 6 was addressed by...projections on some of the experimentally identified dimensions are associ- ated with a high degree of classification success, but signals that lack ,strong...11 Hypotheses ......................... 11 Procedure ....... .. .. ......................... 11 Experimental Stimuli

  4. Classification and mensuration of LACIE segments

    NASA Technical Reports Server (NTRS)

    Heydorn, R. P.; Bizzell, R. M.; Quirein, J. A.; Abotteen, K. M.; Sumner, C. A. (Principal Investigator)

    1979-01-01

    The theory of classification methods and the functional steps in the manual training process used in the three phases of LACIE are discussed. The major problems that arose in using a procedure for manually training a classifier and a method of machine classification are discussed to reveal the motivation that led to a redesign for the third LACIE phase.

  5. The mapping of marsh vegetation using aircraft multispectral scanner data. [in Louisiana

    NASA Technical Reports Server (NTRS)

    Butera, M. K.

    1975-01-01

    A test was conducted to determine if salinity regimes in coastal marshland could be mapped and monitored by the identification and classification of marsh vegetative species from aircraft multispectral scanner data. The data was acquired at 6.1 km (20,000 ft.) on October 2, 1974, over a test area in the coastal marshland of southern Louisiana including fresh, intermediate, brackish, and saline zones. The data was classified by vegetational species using a supervised, spectral pattern recognition procedure. Accuracies of training sites ranged from 67% to 96%. Marsh zones based on free soil water salinity were determined from the species classification to demonstrate a practical use for mapping marsh vegetation.

  6. Pattern of Cortical Fracture following Corticotomy for Distraction Osteogenesis.

    PubMed

    Luvan, M; Kanthan, S R; Roshan, G; Saw, A

    2015-11-01

    Corticotomy is an essential procedure for deformity correction and there are many techniques described. However there is no proper classification of the fracture pattern resulting from corticotomies to enable any studies to be conducted. We performed a retrospective study of corticotomy fracture patterns in 44 patients (34 tibias and 10 femurs) performed for various indications. We identified four distinct fracture patterns, Type I through IV classification based on the fracture propagation following percutaneous corticotomy. Type I transverse fracture, Type II transverse fracture with a winglet, Type III presence of butterfly fragment and Type IV fracture propagation to a fixation point. No significant correlation was noted between the fracture pattern and the underlying pathology or region of corticotomy.

  7. Maturity assessment of harumanis mango using thermal camera sensor

    NASA Astrophysics Data System (ADS)

    Sa'ad, F. S. A.; Shakaff, A. Y. Md.; Zakaria, A.; Abdullah, A. H.; Ibrahim, M. F.

    2017-03-01

    The perceived quality of fruits, such as mangoes, is greatly dependent on many parameters such as ripeness, shape, size, and is influenced by other factors such as harvesting time. Unfortunately, a manual fruit grading has several drawbacks such as subjectivity, tediousness and inconsistency. By automating the procedure, as well as developing new classification technique, it may solve these problems. This paper presents the novel work on the using Infrared as a Tool in Quality Monitoring of Harumanis Mangoes. The histogram of infrared image was used to distinguish and classify the level of ripeness of the fruits based on the colour spectrum by week. The approach proposed thermal data was able to achieve 90.5% correct classification.

  8. Morphological and wavelet features towards sonographic thyroid nodules evaluation.

    PubMed

    Tsantis, Stavros; Dimitropoulos, Nikos; Cavouras, Dionisis; Nikiforidis, George

    2009-03-01

    This paper presents a computer-based classification scheme that utilized various morphological and novel wavelet-based features towards malignancy risk evaluation of thyroid nodules in ultrasonography. The study comprised 85 ultrasound images-patients that were cytological confirmed (54 low-risk and 31 high-risk). A set of 20 features (12 based on nodules boundary shape and 8 based on wavelet local maxima located within each nodule) has been generated. Two powerful pattern recognition algorithms (support vector machines and probabilistic neural networks) have been designed and developed in order to quantify the power of differentiation of the introduced features. A comparative study has also been held, in order to estimate the impact speckle had onto the classification procedure. The diagnostic sensitivity and specificity of both classifiers was made by means of receiver operating characteristics (ROC) analysis. In the speckle-free feature set, the area under the ROC curve was 0.96 for the support vector machines classifier whereas for the probabilistic neural networks was 0.91. In the feature set with speckle, the corresponding areas under the ROC curves were 0.88 and 0.86 respectively for the two classifiers. The proposed features can increase the classification accuracy and decrease the rate of missing and misdiagnosis in thyroid cancer control.

  9. 47 CFR 36.380 - Other billing and collecting expense.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... JURISDICTIONAL SEPARATIONS PROCEDURES; STANDARD PROCEDURES FOR SEPARATING TELECOMMUNICATIONS PROPERTY COSTS... Customer Operations Expenses § 36.380 Other billing and collecting expense. (a) This classification...

  10. Sensory classification of table olives using an electronic tongue: Analysis of aqueous pastes and brines.

    PubMed

    Marx, Ítala; Rodrigues, Nuno; Dias, Luís G; Veloso, Ana C A; Pereira, José A; Drunkler, Deisy A; Peres, António M

    2017-01-01

    Table olives are highly appreciated and consumed worldwide. Different aspects are used for trade category classification being the sensory assessment of negative defects present in the olives and brines one of the most important. The trade category quality classification must follow the International Olive Council directives, requiring the organoleptic assessment of defects by a trained sensory panel. However, the training process is a hard, complex and sometimes subjective task, being the low number of samples that can be evaluated per day a major drawback considering the real needs of the olive industry. In this context, the development of electronic tongues as taste sensors for defects' sensory evaluation is of utmost relevance. So, an electronic tongue was used for table olives classification according to the presence and intensity of negative defects. Linear discrimination models were established based on sub-sets of sensor signals selected by a simulated annealing algorithm. The predictive potential of the novel approach was first demonstrated for standard solutions of chemical compounds that mimic butyric, putrid and zapateria defects (≥93% for cross-validation procedures). Then its applicability was verified; using reference table olives/brine solutions samples identified with a single intense negative attribute, namely butyric, musty, putrid, zapateria or winey-vinegary defects (≥93% cross-validation procedures). Finally, the E-tongue coupled with the same chemometric approach was applied to classify table olive samples according to the trade commercial categories (extra, 1 st choice, 2 nd choice and unsuitable for consumption) and an additional quality category (extra free of defects), established based on sensory analysis data. Despite the heterogeneity of the samples studied and number of different sensory defects perceived, the predictive linear discriminant model established showed sensitivities greater than 86%. So, the overall performance achieved showed that the electrochemical device could be used as a taste sensor for table olives organoleptic trade successful classification, allowing a preliminary quality assessment, which could facilitate, in the future, the complex task of sensory panelists. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Three-Dimensional Road Network by Fusion of Polarimetric and Interferometric SAR Data

    NASA Technical Reports Server (NTRS)

    Gamba, P.; Houshmand, B.

    1998-01-01

    In this paper a fuzzy classification procedure is applied to polarimetric radar measurements, and street pixels are detected. These data are successively grouped into consistent roads by means of a dynamic programming approach based on the fuzzy membership function values. Further fusion of the 2D road network extracted and 3D TOPSAR measurements provides a powerful way to analyze urban infrastructures.

  12. Early Detection of Breast Cancer Using Molecular Beacons

    DTIC Science & Technology

    2008-01-01

    a molecular beacon (MB)-based approach for direct examination of gene expression in viable and fixed cells (2, 3). This objective of proposed study ...can be distinguished from normal cells (dark) (Figure 1) (2, 3, 8). Recently, a class of new fluorescent emitting nanoparticles, semiconductor ...morphological classification. This method may offer a simple and fast procedure to detect biomarker gene expression in clinical samples. Our study results

  13. Reliable Early Classification on Multivariate Time Series with Numerical and Categorical Attributes

    DTIC Science & Technology

    2015-05-22

    design a procedure of feature extraction in REACT named MEG (Mining Equivalence classes with shapelet Generators) based on the concept of...Equivalence Classes Mining [12, 15]. MEG can efficiently and effectively generate the discriminative features. In addition, several strategies are proposed...technique of parallel computing [4] to propose a process of pa- rallel MEG for substantially reducing the computational overhead of discovering shapelet

  14. Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.

    PubMed

    Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck

    2018-04-20

    Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.

  15. Proposed Core Competencies and Empirical Validation Procedure in Competency Modeling: Confirmation and Classification

    PubMed Central

    Baczyńska, Anna K.; Rowiński, Tomasz; Cybis, Natalia

    2016-01-01

    Competency models provide insight into key skills which are common to many positions in an organization. Moreover, there is a range of competencies that is used by many companies. Researchers have developed core competency terminology to underline their cross-organizational value. The article presents a theoretical model of core competencies consisting of two main higher-order competencies called performance and entrepreneurship. Each of them consists of three elements: the performance competency includes cooperation, organization of work and goal orientation, while entrepreneurship includes innovativeness, calculated risk-taking and pro-activeness. However, there is lack of empirical validation of competency concepts in organizations and this would seem crucial for obtaining reliable results from organizational research. We propose a two-step empirical validation procedure: (1) confirmation factor analysis, and (2) classification of employees. The sample consisted of 636 respondents (M = 44.5; SD = 15.1). Participants were administered a questionnaire developed for the study purpose. The reliability, measured by Cronbach’s alpha, ranged from 0.60 to 0.83 for six scales. Next, we tested the model using a confirmatory factor analysis. The two separate, single models of performance and entrepreneurial orientations fit quite well to the data, while a complex model based on the two single concepts needs further research. In the classification of employees based on the two higher order competencies we obtained four main groups of employees. Their profiles relate to those found in the literature, including so-called niche finders and top performers. Some proposal for organizations is discussed. PMID:27014111

  16. Railroad Classification Yard Technology Manual. Volume I : Yard Design Methods

    DOT National Transportation Integrated Search

    1981-02-01

    This volume documents the procedures and methods associated with the design of railroad classification yards. Subjects include: site location, economic analysis, yard capacity analysis, design of flat yards, overall configuration of hump yards, hump ...

  17. An evaluation of the transferability of cross classification trip generation models.

    DOT National Transportation Integrated Search

    1978-01-01

    This report describes the results of the application in Virginia of the trip generation procedures described in the Federal Highway Administration report entitled Trip Generation Analysis and published in 1975. Cross classification models, disaggrega...

  18. Classification of Instructional Programs: 2000 Edition.

    ERIC Educational Resources Information Center

    Education Statistics Quarterly, 2002

    2002-01-01

    Describes the methods, processes, and procedures used to develop the Classification of Instructional Programs 2000 (CIP:2000), the National Center for Education Statistics taxonomy of instructional programs, and provides information on the CIP's structure, contents, and organization. (SLD)

  19. High-Altitude Electromagnetic Pulse (HEMP) Testing

    DTIC Science & Technology

    2011-11-10

    Security Classification Guide ( SCG ). b. The HEMP simulation facility shall have a measured map of the peak amplitude waveform of the...Quadripartite Standardization Agreement s, sec second SCG security classification guide SN serial number SOP Standard Operating Procedure

  20. 46 CFR Sec. 18 - Group classification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION A-NATIONAL SHIPPING AUTHORITY PROCEDURE FOR ACCOMPLISHMENT OF VESSEL REPAIRS UNDER NATIONAL SHIPPING AUTHORITY MASTER LUMP SUM REPAIR CONTRACT-NSA-LUMPSUMREP... inserted thereon: Number Classification 41 Maintenance Repairs (deck, engine and stewards department...

  1. 46 CFR Sec. 18 - Group classification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION A-NATIONAL SHIPPING AUTHORITY PROCEDURE FOR ACCOMPLISHMENT OF VESSEL REPAIRS UNDER NATIONAL SHIPPING AUTHORITY MASTER LUMP SUM REPAIR CONTRACT-NSA-LUMPSUMREP... inserted thereon: Number Classification 41 Maintenance Repairs (deck, engine and stewards department...

  2. 46 CFR Sec. 18 - Group classification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION A-NATIONAL SHIPPING AUTHORITY PROCEDURE FOR ACCOMPLISHMENT OF VESSEL REPAIRS UNDER NATIONAL SHIPPING AUTHORITY MASTER LUMP SUM REPAIR CONTRACT-NSA-LUMPSUMREP... inserted thereon: Number Classification 41 Maintenance Repairs (deck, engine and stewards department...

  3. 46 CFR Sec. 18 - Group classification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION A-NATIONAL SHIPPING AUTHORITY PROCEDURE FOR ACCOMPLISHMENT OF VESSEL REPAIRS UNDER NATIONAL SHIPPING AUTHORITY MASTER LUMP SUM REPAIR CONTRACT-NSA-LUMPSUMREP... inserted thereon: Number Classification 41 Maintenance Repairs (deck, engine and stewards department...

  4. 46 CFR Sec. 18 - Group classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION A-NATIONAL SHIPPING AUTHORITY PROCEDURE FOR ACCOMPLISHMENT OF VESSEL REPAIRS UNDER NATIONAL SHIPPING AUTHORITY MASTER LUMP SUM REPAIR CONTRACT-NSA-LUMPSUMREP... inserted thereon: Number Classification 41 Maintenance Repairs (deck, engine and stewards department...

  5. The Interrater and Intrarater Agreement of a Modified Neer Classification System and Associated Treatment Choice for Lateral Clavicle Fractures.

    PubMed

    Cho, Chul-Hyun; Oh, Joo Han; Jung, Gu-Hee; Moon, Gi-Hyuk; Rhyou, In Hyeok; Yoon, Jong Pil; Lee, Ho Min

    2015-10-01

    As there is substantial variation in the classification and diagnosis of lateral clavicle fractures, proper management can be challenging. Although the Neer classification system modified by Craig has been widely used, no study has assessed its validity through inter- and intrarater agreement. To determine the inter- and intrarater agreement of the modified Neer classification system and associated treatment choice for lateral clavicle fractures and to assess whether 3-dimensional computed tomography (3D CT) improves the level of agreement. Cohort study (diagnosis); Level of evidence, 3. Nine experienced shoulder specialists and 9 orthopaedic fellows evaluated 52 patients with lateral clavicle fractures, completing fracture typing according to the modified Neer classification system and selecting a treatment choice for each case. Web-based assessment was performed using plain radiographs only, followed by the addition of 3D CT images 2 weeks later. This procedure was repeated 4 weeks later. Fleiss κ values were calculated to estimate the inter- and intrarater agreement. Based on plain radiographs only, the inter- and intrarater agreement of the modified Neer classification system was regarded as fair (κ = 0.344) and moderate (κ = 0.496), respectively; the inter- and intrarater agreement of treatment choice was both regarded as moderate (κ = 0.465 and 0.555, respectively). Based on the plain radiographs and 3D CT images, the inter- and intrarater agreement of the classification system was regarded as fair (κ = 0.317) and moderate (κ = 0.508), respectively; the inter- and intrarater agreement of treatment choice was regarded as moderate (κ = 0.463) and substantial (κ = 0.623), respectively. There were no significant differences in the level of agreement between the plain radiographs only and plain radiographs plus 3D CT images for any κ values (all P > .05). The level of interrater agreement of the modified Neer classification system for lateral clavicle fractures was fair. Additional 3D CT did not improve the overall level of interrater or intrarater agreement of the modified Neer classification system or associated treatment choice. To eliminate a common source of disagreement among surgeons, a new classification system to focus on unclassifiable fracture types is needed. © 2015 The Author(s).

  6. Fluorescent marker-based and marker-free discrimination between healthy and cancerous human tissues using hyper-spectral imaging

    NASA Astrophysics Data System (ADS)

    Arnold, Thomas; De Biasio, Martin; Leitner, Raimund

    2015-06-01

    Two problems are addressed in this paper (i) the fluorescent marker-based and the (ii) marker-free discrimination between healthy and cancerous human tissues. For both applications the performance of hyper-spectral methods are quantified. Fluorescent marker-based tissue classification uses a number of fluorescent markers to dye specific parts of a human cell. The challenge is that the emission spectra of the fluorescent dyes overlap considerably. They are, furthermore disturbed by the inherent auto-fluorescence of human tissue. This results in ambiguities and decreased image contrast causing difficulties for the treatment decision. The higher spectral resolution introduced by tunable-filter-based spectral imaging in combination with spectral unmixing techniques results in an improvement of the image contrast and therefore more reliable information for the physician to choose the treatment decision. Marker-free tissue classification is based solely on the subtle spectral features of human tissue without the use of artificial markers. The challenge in this case is that the spectral differences between healthy and cancerous tissues are subtle and embedded in intra- and inter-patient variations of these features. The contributions of this paper are (i) the evaluation of hyper-spectral imaging in combination with spectral unmixing techniques for fluorescence marker-based tissue classification, (ii) the evaluation of spectral imaging for marker-free intra surgery tissue classification. Within this paper, we consider real hyper-spectral fluorescence and endoscopy data sets to emphasize the practical capability of the proposed methods. It is shown that the combination of spectral imaging with multivariate statistical methods can improve the sensitivity and specificity of the detection and the staging of cancerous tissues compared to standard procedures.

  7. Early differential processing of material images: Evidence from ERP classification.

    PubMed

    Wiebel, Christiane B; Valsecchi, Matteo; Gegenfurtner, Karl R

    2014-06-24

    Investigating the temporal dynamics of natural image processing using event-related potentials (ERPs) has a long tradition in object recognition research. In a classical Go-NoGo task two characteristic effects have been emphasized: an early task independent category effect and a later task-dependent target effect. Here, we set out to use this well-established Go-NoGo paradigm to study the time course of material categorization. Material perception has gained more and more interest over the years as its importance in natural viewing conditions has been ignored for a long time. In addition to analyzing standard ERPs, we conducted a single trial ERP pattern analysis. To validate this procedure, we also measured ERPs in two object categories (people and animals). Our linear classification procedure was able to largely capture the overall pattern of results from the canonical analysis of the ERPs and even extend it. We replicate the known target effect (differential Go-NoGo potential at frontal sites) for the material images. Furthermore, we observe task-independent differential activity between the two material categories as early as 140 ms after stimulus onset. Using our linear classification approach, we show that material categories can be differentiated consistently based on the ERP pattern in single trials around 100 ms after stimulus onset, independent of the target-related status. This strengthens the idea of early differential visual processing of material categories independent of the task, probably due to differences in low-level image properties and suggests pattern classification of ERP topographies as a strong instrument for investigating electrophysiological brain activity. © 2014 ARVO.

  8. Predictive value of hippocampal MR imaging-based high-dimensional mapping in mesial temporal epilepsy: preliminary findings.

    PubMed

    Hogan, R E; Wang, L; Bertrand, M E; Willmore, L J; Bucholz, R D; Nassif, A S; Csernansky, J G

    2006-01-01

    We objectively assessed surface structural changes of the hippocampus in mesial temporal sclerosis (MTS) and assessed the ability of large-deformation high-dimensional mapping (HDM-LD) to demonstrate hippocampal surface symmetry and predict group classification of MTS in right and left MTS groups compared with control subjects. Using eigenvector field analysis of HDM-LD segmentations of the hippocampus, we compared the symmetry of changes in the right and left MTS groups with a group of 15 matched controls. To assess the ability of HDM-LD to predict group classification, eigenvectors were selected by a logistic regression procedure when comparing the MTS group with control subjects. Multivariate analysis of variance on the coefficients from the first 9 eigenvectors accounted for 75% of the total variance between groups. The first 3 eigenvectors showed the largest differences between the control group and each of the MTS groups, but with eigenvector 2 showing the greatest difference in the MTS groups. Reconstruction of the hippocampal deformation vector fields due solely to eigenvector 2 shows symmetrical patterns in the right and left MTS groups. A "leave-one-out" (jackknife) procedure correctly predicted group classification in 14 of 15 (93.3%) left MTS subjects and all 15 right MTS subjects. Analysis of principal dimensions of hippocampal shape change suggests that MTS, after accounting for normal right-left asymmetries, affects the right and left hippocampal surface structure very symmetrically. Preliminary analysis using HDM-LD shows it can predict group classification of MTS and control hippocampi in this well-defined population of patients with MTS and mesial temporal lobe epilepsy (MTLE).

  9. A novel approach for honey pollen profile assessment using an electronic tongue and chemometric tools.

    PubMed

    Dias, Luís G; Veloso, Ana C A; Sousa, Mara E B C; Estevinho, Letícia; Machado, Adélio A S C; Peres, António M

    2015-11-05

    Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors' sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10-20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Diabetic foot surgery: classifying patients to predict complications.

    PubMed

    Bevilacqua, Nicholas J; Rogers, Lee C; Armstrong, David G

    2008-01-01

    The purpose of this article is to describe a classification of diabetic foot surgery performed in the absence of critical limb ischaemia. The basis of this classification is centred on three fundamental variables that are present in the assessment of risk and indication: (1) presence or absence of neuropathy (the loss of protective sensation); (2) presence or absence of an open wound; (3) presence or absence of acute limb-threatening infection. The conceptual framework for this classification is to define distinct classes of surgery in an order of theoretically increasing risk for high-level amputation. These include: Class I: elective diabetic foot surgery (procedures performed to treat a painful deformity in a patient without the loss of protective sensation); Class II: prophylactic (procedure performed to reduce the risk of ulceration or reulceration in a person with the loss of protective sensation but without an open wound); Class III: curative (procedure performed to assist in healing an open wound); and Class IV: emergency (procedure performed to limit the progression of acute infection). The presence of critical ischaemia in any of these classes of surgery should prompt a vascular evaluation to consider (1) the urgency of the procedure being considered and (2) possible revascularization prior to or temporally concomitant with the procedure. It is our hope that this system begins a dialogue amongst physicians and surgeons which can ultimately facilitate communication, enhance perspective, and improve care.

  11. AgRISTARS: Foreign commodity production forecasting. The 1980 US corn and soybeans exploratory experiment

    NASA Technical Reports Server (NTRS)

    Malin, J. T.; Carnes, J. G. (Principal Investigator)

    1981-01-01

    The U.S. corn and soybeans exploratory experiment is described which consisted of evaluations of two technology components of a production forecasting system: classification procedures (crop labeling and proportion estimation at the level of a sampling unit) and sampling and aggregation procedures. The results from the labeling evaluations indicate that the corn and soybeans labeling procedure works very well in the U.S. corn belt with full season (after tasseling) LANDSAT data. The procedure should be readily adaptable to corn and soybeans labeling required for subsequent exploratory experiments or pilot tests. The machine classification procedures evaluated in this experiment were not effective in improving the proportion estimates. The corn proportions produced by the machine procedures had a large bias when the bias correction was not performed. This bias was caused by the manner in which the machine procedures handled spectrally impure pixels. The simulation test indicated that the weighted aggregation procedure performed quite well. Although further work can be done to improve both the simulation tests and the aggregation procedure, the results of this test show that the procedure should serve as a useful baseline procedure in future exploratory experiments and pilot tests.

  12. Global Similarity Predicts Dissociation of Classification and Recognition: Evidence Questioning the Implicit-Explicit Learning Distinction in Amnesia

    ERIC Educational Resources Information Center

    Jamieson, Randall K.; Holmes, Signy; Mewhort, D. J. K.

    2010-01-01

    Dissociation of classification and recognition in amnesia is widely taken to imply 2 functional systems: an implicit procedural-learning system that is spared in amnesia and an explicit episodic-learning system that is compromised. We argue that both tasks reflect the global similarity of probes to memory. In classification, subjects sort…

  13. 40 CFR 260.33 - Procedures for variances from classification as a solid waste, for variances to be classified as...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... classification as a solid waste, for variances to be classified as a boiler, or for non-waste determinations. 260.33 Section 260.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES... from classification as a solid waste, for variances to be classified as a boiler, or for non-waste...

  14. 40 CFR 260.33 - Procedures for variances from classification as a solid waste, for variances to be classified as...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... classification as a solid waste, for variances to be classified as a boiler, or for non-waste determinations. 260.33 Section 260.33 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES... from classification as a solid waste, for variances to be classified as a boiler, or for non-waste...

  15. Proposal of a New Adverse Event Classification by the Society of Interventional Radiology Standards of Practice Committee.

    PubMed

    Khalilzadeh, Omid; Baerlocher, Mark O; Shyn, Paul B; Connolly, Bairbre L; Devane, A Michael; Morris, Christopher S; Cohen, Alan M; Midia, Mehran; Thornton, Raymond H; Gross, Kathleen; Caplin, Drew M; Aeron, Gunjan; Misra, Sanjay; Patel, Nilesh H; Walker, T Gregory; Martinez-Salazar, Gloria; Silberzweig, James E; Nikolic, Boris

    2017-10-01

    To develop a new adverse event (AE) classification for the interventional radiology (IR) procedures and evaluate its clinical, research, and educational value compared with the existing Society of Interventional Radiology (SIR) classification via an SIR member survey. A new AE classification was developed by members of the Standards of Practice Committee of the SIR. Subsequently, a survey was created by a group of 18 members from the SIR Standards of Practice Committee and Service Lines. Twelve clinical AE case scenarios were generated that encompassed a broad spectrum of IR procedures and potential AEs. Survey questions were designed to evaluate the following domains: educational and research values, accountability for intraprocedural challenges, consistency of AE reporting, unambiguity, and potential for incorporation into existing quality-assurance framework. For each AE scenario, the survey participants were instructed to answer questions about the proposed and existing SIR classifications. SIR members were invited via online survey links, and 68 members participated among 140 surveyed. Answers on new and existing classifications were evaluated and compared statistically. Overall comparison between the two surveys was performed by generalized linear modeling. The proposed AE classification received superior evaluations in terms of consistency of reporting (P < .05) and potential for incorporation into existing quality-assurance framework (P < .05). Respondents gave a higher overall rating to the educational and research value of the new compared with the existing classification (P < .05). This study proposed an AE classification system that outperformed the existing SIR classification in the studied domains. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.

  16. 5 CFR 1312.1 - Purpose and authority.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.1 Purpose and authority. This subpart sets forth the procedures for the classification and declassification of national security information in the possession of...

  17. Notification: Follow-up Review of EPA’s Classification of National Security Information

    EPA Pesticide Factsheets

    Project #OPE-FY15-0057, July 20, 2015. The EPA OIG plans to begin preliminary research on the OARM actions taken to improve policies and procedures related to the classification of national security information.

  18. 5 CFR 1312.36 - Appeal procedure.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION, DOWNGRADING... is declassified and otherwise releasable. If continued classification is required, the requestor... normally be made within 60 working days following receipt. If additional time is needed, the requestor will...

  19. 5 CFR 1312.36 - Appeal procedure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB DIRECTIVES CLASSIFICATION, DOWNGRADING... is declassified and otherwise releasable. If continued classification is required, the requestor... normally be made within 60 working days following receipt. If additional time is needed, the requestor will...

  20. Volume 19, Issue8 (December 2004)Articles in the Current Issue:Research ArticleTowards automation of palynology 1: analysis of pollen shape and ornamentation using simple geometric measures, derived from scanning electron microscope images

    NASA Astrophysics Data System (ADS)

    Treloar, W. J.; Taylor, G. E.; Flenley, J. R.

    2004-12-01

    This is the first of a series of papers on the theme of automated pollen analysis. The automation of pollen analysis could result in numerous advantages for the reconstruction of past environments, with larger data sets made practical, objectivity and fine resolution sampling. There are also applications in apiculture and medicine. Previous work on the classification of pollen using texture measures has been successful with small numbers of pollen taxa. However, as the number of pollen taxa to be identified increases, more features may be required to achieve a successful classification. This paper describes the use of simple geometric measures to augment the texture measures. The feasibility of this new approach is tested using scanning electron microscope (SEM) images of 12 taxa of fresh pollen taken from reference material collected on Henderson Island, Polynesia. Pollen images were captured directly from a SEM connected to a PC. A threshold grey-level was set and binary images were then generated. Pollen edges were then located and the boundaries were traced using a chain coding system. A number of simple geometric variables were calculated directly from the chain code of the pollen and a variable selection procedure was used to choose the optimal subset to be used for classification. The efficiency of these variables was tested using a leave-one-out classification procedure. The system successfully split the original 12 taxa sample into five sub-samples containing no more than six pollen taxa each. The further subdivision of echinate pollen types was then attempted with a subset of four pollen taxa. A set of difference codes was constructed for a range of displacements along the chain code. From these difference codes probability variables were calculated. A variable selection procedure was again used to choose the optimal subset of probabilities that may be used for classification. The efficiency of these variables was again tested using a leave-one-out classification procedure. The proportion of correctly classified pollen ranged from 81% to 100% depending on the subset of variables used. The best set of variables had an overall classification rate averaging at about 95%. This is comparable with the classification rates from the earlier texture analysis work for other types of pollen. Copyright

  1. Morbidity Assessment in Surgery: Refinement Proposal Based on a Concept of Perioperative Adverse Events

    PubMed Central

    Kazaryan, Airazat M.; Røsok, Bård I.; Edwin, Bjørn

    2013-01-01

    Background. Morbidity is a cornerstone assessing surgical treatment; nevertheless surgeons have not reached extensive consensus on this problem. Methods and Findings. Clavien, Dindo, and Strasberg with coauthors (1992, 2004, 2009, and 2010) made significant efforts to the standardization of surgical morbidity (Clavien-Dindo-Strasberg classification, last revision, the Accordion classification). However, this classification includes only postoperative complications and has two principal shortcomings: disregard of intraoperative events and confusing terminology. Postoperative events have a major impact on patient well-being. However, intraoperative events should also be recorded and reported even if they do not evidently affect the patient's postoperative well-being. The term surgical complication applied in the Clavien-Dindo-Strasberg classification may be regarded as an incident resulting in a complication caused by technical failure of surgery, in contrast to the so-called medical complications. Therefore, the term surgical complication contributes to misinterpretation of perioperative morbidity. The term perioperative adverse events comprising both intraoperative unfavourable incidents and postoperative complications could be regarded as better alternative. In 2005, Satava suggested a simple grading to evaluate intraoperative surgical errors. Based on that approach, we have elaborated a 3-grade classification of intraoperative incidents so that it can be used to grade intraoperative events of any type of surgery. Refinements have been made to the Accordion classification of postoperative complications. Interpretation. The proposed systematization of perioperative adverse events utilizing the combined application of two appraisal tools, that is, the elaborated classification of intraoperative incidents on the basis of the Satava approach to surgical error evaluation together with the modified Accordion classification of postoperative complication, appears to be an effective tool for comprehensive assessment of surgical outcomes. This concept was validated in regard to various surgical procedures. Broad implementation of this approach will promote the development of surgical science and practice. PMID:23762627

  2. 28 CFR 527.31 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Procedures. 527.31 Section 527.31 Judicial Administration BUREAU OF PRISONS, DEPARTMENT OF JUSTICE INMATE ADMISSION, CLASSIFICATION, AND TRANSFER TRANSFERS Transfer of Inmates to State Agents for Production on State Writs § 527.31 Procedures...

  3. Tutor-led teaching of procedural skills in the skills lab: Complexity, relevance and teaching competence from the medical teacher, tutor and student perspective.

    PubMed

    Lauter, Jan; Branchereau, Sylvie; Herzog, Wolfgang; Bugaj, Till Johannes; Nikendei, Christoph

    2017-05-01

    In current medical curricula, the transfer of procedural skills has received increasing attention. Skills lab learning and tutor-led teaching have become an inherent part of all medical curricula at German medical faculties. In 2011, the initial basis for the classification of clinical skills in medical school was created by the German Association for Medical Education (GMA) Committee's consensus statement on procedural skills. As a recommendation for medical curricula, the National Competency-based Catalogue of Learning Objectives (NKLM, 2015) lists procedural skills according to their curriculum integration and competency level. However, classification in regard to the perceived complexity, relevance, or teaching competency is still lacking. The present study aimed to investigate procedural skills taught at the Medical Faculty of Heidelberg in regard to their complexity, relevance, and required teaching skills. To achieve this aim (1) the specific procedural skills in terms of complexity, that is, the degree of difficulty, and (2) the perceived relevance of taught procedural skills for studying and subsequent medical profession as well as (3) the personal preparation and required teaching skills were assessed in medical teachers, tutors and students. During the winter semester 2014/2015, the evaluations of all medical teachers, student tutors, and medical students in the skills lab teaching departments of internal medicine, surgery, pediatrics, gynecology, and otorhinolaryngology at the Medical Faculty of Heidelberg were assessed via a quantitative cross-sectional questionnaire survey using 7-point Likert scales. The questionnaire comprised four item sets concerning 1) demographic details, 2) procedural skill complexity, 3) practical relevance, and 4) required preparation and teaching skills. Descriptive, quantitative analysis was used for questionnaire data. The survey included the data from 17 of 20 physicians (return rate: 85 %), 10 of 10 student tutors (return rate: 100 %) and a total of 406 of 691 students (return rate: 58.8 %). In terms of complexity and relevance, no major differences between medical teachers, tutors, and students were found. Procedural skills, assigned to the competence level of final year medical education in the NKLM, were also perceived as more complex than other skills. All skills were considered equally relevant, and student tutors were seen to have equally competent teaching skills as experienced medical teachers. This study largely underpins the NKLM's classification of procedural skills. The complexity assessment allows for conclusions to be drawn as to which skills are perceived to require particularly intensive training. Finally, our study corroborates extant findings that student tutors are apt at teaching procedural skills if they have been properly trained. Copyright © 2017. Published by Elsevier GmbH.

  4. Is overall similarity classification less effortful than single-dimension classification?

    PubMed

    Wills, Andy J; Milton, Fraser; Longmore, Christopher A; Hester, Sarah; Robinson, Jo

    2013-01-01

    It is sometimes argued that the implementation of an overall similarity classification is less effortful than the implementation of a single-dimension classification. In the current article, we argue that the evidence securely in support of this view is limited, and report additional evidence in support of the opposite proposition--overall similarity classification is more effortful than single-dimension classification. Using a match-to-standards procedure, Experiments 1A, 1B and 2 demonstrate that concurrent load reduces the prevalence of overall similarity classification, and that this effect is robust to changes in the concurrent load task employed, the level of time pressure experienced, and the short-term memory requirements of the classification task. Experiment 3 demonstrates that participants who produced overall similarity classifications from the outset have larger working memory capacities than those who produced single-dimension classifications initially, and Experiment 4 demonstrates that instructions to respond meticulously increase the prevalence of overall similarity classification.

  5. Combined texture feature analysis of segmentation and classification of benign and malignant tumour CT slices.

    PubMed

    Padma, A; Sukanesh, R

    2013-01-01

    A computer software system is designed for the segmentation and classification of benign from malignant tumour slices in brain computed tomography (CT) images. This paper presents a method to find and select both the dominant run length and co-occurrence texture features of region of interest (ROI) of the tumour region of each slice to be segmented by Fuzzy c means clustering (FCM) and evaluate the performance of support vector machine (SVM)-based classifiers in classifying benign and malignant tumour slices. Two hundred and six tumour confirmed CT slices are considered in this study. A total of 17 texture features are extracted by a feature extraction procedure, and six features are selected using Principal Component Analysis (PCA). This study constructed the SVM-based classifier with the selected features and by comparing the segmentation results with the experienced radiologist labelled ground truth (target). Quantitative analysis between ground truth and segmented tumour is presented in terms of segmentation accuracy, segmentation error and overlap similarity measures such as the Jaccard index. The classification performance of the SVM-based classifier with the same selected features is also evaluated using a 10-fold cross-validation method. The proposed system provides some newly found texture features have an important contribution in classifying benign and malignant tumour slices efficiently and accurately with less computational time. The experimental results showed that the proposed system is able to achieve the highest segmentation and classification accuracy effectiveness as measured by jaccard index and sensitivity and specificity.

  6. Machine-based classification of ADHD and nonADHD participants using time/frequency features of event-related neuroelectric activity.

    PubMed

    Öztoprak, Hüseyin; Toycan, Mehmet; Alp, Yaşar Kemal; Arıkan, Orhan; Doğutepe, Elvin; Karakaş, Sirel

    2017-12-01

    Attention-deficit/hyperactivity disorder (ADHD) is the most frequent diagnosis among children who are referred to psychiatry departments. Although ADHD was discovered at the beginning of the 20th century, its diagnosis is still confronted with many problems. A novel classification approach that discriminates ADHD and nonADHD groups over the time-frequency domain features of event-related potential (ERP) recordings that are taken during Stroop task is presented. Time-Frequency Hermite-Atomizer (TFHA) technique is used for the extraction of high resolution time-frequency domain features that are highly localized in time-frequency domain. Based on an extensive investigation, Support Vector Machine-Recursive Feature Elimination (SVM-RFE) was used to obtain the best discriminating features. When the best three features were used, the classification accuracy for the training dataset reached 98%, and the use of five features further improved the accuracy to 99.5%. The accuracy was 100% for the testing dataset. Based on extensive experiments, the delta band emerged as the most contributing frequency band and statistical parameters emerged as the most contributing feature group. The classification performance of this study suggests that TFHA can be employed as an auxiliary component of the diagnostic and prognostic procedures for ADHD. The features obtained in this study can potentially contribute to the neuroelectrical understanding and clinical diagnosis of ADHD. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  7. Selection of effective cocrystals former for dissolution rate improvement of active pharmaceutical ingredients based on lipoaffinity index.

    PubMed

    Cysewski, Piotr; Przybyłek, Maciej

    2017-09-30

    New theoretical screening procedure was proposed for appropriate selection of potential cocrystal formers possessing the ability of enhancing dissolution rates of drugs. The procedure relies on the training set comprising 102 positive and 17 negative cases of cocrystals found in the literature. Despite the fact that the only available data were of qualitative character, performed statistical analysis using binary classification allowed to formulate quantitative criterions. Among considered 3679 molecular descriptors the relative value of lipoaffinity index, expressed as the difference between values calculated for active compound and excipient, has been found as the most appropriate measure suited for discrimination of positive and negative cases. Assuming 5% precision, the applied classification criterion led to inclusion of 70% positive cases in the final prediction. Since lipoaffinity index is a molecular descriptor computed using only 2D information about a chemical structure, its estimation is straightforward and computationally inexpensive. The inclusion of an additional criterion quantifying the cocrystallization probability leads to the following conjunction criterions H mix <-0.18 and ΔLA>3.61, allowing for identification of dissolution rate enhancers. The screening procedure was applied for finding the most promising coformers of such drugs as Iloperidone, Ritonavir, Carbamazepine and Enthenzamide. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Differential prioritization between relevance and redundancy in correlation-based feature selection techniques for multiclass gene expression data.

    PubMed

    Ooi, Chia Huey; Chetty, Madhu; Teng, Shyh Wei

    2006-06-23

    Due to the large number of genes in a typical microarray dataset, feature selection looks set to play an important role in reducing noise and computational cost in gene expression-based tissue classification while improving accuracy at the same time. Surprisingly, this does not appear to be the case for all multiclass microarray datasets. The reason is that many feature selection techniques applied on microarray datasets are either rank-based and hence do not take into account correlations between genes, or are wrapper-based, which require high computational cost, and often yield difficult-to-reproduce results. In studies where correlations between genes are considered, attempts to establish the merit of the proposed techniques are hampered by evaluation procedures which are less than meticulous, resulting in overly optimistic estimates of accuracy. We present two realistically evaluated correlation-based feature selection techniques which incorporate, in addition to the two existing criteria involved in forming a predictor set (relevance and redundancy), a third criterion called the degree of differential prioritization (DDP). DDP functions as a parameter to strike the balance between relevance and redundancy, providing our techniques with the novel ability to differentially prioritize the optimization of relevance against redundancy (and vice versa). This ability proves useful in producing optimal classification accuracy while using reasonably small predictor set sizes for nine well-known multiclass microarray datasets. For multiclass microarray datasets, especially the GCM and NCI60 datasets, DDP enables our filter-based techniques to produce accuracies better than those reported in previous studies which employed similarly realistic evaluation procedures.

  9. Applying aerial digital photography as a spectral remote sensing technique for macrophytic cover assessment in small rural streams

    NASA Astrophysics Data System (ADS)

    Anker, Y.; Hershkovitz, Y.; Gasith, A.; Ben-Dor, E.

    2011-12-01

    Although remote sensing of fluvial ecosystems is well developed, the tradeoff between spectral and spatial resolutions prevents its application in small streams (<3m width). In the current study, a remote sensing approach for monitoring and research of small ecosystem was developed. The method is based on differentiation between two indicative vegetation species out of the ecosystem flora. Since when studied, the channel was covered mostly by a filamentous green alga (Cladophora glomerata) and watercress (Nasturtium officinale), these species were chosen as indicative; nonetheless, common reed (Phragmites australis) was also classified in order to exclude it from the stream ROI. The procedure included: A. For both section and habitat scales classifications, acquisition of aerial digital RGB datasets. B. For section scale classification, hyperspectral (HSR) dataset acquisition. C. For calibration, HSR reflectance measurements of specific ground targets, in close proximity to each dataset acquisition swath. D. For habitat scale classification, manual, in-stream flora grid transects classification. The digital RGB datasets were converted to reflectance units by spectral calibration against colored reference plates. These red, green, blue, white, and black EVA foam reference plates were measured by an ASD field spectrometer and each was given a spectral value. Each spectral value was later applied to the spectral calibration and radiometric correction of spectral RGB (SRGB) cube. Spectral calibration of the HSR dataset was done using the empirical line method, based on reference values of progressive grey scale targets. Differentiation between the vegetation species was done by supervised classification both for the HSR and for the SRGB datasets. This procedure was done using the Spectral Angle Mapper function with the spectral pattern of each vegetation species as a spectral end member. Comparison between the two remote sensing techniques and between the SRGB classification and the in-situ transects indicates that: A. Stream vegetation classification resolution is about 4 cm by the SRGB method compared to about 1 m by HSR. Moreover, this resolution is also higher than of the manual grid transect classification. B. The SRGB method is by far the most cost-efficient. The combination of spectral information (rather than the cognitive color) and high spatial resolution of aerial photography provides noise filtration and better sub-water detection capabilities than the HSR technique. C. Only the SRGB method applies for habitat and section scales; hence, its application together with in-situ grid transects for validation, may be optimal for use in similar scenarios.
    The HSR dataset was first degraded to 17 bands with the same spectral range as the RGB dataset and also to a dataset with 3 equivalent bands

  10. Use of an automatic procedure for determination of classes of land use in the Teste Araras area of the peripheral Paulist depression

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Lombardo, M. A.; Valeriano, D. D.

    1981-01-01

    An evaluation of the multispectral image analyzer (system Image 1-100), using automatic classification, is presented. The region studied is situated. The automatic was carried out using the maximum likelihood (MAXVER) classification system. The following classes were established: urban area, bare soil, sugar cane, citrus culture (oranges), pastures, and reforestation. The classification matrix of the test sites indicate that the percentage of correct classification varied between 63% and 100%.

  11. [Nursing service certification. Norm UNE-EN-ISO 9001-2008].

    PubMed

    Salazar de la Guerra, R; Ferrer Arnedo, C; Labrador Domínguez, M J; Sangregorio Matesanz, A

    2014-01-01

    To certify the nursing services using a quality management system, taking an international standard as a reference, and based on a continuous improvement process. The standard was revised, and the Quality Management System documentation was updated, consisting of a Quality Manual and 7 control procedures. All the existing procedures were coded in accordance with the documentation control process. Each operational procedure was associated with a set of indicators which permitted to know the results obtained, analyze the deviations and to implement further improvements. The system was implemented successfully. Twenty-eight care procedures and eleven procedures concerning techniques were incorporated into the management system. Thirty indicators were established that allowed the whole process to be monitored. All patients were assigned to a nurse in their clinical notes and all of them had a personalized Care Plan according to planning methodology using North American Nursing Diagnosis Association (NANDA), Nursing Interventions Classification (NIC) and Nursing Outcomes Classification (NOC) international rankings. The incidence of falls, as well as the incidence of chronic skin wounds, was low, taking into account the characteristics of the patient and the duration of the stay (mean=35.87 days). The safety indicators had a high level of compliance, with 90% of patients clearly identified and 100% with hygiene protocol. The confidence rating given to the nurses was 91%. The certification enabled the quality of the service to be improved using a structured process, analyzing the results, dealing with non-conformities and introducing improvements. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  12. The Analysis of Object-Based Change Detection in Mining Area: a Case Study with Pingshuo Coal Mine

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Zhou, W.; Li, Y.

    2017-09-01

    Accurate information on mining land use and land cover change are crucial for monitoring and environmental change studies. In this paper, RapidEye Remote Sensing Image (Map 2012) and SPOT7 Remote Sensing Image (Map 2015) in Pingshuo Mining Area are selected to monitor changes combined with object-based classification and change vector analysis method, we also used R in highresolution remote sensing image for mining land classification, and found the feasibility and the flexibility of open source software. The results show that (1) the classification of reclaimed mining land has higher precision, the overall accuracy and kappa coefficient of the classification of the change region map were 86.67 % and 89.44 %. It's obvious that object-based classification and change vector analysis which has a great significance to improve the monitoring accuracy can be used to monitor mining land, especially reclaiming mining land; (2) the vegetation area changed from 46 % to 40 % accounted for the proportion of the total area from 2012 to 2015, and most of them were transformed into the arable land. The sum of arable land and vegetation area increased from 51 % to 70 %; meanwhile, build-up land has a certain degree of increase, part of the water area was transformed into arable land, but the extent of the two changes is not obvious. The result illustrated the transformation of reclaimed mining area, at the same time, there is still some land convert to mining land, and it shows the mine is still operating, mining land use and land cover are the dynamic procedure.

  13. An innovative recycling process to obtain pure polyethylene and polypropylene from household waste.

    PubMed

    Serranti, Silvia; Luciani, Valentina; Bonifazi, Giuseppe; Hu, Bin; Rem, Peter C

    2015-01-01

    An innovative recycling process, based on magnetic density separation (MDS) and hyperspectral imaging (HSI), to obtain high quality polypropylene and polyethylene as secondary raw materials, is presented. More in details, MDS was applied to two different polyolefin mixtures coming from household waste. The quality of the two separated PP and PE streams, in terms of purity, was evaluated by a classification procedure based on HSI working in the near infrared range (1000-1700 nm). The classification model was built using known PE and PP samples as training set. The results obtained by HSI were compared with those obtained by classical density analysis carried in laboratory on the same polymers. The results obtained by MDS and the quality assessment of the plastic products by HSI showed that the combined action of these two technologies is a valid solution that can be implemented at industrial level. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Fuzzy ontologies for semantic interpretation of remotely sensed images

    NASA Astrophysics Data System (ADS)

    Djerriri, Khelifa; Malki, Mimoun

    2015-10-01

    Object-based image classification consists in the assignment of object that share similar attributes to object categories. To perform such a task the remote sensing expert uses its personal knowledge, which is rarely formalized. Ontologies have been proposed as solution to represent domain knowledge agreed by domain experts in a formal and machine readable language. Classical ontology languages are not appropriate to deal with imprecision or vagueness in knowledge. Fortunately, Description Logics for the semantic web has been enhanced by various approaches to handle such knowledge. This paper presents the extension of the traditional ontology-based interpretation with fuzzy ontology of main land-cover classes in Landsat8-OLI scenes (vegetation, built-up areas, water bodies, shadow, clouds, forests) objects. A good classification of image objects was obtained and the results highlight the potential of the method to be replicated over time and space in the perspective of transferability of the procedure.

  15. On Biometrics With Eye Movements.

    PubMed

    Zhang, Youming; Juhola, Martti

    2017-09-01

    Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.

  16. Evaluation of spatial filtering on the accuracy of wheat area estimate

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Moreira, M. A.; Chen, S. C.; Delima, A. M.

    1982-01-01

    A 3 x 3 pixel spatial filter for postclassification was used for wheat classification to evaluate the effects of this procedure on the accuracy of area estimation using LANDSAT digital data obtained from a single pass. Quantitative analyses were carried out in five test sites (approx 40 sq km each) and t tests showed that filtering with threshold values significantly decreased errors of commission and omission. In area estimation filtering improved the overestimate of 4.5% to 2.7% and the root-mean-square error decreased from 126.18 ha to 107.02 ha. Extrapolating the same procedure of automatic classification using spatial filtering for postclassification to the whole study area, the accuracy in area estimate was improved from the overestimate of 10.9% to 9.7%. It is concluded that when single pass LANDSAT data is used for crop identification and area estimation the postclassification procedure using a spatial filter provides a more accurate area estimate by reducing classification errors.

  17. Prediction of Protein-Protein Interaction Sites with Machine-Learning-Based Data-Cleaning and Post-Filtering Procedures.

    PubMed

    Liu, Guang-Hui; Shen, Hong-Bin; Yu, Dong-Jun

    2016-04-01

    Accurately predicting protein-protein interaction sites (PPIs) is currently a hot topic because it has been demonstrated to be very useful for understanding disease mechanisms and designing drugs. Machine-learning-based computational approaches have been broadly utilized and demonstrated to be useful for PPI prediction. However, directly applying traditional machine learning algorithms, which often assume that samples in different classes are balanced, often leads to poor performance because of the severe class imbalance that exists in the PPI prediction problem. In this study, we propose a novel method for improving PPI prediction performance by relieving the severity of class imbalance using a data-cleaning procedure and reducing predicted false positives with a post-filtering procedure: First, a machine-learning-based data-cleaning procedure is applied to remove those marginal targets, which may potentially have a negative effect on training a model with a clear classification boundary, from the majority samples to relieve the severity of class imbalance in the original training dataset; then, a prediction model is trained on the cleaned dataset; finally, an effective post-filtering procedure is further used to reduce potential false positive predictions. Stringent cross-validation and independent validation tests on benchmark datasets demonstrated the efficacy of the proposed method, which exhibits highly competitive performance compared with existing state-of-the-art sequence-based PPIs predictors and should supplement existing PPI prediction methods.

  18. Sources of error in estimating truck traffic from automatic vehicle classification data

    DOT National Transportation Integrated Search

    1998-10-01

    Truck annual average daily traffic estimation errors resulting from sample classification counts are computed in this paper under two scenarios. One scenario investigates an improper factoring procedure that may be used by highway agencies. The study...

  19. Using machine learning classifiers to assist healthcare-related decisions: classification of electronic patient records.

    PubMed

    Pollettini, Juliana T; Panico, Sylvia R G; Daneluzzi, Julio C; Tinós, Renato; Baranauskas, José A; Macedo, Alessandra A

    2012-12-01

    Surveillance Levels (SLs) are categories for medical patients (used in Brazil) that represent different types of medical recommendations. SLs are defined according to risk factors and the medical and developmental history of patients. Each SL is associated with specific educational and clinical measures. The objective of the present paper was to verify computer-aided, automatic assignment of SLs. The present paper proposes a computer-aided approach for automatic recommendation of SLs. The approach is based on the classification of information from patient electronic records. For this purpose, a software architecture composed of three layers was developed. The architecture is formed by a classification layer that includes a linguistic module and machine learning classification modules. The classification layer allows for the use of different classification methods, including the use of preprocessed, normalized language data drawn from the linguistic module. We report the verification and validation of the software architecture in a Brazilian pediatric healthcare institution. The results indicate that selection of attributes can have a great effect on the performance of the system. Nonetheless, our automatic recommendation of surveillance level can still benefit from improvements in processing procedures when the linguistic module is applied prior to classification. Results from our efforts can be applied to different types of medical systems. The results of systems supported by the framework presented in this paper may be used by healthcare and governmental institutions to improve healthcare services in terms of establishing preventive measures and alerting authorities about the possibility of an epidemic.

  20. Development of Subscale Fast Cookoff Test (PREPRINT)

    DTIC Science & Technology

    2006-09-21

    The hazards classification procedures have been harmonized with both the UN Test and Criteria Manual for UN Series 1...aimed at the development of a sub-scale alternate test protocol to the external fire test currently required for final hazards classification (HC...external fire test currently required for final hazards classification (HC) of an ordnance system. The specific goal of this part of the task was

  1. Noncoding sequence classification based on wavelet transform analysis: part II

    NASA Astrophysics Data System (ADS)

    Paredes, O.; Strojnik, M.; Romo-Vázquez, R.; Vélez-Pérez, H.; Ranta, R.; Garcia-Torales, G.; Scholl, M. K.; Morales, J. A.

    2017-09-01

    DNA sequences in human genome can be divided into the coding and noncoding ones. We hypothesize that the characteristic periodicities of the noncoding sequences are related to their function. We describe the procedure to identify these characteristic periodicities using the wavelet analysis. Our results show that three groups of noncoding sequences, each one with different biological function, may be differentiated by their wavelet coefficients within specific frequency range.

  2. Implicit structured sequence learning: an fMRI study of the structural mere-exposure effect

    PubMed Central

    Folia, Vasiliki; Petersson, Karl Magnus

    2014-01-01

    In this event-related fMRI study we investigated the effect of 5 days of implicit acquisition on preference classification by means of an artificial grammar learning (AGL) paradigm based on the structural mere-exposure effect and preference classification using a simple right-linear unification grammar. This allowed us to investigate implicit AGL in a proper learning design by including baseline measurements prior to grammar exposure. After 5 days of implicit acquisition, the fMRI results showed activations in a network of brain regions including the inferior frontal (centered on BA 44/45) and the medial prefrontal regions (centered on BA 8/32). Importantly, and central to this study, the inclusion of a naive preference fMRI baseline measurement allowed us to conclude that these fMRI findings were the intrinsic outcomes of the learning process itself and not a reflection of a preexisting functionality recruited during classification, independent of acquisition. Support for the implicit nature of the knowledge utilized during preference classification on day 5 come from the fact that the basal ganglia, associated with implicit procedural learning, were activated during classification, while the medial temporal lobe system, associated with explicit declarative memory, was consistently deactivated. Thus, preference classification in combination with structural mere-exposure can be used to investigate structural sequence processing (syntax) in unsupervised AGL paradigms with proper learning designs. PMID:24550865

  3. Implicit structured sequence learning: an fMRI study of the structural mere-exposure effect.

    PubMed

    Folia, Vasiliki; Petersson, Karl Magnus

    2014-01-01

    In this event-related fMRI study we investigated the effect of 5 days of implicit acquisition on preference classification by means of an artificial grammar learning (AGL) paradigm based on the structural mere-exposure effect and preference classification using a simple right-linear unification grammar. This allowed us to investigate implicit AGL in a proper learning design by including baseline measurements prior to grammar exposure. After 5 days of implicit acquisition, the fMRI results showed activations in a network of brain regions including the inferior frontal (centered on BA 44/45) and the medial prefrontal regions (centered on BA 8/32). Importantly, and central to this study, the inclusion of a naive preference fMRI baseline measurement allowed us to conclude that these fMRI findings were the intrinsic outcomes of the learning process itself and not a reflection of a preexisting functionality recruited during classification, independent of acquisition. Support for the implicit nature of the knowledge utilized during preference classification on day 5 come from the fact that the basal ganglia, associated with implicit procedural learning, were activated during classification, while the medial temporal lobe system, associated with explicit declarative memory, was consistently deactivated. Thus, preference classification in combination with structural mere-exposure can be used to investigate structural sequence processing (syntax) in unsupervised AGL paradigms with proper learning designs.

  4. Change classification in SAR time series: a functional approach

    NASA Astrophysics Data System (ADS)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  5. Measures of Agreement Between Many Raters for Ordinal Classifications

    PubMed Central

    Nelson, Kerrie P.; Edwards, Don

    2015-01-01

    Screening and diagnostic procedures often require a physician's subjective interpretation of a patient's test result using an ordered categorical scale to define the patient's disease severity. Due to wide variability observed between physicians’ ratings, many large-scale studies have been conducted to quantify agreement between multiple experts’ ordinal classifications in common diagnostic procedures such as mammography. However, very few statistical approaches are available to assess agreement in these large-scale settings. Existing summary measures of agreement rely on extensions of Cohen's kappa [1 - 5]. These are prone to prevalence and marginal distribution issues, become increasingly complex for more than three experts or are not easily implemented. Here we propose a model-based approach to assess agreement in large-scale studies based upon a framework of ordinal generalized linear mixed models. A summary measure of agreement is proposed for multiple experts assessing the same sample of patients’ test results according to an ordered categorical scale. This measure avoids some of the key flaws associated with Cohen's kappa and its extensions. Simulation studies are conducted to demonstrate the validity of the approach with comparison to commonly used agreement measures. The proposed methods are easily implemented using the software package R and are applied to two large-scale cancer agreement studies. PMID:26095449

  6. Application of the International Classification of Functioning, Disability, and Health-Children and Youth Version (ICF-CY) to cleft lip and palate.

    PubMed

    Neumann, Sandra; Romonath, Roswitha

    2012-05-01

    In recent health policy discussions, the World Health Organization has urged member states to implement the International Classification of Functioning, Disability, and Health: Children and Youth Version in their clinical practice and research. The purpose of this study was to identify codes from the International Classification of Functioning, Disability, and Health: Children and Youth Version relevant for use among children with cleft lip and/or palate, thereby highlighting the potential value of these codes for interprofessional cleft palate-craniofacial teams. The scope of recent published research in the area of cleft lip and/or palate was reviewed and compared with meaningful terms identified from the International Classification of Functioning, Disability, and Health: Children and Youth Version. In a five-step procedure, a consensus-based list of terms was developed that was linked separately to International Classification of Functioning, Disability, and Health: Children and Youth Version categories and codes. This provided a first draft of a core set for use in the cleft lip and/or palate field. Adopting International Classification of Functioning, Disability, and Health: Children and Youth Version domains in cleft lip and/or palate may aid experts in identifying appropriate starting points for assessment, counseling, and therapy. When used as a clinical tool, it encourages health care professionals to go beyond treatment and outcome perspectives that are focused solely on the child and to include the children's environment and their familial/societal context. In order to establish improved, evidence-based interdisciplinary treatments for children with cleft lip and/or palate, more studies are needed that seek to identify all the influencing conditions of activities, children's participation, and barriers/facilitators in their environments.

  7. Automated detection of radioisotopes from an aircraft platform by pattern recognition analysis of gamma-ray spectra.

    PubMed

    Dess, Brian W; Cardarelli, John; Thomas, Mark J; Stapleton, Jeff; Kroutil, Robert T; Miller, David; Curry, Timothy; Small, Gary W

    2018-03-08

    A generalized methodology was developed for automating the detection of radioisotopes from gamma-ray spectra collected from an aircraft platform using sodium-iodide detectors. Employing data provided by the U.S Environmental Protection Agency Airborne Spectral Photometric Environmental Collection Technology (ASPECT) program, multivariate classification models based on nonparametric linear discriminant analysis were developed for application to spectra that were preprocessed through a combination of altitude-based scaling and digital filtering. Training sets of spectra for use in building classification models were assembled from a combination of background spectra collected in the field and synthesized spectra obtained by superimposing laboratory-collected spectra of target radioisotopes onto field backgrounds. This approach eliminated the need for field experimentation with radioactive sources for use in building classification models. Through a bi-Gaussian modeling procedure, the discriminant scores that served as the outputs from the classification models were related to associated confidence levels. This provided an easily interpreted result regarding the presence or absence of the signature of a specific radioisotope in each collected spectrum. Through the use of this approach, classifiers were built for cesium-137 ( 137 Cs) and cobalt-60 ( 60 Co), two radioisotopes that are of interest in airborne radiological monitoring applications. The optimized classifiers were tested with field data collected from a set of six geographically diverse sites, three of which contained either 137 Cs, 60 Co, or both. When the optimized classification models were applied, the overall percentages of correct classifications for spectra collected at these sites were 99.9 and 97.9% for the 60 Co and 137 Cs classifiers, respectively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Proposed Standardized Neurological Endpoints for Cardiovascular Clinical Trials: An Academic Research Consortium Initiative.

    PubMed

    Lansky, Alexandra J; Messé, Steven R; Brickman, Adam M; Dwyer, Michael; Bart van der Worp, H; Lazar, Ronald M; Pietras, Cody G; Abrams, Kevin J; McFadden, Eugene; Petersen, Nils H; Browndyke, Jeffrey; Prendergast, Bernard; Ng, Vivian G; Cutlip, Donald E; Kapadia, Samir; Krucoff, Mitchell W; Linke, Axel; Scala Moy, Claudia; Schofer, Joachim; van Es, Gerrit-Anne; Virmani, Renu; Popma, Jeffrey; Parides, Michael K; Kodali, Susheel; Bilello, Michel; Zivadinov, Robert; Akar, Joseph; Furie, Karen L; Gress, Daryl; Voros, Szilard; Moses, Jeffrey; Greer, David; Forrest, John K; Holmes, David; Kappetein, Arie P; Mack, Michael; Baumbach, Andreas

    2018-05-14

    Surgical and catheter-based cardiovascular procedures and adjunctive pharmacology have an inherent risk of neurological complications. The current diversity of neurological endpoint definitions and ascertainment methods in clinical trials has led to uncertainties in the neurological risk attributable to cardiovascular procedures and inconsistent evaluation of therapies intended to prevent or mitigate neurological injury. Benefit-risk assessment of such procedures should be on the basis of an evaluation of well-defined neurological outcomes that are ascertained with consistent methods and capture the full spectrum of neurovascular injury and its clinical effect. The Neurologic Academic Research Consortium is an international collaboration intended to establish consensus on the definition, classification, and assessment of neurological endpoints applicable to clinical trials of a broad range of cardiovascular interventions. Systematic application of the proposed definitions and assessments will improve our ability to evaluate the risks of cardiovascular procedures and the safety and effectiveness of preventive therapies.

  9. Classification in Astronomy: Past and Present

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric

    2012-03-01

    Astronomers have always classified celestial objects. The ancient Greeks distinguished between asteros, the fixed stars, and planetos, the roving stars. The latter were associated with the Gods and, starting with Plato in his dialog Timaeus, provided the first mathematical models of celestial phenomena. Giovanni Hodierna classified nebulous objects, seen with a Galilean refractor telescope in the mid-seventeenth century into three classes: "Luminosae," "Nebulosae," and "Occultae." A century later, Charles Messier compiled a larger list of nebulae, star clusters and galaxies, but did not attempt a classification. Classification of comets was a significant enterprise in the 19th century: Alexander (1850) considered two groups based on orbit sizes, Lardner (1853) proposed three groups of orbits, and Barnard (1891) divided them into two classes based on morphology. Aside from the segmentation of the bright stars into constellations, most stellar classifications were based on colors and spectral properties. During the 1860s, the pioneering spectroscopist Angelo Secchi classified stars into five classes: white, yellow, orange, carbon stars, and emission line stars. After many debates, the stellar spectral sequence was refined by the group at Harvard into the familiar OBAFGKM spectral types, later found to be a sequence on surface temperature (Cannon 1926). The spectral classification is still being extended with recent additions of O2 hot stars (Walborn et al. 2002) and L and T brown dwarfs (Kirkpatrick 2005). Townley (1913) reviews 30 years of variable star classification, emerging with six classes with five subclasses. The modern classification of variable stars has about 80 (sub)classes, and is still under debate (Samus 2009). Shortly after his confirmation that some nebulae are external galaxies, Edwin Hubble (1926) proposed his famous bifurcated classification of galaxy morphologies with three classes: ellipticals, spirals, and irregulars. These classes are still used today with many refinements by Gerard de Vaucouleurs and others. Supernovae, nearly all of which are found in external galaxies, have a complicated classification scheme:Type I with subtypes Ia, Ib, Ic, Ib/c pec and Type II with subtypes IIb, IIL, IIP, and IIn (Turatto 2003). The classification is based on elemental abundances in optical spectra and on optical light curve shapes. Tadhunter (2009) presents a three-dimensional classification of active galactic nuclei involving radio power, emission line width, and nuclear luminosity. These taxonomies have played enormously important roles in the development of astronomy, yet all were developed using heuristic methods. Many are based on qualitative and subjective assessments of spatial, temporal, or spectral properties. A qualitative, morphological approach to astronomical studies was explicitly promoted by Zwicky (1957). Other classifications are based on quantitative criteria, but these criteria were developed by subjective examination of training datasets. For example, starburst galaxies are discriminated from narrow-line Seyfert galaxies by a curved line in a diagramof the ratios of four emission lines (Veilleux and Osterbrock 1987). Class II young stellar objects have been defined by a rectangular region in a mid-infrared color-color diagram (Allen et al. 2004). Short and hard gamma-ray bursts are discriminated by a dip in the distribution of burst durations (Kouveliotou et al. 2000). In no case was a statistical or algorithmic procedure used to define the classes.

  10. Procedures for gathering ground truth information for a supervised approach to a computer-implemented land cover classification of LANDSAT-acquired multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Joyce, A. T.

    1978-01-01

    Procedures for gathering ground truth information for a supervised approach to a computer-implemented land cover classification of LANDSAT acquired multispectral scanner data are provided in a step by step manner. Criteria for determining size, number, uniformity, and predominant land cover of training sample sites are established. Suggestions are made for the organization and orientation of field team personnel, the procedures used in the field, and the format of the forms to be used. Estimates are made of the probable expenditures in time and costs. Examples of ground truth forms and definitions and criteria of major land cover categories are provided in appendixes.

  11. Study of distributed learning as a solution to category proliferation in Fuzzy ARTMAP based neural systems.

    PubMed

    Parrado-Hernández, Emilio; Gómez-Sánchez, Eduardo; Dimitriadis, Yannis A

    2003-09-01

    An evaluation of distributed learning as a means to attenuate the category proliferation problem in Fuzzy ARTMAP based neural systems is carried out, from both qualitative and quantitative points of view. The study involves two original winner-take-all (WTA) architectures, Fuzzy ARTMAP and FasArt, and their distributed versions, dARTMAP and dFasArt. A qualitative analysis of the distributed learning properties of dARTMAP is made, focusing on the new elements introduced to endow Fuzzy ARTMAP with distributed learning. In addition, a quantitative study on a selected set of classification problems points out that problems have to present certain features in their output classes in order to noticeably reduce the number of recruited categories and achieve an acceptable classification accuracy. As part of this analysis, distributed learning was successfully adapted to a member of the Fuzzy ARTMAP family, FasArt, and similar procedures can be used to extend distributed learning capabilities to other Fuzzy ARTMAP based systems.

  12. Pattern of Cortical Fracture following Corticotomy for Distraction Osteogenesis

    PubMed Central

    Luvan, M; Roshan, G; Saw, A

    2015-01-01

    Corticotomy is an essential procedure for deformity correction and there are many techniques described. However there is no proper classification of the fracture pattern resulting from corticotomies to enable any studies to be conducted. We performed a retrospective study of corticotomy fracture patterns in 44 patients (34 tibias and 10 femurs) performed for various indications. We identified four distinct fracture patterns, Type I through IV classification based on the fracture propagation following percutaneous corticotomy. Type I transverse fracture, Type II transverse fracture with a winglet, Type III presence of butterfly fragment and Type IV fracture propagation to a fixation point. No significant correlation was noted between the fracture pattern and the underlying pathology or region of corticotomy. PMID:28611907

  13. Relationships among classes of self-oscillating transistor parallel inverters. [for power conditioning applications

    NASA Technical Reports Server (NTRS)

    Wilson, T. G.; Lee, F. C. Y.; Burns, W. W., III; Owen, H. A., Jr.

    1975-01-01

    It recently has been shown in the literature that many dc-to-square-wave parallel inverters which are widely used in power-conditioning applications can be grouped into one of two families. Each family is characterized by an equivalent RLC network. Based on this approach, a classification procedure is presented for self-oscillating parallel inverters which makes evident natural relationships which exist between various inverter configurations. By utilizing concepts from the basic theory of negative resistance oscillators and the principle of duality as applied to nonlinear networks, a chain of relationships is established which enables a methodical transfer of knowledge gained about one family of inverters to any of the other families in the classification array.

  14. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    PubMed Central

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible non-parametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656

  15. Classification System for Individualized Treatment of Adult Buried Penis Syndrome.

    PubMed

    Tausch, Timothy J; Tachibana, Isamu; Siegel, Jordan A; Hoxworth, Ronald; Scott, Jeremy M; Morey, Allen F

    2016-09-01

    The authors present their experience with reconstructive strategies for men with various manifestations of adult buried penis syndrome, and propose a comprehensive anatomical classification system and treatment algorithm based on pathologic changes in the penile skin and involvement of neighboring abdominal and/or scrotal components. The authors reviewed all patients who underwent reconstruction of adult buried penis syndrome at their referral center between 2007 and 2015. Patients were stratified by location and severity of involved anatomical components. Procedures performed, demographics, comorbidities, and clinical outcomes were reviewed. Fifty-six patients underwent reconstruction of buried penis at the authors' center from 2007 to 2015. All procedures began with a ventral penile release. If the uncovered penile skin was determined to be viable, a phalloplasty was performed by anchoring penoscrotal skin to the proximal shaft, and the ventral shaft skin defect was closed with scrotal flaps. In more complex patients with circumferential nonviable penile skin, the penile skin was completely excised and replaced with a split-thickness skin graft. Complex patients with severe abdominal lipodystrophy required adjacent tissue transfer. For cases of genital lymphedema, the procedure involved complete excision of the lymphedematous tissue, and primary closure with or without a split-thickness skin graft, also often involving the scrotum. The authors' overall success rate was 88 percent (49 of 56), defined as resolution of symptoms without the need for additional procedures. Successful correction of adult buried penis often necessitates an interdisciplinary, multimodal approach. Therapeutic, IV.

  16. Policy Agenda for the Next Decade: Creating a Path for Graceful Evolution and Harmonized Classifications and Terminologies Used for Encoding Health Information in Electronic Environments

    PubMed Central

    Foley, Margaret M; Glenn, Regina M; Meli, Peggy L; Scichilone, Rita A

    2009-01-01

    Introduction Health information management (HIM) professionals' involvement with disease classification and nomenclature in the United States can be traced back to the early 20th century. In 1914, Grace Whiting Myers, the founder of the association known today as the American Health Information Management Association (AHIMA), served on the Committee on Uniform Nomenclature, which developed a disease classification system based upon etiological groupings. The profession's expertise and leadership in the collection, classification, and reporting of health data has continued since then. For example, in the early 1960s, another HIM professional (a medical record librarian) served as the associate editor of the fifth edition of the Standard Nomenclature of Disease (SNDO), a forerunner of the widely used clinical terminology, Systematized Nomenclature of Medicine Clinical Terms (SNOMED-CT). During the same period in history, the medical record professionals working in hospitals throughout the country were responsible for manually collecting and reporting disease and procedure information from medical records using SNDO.1 Because coded data have played a pivotal role in the ability to record and share health information through the years, creating the appropriate policy framework for the graceful evolution and harmonization of classification systems and clinical terminologies is essential. PMID:20169015

  17. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    NASA Astrophysics Data System (ADS)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  18. A Single-Channel EOG-Based Speller.

    PubMed

    He, Shenghong; Li, Yuanqing

    2017-11-01

    Electrooculography (EOG) signals, which can be used to infer the intentions of a user based on eye movements, are widely used in human-computer interface (HCI) systems. Most existing EOG-based HCI systems incorporate a limited number of commands because they generally associate different commands with a few different types of eye movements, such as looking up, down, left, or right. This paper presents a novel single-channel EOG-based HCI that allows users to spell asynchronously by only blinking. Forty buttons corresponding to 40 characters displayed to the user via a graphical user interface are intensified in a random order. To select a button, the user must blink his/her eyes in synchrony as the target button is flashed. Two data processing procedures, specifically support vector machine (SVM) classification and waveform detection, are combined to detect eye blinks. During detection, we simultaneously feed the feature vectors extracted from the ongoing EOG signal into the SVM classification and waveform detection modules. Decisions are made based on the results of the SVM classification and waveform detection. Three online experiments were conducted with eight healthy subjects. We achieved an average accuracy of 94.4% and a response time of 4.14 s for selecting a character in synchronous mode, as well as an average accuracy of 93.43% and a false positive rate of 0.03/min in the idle state in asynchronous mode. The experimental results, therefore, demonstrated the effectiveness of this single-channel EOG-based speller.

  19. Success criteria in pediatric neuroendoscopic procedures. Proposal for classification of results after 67 operations.

    PubMed

    Ros, Bienvenido; Romero, Lorena; Ibáñez, Guillermo; Iglesias, Sara; Rius, Francisca; Pérez, Sandra; Arráez, Miguel A

    2012-05-01

    Controversial issues exist concerning criteria for patient selection and long-term success in pediatric neuroendoscopic procedures. We designed a classification of success grades applicable to high-pressure and chronic hydrocephalus and also to those cases in which different endoscopic maneuvers are performed during the same procedure. We then evaluated the success rate and complications in our series. A total of 59 patients underwent 67 neuroendoscopic procedures between January 2003 and January 2011. A retrospective study was made of the preoperative history, operative reports, and postoperative imaging findings and medical records. A 5-grade scale was developed to assess the type of success depending on clinical and radiological data. Complications related to the surgical procedure were also recorded. Two patients were excluded from the success analysis due to insufficient follow-up time. The final results for the first procedures in 57 patients were complete and permanent success (grade I) in 49.1%, complete but transitory success (grade II) in 10.5%, partial success (grade III) in 12.3%, doubtful success (grade IV) in 5.3%, and failure (grade V) in 22.8%. In eight cases a second procedure followed the failure of the first: grade I success was achieved in seven cases (87.5%) and grade V in one case (12.5%). The highest success rates were achieved in cases of hydrocephalus caused by tumors or arachnoid cysts and the lowest in slit ventricle syndrome. A common classification of degrees of success, such as that proposed here, would aid the development of comparative and cooperative studies.

  20. Annual update of data for estimating ESALs.

    DOT National Transportation Integrated Search

    2006-10-01

    A revised procedure for estimating equivalent single axleloads (ESALs) was developed in 1985. This procedure used weight, classification, and traffic volume data collected by the Transportation Cabinet's Division of Planning. : Annual updates of data...

Top