Science.gov

Sample records for accurate exposure classification

  1. TIME-INTEGRATED EXPOSURE MEASURES TO IMPROVE THE PREDICTIVE POWER OF EXPOSURE CLASSIFICATION FOR EPIDEMIOLOGIC STUDIES

    EPA Science Inventory

    Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Although long-term integrated exposure measurements are a critical component of exposure assessment, the ability to include these measurements into epidemiologic...

  2. Accurate phylogenetic classification of variable-length DNA fragments.

    PubMed

    McHardy, Alice Carolyn; Martín, Héctor García; Tsirigos, Aristotelis; Hugenholtz, Philip; Rigoutsos, Isidore

    2007-01-01

    Metagenome studies have retrieved vast amounts of sequence data from a variety of environments leading to new discoveries and insights into the uncultured microbial world. Except for very simple communities, the encountered diversity has made fragment assembly and the subsequent analysis a challenging problem. A taxonomic characterization of metagenomic fragments is required for a deeper understanding of shotgun-sequenced microbial communities, but success has mostly been limited to sequences containing phylogenetic marker genes. Here we present PhyloPythia, a composition-based classifier that combines higher-level generic clades from a set of 340 completed genomes with sample-derived population models. Extensive analyses on synthetic and real metagenome data sets showed that PhyloPythia allows the accurate classification of most sequence fragments across all considered taxonomic ranks, even for unknown organisms. The method requires no more than 100 kb of training sequence for the creation of accurate models of sample-specific populations and can assign fragments >or=1 kb with high specificity.

  3. Automatic classification and accurate size measurement of blank mask defects

    NASA Astrophysics Data System (ADS)

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    complexity of defects encountered. The variety arises due to factors such as defect nature, size, shape and composition; and the optical phenomena occurring around the defect. This paper focuses on preliminary characterization results, in terms of classification and size estimation, obtained by Calibre MDPAutoClassify tool on a variety of mask blank defects. It primarily highlights the challenges faced in achieving the results with reference to the variety of defects observed on blank mask substrates and the underlying complexities which make accurate defect size measurement an important and challenging task.

  4. How accurate are the European Union's classifications of chemical substances.

    PubMed

    Rudén, Christina; Hansson, Sven Ove

    2003-09-30

    The European Commission has decided on harmonized classifications for a large number of individual chemicals according to its own directive for classification and labeling of dangerous substances. We have compared the harmonized classifications for acute oral toxicity to the acute oral toxicity data available in the RTECS database. Of the 992 substances eligible for this comparison, 15% were assigned a too low danger class and 8% a too high danger class according to the RTECS data. Due to insufficient transparency-scientific documentations of the classification decisions are not available-the causes of this discrepancy can only be hypothesized. We propose that the scientific motivations of future classifications be published and that the apparent over- and underclassifications in the present system be either explained or rectified, according to what are the facts in the matter.

  5. Accurate phylogenetic classification of DNA fragments based onsequence composition

    SciTech Connect

    McHardy, Alice C.; Garcia Martin, Hector; Tsirigos, Aristotelis; Hugenholtz, Philip; Rigoutsos, Isidore

    2006-05-01

    Metagenome studies have retrieved vast amounts of sequenceout of a variety of environments, leading to novel discoveries and greatinsights into the uncultured microbial world. Except for very simplecommunities, diversity makes sequence assembly and analysis a verychallenging problem. To understand the structure a 5 nd function ofmicrobial communities, a taxonomic characterization of the obtainedsequence fragments is highly desirable, yet currently limited mostly tothose sequences that contain phylogenetic marker genes. We show that forclades at the rank of domain down to genus, sequence composition allowsthe very accurate phylogenetic 10 characterization of genomic sequence.We developed a composition-based classifier, PhyloPythia, for de novophylogenetic sequence characterization and have trained it on adata setof 340 genomes. By extensive evaluation experiments we show that themethodis accurate across all taxonomic ranks considered, even forsequences that originate fromnovel organisms and are as short as 1kb.Application to two metagenome datasets 15 obtained from samples ofphosphorus-removing sludge showed that the method allows the accurateclassification at genus level of most sequence fragments from thedominant populations, while at the same time correctly characterizingeven larger parts of the samples at higher taxonomic levels.

  6. Accurate Classification of RNA Structures Using Topological Fingerprints

    PubMed Central

    Li, Kejie; Gribskov, Michael

    2016-01-01

    While RNAs are well known to possess complex structures, functionally similar RNAs often have little sequence similarity. While the exact size and spacing of base-paired regions vary, functionally similar RNAs have pronounced similarity in the arrangement, or topology, of base-paired stems. Furthermore, predicted RNA structures often lack pseudoknots (a crucial aspect of biological activity), and are only partially correct, or incomplete. A topological approach addresses all of these difficulties. In this work we describe each RNA structure as a graph that can be converted to a topological spectrum (RNA fingerprint). The set of subgraphs in an RNA structure, its RNA fingerprint, can be compared with the fingerprints of other RNA structures to identify and correctly classify functionally related RNAs. Topologically similar RNAs can be identified even when a large fraction, up to 30%, of the stems are omitted, indicating that highly accurate structures are not necessary. We investigate the performance of the RNA fingerprint approach on a set of eight highly curated RNA families, with diverse sizes and functions, containing pseudoknots, and with little sequence similarity–an especially difficult test set. In spite of the difficult test set, the RNA fingerprint approach is very successful (ROC AUC > 0.95). Due to the inclusion of pseudoknots, the RNA fingerprint approach both covers a wider range of possible structures than methods based only on secondary structure, and its tolerance for incomplete structures suggests that it can be applied even to predicted structures. Source code is freely available at https://github.rcac.purdue.edu/mgribsko/XIOS_RNA_fingerprint. PMID:27755571

  7. Accurate and interpretable classification of microspectroscopy pixels using artificial neural networks.

    PubMed

    Manescu, Petru; Jong Lee, Young; Camp, Charles; Cicerone, Marcus; Brady, Mary; Bajcsy, Peter

    2017-04-01

    This paper addresses the problem of classifying materials from microspectroscopy at a pixel level. The challenges lie in identifying discriminatory spectral features and obtaining accurate and interpretable models relating spectra and class labels. We approach the problem by designing a supervised classifier from a tandem of Artificial Neural Network (ANN) models that identify relevant features in raw spectra and achieve high classification accuracy. The tandem of ANN models is meshed with classification rule extraction methods to lower the model complexity and to achieve interpretability of the resulting model. The contribution of the work is in designing each ANN model based on the microspectroscopy hypothesis about a discriminatory feature of a certain target class being composed of a linear combination of spectra. The novelty lies in meshing ANN and decision rule models into a tandem configuration to achieve accurate and interpretable classification results. The proposed method was evaluated using a set of broadband coherent anti-Stokes Raman scattering (BCARS) microscopy cell images (600 000  pixel-level spectra) and a reference four-class rule-based model previously created by biochemical experts. The generated classification rule-based model was on average 85% accurate measured by the DICE pixel label similarity metric, and on average 96% similar to the reference rules measured by the vector cosine metric.

  8. Accurate crop classification using hierarchical genetic fuzzy rule-based systems

    NASA Astrophysics Data System (ADS)

    Topaloglou, Charalampos A.; Mylonas, Stelios K.; Stavrakoudis, Dimitris G.; Mastorocostas, Paris A.; Theocharis, John B.

    2014-10-01

    This paper investigates the effectiveness of an advanced classification system for accurate crop classification using very high resolution (VHR) satellite imagery. Specifically, a recently proposed genetic fuzzy rule-based classification system (GFRBCS) is employed, namely, the Hierarchical Rule-based Linguistic Classifier (HiRLiC). HiRLiC's model comprises a small set of simple IF-THEN fuzzy rules, easily interpretable by humans. One of its most important attributes is that its learning algorithm requires minimum user interaction, since the most important learning parameters affecting the classification accuracy are determined by the learning algorithm automatically. HiRLiC is applied in a challenging crop classification task, using a SPOT5 satellite image over an intensively cultivated area in a lake-wetland ecosystem in northern Greece. A rich set of higher-order spectral and textural features is derived from the initial bands of the (pan-sharpened) image, resulting in an input space comprising 119 features. The experimental analysis proves that HiRLiC compares favorably to other interpretable classifiers of the literature, both in terms of structural complexity and classification accuracy. Its testing accuracy was very close to that obtained by complex state-of-the-art classification systems, such as the support vector machines (SVM) and random forest (RF) classifiers. Nevertheless, visual inspection of the derived classification maps shows that HiRLiC is characterized by higher generalization properties, providing more homogeneous classifications that the competitors. Moreover, the runtime requirements for producing the thematic map was orders of magnitude lower than the respective for the competitors.

  9. Change in BMI Accurately Predicted by Social Exposure to Acquaintances

    PubMed Central

    Oloritun, Rahman O.; Ouarda, Taha B. M. J.; Moturu, Sai; Madan, Anmol; Pentland, Alex (Sandy); Khayal, Inas

    2013-01-01

    Research has mostly focused on obesity and not on processes of BMI change more generally, although these may be key factors that lead to obesity. Studies have suggested that obesity is affected by social ties. However these studies used survey based data collection techniques that may be biased toward select only close friends and relatives. In this study, mobile phone sensing techniques were used to routinely capture social interaction data in an undergraduate dorm. By automating the capture of social interaction data, the limitations of self-reported social exposure data are avoided. This study attempts to understand and develop a model that best describes the change in BMI using social interaction data. We evaluated a cohort of 42 college students in a co-located university dorm, automatically captured via mobile phones and survey based health-related information. We determined the most predictive variables for change in BMI using the least absolute shrinkage and selection operator (LASSO) method. The selected variables, with gender, healthy diet category, and ability to manage stress, were used to build multiple linear regression models that estimate the effect of exposure and individual factors on change in BMI. We identified the best model using Akaike Information Criterion (AIC) and R2. This study found a model that explains 68% (p<0.0001) of the variation in change in BMI. The model combined social interaction data, especially from acquaintances, and personal health-related information to explain change in BMI. This is the first study taking into account both interactions with different levels of social interaction and personal health-related information. Social interactions with acquaintances accounted for more than half the variation in change in BMI. This suggests the importance of not only individual health information but also the significance of social interactions with people we are exposed to, even people we may not consider as close friends. PMID

  10. Accurate measurement of RF exposure from emerging wireless communication systems

    NASA Astrophysics Data System (ADS)

    Letertre, Thierry; Monebhurrun, Vikass; Toffano, Zeno

    2013-04-01

    Isotropic broadband probes or spectrum analyzers (SAs) may be used for the measurement of rapidly varying electromagnetic fields generated by emerging wireless communication systems. In this paper this problematic is investigated by comparing the responses measured by two different isotropic broadband probes typically used to perform electric field (E-field) evaluations. The broadband probes are submitted to signals with variable duty cycles (DC) and crest factors (CF) either with or without Orthogonal Frequency Division Multiplexing (OFDM) modulation but with the same root-mean-square (RMS) power. The two probes do not provide accurate enough results for deterministic signals such as Worldwide Interoperability for Microwave Access (WIMAX) or Long Term Evolution (LTE) as well as for non-deterministic signals such as Wireless Fidelity (WiFi). The legacy measurement protocols should be adapted to cope for the emerging wireless communication technologies based on the OFDM modulation scheme. This is not easily achieved except when the statistics of the RF emission are well known. In this case the measurement errors are shown to be systematic and a correction factor or calibration can be applied to obtain a good approximation of the total RMS power.

  11. Accurate Detection of Dysmorphic Nuclei Using Dynamic Programming and Supervised Classification.

    PubMed

    Verschuuren, Marlies; De Vylder, Jonas; Catrysse, Hannes; Robijns, Joke; Philips, Wilfried; De Vos, Winnok H

    2017-01-01

    A vast array of pathologies is typified by the presence of nuclei with an abnormal morphology. Dysmorphic nuclear phenotypes feature dramatic size changes or foldings, but also entail much subtler deviations such as nuclear protrusions called blebs. Due to their unpredictable size, shape and intensity, dysmorphic nuclei are often not accurately detected in standard image analysis routines. To enable accurate detection of dysmorphic nuclei in confocal and widefield fluorescence microscopy images, we have developed an automated segmentation algorithm, called Blebbed Nuclei Detector (BleND), which relies on two-pass thresholding for initial nuclear contour detection, and an optimal path finding algorithm, based on dynamic programming, for refining these contours. Using a robust error metric, we show that our method matches manual segmentation in terms of precision and outperforms state-of-the-art nuclear segmentation methods. Its high performance allowed for building and integrating a robust classifier that recognizes dysmorphic nuclei with an accuracy above 95%. The combined segmentation-classification routine is bound to facilitate nucleus-based diagnostics and enable real-time recognition of dysmorphic nuclei in intelligent microscopy workflows.

  12. GPD: a graph pattern diffusion kernel for accurate graph classification with applications in cheminformatics.

    PubMed

    Smalter, Aaron; Huan, Jun Luke; Jia, Yi; Lushington, Gerald

    2010-01-01

    Graph data mining is an active research area. Graphs are general modeling tools to organize information from heterogeneous sources and have been applied in many scientific, engineering, and business fields. With the fast accumulation of graph data, building highly accurate predictive models for graph data emerges as a new challenge that has not been fully explored in the data mining community. In this paper, we demonstrate a novel technique called graph pattern diffusion (GPD) kernel. Our idea is to leverage existing frequent pattern discovery methods and to explore the application of kernel classifier (e.g., support vector machine) in building highly accurate graph classification. In our method, we first identify all frequent patterns from a graph database. We then map subgraphs to graphs in the graph database and use a process we call "pattern diffusion" to label nodes in the graphs. Finally, we designed a graph alignment algorithm to compute the inner product of two graphs. We have tested our algorithm using a number of chemical structure data. The experimental results demonstrate that our method is significantly better than competing methods such as those kernel functions based on paths, cycles, and subgraphs.

  13. Accurate Detection of Dysmorphic Nuclei Using Dynamic Programming and Supervised Classification

    PubMed Central

    Verschuuren, Marlies; De Vylder, Jonas; Catrysse, Hannes; Robijns, Joke; Philips, Wilfried

    2017-01-01

    A vast array of pathologies is typified by the presence of nuclei with an abnormal morphology. Dysmorphic nuclear phenotypes feature dramatic size changes or foldings, but also entail much subtler deviations such as nuclear protrusions called blebs. Due to their unpredictable size, shape and intensity, dysmorphic nuclei are often not accurately detected in standard image analysis routines. To enable accurate detection of dysmorphic nuclei in confocal and widefield fluorescence microscopy images, we have developed an automated segmentation algorithm, called Blebbed Nuclei Detector (BleND), which relies on two-pass thresholding for initial nuclear contour detection, and an optimal path finding algorithm, based on dynamic programming, for refining these contours. Using a robust error metric, we show that our method matches manual segmentation in terms of precision and outperforms state-of-the-art nuclear segmentation methods. Its high performance allowed for building and integrating a robust classifier that recognizes dysmorphic nuclei with an accuracy above 95%. The combined segmentation-classification routine is bound to facilitate nucleus-based diagnostics and enable real-time recognition of dysmorphic nuclei in intelligent microscopy workflows. PMID:28125723

  14. Sunlight exposure assessment: can we accurately assess vitamin D exposure from sunlight questionnaires?

    PubMed

    McCarty, Catherine A

    2008-04-01

    The purpose of this review is to summarize the peer-reviewed literature in relation to sunlight exposure assessment and the validity of using sunlight exposure questionnaires to quantify vitamin D status. There is greater variability in personal ultraviolet (UV) light exposure as the result of personal behavior than as the result of ambient UV light exposure. Although statistically significant, the correlation coefficients for the relation between personal report of sun exposure and ambient UV light measured by dosimetry (assessment of radiation dose) are relatively low. Moreover, the few studies to assess the relation between sunlight measures and serum 25-hydroxyvitamin D show low correlations. These low correlations may not be surprising given that personal factors like melanin content in skin and age also influence cutaneous synthesis of vitamin D. In summary, sunlight exposure questionnaires currently provide imprecise estimates of vitamin D status. Research should be directed to develop more objective, nonintrusive, and economical measures of sunlight exposure to quantify personal vitamin D status.

  15. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    PubMed

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  16. Photometric brown-dwarf classification. I. A method to identify and accurately classify large samples of brown dwarfs without spectroscopy

    NASA Astrophysics Data System (ADS)

    Skrzypek, N.; Warren, S. J.; Faherty, J. K.; Mortlock, D. J.; Burgasser, A. J.; Hewett, P. C.

    2015-02-01

    Aims: We present a method, named photo-type, to identify and accurately classify L and T dwarfs onto the standard spectral classification system using photometry alone. This enables the creation of large and deep homogeneous samples of these objects efficiently, without the need for spectroscopy. Methods: We created a catalogue of point sources with photometry in 8 bands, ranging from 0.75 to 4.6 μm, selected from an area of 3344 deg2, by combining SDSS, UKIDSS LAS, and WISE data. Sources with 13.0 0.8, were then classified by comparison against template colours of quasars, stars, and brown dwarfs. The L and T templates, spectral types L0 to T8, were created by identifying previously known sources with spectroscopic classifications, and fitting polynomial relations between colour and spectral type. Results: Of the 192 known L and T dwarfs with reliable photometry in the surveyed area and magnitude range, 189 are recovered by our selection and classification method. We have quantified the accuracy of the classification method both externally, with spectroscopy, and internally, by creating synthetic catalogues and accounting for the uncertainties. We find that, brighter than J = 17.5, photo-type classifications are accurate to one spectral sub-type, and are therefore competitive with spectroscopic classifications. The resultant catalogue of 1157 L and T dwarfs will be presented in a companion paper.

  17. A novel sulfur mustard (HD) vapor inhalation exposure system for accurate inhaled dose delivery

    PubMed Central

    Perry, Mark R.; Benson, Eric M.; Kohne, Jonathon W.; Plahovinsak, Jennifer L.; Babin, Michael C.; Platoff, Gennady E.; Yeung, David T.

    2014-01-01

    Introduction A custom designed HD exposure system was used to deliver controlled inhaled doses to an animal model through an endotracheal tube. Methods Target HD vapor challenges were generated by a temperature controlled bubbler/aerosol trap, while concentration was monitored near real-time by gas chromatography. Animal breathing parameters were monitored real-time by an in-line pneumotach, pressure transducer, and Buxco pulmonary analysis computer/software. For each exposure, the challenge atmosphere was allowed to stabilize at the desired concentration while the anesthetized animal was provided humidity controlled clean air. Once the target concentration was achieved and stable, a portion of the challenge atmosphere was drawn past the endotracheal tube, where the animal inhaled the exposure ad libitum. During the exposure, HD vapor concentration and animal weight were used to calculate the needed inhaled volume to achieve the target inhaled dose (μg/kg). The exposures were halted when the inhaled volume was achieved. Results The exposure system successfully controlled HD concentrations from 22.2 to 278 mg/m3 and accurately delivered inhaled doses between 49.3 and 1120 μg/kg with actual administered doses being within 4% of the target level. Discussion This exposure system administers specific HD inhaled doses to evaluate physiological effects and for evaluation of potential medical countermeasure treatments. PMID:25291290

  18. IDENTIFICATION OF TIME-INTEGRATED SAMPLING AND MEASUREMENT TECHNIQUES TO SUPPORT HUMAN EXPOSURE STUDIES

    EPA Science Inventory

    Accurate exposure classification tools are required to link exposure with health effects in epidemiological studies. Long-term, time-integrated exposure measures would be desirable to address the problem of developing appropriate residential childhood exposure classifications. ...

  19. Classification algorithms with multi-modal data fusion could accurately distinguish neuromyelitis optica from multiple sclerosis.

    PubMed

    Eshaghi, Arman; Riyahi-Alam, Sadjad; Saeedi, Roghayyeh; Roostaei, Tina; Nazeri, Arash; Aghsaei, Aida; Doosti, Rozita; Ganjgahi, Habib; Bodini, Benedetta; Shakourirad, Ali; Pakravan, Manijeh; Ghana'ati, Hossein; Firouznia, Kavous; Zarei, Mojtaba; Azimi, Amir Reza; Sahraian, Mohammad Ali

    2015-01-01

    Neuromyelitis optica (NMO) exhibits substantial similarities to multiple sclerosis (MS) in clinical manifestations and imaging results and has long been considered a variant of MS. With the advent of a specific biomarker in NMO, known as anti-aquaporin 4, this assumption has changed; however, the differential diagnosis remains challenging and it is still not clear whether a combination of neuroimaging and clinical data could be used to aid clinical decision-making. Computer-aided diagnosis is a rapidly evolving process that holds great promise to facilitate objective differential diagnoses of disorders that show similar presentations. In this study, we aimed to use a powerful method for multi-modal data fusion, known as a multi-kernel learning and performed automatic diagnosis of subjects. We included 30 patients with NMO, 25 patients with MS and 35 healthy volunteers and performed multi-modal imaging with T1-weighted high resolution scans, diffusion tensor imaging (DTI) and resting-state functional MRI (fMRI). In addition, subjects underwent clinical examinations and cognitive assessments. We included 18 a priori predictors from neuroimaging, clinical and cognitive measures in the initial model. We used 10-fold cross-validation to learn the importance of each modality, train and finally test the model performance. The mean accuracy in differentiating between MS and NMO was 88%, where visible white matter lesion load, normal appearing white matter (DTI) and functional connectivity had the most important contributions to the final classification. In a multi-class classification problem we distinguished between all of 3 groups (MS, NMO and healthy controls) with an average accuracy of 84%. In this classification, visible white matter lesion load, functional connectivity, and cognitive scores were the 3 most important modalities. Our work provides preliminary evidence that computational tools can be used to help make an objective differential diagnosis of NMO and MS.

  20. Number of signatures necessary for accurate classification. [for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Richardson, W.; Pentland, A.; Crane, R.; Horwitz, H.

    1976-01-01

    This paper presents a procedure for determining the number of signatures to use in classifying multispectral scanner data. A large initial set of signatures is obtained by clustering the training points within each category (such as 'wheat' or 'other') to be recognized. These clusters are then combined into broader signatures by a program that considers each pair of signatures within a category, combines the best pair in the light of certain criteria, saves the combined signature and repeats the procedure until there is one signature for each category. The result is a collection of sets of signatures, one set for each number between the number of initial clusters and the number of categories. With the aid of statistics such as an estimate of the probability of misclassification between categories, the user can choose the smallest set satisfying his requirements for classification accuracy.

  1. The International Classification of Headache Disorders: accurate diagnosis of orofacial pain?

    PubMed

    Benoliel, R; Birman, N; Eliav, E; Sharav, Y

    2008-07-01

    The aim was to apply diagnostic criteria, as published by the International Headache Society (IHS), to the diagnosis of orofacial pain. A total of 328 consecutive patients with orofacial pain were collected over a period of 2 years. The orofacial pain clinic routinely employs criteria published by the IHS, the American Academy of Orofacial Pain (AAOP) and the Research Diagnostic Criteria for Temporomandibular Disorders (RDCTMD). Employing IHS criteria, 184 patients were successfully diagnosed (56%), including 34 with persistent idiopathic facial pain. In the remaining 144 we applied AAOP/RDCTMD criteria and diagnosed 120 as masticatory myofascial pain (MMP) resulting in a diagnostic efficiency of 92.7% (304/328) when applying the three classifications (IHS, AAOP, RDCTMD). Employing further published criteria, 23 patients were diagnosed as neurovascular orofacial pain (NVOP, facial migraine) and one as a neuropathy secondary to connective tissue disease. All the patients were therefore allocated to predefined diagnoses. MMP is clearly defined by AAOP and the RDCTMD. However, NVOP is not defined by any of the above classification systems. The features of MMP and NVOP are presented and analysed with calculations for positive (PPV) and negative predictive values (NPV). In MMP the combination of facial pain aggravated by jaw movement, and the presence of three or more tender muscles resulted in a PPV = 0.82 and a NPV = 0.86. For NVOP the combination of facial pain, throbbing quality, autonomic and/or systemic features and attack duration of > 60 min gave a PPV = 0.71 and a NPV = 0.95. Expansion of the IHS system is needed so as to integrate more orofacial pain syndromes.

  2. Accurate determination of screw position in treating fifth metatarsal base fractures to shorten radiation exposure time

    PubMed Central

    Wang, Xu; Zhang, Chao; Wang, Chen; Huang, Jia Zhang; Ma, Xin

    2016-01-01

    INTRODUCTION Anatomical markers can help to guide lag screw placement during surgery for internal fixation of fifth metatarsal base fractures. This study aimed to identify the optimal anatomical markers and thus reduce radiation exposure. METHODS A total of 50 patients in Huashan Hospital, Shanghai, China, who underwent oblique foot radiography in the lateral position were randomly selected. The angles between the fifth metatarsal axis and cuboid articular surface were measured to determine the optimal lag screw placement relative to anatomical markers. RESULTS The line connecting the styloid process of the fifth metatarsal base with the second metatarsophalangeal (MTP) joint intersected with the fifth metatarsal base fracture line at an angle of 86.85° ± 5.44°. The line connecting the fifth metatarsal base styloid with the third and fourth MTP joints intersected with the fracture line at angles of 93.28° ± 5.24° and 100.95° ± 5.00°, respectively. The proximal articular surface of the fifth metatarsal base intersected with the line connecting the styloid process of the fifth metatarsal base with the second, third and fourth MTP joints at angles of 24.02° ± 4.77°, 30.79° ± 4.53° and 38.08° ± 4.54°, respectively. CONCLUSION The fifth metatarsal base styloid and third MTP joint can be used as anatomical markers for lag screw placement in fractures involving the fifth tarsometatarsal joint. The connection line, which is normally perpendicular to the fracture line, provides sufficient mechanical stability to facilitate accurate screw placement. The use of these anatomical markers could help to reduce unnecessary radiation exposure for patients and medical staff. PMID:26767892

  3. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2016-10-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  4. Response to “Accurate Risk-Based Chemical Screening Relies on Robust Exposure Estimates”

    EPA Science Inventory

    This is a correspondence (letter to the editor) with reference to comments by Rudel and Perovich on the article "Integration of Dosimetry, Exposure, and High-Throughput Screening Data in Chemical Toxicity Assessment". Article Reference: SI # 238882

  5. Evaluation of air quality zone classification methods based on ambient air concentration exposure.

    PubMed

    Freeman, Brian; McBean, Ed; Gharabaghi, Bahram; Thé, Jesse

    2017-05-01

    Air quality zones are used by regulatory authorities to implement ambient air standards in order to protect human health. Air quality measurements at discrete air monitoring stations are critical tools to determine whether an air quality zone complies with local air quality standards or is noncompliant. This study presents a novel approach for evaluation of air quality zone classification methods by breaking the concentration distribution of a pollutant measured at an air monitoring station into compliance and exceedance probability density functions (PDFs) and then using Monte Carlo analysis with the Central Limit Theorem to estimate long-term exposure. The purpose of this paper is to compare the risk associated with selecting one ambient air classification approach over another by testing the possible exposure an individual living within a zone may face. The chronic daily intake (CDI) is utilized to compare different pollutant exposures over the classification duration of 3 years between two classification methods. Historical data collected from air monitoring stations in Kuwait are used to build representative models of 1-hr NO2 and 8-hr O3 within a zone that meets the compliance requirements of each method. The first method, the "3 Strike" method, is a conservative approach based on a winner-take-all approach common with most compliance classification methods, while the second, the 99% Rule method, allows for more robust analyses and incorporates long-term trends. A Monte Carlo analysis is used to model the CDI for each pollutant and each method with the zone at a single station and with multiple stations. The model assumes that the zone is already in compliance with air quality standards over the 3 years under the different classification methodologies. The model shows that while the CDI of the two methods differs by 2.7% over the exposure period for the single station case, the large number of samples taken over the duration period impacts the sensitivity of

  6. Classification

    ERIC Educational Resources Information Center

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  7. A non-contact method based on multiple signal classification algorithm to reduce the measurement time for accurately heart rate detection.

    PubMed

    Bechet, P; Mitran, R; Munteanu, M

    2013-08-01

    Non-contact methods for the assessment of vital signs are of great interest for specialists due to the benefits obtained in both medical and special applications, such as those for surveillance, monitoring, and search and rescue. This paper investigates the possibility of implementing a digital processing algorithm based on the MUSIC (Multiple Signal Classification) parametric spectral estimation in order to reduce the observation time needed to accurately measure the heart rate. It demonstrates that, by proper dimensioning the signal subspace, the MUSIC algorithm can be optimized in order to accurately assess the heart rate during an 8-28 s time interval. The validation of the processing algorithm performance was achieved by minimizing the mean error of the heart rate after performing simultaneous comparative measurements on several subjects. In order to calculate the error the reference value of heart rate was measured using a classic measurement system through direct contact.

  8. A non-contact method based on multiple signal classification algorithm to reduce the measurement time for accurately heart rate detection

    NASA Astrophysics Data System (ADS)

    Bechet, P.; Mitran, R.; Munteanu, M.

    2013-08-01

    Non-contact methods for the assessment of vital signs are of great interest for specialists due to the benefits obtained in both medical and special applications, such as those for surveillance, monitoring, and search and rescue. This paper investigates the possibility of implementing a digital processing algorithm based on the MUSIC (Multiple Signal Classification) parametric spectral estimation in order to reduce the observation time needed to accurately measure the heart rate. It demonstrates that, by proper dimensioning the signal subspace, the MUSIC algorithm can be optimized in order to accurately assess the heart rate during an 8-28 s time interval. The validation of the processing algorithm performance was achieved by minimizing the mean error of the heart rate after performing simultaneous comparative measurements on several subjects. In order to calculate the error the reference value of heart rate was measured using a classic measurement system through direct contact.

  9. Exposure assessment within a Total Diet Study: a comparison of the use of the pan-European classification system FoodEx-1 with national food classification systems.

    PubMed

    Akhandaf, Y; Van Klaveren, J; De Henauw, S; Van Donkersgoed, G; Van Gorcum, T; Papadopoulos, A; Sirot, V; Kennedy, M; Pinchen, H; Ruprich, J; Rehurkova, I; Perelló, G; Sioen, I

    2015-04-01

    A Total Diet Study (TDS) consists of selecting, collecting and preparing commonly consumed foods purchased at retail level and analysing them for harmful and/or beneficial chemical substances. A food classification system is needed to link food consumption data with the contaminant concentration data obtained in the TDS for the exposure assessment. In this study a comparison was made between the use of a national food classification systems and the use of FoodEx-1, developed and recommended by the European Food Safety Authority (EFSA). The work was performed using data of six European countries: Belgium, Czech Republic, France, The Netherlands, Spain and the UK. For each population, exposure to contaminant A (organic compounds) and/or contaminant B (inorganic compound) was assessed by the Monte Carlo Risk Assessment (MCRA) software using the national classification system and FoodEx-1 for food consumption data and for TDS laboratory results. Minimal differences between both approaches were observed. This observation applied for both contaminant A and contaminant B. In general risk assessment will be similar for both approaches; however, this is not guaranteed. FoodEx-1 proved to be a valuable hierarchic classification system in order to harmonise exposure assessment based on existing TDS results throughout Europe.

  10. Solar ultraviolet and the occupational radiant exposure of Queensland school teachers: A comparative study between teaching classifications and behavior patterns.

    PubMed

    Downs, Nathan J; Harrison, Simone L; Chavez, Daniel R Garzon; Parisi, Alfio V

    2016-05-01

    Classroom teachers located in Queensland, Australia are exposed to high levels of ambient solar ultraviolet as part of the occupational requirement to provide supervision of children during lunch and break times. We investigated the relationship between periods of outdoor occupational radiant exposure and available ambient solar radiation across different teaching classifications and schools relative to the daily occupational solar ultraviolet radiation (HICNIRP) protection standard of 30J/m(2). Self-reported daily sun exposure habits (n=480) and personal radiant exposures were monitored using calibrated polysulphone dosimeters (n=474) in 57 teaching staff from 6 different schools located in tropical north and southern Queensland. Daily radiant exposure patterns among teaching groups were compared to the ambient UV-Index. Personal sun exposures were stratified among teaching classifications, school location, school ownership (government vs non-government), and type (primary vs secondary). Median daily radiant exposures were 15J/m(2) and 5J/m(2)HICNIRP for schools located in northern and southern Queensland respectively. Of the 474 analyzed dosimeter-days, 23.0% were found to exceed the solar radiation protection standard, with the highest prevalence found among physical education teachers (57.4% dosimeter-days), followed by teacher aides (22.6% dosimeter-days) and classroom teachers (18.1% dosimeter-days). In Queensland, peak outdoor exposure times of teaching staff correspond with periods of extreme UV-Index. The daily occupational HICNIRP radiant exposure standard was exceeded in all schools and in all teaching classifications.

  11. MetaShot: an accurate workflow for taxon classification of host-associated microbiome from shotgun metagenomic data.

    PubMed

    Fosso, B; Santamaria, M; D'Antonio, M; Lovero, D; Corrado, G; Vizza, E; Passaro, N; Garbuglia, A R; Capobianchi, M R; Crescenzi, M; Valiente, G; Pesole, G

    2017-01-27

    Shotgun metagenomics by high-throughput sequencing may allow deep and accurate characterization of host-associated total microbiomes, including bacteria, viruses, protists and fungi. However, the analysis of such sequencing data is still extremely challenging in terms of both overall accuracy and computational efficiency, and current methodologies show substantial variability in misclassification rate and resolution at lower taxonomic ranks or are limited to specific life domains (e.g. only bacteria). We present here MetaShot, a workflow for assessing the total microbiome composition from host-associated shotgun sequence data, and show its overall optimal accuracy performance by analyzing both simulated and real datasets.

  12. Fast, Simple and Accurate Handwritten Digit Classification by Training Shallow Neural Network Classifiers with the 'Extreme Learning Machine' Algorithm.

    PubMed

    McDonnell, Mark D; Tissera, Migel D; Vladusich, Tony; van Schaik, André; Tapson, Jonathan

    2015-01-01

    Recent advances in training deep (multi-layer) architectures have inspired a renaissance in neural network use. For example, deep convolutional networks are becoming the default option for difficult tasks on large datasets, such as image and speech recognition. However, here we show that error rates below 1% on the MNIST handwritten digit benchmark can be replicated with shallow non-convolutional neural networks. This is achieved by training such networks using the 'Extreme Learning Machine' (ELM) approach, which also enables a very rapid training time (∼ 10 minutes). Adding distortions, as is common practise for MNIST, reduces error rates even further. Our methods are also shown to be capable of achieving less than 5.5% error rates on the NORB image database. To achieve these results, we introduce several enhancements to the standard ELM algorithm, which individually and in combination can significantly improve performance. The main innovation is to ensure each hidden-unit operates only on a randomly sized and positioned patch of each image. This form of random 'receptive field' sampling of the input ensures the input weight matrix is sparse, with about 90% of weights equal to zero. Furthermore, combining our methods with a small number of iterations of a single-batch backpropagation method can significantly reduce the number of hidden-units required to achieve a particular performance. Our close to state-of-the-art results for MNIST and NORB suggest that the ease of use and accuracy of the ELM algorithm for designing a single-hidden-layer neural network classifier should cause it to be given greater consideration either as a standalone method for simpler problems, or as the final classification stage in deep neural networks applied to more difficult problems.

  13. Classification.

    PubMed

    Tuxhorn, Ingrid; Kotagal, Prakash

    2008-07-01

    In this article, we review the practical approach and diagnostic relevance of current seizure and epilepsy classification concepts and principles as a basic framework for good management of patients with epileptic seizures and epilepsy. Inaccurate generalizations about terminology, diagnosis, and treatment may be the single most important factor, next to an inadequately obtained history, that determines the misdiagnosis and mismanagement of patients with epilepsy. A stepwise signs and symptoms approach for diagnosis, evaluation, and management along the guidelines of the International League Against Epilepsy and definitions of epileptic seizures and epilepsy syndromes offers a state-of-the-art clinical approach to managing patients with epilepsy.

  14. Malingering in toxic exposure: classification accuracy of reliable digit span and WAIS-III Digit Span scaled scores.

    PubMed

    Greve, Kevin W; Springer, Steven; Bianchini, Kevin J; Black, F William; Heinly, Matthew T; Love, Jeffrey M; Swift, Douglas A; Ciota, Megan A

    2007-03-01

    This study examined the sensitivity and false-positive error rate of reliable digit span (RDS) and the WAIS-III Digit Span (DS) scaled score in persons alleging toxic exposure and determined whether error rates differed from published rates in traumatic brain injury (TBI) and chronic pain (CP). Data were obtained from the files of 123 persons referred for neuropsychological evaluation related to alleged exposure to environmental and industrial substances. Malingering status was determined using the criteria of Slick, Sherman, and Iverson (1999). The sensitivity and specificity of RDS and DS in toxic exposure are consistent with those observed in TBI and CP. These findings support the use of these malingering indicators in cases of alleged toxic exposure and suggest that the classification accuracy data of indicators derived from studies of TBI patients may also be validly applied to cases of alleged toxic exposure.

  15. Classification

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj C.

    2011-01-01

    A supervised learning task involves constructing a mapping from input data (normally described by several features) to the appropriate outputs. Within supervised learning, one type of task is a classification learning task, in which each output is one or more classes to which the input belongs. In supervised learning, a set of training examples---examples with known output values---is used by a learning algorithm to generate a model. This model is intended to approximate the mapping between the inputs and outputs. This model can be used to generate predicted outputs for inputs that have not been seen before. For example, we may have data consisting of observations of sunspots. In a classification learning task, our goal may be to learn to classify sunspots into one of several types. Each example may correspond to one candidate sunspot with various measurements or just an image. A learning algorithm would use the supplied examples to generate a model that approximates the mapping between each supplied set of measurements and the type of sunspot. This model can then be used to classify previously unseen sunspots based on the candidate's measurements. This chapter discusses methods to perform machine learning, with examples involving astronomy.

  16. Actual and Potential Radiation Exposures in Digital Radiology: Analysis of Cumulative Data, Implications to Worker Classification and Occupational Exposure Monitoring.

    PubMed

    Kortesniemi, Mika; Siiskonen, Teemu; Kelaranta, Anna; Lappalainen, Kimmo

    2016-04-21

    Radiation worker categorization and exposure monitoring are principal functions of occupational radiation safety. The aim of this study was to use the actual occupational exposure data in a large university hospital to estimate the frequency and magnitude of potential exposures in radiology. The additional aim was to propose a revised categorization and exposure monitoring practice based on the potential exposures. The cumulative probability distribution was calculated from the normalized integral of the probability density function fitted to the exposure data. Conformity of the probabilistic model was checked against 16 years of national monitoring data. The estimated probabilities to exceed annual effective dose limits of 1 mSv, 6 mSv and 20 mSv were 1:1000, 1:20 000 and 1:200 000, respectively. Thus, it is very unlikely that the class A categorization limit of 6 mSv could be exceeded, even in interventional procedures, with modern equipment and appropriate working methods. Therefore, all workers in diagnostic and interventional radiology could be systematically categorized into class B. Furthermore, current personal monitoring practice could be replaced by use of active personal dosemeters that offer more effective and flexible means to optimize working methods.

  17. Is the hair nicotine level a more accurate biomarker of environmental tobacco smoke exposure than urine cotinine?

    PubMed Central

    Al-Delaimy, W; Crane, J; Woodward, A

    2002-01-01

    Study objective: The aim of this study was to compare the two biomarkers of exposure to environmental tobacco smoke (ETS); urine cotinine and hair nicotine, using questionnaires as the standard. Design: A cross sectional study of children consecutively admitted to hospital for lower respiratory illnesses during the period of the study. Settings: Three regional hospitals in the larger Wellington area, New Zealand. Participants: Children aged 3–27 months and admitted to the above hospitals during August 1997 to October 1998. A total of 322 children provided 297 hair samples and 158 urine samples. Main results: Hair nicotine levels were better able to discriminate the groups of children according to their household's smoking habits at home (no smokers, smoke only outside the home, smoke inside the house) than urine cotinine (Kruskall-Wallis; χ2=142.14, and χ2=49.5, respectively (p<0.0001)). Furthermore, hair nicotine levels were more strongly correlated with number of smokers in the house, and the number of cigarettes smoked by parents and other members of the child's households. Hair nicotine was better related to the questionnaire variables of smoking in a multivariate regression model (r2=0.55) than urine cotinine (r2=0.31). Conclusions: In this group of young children, hair nicotine was a more precise biomarker of exposure to ETS than urine cotinine levels, using questionnaire reports as the reference. Both biomarkers indicate that smoking outside the house limits ETS exposure of children but does not eliminate it. PMID:11801622

  18. A Stochastic Method for Balancing Item Exposure Rates in Computerized Classification Tests

    ERIC Educational Resources Information Center

    Huebner, Alan; Li, Zhushan

    2012-01-01

    Computerized classification tests (CCTs) classify examinees into categories such as pass/fail, master/nonmaster, and so on. This article proposes the use of stochastic methods from sequential analysis to address item overexposure, a practical concern in operational CCTs. Item overexposure is traditionally dealt with in CCTs by the Sympson-Hetter…

  19. Encoding negative events under stress: high subjective arousal is related to accurate emotional memory despite misinformation exposure.

    PubMed

    Hoscheidt, Siobhan M; LaBar, Kevin S; Ryan, Lee; Jacobs, W Jake; Nadel, Lynn

    2014-07-01

    Stress at encoding affects memory processes, typically enhancing, or preserving, memory for emotional information. These effects have interesting implications for eyewitness accounts, which in real-world contexts typically involve encoding an aversive event under stressful conditions followed by potential exposure to misinformation. The present study investigated memory for a negative event encoded under stress and subsequent misinformation endorsement. Healthy young adults participated in a between-groups design with three experimental sessions conducted 48 h apart. Session one consisted of a psychosocial stress induction (or control task) followed by incidental encoding of a negative slideshow. During session two, participants were asked questions about the slideshow, during which a random subgroup was exposed to misinformation. Memory for the slideshow was tested during the third session. Assessment of memory accuracy across stress and no-stress groups revealed that stress induced just prior to encoding led to significantly better memory for the slideshow overall. The classic misinformation effect was also observed - participants exposed to misinformation were significantly more likely to endorse false information during memory testing. In the stress group, however, memory accuracy and misinformation effects were moderated by arousal experienced during encoding of the negative event. Misinformed-stress group participants who reported that the negative slideshow elicited high arousal during encoding were less likely to endorse misinformation for the most aversive phase of the story. Furthermore, these individuals showed better memory for components of the aversive slideshow phase that had been directly misinformed. Results from the current study provide evidence that stress and high subjective arousal elicited by a negative event act concomitantly during encoding to enhance emotional memory such that the most aversive aspects of the event are well remembered and

  20. Genomic Models of Short-Term Exposure Accurately Predict Long-Term Chemical Carcinogenicity and Identify Putative Mechanisms of Action

    PubMed Central

    Gusenleitner, Daniel; Auerbach, Scott S.; Melia, Tisha; Gómez, Harold F.; Sherr, David H.; Monti, Stefano

    2014-01-01

    Background Despite an overall decrease in incidence of and mortality from cancer, about 40% of Americans will be diagnosed with the disease in their lifetime, and around 20% will die of it. Current approaches to test carcinogenic chemicals adopt the 2-year rodent bioassay, which is costly and time-consuming. As a result, fewer than 2% of the chemicals on the market have actually been tested. However, evidence accumulated to date suggests that gene expression profiles from model organisms exposed to chemical compounds reflect underlying mechanisms of action, and that these toxicogenomic models could be used in the prediction of chemical carcinogenicity. Results In this study, we used a rat-based microarray dataset from the NTP DrugMatrix Database to test the ability of toxicogenomics to model carcinogenicity. We analyzed 1,221 gene-expression profiles obtained from rats treated with 127 well-characterized compounds, including genotoxic and non-genotoxic carcinogens. We built a classifier that predicts a chemical's carcinogenic potential with an AUC of 0.78, and validated it on an independent dataset from the Japanese Toxicogenomics Project consisting of 2,065 profiles from 72 compounds. Finally, we identified differentially expressed genes associated with chemical carcinogenesis, and developed novel data-driven approaches for the molecular characterization of the response to chemical stressors. Conclusion Here, we validate a toxicogenomic approach to predict carcinogenicity and provide strong evidence that, with a larger set of compounds, we should be able to improve the sensitivity and specificity of the predictions. We found that the prediction of carcinogenicity is tissue-dependent and that the results also confirm and expand upon previous studies implicating DNA damage, the peroxisome proliferator-activated receptor, the aryl hydrocarbon receptor, and regenerative pathology in the response to carcinogen exposure. PMID:25058030

  1. a Single-Exposure Dual-Energy Computed Radiography Technique for Improved Nodule Detection and Classification in Chest Imaging

    NASA Astrophysics Data System (ADS)

    Zink, Frank Edward

    The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology

  2. Evidence that bisphenol A (BPA) can be accurately measured without contamination in human serum and urine, and that BPA causes numerous hazards from multiple routes of exposure.

    PubMed

    vom Saal, Frederick S; Welshons, Wade V

    2014-12-01

    There is extensive evidence that bisphenol A (BPA) is related to a wide range of adverse health effects based on both human and experimental animal studies. However, a number of regulatory agencies have ignored all hazard findings. Reports of high levels of unconjugated (bioactive) serum BPA in dozens of human biomonitoring studies have also been rejected based on the prediction that the findings are due to assay contamination and that virtually all ingested BPA is rapidly converted to inactive metabolites. NIH and industry-sponsored round robin studies have demonstrated that serum BPA can be accurately assayed without contamination, while the FDA lab has acknowledged uncontrolled assay contamination. In reviewing the published BPA biomonitoring data, we find that assay contamination is, in fact, well controlled in most labs, and cannot be used as the basis for discounting evidence that significant and virtually continuous exposure to BPA must be occurring from multiple sources.

  3. Evidence that bisphenol A (BPA) can be accurately measured without contamination in human serum and urine, and that BPA causes numerous hazards from multiple routes of exposure

    PubMed Central

    vom Saal, Frederick S.; Welshons, Wade V.

    2016-01-01

    There is extensive evidence that bisphenol A (BPA) is related to a wide range of adverse health effects based on both human and experimental animal studies. However, a number of regulatory agencies have ignored all hazard findings. Reports of high levels of unconjugated (bioactive) serum BPA in dozens of human biomonitoring studies have also been rejected based on the prediction that the findings are due to assay contamination and that virtually all ingested BPA is rapidly converted to inactive metabolites. NIH and industry-sponsored round robin studies have demonstrated that serum BPA can be accurately assayed without contamination, while the FDA lab has acknowledged uncontrolled assay contamination. In reviewing the published BPA biomonitoring data, we find that assay contamination is, in fact, well controlled in most labs, and cannot be used as the basis for discounting evidence that significant and virtually continuous exposure to BPA must be occurring from multiple sources. PMID:25304273

  4. Fall, classification, and exposure history of the Mifflin L5 chondrite

    NASA Astrophysics Data System (ADS)

    Kita, Noriko T.; Welten, Kees C.; Valley, John W.; Spicuzza, Michael J.; Nakashima, Daisuke; Tenner, Travis J.; Ushikubo, Takayuki; MacPherson, Glenn J.; Welzenbach, Linda; Heck, Philipp R.; Davis, Andrew M.; Meier, Matthias M. M.; Wieler, Rainer; Caffee, Marc W.; Laubenstein, Matthias; Nishiizumi, Kunihiko

    2013-04-01

    The Mifflin meteorite fell on the night of April 14, 2010, in southwestern Wisconsin. A bright fireball was observed throughout a wide area of the midwestern United States. The petrography, mineral compositions, and oxygen isotope ratios indicate that the meteorite is a L5 chondrite fragmental breccia with light/dark structure. The meteorite shows a low shock stage of S2, although some shock-melted veins are present. The U,Th-He age is 0.7 Ga, and the K-Ar age is 1.8 Ga, indicating that Mifflin might have been heated at the time of the 470 Ma L-chondrite parent body breakup and that U, Th-He, and K-Ar ages were partially reset. The cosmogenic radionuclide data indicate that Mifflin was exposed to cosmic rays while its radius was 30-65 cm. Assuming this exposure geometry, a cosmic-ray exposure age of 25 ± 3 Ma is calculated from cosmogenic noble gas concentrations. The low 22Ne/21Ne ratio may, however, indicate a two-stage exposure with a longer first-stage exposure at high shielding. Mifflin is unusual in having a low radiogenic gas content combined with a low shock stage and no evidence of late stage annealing; this inconsistency remains unexplained.

  5. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  6. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented.

    PubMed

    Saraswathi, Saras; Sundaram, Suresh; Sundararajan, Narasimhan; Zimmermann, Michael; Nilsen-Hamilton, Marit

    2011-01-01

    A combination of Integer-Coded Genetic Algorithm (ICGA) and Particle Swarm Optimization (PSO), coupled with the neural-network-based Extreme Learning Machine (ELM), is used for gene selection and cancer classification. ICGA is used with PSO-ELM to select an optimal set of genes, which is then used to build a classifier to develop an algorithm (ICGA_PSO_ELM) that can handle sparse data and sample imbalance. We evaluate the performance of ICGA-PSO-ELM and compare our results with existing methods in the literature. An investigation into the functions of the selected genes, using a systems biology approach, revealed that many of the identified genes are involved in cell signaling and proliferation. An analysis of these gene sets shows a larger representation of genes that encode secreted proteins than found in randomly selected gene sets. Secreted proteins constitute a major means by which cells interact with their surroundings. Mounting biological evidence has identified the tumor microenvironment as a critical factor that determines tumor survival and growth. Thus, the genes identified by this study that encode secreted proteins might provide important insights to the nature of the critical biological features in the microenvironment of each tumor type that allow these cells to thrive and proliferate.

  7. Radiochromic film dosimetry with flatbed scanners: A fast and accurate method for dose calibration and uniformity correction with single film exposure

    SciTech Connect

    Menegotti, L.; Delana, A.; Martignano, A.

    2008-07-15

    Film dosimetry is an attractive tool for dose distribution verification in intensity modulated radiotherapy (IMRT). A critical aspect of radiochromic film dosimetry is the scanner used for the readout of the film: the output needs to be calibrated in dose response and corrected for pixel value and spatial dependent nonuniformity caused by light scattering; these procedures can take a long time. A method for a fast and accurate calibration and uniformity correction for radiochromic film dosimetry is presented: a single film exposure is used to do both calibration and correction. Gafchromic EBT films were read with two flatbed charge coupled device scanners (Epson V750 and 1680Pro). The accuracy of the method is investigated with specific dose patterns and an IMRT beam. The comparisons with a two-dimensional array of ionization chambers using a 18x18 cm{sup 2} open field and an inverse pyramid dose pattern show an increment in the percentage of points which pass the gamma analysis (tolerance parameters of 3% and 3 mm), passing from 55% and 64% for the 1680Pro and V750 scanners, respectively, to 94% for both scanners for the 18x18 open field, and from 76% and 75% to 91% for the inverse pyramid pattern. Application to an IMRT beam also shows better gamma index results, passing from 88% and 86% for the two scanners, respectively, to 94% for both. The number of points and dose range considered for correction and calibration appears to be appropriate for use in IMRT verification. The method showed to be fast and to correct properly the nonuniformity and has been adopted for routine clinical IMRT dose verification.

  8. Fast, Simple and Accurate Handwritten Digit Classification by Training Shallow Neural Network Classifiers with the ‘Extreme Learning Machine’ Algorithm

    PubMed Central

    McDonnell, Mark D.; Tissera, Migel D.; Vladusich, Tony; van Schaik, André; Tapson, Jonathan

    2015-01-01

    Recent advances in training deep (multi-layer) architectures have inspired a renaissance in neural network use. For example, deep convolutional networks are becoming the default option for difficult tasks on large datasets, such as image and speech recognition. However, here we show that error rates below 1% on the MNIST handwritten digit benchmark can be replicated with shallow non-convolutional neural networks. This is achieved by training such networks using the ‘Extreme Learning Machine’ (ELM) approach, which also enables a very rapid training time (∼ 10 minutes). Adding distortions, as is common practise for MNIST, reduces error rates even further. Our methods are also shown to be capable of achieving less than 5.5% error rates on the NORB image database. To achieve these results, we introduce several enhancements to the standard ELM algorithm, which individually and in combination can significantly improve performance. The main innovation is to ensure each hidden-unit operates only on a randomly sized and positioned patch of each image. This form of random ‘receptive field’ sampling of the input ensures the input weight matrix is sparse, with about 90% of weights equal to zero. Furthermore, combining our methods with a small number of iterations of a single-batch backpropagation method can significantly reduce the number of hidden-units required to achieve a particular performance. Our close to state-of-the-art results for MNIST and NORB suggest that the ease of use and accuracy of the ELM algorithm for designing a single-hidden-layer neural network classifier should cause it to be given greater consideration either as a standalone method for simpler problems, or as the final classification stage in deep neural networks applied to more difficult problems. PMID:26262687

  9. [Comparison and application of two risk assessment methods for occupational lead exposure risk classification in a lead-acid battery enterprise].

    PubMed

    Chen, H F; Yao, Z H; Yan, X H; Zhao, L; Wang, S; Lin, J; Huang, H L

    2017-02-20

    Objective: To apply and compare two risk assessment methods for occupational lead exposure risk classification in a lead-acid battery enterprise. Methods: In April 2013, an occupational health survey was carried out in a lead-acid battery enterprise. Lead smoke and lead dust were tested in the workplace. The risk assessment index system for occupational chemical hazards that was established and optimized by the research group (referred to as "optimized index system" ) , as well as the Singapore semi-quantitative risk assessment model, was used for occupational lead exposure risk classification in the lead-acid battery enterprise. The two risk classification results were analyzed and compared. Results: In the lead smoke risk classification results, the optimized index system classified the raw material group and foundry group workshops as Class I hazardous and the assembling group workshop as Class II hazardous. The Singapore semi-quantitative risk assessment model classified the raw material group workshop as high risk and foundry group and assembling group workshops as extremely high risk. In the lead dust risk classification results, the optimized index system classified the raw material group workshop as Class I hazardous, while the plate painting group, plate cutting group, and assembling group workshops were classified as Class II hazardous. The Singapore semi-quantitative risk assessment model classified the raw material group workshop as medium risk, the plate painting group and plate cutting group workshops as high risk, and the assembling group workshop as extremely high risk. Conclusion: There are some differences in risk assessment of occupational lead exposure between the two risk assessment methods. The optimized index system is comparably more reasonable and feasible, and is highly operable.

  10. Comparing diagnostic classification of neurobehavioral disorder associated with prenatal alcohol exposure with the Canadian fetal alcohol spectrum disorder guidelines: a cohort study

    PubMed Central

    Sanders, James L.; Breen, Rebecca E. Hudson; Netelenbos, Nicole

    2017-01-01

    Background: Diagnostic criteria have recently been introduced in the Diagnostic and Statistical Manual of Mental Disorders, 5th edition (DSM-5), for neurobehavioral disorder associated with prenatal alcohol exposure (ND-PAE). The purpose of this study is to assess the classification of this condition using the Canadian fetal alcohol spectrum disorder (FASD) multidisciplinary diagnostic guidelines as the standard of comparison. First, classification of ND-PAE was compared with Canadian FASD diagnoses of fetal alcohol syndrome (FAS), partial FAS and alcohol-related neurodevelopmental disorder. Second, classification of ND-PAE was compared with FAS and pFAS only, a criterion for which includes facial features highly predictive of prenatal alcohol exposure and effects. Methods: Eighty-two patients underwent multidisciplinary clinical evaluations using the Canadian FASD diagnostic guidelines between 2011 and 2015. Two clinicians independently reviewed patient files for evidence of diagnostic criteria for ND-PAE when applying an impairment cut-off level of 2 or more standard deviations below the mean, or clinically significant impairment in the absence of standardized norm-referenced measures. Results: Good interrater reliability was established between clinicians (κ = 0.79). Classifications of ND-PAE and Canadian FASD diagnoses, including alcohol-related neurodevelopmental disorder, were moderately correlated (Cramer V [82] = 0.44, p < 0.01). However, ND-PAE possessed low sensitivity in FASD identification. Further, there was no correlation between ND-PAE and FAS/pFAS classifications (Cramer V [82] = 0.05, p > 0.05). Interpretation: Although there is considerable overlap between both sets of criteria, ND-PAE was less likely to identify patients with FASD. Although the neurobehavioral domains assessed by ND-PAE are supported in research, its diagnostic structure restricts the identification of FASD at the impairment threshold of 2 or more standard deviations. A

  11. Classification of personal exposure to radio frequency electromagnetic fields (RF-EMF) for epidemiological research: Evaluation of different exposure assessment methods.

    PubMed

    Frei, Patrizia; Mohler, Evelyn; Bürgi, Alfred; Fröhlich, Jürg; Neubauer, Georg; Braun-Fahrländer, Charlotte; Röösli, Martin

    2010-10-01

    The use of personal exposure meters (exposimeters) has been recommended for measuring personal exposure to radio frequency electromagnetic fields (RF-EMF) from environmental far-field sources in everyday life. However, it is unclear to what extent exposimeter readings are affected by measurements taken when personal mobile and cordless phones are used. In addition, the use of exposimeters in large epidemiological studies is limited due to high costs and large effort for study participants. In the current analysis we aimed to investigate the impact of personal phone use on exposimeter readings and to evaluate different exposure assessment methods potentially useful in epidemiological studies. We collected personal exposimeter measurements during one week and diary data from 166 study participants. Moreover, we collected spot measurements in the participants' bedrooms and data on self-estimated exposure, assessed residential exposure to fixed site transmitters by calculating the geo-coded distance and mean RF-EMF from a geospatial propagation model, and developed an exposure prediction model based on the propagation model and exposure relevant behavior. The mean personal exposure was 0.13 mW/m(2), when measurements during personal phone calls were excluded and 0.15 mW/m(2), when such measurements were included. The Spearman correlation with personal exposure (without personal phone calls) was 0.42 (95%-CI: 0.29 to 0.55) for the spot measurements, -0.03 (95%-CI: -0.18 to 0.12) for the geo-coded distance, 0.28 (95%-CI: 0.14 to 0.42) for the geospatial propagation model, 0.50 (95%-CI: 0.37 to 0.61) for the full exposure prediction model and 0.06 (95%-CI: -0.10 to 0.21) for self-estimated exposure. In conclusion, personal exposure measured with exposimeters correlated best with the full exposure prediction model and spot measurements. Self-estimated exposure and geo-coded distance turned out to be poor surrogates for personal exposure.

  12. Accurate Arabic Script Language/Dialect Classification

    DTIC Science & Technology

    2014-01-01

    language processing ( NLP ) tools and models only function well when applied to the language for which they were designed or trained. Thus, it is...and MSA, NLP tools trained using MSA data collections tend to perform poorly on dialectal text (Rambow et al., 2005). Thus, it is important to

  13. Improvement of the Cramer classification for oral exposure using the database TTC RepDose - A strategy description

    EPA Science Inventory

    The present report describes a strategy to refine the current Cramer classification of the TTC concept using a broad database (DB) termed TTC RepDose. Cramer classes 1-3 overlap to some extent, indicating a need for a better separation of structural classes likely to be toxic, mo...

  14. Malingering in Toxic Exposure. Classification Accuracy of Reliable Digit Span and WAIS-III Digit Span Scaled Scores

    ERIC Educational Resources Information Center

    Greve, Kevin W.; Springer, Steven; Bianchini, Kevin J.; Black, F. William; Heinly, Matthew T.; Love, Jeffrey M.; Swift, Douglas A.; Ciota, Megan A.

    2007-01-01

    This study examined the sensitivity and false-positive error rate of reliable digit span (RDS) and the WAIS-III Digit Span (DS) scaled score in persons alleging toxic exposure and determined whether error rates differed from published rates in traumatic brain injury (TBI) and chronic pain (CP). Data were obtained from the files of 123 persons…

  15. Ozone exposure and cardiovascular-related mortality in the Canadian Census Health and Environment Cohort (CANCHEC) by spatial synoptic classification zone.

    PubMed

    Cakmak, Sabit; Hebbern, Chris; Vanos, Jennifer; Crouse, Dan L; Burnett, Rick

    2016-07-01

    Our objective is to analyse the association between long term ozone exposure and cardiovascular related mortality while accounting for climate, location, and socioeconomic factors. We assigned subjects with 16 years of follow-up in the Canadian Census Health and Environment Cohort (CanCHEC) to one of seven regions based on spatial synoptic classification (SSC) weather types and examined the interaction of exposure to both fine particulate matter (PM2.5) and ground level ozone and cause of death using survival analysis, while adjusting for socioeconomic characteristics and individual confounders. Correlations between ozone and PM2.5 varied across SSC zones from -0.02 to 0.7. Comparing zones using the most populated SSC zone as a reference, a 10 ppb increase in ozone exposure was associated with increases in hazard ratios (HRs) that ranged from 1.007 (95% CI 0.99, 1.015) to 1.03 (95% CI 1.02, 1.041) for cardiovascular disease, 1.013 (95% CI 0.996, 1.03) to 1.058 (95% CI 1.034, 1.082) for cerebrovascular disease, and 1.02 (95% CI 1.006, 1.034) for ischemic heart disease. HRs remained significant after adjustment for PM2.5. Long term exposure to ozone is related to an increased risk of mortality from cardiovascular and cerebrovascular diseases; the risk varies by location across Canada and is not attenuated by adjustment for PM2.5. This research shows that the SSC can be used to define geographic regions and it demonstrates the importance of accounting for that spatial variability when studying the long term health effects of air pollution.

  16. Challenges in prosthesis classification.

    PubMed

    Robertsson, Otto; Mendenhall, Stan; Paxton, Elizabeth W; Inacio, Maria C S; Graves, Stephen

    2011-12-21

    Accurate prosthesis classification is critical for total joint arthroplasty surveillance and assessment of comparative effectiveness. Historically, prosthesis classification was based solely on the names of the prosthesis manufacturers. As a result, prosthesis designs changed without corresponding name changes, and other prostheses' names changed over time without substantial design modifications. As the number of prostheses used in total joint arthroplasty on the market increased, catalog and lot numbers associated with prosthesis descriptions were introduced by manufacturers. Currently, these catalog and lot numbers are not standardized, and there is no consensus on categorization of these numbers into brands or subbrands. Classification of the attributes of a prosthesis also varies, limiting comparisons of prostheses across studies and reports. The development of a universal prosthesis classification system would standardize prosthesis classification and enhance total joint arthroplasty research collaboration worldwide. This is a current area of focus for the International Consortium of Orthopaedic Registries (ICOR).

  17. Temporal context in floristic classification

    NASA Astrophysics Data System (ADS)

    Fitzgerald, R. W.; Lees, B. G.

    1996-11-01

    Multi-temporal remote sensing data present a number of significant problems for the statistical and spatial competence of a classifier. Ideally, a classifier of multi-temporal data should be temporally invariant. It must have the capacity to account for the variations in season, growth cycle, radiometric, and atmospheric conditions at any point in time when classifying the land cover. This paper tests two methods of creating a temporally invariant classifier based on the pattern recognition capabilities of a neural network. A suite of twelve multi-temporal datasets spread over 5 yr along with a comprehensive mix of environmental variables are fused into floristic classification images by the neural network. Uncertainties in the classifications are addressed explicitly with a confidence mask generated from the fuzzy membership value's output by the neural network. These confidence masks are used to produce constrained classification images. The overall accuracy percentage achieved from a study site containing highly disturbed undulating terrain averages 60%. The first method of training, sequential learning of temporal context, is tested by an examination of the step-by-step evolution of the sequential training process. This reveals that the sequential classifier may not have learned about time, because time was constant during each network training session. It also suggests that there are optimal times during the annual cycle to train the classifier for particular floristic classes. The second method of training the classifier is randomised exposure to the entire temporal training suite. Time was now a fluctuating input variable during the network training process. This method produced the best spatially accurate results. The performance of this classifier as a temporally invariant classifier is tested amongst four multi-temporal datasets with encouraging results. The classifier consistently achieved an overall accuracy percentage of 60%. The pairwise predicted

  18. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  19. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  20. Short Time Exposure (STE) test in conjunction with Bovine Corneal Opacity and Permeability (BCOP) assay including histopathology to evaluate correspondence with the Globally Harmonized System (GHS) eye irritation classification of textile dyes.

    PubMed

    Oliveira, Gisele Augusto Rodrigues; Ducas, Rafael do Nascimento; Teixeira, Gabriel Campos; Batista, Aline Carvalho; Oliveira, Danielle Palma; Valadares, Marize Campos

    2015-09-01

    Eye irritation evaluation is mandatory for predicting health risks in consumers exposed to textile dyes. The two dyes, Reactive Orange 16 (RO16) and Reactive Green 19 (RG19) are classified as Category 2A (irritating to eyes) based on the UN Globally Harmonized System for classification (UN GHS), according to the Draize test. On the other hand, animal welfare considerations and the enforcement of a new regulation in the EU are drawing much attention in reducing or replacing animal experiments with alternative methods. This study evaluated the eye irritation of the two dyes RO16 and RG19 by combining the Short Time Exposure (STE) and the Bovine Corneal Opacity and Permeability (BCOP) assays and then comparing them with in vivo data from the GHS classification. The STE test (first level screening) categorized both dyes as GHS Category 1 (severe irritant). In the BCOP, dye RG19 was also classified as GHS Category 1 while dye RO16 was classified as GHS no prediction can be made. Both dyes caused damage to the corneal tissue as confirmed by histopathological analysis. Our findings demonstrated that the STE test did not contribute to arriving at a better conclusion about the eye irritation potential of the dyes when used in conjunction with the BCOP test. Adding the histopathology to the BCOP test could be an appropriate tool for a more meaningful prediction of the eye irritation potential of dyes.

  1. Contextual classification of multispectral image data: Approximate algorithm

    NASA Technical Reports Server (NTRS)

    Tilton, J. C. (Principal Investigator)

    1980-01-01

    An approximation to a classification algorithm incorporating spatial context information in a general, statistical manner is presented which is computationally less intensive. Classifications that are nearly as accurate are produced.

  2. [The assessment of exposure to and the activity of the manual lifting of patients in wards: methods, procedures, the exposure index (MAPO) and classification criteria. Movimientazione e Assistenza Pazienti Ospedalizzati (Lifting and Assistance to Hospitalized Patients)].

    PubMed

    Menoni, O; Ricci, M G; Panciera, D; Occhipinti, E

    1999-01-01

    Since a method for quantifying exposure to patient handling in hospital wards is lacking, the authors describe and propose a model for identifying the main risk factors in this type of occupational exposure: presence of disabled patients, staff engaged on manual handling of patients, structure of the working environment, equipment and aids for moving patients, training of workers according to the specific risk. For each factor a procedure for identification and assessment is proposed that is easily applicable in practice. The authors also propose a formula for the calculation of a condensed exposure index (MAPO Index), which brings together the various factors. The exposure index, which requires further, detailed study and validation, makes it possible, in practice, to plan the preventive and health measures according to a specific order of priority, thus complying with the requirements of Chapter V of Law 626/94. From a practical point of view, in the present state of knowledge, it can be stated that for MAPO Index values between 0 and 1.5, risk is deemed negligible, average for values between 1.51 and 5, and high for values exceeding 5.

  3. New hierarchical classification of food items for the assessment of exposure to packaging migrants: use of hub codes for different food groups.

    PubMed

    Northing, P; Oldring, P K T; Castle, L; Mason, P A S S

    2009-04-01

    This paper describes development work undertaken to expand the capabilities of an existing two-dimensional probabilistic modelling approach for assessing dietary exposure to chemicals migrating out of food contact materials. A new three-level hub-coding system has been devised for coding different food groups with regards to their consumption by individuals. The hub codes can be used at three different levels representing a high, medium and low level of aggregation of individual food items. The hub codes were developed because they have a greater relevance to packaging migration than coding used (largely and historically) for nutritional purposes. Also, the hub codes will assist pan-europeanization of the exposure model in the future, when up to 27 or more different food coding systems from 27 European Union Member States will have to be assimilated into the modelling approach. The applicability of the model with the new coding system has been tested by incorporating newly released 2001 UK consumption data. The example used was exposure to a hypothetical migrant from coated metal packaging for foodstuffs. When working at the three hierarchical levels, it was found that the tiered approach gave conservative estimates at the cruder level of refinement and a more realistic assessment was obtained as the refinement progressed. The work overall revealed that changes in eating habits over time had a relatively small impact on estimates of exposure. More important impacts are changes over time in packaging usage, packaging composition and migration levels. For countries like the UK, which has sophisticated food consumption data, it is uncertainties in these other areas that need to be addressed by new data collection.

  4. Integrated classification of inflammatory myopathies.

    PubMed

    Allenbach, Y; Benveniste, O; Goebel, H-H; Stenzel, W

    2017-02-01

    Inflammatory myopathies comprise a multitude of diverse diseases, most often occurring in complex clinical settings. To ensure accurate diagnosis, multidisciplinary expertise is required. Here, we propose a comprehensive myositis classification that incorporates clinical, morphological and molecular data as well as autoantibody profile. This review focuses on recent advances in myositis research, in particular, the correlation between autoantibodies and morphological or clinical phenotypes that can be used as the basis for an 'integrated' classification system.

  5. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  6. Multiple Sparse Representations Classification

    PubMed Central

    Plenge, Esben; Klein, Stefan S.; Niessen, Wiro J.; Meijering, Erik

    2015-01-01

    Sparse representations classification (SRC) is a powerful technique for pixelwise classification of images and it is increasingly being used for a wide variety of image analysis tasks. The method uses sparse representation and learned redundant dictionaries to classify image pixels. In this empirical study we propose to further leverage the redundancy of the learned dictionaries to achieve a more accurate classifier. In conventional SRC, each image pixel is associated with a small patch surrounding it. Using these patches, a dictionary is trained for each class in a supervised fashion. Commonly, redundant/overcomplete dictionaries are trained and image patches are sparsely represented by a linear combination of only a few of the dictionary elements. Given a set of trained dictionaries, a new patch is sparse coded using each of them, and subsequently assigned to the class whose dictionary yields the minimum residual energy. We propose a generalization of this scheme. The method, which we call multiple sparse representations classification (mSRC), is based on the observation that an overcomplete, class specific dictionary is capable of generating multiple accurate and independent estimates of a patch belonging to the class. So instead of finding a single sparse representation of a patch for each dictionary, we find multiple, and the corresponding residual energies provides an enhanced statistic which is used to improve classification. We demonstrate the efficacy of mSRC for three example applications: pixelwise classification of texture images, lumen segmentation in carotid artery magnetic resonance imaging (MRI), and bifurcation point detection in carotid artery MRI. We compare our method with conventional SRC, K-nearest neighbor, and support vector machine classifiers. The results show that mSRC outperforms SRC and the other reference methods. In addition, we present an extensive evaluation of the effect of the main mSRC parameters: patch size, dictionary size, and

  7. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  8. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  9. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  10. Classification of Instructional Programs: 2000 Edition.

    ERIC Educational Resources Information Center

    Morgan, Robert L.; Hunt, E. Stephen

    This third revision of the Classification of Instructional Programs (CIP) updates and modifies education program classifications, providing a taxonomic scheme that supports the accurate tracking, assessment, and reporting of field of study and program completions activity. This edition has also been adopted as the standard field of study taxonomy…

  11. The Influence of Second-Hand Cigarette Smoke Exposure during Childhood and Active Cigarette Smoking on Crohn’s Disease Phenotype Defined by the Montreal Classification Scheme in a Western Cape Population, South Africa

    PubMed Central

    Chivese, Tawanda; Esterhuizen, Tonya M.; Basson, Abigail Raffner

    2015-01-01

    Background Smoking may worsen the disease outcomes in patients with Crohn’s disease (CD), however the effect of exposure to second-hand cigarette smoke during childhood is unclear. In South Africa, no such literature exists. The aim of this study was to investigate whether disease phenotype, at time of diagnosis of CD, was associated with exposure to second-hand cigarette during childhood and active cigarette smoking habits. Methods A cross sectional examination of all consecutive CD patients seen during the period September 2011-January 2013 at 2 large inflammatory bowel disease centers in the Western Cape, South Africa was performed. Data were collected via review of patient case notes, interviewer-administered questionnaire and clinical examination by the attending gastroenterologist. Disease phenotype (behavior and location) was evaluated at time of diagnosis, according to the Montreal Classification scheme. In addition, disease behavior was stratified as ‘complicated’ or ‘uncomplicated’, using predefined definitions. Passive cigarette smoke exposure was evaluated during 3 age intervals: 0–5, 6–10, and 11–18 years. Results One hundred and ninety four CD patients were identified. Cigarette smoking during the 6 months prior to, or at time of diagnosis was significantly associated with ileo-colonic (L3) disease (RRR = 3.63; 95%CI, 1.32–9.98, p = 0.012) and ileal (L1) disease (RRR = 3.54; 95%CI, 1.06–11.83, p = 0.040) compared with colonic disease. In smokers, childhood passive cigarette smoke exposure during the 0–5 years age interval was significantly associated with ileo-colonic CD location (RRR = 21.3; 95%CI, 1.16–391.55, p = 0.040). No significant association between smoking habits and disease behavior at diagnosis, whether defined by the Montreal scheme, or stratified as ‘complicated’ vs ‘uncomplicated’, was observed. Conclusion Smoking habits were associated with ileo-colonic (L3) and ileal (L1) disease at time of diagnosis in

  12. Classification of spatially unresolved objects

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Horwitz, H. M.; Hyde, P. D.; Morgenstern, J. P.

    1972-01-01

    A proportion estimation technique for classification of multispectral scanner images is reported that uses data point averaging to extract and compute estimated proportions for a single average data point to classify spatial unresolved areas. Example extraction calculations of spectral signatures for bare soil, weeds, alfalfa, and barley prove quite accurate.

  13. Comparison of two-level and three-level classifications of moisture-damaged dwellings in relation to health effects.

    PubMed

    Haverinen, U; Husman, T; Vahteristo, M; Koskinen, O; Moschandreas, D; Nevalainen, A; Pekkanen, J

    2001-09-01

    A total of 630 randomly selected dwellings were surveyed for visible signs of moisture damage by civil engineers, and questionnaire responses were collected from the occupants (a total of 1,017 adults) to analyse the association between moisture damage and occupant health. A three-level grading system was developed, which took into account the number of damage sites in buildings and estimated the severity of the damage. In the present study, this grading system was tested as an improved model of moisture damage-related exposure in comparison to a conventional two-category system: based on independent, technical criteria it also allowed dose-response to be estimated. The questionnaire probed 28 individual health symptoms, based on earlier reported associations with building moisture and mould-related exposure. Criteria in evaluating the goodness of the selected exposure model were (1) dose-responsiveness and (2) higher risk compared to a two-level classification. Dose-responsiveness was observed with the three-level classification in 7, higher risk in 10, and both criteria in 5 out of 28 health symptoms. Two-level classification had higher risk in 4 health symptoms. Dose-dependent risk increases for respiratory infections and lower respiratory symptoms, and recurrent irritative and skin symptoms were observed with the three-level classification using symptom score variables. Although the results did not unambiguously support the three-level model, they underline the importance of developing more accurate exposure models in assessing the severity of moisture damage.

  14. Learning classification trees

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1991-01-01

    Algorithms for learning classification trees have had successes in artificial intelligence and statistics over many years. How a tree learning algorithm can be derived from Bayesian decision theory is outlined. This introduces Bayesian techniques for splitting, smoothing, and tree averaging. The splitting rule turns out to be similar to Quinlan's information gain splitting rule, while smoothing and averaging replace pruning. Comparative experiments with reimplementations of a minimum encoding approach, Quinlan's C4 and Breiman et al. Cart show the full Bayesian algorithm is consistently as good, or more accurate than these other approaches though at a computational price.

  15. Classification Analysis.

    ERIC Educational Resources Information Center

    Ball, Geoffrey H.

    Sorting things into groups is a basic intellectual task that allows people to simplify with minimal reduction in information. Classification techniques, which include both clustering and discrimination, provide step-by-step computer-based procedures for sorting things based on notions of generalized similarity and on the "class description"…

  16. [A new method of determining ages: chronological classification].

    PubMed

    Gubry, P

    1983-01-01

    Two methods of determining ages, the historical calendar method and the classification method, are compared using data for Chad from the UDEAC-TCHAD demographic survey. Complementary processing shows that the classification method is more accurate for those born in villages, while the historical method is more accurate for those born elsewhere. (summary in ENG)

  17. Neuromuscular disease classification system

    NASA Astrophysics Data System (ADS)

    Sáez, Aurora; Acha, Begoña; Montero-Sánchez, Adoración; Rivas, Eloy; Escudero, Luis M.; Serrano, Carmen

    2013-06-01

    Diagnosis of neuromuscular diseases is based on subjective visual assessment of biopsies from patients by the pathologist specialist. A system for objective analysis and classification of muscular dystrophies and neurogenic atrophies through muscle biopsy images of fluorescence microscopy is presented. The procedure starts with an accurate segmentation of the muscle fibers using mathematical morphology and a watershed transform. A feature extraction step is carried out in two parts: 24 features that pathologists take into account to diagnose the diseases and 58 structural features that the human eye cannot see, based on the assumption that the biopsy is considered as a graph, where the nodes are represented by each fiber, and two nodes are connected if two fibers are adjacent. A feature selection using sequential forward selection and sequential backward selection methods, a classification using a Fuzzy ARTMAP neural network, and a study of grading the severity are performed on these two sets of features. A database consisting of 91 images was used: 71 images for the training step and 20 as the test. A classification error of 0% was obtained. It is concluded that the addition of features undetectable by the human visual inspection improves the categorization of atrophic patterns.

  18. Multisensor classification of sedimentary rocks

    NASA Technical Reports Server (NTRS)

    Evans, Diane

    1988-01-01

    A comparison is made between linear discriminant analysis and supervised classification results based on signatures from the Landsat TM, the Thermal Infrared Multispectral Scanner (TIMS), and airborne SAR, alone and combined into extended spectral signatures for seven sedimentary rock units exposed on the margin of the Wind River Basin, Wyoming. Results from a linear discriminant analysis showed that training-area classification accuracies based on the multisensor data were improved an average of 15 percent over TM alone, 24 percent over TIMS alone, and 46 percent over SAR alone, with similar improvement resulting when supervised multisensor classification maps were compared to supervised, individual sensor classification maps. When training area signatures were used to map spectrally similar materials in an adjacent area, the average classification accuracy improved 19 percent using the multisensor data over TM alone, 2 percent over TIMS alone, and 11 percent over SAR alone. It is concluded that certain sedimentary lithologies may be accurately mapped using a single sensor, but classification of a variety of rock types can be improved using multisensor data sets that are sensitive to different characteristics such as mineralogy and surface roughness.

  19. Historical limitations of determinant based exposure groupings in the rubber manufacturing industry

    PubMed Central

    Vermeulen, R; Kromhout, H

    2005-01-01

    Aims: To study the validity of using a cross-sectional industry-wide exposure survey to develop exposure groupings for epidemiological purposes that extend beyond the time period in which the exposure data were collected. Methods: Exposure determinants were used to group workers into high, medium, and low exposure groups. The contrast of this grouping and other commonly used grouping schemes based on plant and department within this exposure survey and a previously conducted survey within the same industry (and factories) were estimated and compared. Results: Grouping of inhalable and dermal exposure based on exposure determinants resulted in the highest, but still modest, contrast (ε ∼ 0.3). Classifying subjects based on a combination of plant and department resulted in a slightly lower contrast (ε ∼ 0.2). If the determinant based grouping derived from the 1997 exposure survey was used to classify workers in the 1988 survey the average contrast decreased significantly for both exposures (ε ∼ 0.1). On the contrary, the exposure classification based on plant and department increased in contrast (from ε ∼ 0.2 to ε ∼ 0.3) and retained its relative ranking overtime. Conclusions: Although determinant based groupings seem to result in more efficient groupings within a cross-sectional survey, they have to be used with caution as they might result in significant less contrast beyond the studied population or time period. It is concluded that a classification based on plant and department might be more desirable for retrospective studies in the rubber manufacturing industry, as they seem to have more historical relevance and are most likely more accurately recorded historically than information on exposure determinants in a particular industry. PMID:16234406

  20. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  1. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  2. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  3. Remote Sensing Information Classification

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas L.

    2008-01-01

    This viewgraph presentation reviews the classification of Remote Sensing data in relation to epidemiology. Classification is a way to reduce the dimensionality and precision to something a human can understand. Classification changes SCALAR data into NOMINAL data.

  4. Classification and knowledge

    NASA Technical Reports Server (NTRS)

    Kurtz, Michael J.

    1989-01-01

    Automated procedures to classify objects are discussed. The classification problem is reviewed, and the relation of epistemology and classification is considered. The classification of stellar spectra and of resolved images of galaxies is addressed.

  5. Hyperspectral image classification using functional data analysis.

    PubMed

    Li, Hong; Xiao, Guangrun; Xia, Tian; Tang, Y Y; Li, Luoqing

    2014-09-01

    The large number of spectral bands acquired by hyperspectral imaging sensors allows us to better distinguish many subtle objects and materials. Unlike other classical hyperspectral image classification methods in the multivariate analysis framework, in this paper, a novel method using functional data analysis (FDA) for accurate classification of hyperspectral images has been proposed. The central idea of FDA is to treat multivariate data as continuous functions. From this perspective, the spectral curve of each pixel in the hyperspectral images is naturally viewed as a function. This can be beneficial for making full use of the abundant spectral information. The relevance between adjacent pixel elements in the hyperspectral images can also be utilized reasonably. Functional principal component analysis is applied to solve the classification problem of these functions. Experimental results on three hyperspectral images show that the proposed method can achieve higher classification accuracies in comparison to some state-of-the-art hyperspectral image classification methods.

  6. Feature utility in polarimetric radar image classification

    NASA Technical Reports Server (NTRS)

    Cumming, Ian G.; Van Zyl, Jakob J.

    1989-01-01

    The information content in polarimetric SAR images is examined, and the polarimetric image variables containing the information that is important to the classification of terrain features in the images are determined. It is concluded that accurate classification can be done when just over half of the image variables are retained. A reduction in image data dimensionality gives storage savings, and can lead to the improvement of classifier performance. In addition, it is shown that a simplified radar system with only phase-calibrated CO-POL or SINGLE TX channels can give classification performance which approaches that of a fully polarimetric radar.

  7. Learning accurate very fast decision trees from uncertain data streams

    NASA Astrophysics Data System (ADS)

    Liang, Chunquan; Zhang, Yang; Shi, Peng; Hu, Zhengguo

    2015-12-01

    Most existing works on data stream classification assume the streaming data is precise and definite. Such assumption, however, does not always hold in practice, since data uncertainty is ubiquitous in data stream applications due to imprecise measurement, missing values, privacy protection, etc. The goal of this paper is to learn accurate decision tree models from uncertain data streams for classification analysis. On the basis of very fast decision tree (VFDT) algorithms, we proposed an algorithm for constructing an uncertain VFDT tree with classifiers at tree leaves (uVFDTc). The uVFDTc algorithm can exploit uncertain information effectively and efficiently in both the learning and the classification phases. In the learning phase, it uses Hoeffding bound theory to learn from uncertain data streams and yield fast and reasonable decision trees. In the classification phase, at tree leaves it uses uncertain naive Bayes (UNB) classifiers to improve the classification performance. Experimental results on both synthetic and real-life datasets demonstrate the strong ability of uVFDTc to classify uncertain data streams. The use of UNB at tree leaves has improved the performance of uVFDTc, especially the any-time property, the benefit of exploiting uncertain information, and the robustness against uncertainty.

  8. Evaluation of rock mass classification schemes: a case study from the Bowen Basin, Australia

    NASA Astrophysics Data System (ADS)

    Brook, Martin; Hebblewhite, Bruce; Mitra, Rudrajit

    2016-04-01

    The development of an accurate engineering geological model and adequate knowledge of spatial variation in rock mass conditions are important prerequisites for slope stability analyses, tunnel design, mine planning and risk management. Rock mass classification schemes such as Rock Mass Rating (RMR), Coal Mine Roof Rating (CMRR), Q-system and Roof Strength Index (RSI) have been used for a range of engineering geological applications, including transport tunnels, "hard rock" mining and underground and open-cut coal mines. Often, rock mass classification schemes have been evaluated on subaerial exposures, where weathering has affected joint characteristics and intact strength. In contrast, the focus of this evaluation of the above classification schemes is an underground coal mine in the Bowen Basin, central Queensland, Australia, 15 km east of the town of Moranbah. Rock mass classification was undertaken at 68 sites across the mine. Both the target coal seam and overlying rock show marked spatial variability in terms of RMR, CMRR and Q, but RSI showed limited sensitivity to changes in rock mass condition. Relationships were developed between different parameters with varying degrees of success. A mine-wide analysis of faulting was undertaken, and compared with in situ stress field and local-scale measurements of joint and cleat. While there are no unequivocal relationships between rock mass classification parameters and faulting, a central graben zone shows heterogeneous rock mass properties. The corollary is that if geological features can be accurately defined by remote sensing technologies, then this can assist in predicting rock mass conditions and risk management ahead of development and construction.

  9. CHILDREN'S DIETARY EXPOSURES TO CHEMICAL CONTAMINANTS

    EPA Science Inventory

    The Food Quality Protection Act of 1996 requires EPA to more accurately assess children's aggregate exposures to environmental contaminants. Children have unstructured eating behaviors which cause excess exposures as a result of their activities. Determining total dietary intak...

  10. Potency values from the local lymph node assay: application to classification, labelling and risk assessment.

    PubMed

    Loveless, S E; Api, A-M; Crevel, R W R; Debruyne, E; Gamer, A; Jowsey, I R; Kern, P; Kimber, I; Lea, L; Lloyd, P; Mehmood, Z; Steiling, W; Veenstra, G; Woolhiser, M; Hennes, C

    2010-02-01

    Hundreds of chemicals are contact allergens but there remains a need to identify and characterise accurately skin sensitising hazards. The purpose of this review was fourfold. First, when using the local lymph node assay (LLNA), consider whether an exposure concentration (EC3 value) lower than 100% can be defined and used as a threshold criterion for classification and labelling. Second, is there any reason to revise the recommendation of a previous ECETOC Task Force regarding specific EC3 values used for sub-categorisation of substances based upon potency? Third, what recommendations can be made regarding classification and labelling of preparations under GHS? Finally, consider how to integrate LLNA data into risk assessment and provide a rationale for using concentration responses and corresponding no-effect concentrations. Although skin sensitising chemicals having high EC3 values may represent only relatively low risks to humans, it is not possible currently to define an EC3 value below 100% that would serve as an appropriate threshold for classification and labelling. The conclusion drawn from reviewing the use of distinct categories for characterising contact allergens was that the most appropriate, science-based classification of contact allergens according to potency is one in which four sub-categories are identified: 'extreme', 'strong', 'moderate' and 'weak'. Since draining lymph node cell proliferation is related causally and quantitatively to potency, LLNA EC3 values are recommended for determination of a no expected sensitisation induction level that represents the first step in quantitative risk assessment.

  11. Hyperspectral image classification for mapping agricultural tillage practices

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An efficient classification framework for mapping agricultural tillage practice using hyperspectral remote sensing imagery is proposed, which has the potential to be implemented practically to provide rapid, accurate, and objective surveying data for precision agricultural management and appraisal f...

  12. Supervised Machine Learning for Classification of the Electrophysiological Effects of Chronotropic Drugs on Human Induced Pluripotent Stem Cell-Derived Cardiomyocytes.

    PubMed

    Heylman, Christopher; Datta, Rupsa; Sobrino, Agua; George, Steven; Gratton, Enrico

    2015-01-01

    Supervised machine learning can be used to predict which drugs human cardiomyocytes have been exposed to. Using electrophysiological data collected from human cardiomyocytes with known exposure to different drugs, a supervised machine learning algorithm can be trained to recognize and classify cells that have been exposed to an unknown drug. Furthermore, the learning algorithm provides information on the relative contribution of each data parameter to the overall classification. Probabilities and confidence in the accuracy of each classification may also be determined by the algorithm. In this study, the electrophysiological effects of β-adrenergic drugs, propranolol and isoproterenol, on cardiomyocytes derived from human induced pluripotent stem cells (hiPS-CM) were assessed. The electrophysiological data were collected using high temporal resolution 2-photon microscopy of voltage sensitive dyes as a reporter of membrane voltage. The results demonstrate the ability of our algorithm to accurately assess, classify, and predict hiPS-CM membrane depolarization following exposure to chronotropic drugs.

  13. Supervised Machine Learning for Classification of the Electrophysiological Effects of Chronotropic Drugs on Human Induced Pluripotent Stem Cell-Derived Cardiomyocytes

    PubMed Central

    Heylman, Christopher; Datta, Rupsa; Sobrino, Agua

    2015-01-01

    Supervised machine learning can be used to predict which drugs human cardiomyocytes have been exposed to. Using electrophysiological data collected from human cardiomyocytes with known exposure to different drugs, a supervised machine learning algorithm can be trained to recognize and classify cells that have been exposed to an unknown drug. Furthermore, the learning algorithm provides information on the relative contribution of each data parameter to the overall classification. Probabilities and confidence in the accuracy of each classification may also be determined by the algorithm. In this study, the electrophysiological effects of β–adrenergic drugs, propranolol and isoproterenol, on cardiomyocytes derived from human induced pluripotent stem cells (hiPS-CM) were assessed. The electrophysiological data were collected using high temporal resolution 2-photon microscopy of voltage sensitive dyes as a reporter of membrane voltage. The results demonstrate the ability of our algorithm to accurately assess, classify, and predict hiPS-CM membrane depolarization following exposure to chronotropic drugs. PMID:26695765

  14. Classification issues related to neuropathic trigeminal pain.

    PubMed

    Zakrzewska, Joanna M

    2004-01-01

    The goal of a classification system of medical conditions is to facilitate accurate communication, to ensure that each condition is described uniformly and universally and that all data banks for the storage and retrieval of research and clinical data related to the conditions are consistent. Classification entails deciding which kinds of diagnostic entities should be recognized and how to order them in a meaningful way. Currently there are 3 major pain classification systems of relevance to orofacial pain: The International Association for the Study of Pain classification system, the International Headache Society classification system, and the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). All use different methodologies, and only the RDC/TMD take into account social and psychologic factors in the classification of conditions. Classification systems need to be reliable, valid, comprehensive, generalizable, and flexible, and they need to be tested using consensus views of experts as well as the available literature. There is an urgent need for a robust classification system for neuropathic trigeminal pain.

  15. Contribution of various microenvironments to the daily personal exposure to ultrafine particles: Personal monitoring coupled with GPS tracking

    NASA Astrophysics Data System (ADS)

    Bekö, Gabriel; Kjeldsen, Birthe Uldahl; Olsen, Yulia; Schipperijn, Jasper; Wierzbicka, Aneta; Karottki, Dorina Gabriela; Toftum, Jørn; Loft, Steffen; Clausen, Geo

    2015-06-01

    Exposure to ultrafine particles (UFP) may have adverse health effects. Central monitoring stations do not represent the personal exposure to UFP accurately. Few studies have previously focused on personal exposure to UFP. Sixty non-smoking residents living in Copenhagen, Denmark were asked to carry a backpack equipped with a portable monitor, continuously recording particle number concentrations (PN), in order to measure the real-time individual exposure over a period of ˜48 h. A GPS logger was carried along with the particle monitor and allowed us to estimate the contribution of UFP exposure occurring in various microenvironments (residence, during active and passive transport, other indoor and outdoor environments) to the total daily exposure. On average, the fractional contribution of each microenvironment to the daily integrated personal exposure roughly corresponded to the fractions of the day the subjects spent in each microenvironment. The home environment accounted for 50% of the daily personal exposure. Indoor environments other than home or vehicles contributed with ˜40%. The highest median UFP concentration was obtained during passive transport (vehicles). However, being in transit or outdoors contributed 5% or less to the daily exposure. Additionally, the subjects recorded in a diary the periods when they were at home. With this approach, 66% of the total daily exposure was attributable to the home environment. The subjects spent 28% more time at home according to the diary, compared to the GPS. These results may indicate limitations of using diaries, but also possible inaccuracy and miss-classification in the GPS data.

  16. Nanotechnology and Exposure Science

    PubMed Central

    LIOY, PAUL J.; NAZARENKO, YEVGEN; HAN, TAE WON; LIOY, MARY JEAN; MAINELIS, GEDIMINAS

    2014-01-01

    This article discusses the gaps in our understanding of human exposures to nanoparticles stemming from the use of nanotechnology-based consumer products by the general public. It also describes a series of steps that could be taken to characterize such exposures. The suggested steps include classification of the nanotechnology-based products, simulation of realistic exposure patterns, characterization of emissions, analysis of the duration of activities resulting in exposures, and consideration of the bioaccessibility of nanoparticles. In addition, we present a preliminary study with nanotechnology-based cosmetic powders where particle release was studied under realistic powder application conditions. The data demonstrated that when nanotechnology-based cosmetic powders were used, there was a potential for inhaling airborne particles ranging in size from tens of nanometers to tens of micrometers. PMID:21222382

  17. Classification of Raynaud's disease based on angiographic features.

    PubMed

    Kim, Youn Hwan; Ng, Siew-Weng; Seo, Heung Seok; Chang Ahn, Hee

    2011-11-01

    Accurate diagnosis and timely management are crucial to avoid an ischaemic consequence in Raynaud's disease. There is, however, no objective classification of this disorder which guides surgical planning in refractory cases. We propose a new classification system to achieve this. From 2003 to 2009, we treated 178 patients (351 hands) who underwent surgical intervention due to an ischaemic consequence. We analysed the angiographic features of the arterial supply of the hand at three levels: (1) radial or ulnar, (2) palmar arch and common digital and (3) digital vessels. Subsequent surgical interventions were tailored according to disease types, and these included combinations of: digital sympathectomy, balloon angioplasty and end-to-end interposition venous or arterial grafting. We classified Raynaud's disease into six types: type I and II involve the radial or ulnar arteries. Type I (27.3%) showed complete occlusion, while type II (26.2%) involved partial occlusion. Type IIIa (27.1%) showed tortuous, narrowed or stenosed common digital and digital vessels. Type IIIb (1.4%) is a subset which involved the digital vessel of the index finger related to exposure to prolonged vibration. Type IV and V showed global involvement from the main to digital vessels. Type IV (13.7%) showed diffused tortuosity, narrowing and stenosis. Type V (4.3%) is the most severe, with paucity of vessels and very scant flow. Nearly half (47%) of the patients had associated systemic disease. This new classification provides objective and valuable information for decision making regarding choice of surgical procedures for the treatment of patients with Raynaud's disease which had failed conservative therapy.

  18. Influence of pansharpening techniques in obtaining accurate vegetation thematic maps

    NASA Astrophysics Data System (ADS)

    Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier

    2016-10-01

    In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.

  19. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  20. Fast and accurate estimation for astrophysical problems in large databases

    NASA Astrophysics Data System (ADS)

    Richards, Joseph W.

    2010-10-01

    A recent flood of astronomical data has created much demand for sophisticated statistical and machine learning tools that can rapidly draw accurate inferences from large databases of high-dimensional data. In this Ph.D. thesis, methods for statistical inference in such databases will be proposed, studied, and applied to real data. I use methods for low-dimensional parametrization of complex, high-dimensional data that are based on the notion of preserving the connectivity of data points in the context of a Markov random walk over the data set. I show how this simple parameterization of data can be exploited to: define appropriate prototypes for use in complex mixture models, determine data-driven eigenfunctions for accurate nonparametric regression, and find a set of suitable features to use in a statistical classifier. In this thesis, methods for each of these tasks are built up from simple principles, compared to existing methods in the literature, and applied to data from astronomical all-sky surveys. I examine several important problems in astrophysics, such as estimation of star formation history parameters for galaxies, prediction of redshifts of galaxies using photometric data, and classification of different types of supernovae based on their photometric light curves. Fast methods for high-dimensional data analysis are crucial in each of these problems because they all involve the analysis of complicated high-dimensional data in large, all-sky surveys. Specifically, I estimate the star formation history parameters for the nearly 800,000 galaxies in the Sloan Digital Sky Survey (SDSS) Data Release 7 spectroscopic catalog, determine redshifts for over 300,000 galaxies in the SDSS photometric catalog, and estimate the types of 20,000 supernovae as part of the Supernova Photometric Classification Challenge. Accurate predictions and classifications are imperative in each of these examples because these estimates are utilized in broader inference problems

  1. Classification Shell Game.

    ERIC Educational Resources Information Center

    Etzold, Carol

    1983-01-01

    Discusses shell classification exercises. Through keying students advanced from the "I know what a shell looks like" stage to become involved in the classification process: observing, labeling, making decisions about categories, and identifying marine animals. (Author/JN)

  2. Aircraft Operations Classification System

    NASA Technical Reports Server (NTRS)

    Harlow, Charles; Zhu, Weihong

    2001-01-01

    Accurate data is important in the aviation planning process. In this project we consider systems for measuring aircraft activity at airports. This would include determining the type of aircraft such as jet, helicopter, single engine, and multiengine propeller. Some of the issues involved in deploying technologies for monitoring aircraft operations are cost, reliability, and accuracy. In addition, the system must be field portable and acceptable at airports. A comparison of technologies was conducted and it was decided that an aircraft monitoring system should be based upon acoustic technology. A multimedia relational database was established for the study. The information contained in the database consists of airport information, runway information, acoustic records, photographic records, a description of the event (takeoff, landing), aircraft type, and environmental information. We extracted features from the time signal and the frequency content of the signal. A multi-layer feed-forward neural network was chosen as the classifier. Training and testing results were obtained. We were able to obtain classification results of over 90 percent for training and testing for takeoff events.

  3. Cirrhosis classification based on texture classification of random features.

    PubMed

    Liu, Hui; Shao, Ying; Guo, Dongmei; Zheng, Yuanjie; Zhao, Zuowei; Qiu, Tianshuang

    2014-01-01

    Accurate staging of hepatic cirrhosis is important in investigating the cause and slowing down the effects of cirrhosis. Computer-aided diagnosis (CAD) can provide doctors with an alternative second opinion and assist them to make a specific treatment with accurate cirrhosis stage. MRI has many advantages, including high resolution for soft tissue, no radiation, and multiparameters imaging modalities. So in this paper, multisequences MRIs, including T1-weighted, T2-weighted, arterial, portal venous, and equilibrium phase, are applied. However, CAD does not meet the clinical needs of cirrhosis and few researchers are concerned with it at present. Cirrhosis is characterized by the presence of widespread fibrosis and regenerative nodules in the hepatic, leading to different texture patterns of different stages. So, extracting texture feature is the primary task. Compared with typical gray level cooccurrence matrix (GLCM) features, texture classification from random features provides an effective way, and we adopt it and propose CCTCRF for triple classification (normal, early, and middle and advanced stage). CCTCRF does not need strong assumptions except the sparse character of image, contains sufficient texture information, includes concise and effective process, and makes case decision with high accuracy. Experimental results also illustrate the satisfying performance and they are also compared with typical NN with GLCM.

  4. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  5. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  6. Radiation exposure and pregnancy.

    PubMed

    Labant, Amy; Silva, Christina

    2014-01-01

    Radiological exposure from nuclear power reactor accidents, transportation of nuclear waste accidents, industrial accidents, or terrorist activity may be a remote possibility, but it could happen. Nurses must be prepared to evaluate and treat pregnant women and infants who have been exposed to radiation, and to have an understanding of the health consequences of a nuclear or radiological incident. Pregnant women and infants are a special group of patients who need consideration when exposed to radiation. Initial care requires thorough assessment and decisions regarding immediate care needs. Ongoing care is based on type and extent of radiation exposure. With accurate, comprehensive information and education, nurses will be better prepared to help mitigate the effects of radiation exposure to pregnant women and infants following a radiological incident. Information about radiation, health effects of prenatal radiation exposure, assessment, patient care, and treatment of pregnant women and infants are presented.

  7. Automatic classification of blank substrate defects

    NASA Astrophysics Data System (ADS)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  8. Adaptive training using an artificial neural network and EEG metrics for within- and cross-task workload classification.

    PubMed

    Baldwin, Carryl L; Penaranda, B N

    2012-01-02

    Adaptive training using neurophysiological measures requires efficient classification of mental workload in real time as a learner encounters new and increasingly difficult levels of tasks. Previous investigations have shown that artificial neural networks (ANNs) can accurately classify workload, but only when trained on neurophysiological exemplars from experienced operators on specific tasks. The present study examined classification accuracies for ANNs trained on electroencephalographic (EEG) activity recorded while participants performed the same (within task) and different (cross) tasks for short periods of time with little or no prior exposure to the tasks. Participants performed three working memory tasks at two difficulty levels with order of task and difficulty level counterbalanced. Within-task classification accuracies were high when ANNs were trained on exemplars from the same task or a set containing the to-be-classified task, (M=87.1% and 85.3%, respectively). Cross-task classification accuracies were significantly lower (average 44.8%) indicating consistent systematic misclassification for certain tasks in some individuals. Results are discussed in terms of their implications for developing neurophysiologically driven adaptive training platforms.

  9. Accurate three-dimensional documentation of distinct sites

    NASA Astrophysics Data System (ADS)

    Singh, Mahesh K.; Dutta, Ashish; Subramanian, Venkatesh K.

    2017-01-01

    One of the most critical aspects of documenting distinct sites is acquiring detailed and accurate range information. Several three-dimensional (3-D) acquisition techniques are available, but each has its own limitations. This paper presents a range data fusion method with the aim to enhance the descriptive contents of the entire 3-D reconstructed model. A kernel function is introduced for supervised classification of the range data using a kernelized support vector machine. The classification method is based on the local saliency features of the acquired range data. The range data acquired from heterogeneous range sensors are transformed into a defined common reference frame. Based on the segmentation criterion, the fusion of range data is performed by integrating finer regions of range data acquired from a laser range scanner with the coarser region of Kinect's range data. After fusion, the Delaunay triangulation algorithm is applied to generate the highly accurate, realistic 3-D model of the scene. Finally, experimental results show the robustness of the proposed approach.

  10. Active learning framework with iterative clustering for bioimage classification.

    PubMed

    Kutsuna, Natsumaro; Higaki, Takumi; Matsunaga, Sachihiro; Otsuki, Tomoshi; Yamaguchi, Masayuki; Fujii, Hirofumi; Hasezawa, Seiichiro

    2012-01-01

    Advances in imaging systems have yielded a flood of images into the research field. A semi-automated facility can reduce the laborious task of classifying this large number of images. Here we report the development of a novel framework, CARTA (Clustering-Aided Rapid Training Agent), applicable to bioimage classification that facilitates annotation and selection of features. CARTA comprises an active learning algorithm combined with a genetic algorithm and self-organizing map. The framework provides an easy and interactive annotation method and accurate classification. The CARTA framework enables classification of subcellular localization, mitotic phases and discrimination of apoptosis in images of plant and human cells with an accuracy level greater than or equal to annotators. CARTA can be applied to classification of magnetic resonance imaging of cancer cells or multicolour time-course images after surgery. Furthermore, CARTA can support development of customized features for classification, high-throughput phenotyping and application of various classification schemes dependent on the user's purpose.

  11. Hierarchical image classification in the bioscience literature.

    PubMed

    Kim, Daehyun; Yu, Hong

    2009-11-14

    Our previous work has shown that images appearing in bioscience articles can be classified into five types: Gel-Image, Image-of-Thing, Graph, Model, and Mix. For this paper, we explored and analyzed features strongly associated with each image type and developed a hierarchical image classification approach for classifying an image into one of the five types. First, we applied texture features to separate images into two groups: 1) a texture group comprising Gel Image, Image-of-Thing, and Mix, and 2) a non-texture group comprising Graph and Model. We then applied entropy, skewness, and uniformity for the first group, and edge difference, uniformity, and smoothness for the second group to classify images into specific types. Our results show that hierarchical image classification accurately divided images into the two groups during the initial classification and that the overall accuracy of the image classification was higher than that of our previous approach. In particular, the recall of hierarchical image classification was greatly improved due to the high accuracy of the initial classification.

  12. Prostate segmentation by sparse representation based classification

    PubMed Central

    Gao, Yaozong; Liao, Shu; Shen, Dinggang

    2012-01-01

    Purpose: The segmentation of prostate in CT images is of essential importance to external beam radiotherapy, which is one of the major treatments for prostate cancer nowadays. During the radiotherapy, the prostate is radiated by high-energy x rays from different directions. In order to maximize the dose to the cancer and minimize the dose to the surrounding healthy tissues (e.g., bladder and rectum), the prostate in the new treatment image needs to be accurately localized. Therefore, the effectiveness and efficiency of external beam radiotherapy highly depend on the accurate localization of the prostate. However, due to the low contrast of the prostate with its surrounding tissues (e.g., bladder), the unpredicted prostate motion, and the large appearance variations across different treatment days, it is challenging to segment the prostate in CT images. In this paper, the authors present a novel classification based segmentation method to address these problems. Methods: To segment the prostate, the proposed method first uses sparse representation based classification (SRC) to enhance the prostate in CT images by pixel-wise classification, in order to overcome the limitation of poor contrast of the prostate images. Then, based on the classification results, previous segmented prostates of the same patient are used as patient-specific atlases to align onto the current treatment image and the majority voting strategy is finally adopted to segment the prostate. In order to address the limitations of the traditional SRC in pixel-wise classification, especially for the purpose of segmentation, the authors extend SRC from the following four aspects: (1) A discriminant subdictionary learning method is proposed to learn a discriminant and compact representation of training samples for each class so that the discriminant power of SRC can be increased and also SRC can be applied to the large-scale pixel-wise classification. (2) The L1 regularized sparse coding is replaced by

  13. Security classification of information

    SciTech Connect

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  14. Recursive heuristic classification

    NASA Technical Reports Server (NTRS)

    Wilkins, David C.

    1994-01-01

    The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.

  15. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  16. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  17. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  18. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  19. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  20. Classification of Stellar Spectra

    NASA Astrophysics Data System (ADS)

    Garrison, R.; Murdin, P.

    2000-11-01

    How does a scientist approach the problem of trying to understand countless billions of objects? One of the first steps is to organize the data and set up a classification scheme which can provide the best insights into the nature of the objects. Perception and insight are the main purposes of classification. In astronomy, where there are `billions and billions' of stars, classification is an ong...

  1. Classiology and soil classification

    NASA Astrophysics Data System (ADS)

    Rozhkov, V. A.

    2012-03-01

    Classiology can be defined as a science studying the principles and rules of classification of objects of any nature. The development of the theory of classification and the particular methods for classifying objects are the main challenges of classiology; to a certain extent, they are close to the challenges of pattern recognition. The methodology of classiology integrates a wide range of methods and approaches: from expert judgment to formal logic, multivariate statistics, and informatics. Soil classification assumes generalization of available data and practical experience, formalization of our notions about soils, and their representation in the form of an information system. As an information system, soil classification is designed to predict the maximum number of a soil's properties from the position of this soil in the classification space. The existing soil classification systems do not completely satisfy the principles of classiology. The violation of logical basis, poor structuring, low integrity, and inadequate level of formalization make these systems verbal schemes rather than classification systems sensu stricto. The concept of classification as listing (enumeration) of objects makes it possible to introduce the notion of the information base of classification. For soil objects, this is the database of soil indices (properties) that might be applied for generating target-oriented soil classification system. Mathematical methods enlarge the prognostic capacity of classification systems; they can be applied to assess the quality of these systems and to recognize new soil objects to be included in the existing systems. The application of particular principles and rules of classiology for soil classification purposes is discussed in this paper.

  2. Exposure Nomographs

    NASA Astrophysics Data System (ADS)

    Zissell, Ronald E.

    Correct exposure times may be determined from nomographs relating signal-to-noise ratio, exposure time, color, seeing, and magnitude. The equations needed to construct the nomographs are developed. Calibration techniques are discussed.

  3. Multilayer Perceptrons for Classification

    DTIC Science & Technology

    1992-03-01

    retention/ separation rates fu, input to force projection models. The second application concerns the classification of Armor Piercing Incendiary (API...Air Force pilot reten- tion/ separation rates for input to force projection models. The second application concerns the classification of Armor...methodologies for predicting pilot retention/ separation rates for input to personnel inventory projection models were e::plored. Specifically, the multilayer

  4. DOE LLW classification rationale

    SciTech Connect

    Flores, A.Y.

    1991-09-16

    This report was about the rationale which the US Department of Energy had with low-level radioactive waste (LLW) classification. It is based on the Nuclear Regulatory Commission`s classification system. DOE site operators met to review the qualifications and characteristics of the classification systems. They evaluated performance objectives, developed waste classification tables, and compiled dose limits on the waste. A goal of the LLW classification system was to allow each disposal site the freedom to develop limits to radionuclide inventories and concentrations according to its own site-specific characteristics. This goal was achieved with the adoption of a performance objectives system based on a performance assessment, with site-specific environmental conditions and engineered disposal systems.

  5. DOE LLW classification rationale

    SciTech Connect

    Flores, A.Y.

    1991-09-16

    This report was about the rationale which the US Department of Energy had with low-level radioactive waste (LLW) classification. It is based on the Nuclear Regulatory Commission's classification system. DOE site operators met to review the qualifications and characteristics of the classification systems. They evaluated performance objectives, developed waste classification tables, and compiled dose limits on the waste. A goal of the LLW classification system was to allow each disposal site the freedom to develop limits to radionuclide inventories and concentrations according to its own site-specific characteristics. This goal was achieved with the adoption of a performance objectives system based on a performance assessment, with site-specific environmental conditions and engineered disposal systems.

  6. Improved Surgical Site Infection (SSI) rate through accurately assessed surgical wounds.

    PubMed

    John, Honeymol; Nimeri, Abdelrahman; Ellahham, Samer

    2015-01-01

    Sheikh Khalifa Medical City's (SKMC) Surgery Institute was identified as a high outlier in Surgical Site Infections (SSI) based on the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) - Semi-Annual Report (SAR) in January 2012. The aim of this project was to improve SSI rates through accurate wound classification. We identified SSI rate reduction as a performance improvement and safety priority at SKMC, a tertiary referral center. We used the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) best practice guidelines as a guide. ACS NSQIP is a clinical registry that provides risk-adjusted clinical outcome reports every six months. The rates of SSI are reported in an observed/expected ratio. The expected ratio is calculated based on the risk factors of the patients which include wound classification. We established a multidisciplinary SSI taskforce. The members of the SSI taskforce included the ACS NSQIP team members, quality, surgeons, nurses, infection control, IT, pharmacy, microbiology, and it was chaired by a colorectal surgeon. The taskforce focused on five areas: pre-op showering and hair removal, skin antisepsis, prophylactic antibiotics, peri-operative maintenance of glycaemia, and normothermia. We planned audits to evaluate our wound classification and our SSI rates based on the SAR. Our expected SSI rates in general surgery and the whole department were 2.52% and 1.70% respectively, while our observed SSI rates were 4.68% and 3.57% respectively, giving us a high outlier status with an odd's ratio of 1.72 and 2.03. Wound classifications were identified as an area of concern. For example, wound classifications were preoperatively selected based on the default wound classification of the booked procedure in the Electronic Medical Record (EMR) which led to under classifying wounds in many occasions. A total of 998 cases were reviewed, our rate of incorrect wound classification

  7. Improved Surgical Site Infection (SSI) rate through accurately assessed surgical wounds

    PubMed Central

    John, Honeymol; Nimeri, Abdelrahman; ELLAHHAM, SAMER

    2015-01-01

    Sheikh Khalifa Medical City's (SKMC) Surgery Institute was identified as a high outlier in Surgical Site Infections (SSI) based on the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) - Semi-Annual Report (SAR) in January 2012. The aim of this project was to improve SSI rates through accurate wound classification. We identified SSI rate reduction as a performance improvement and safety priority at SKMC, a tertiary referral center. We used the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) best practice guidelines as a guide. ACS NSQIP is a clinical registry that provides risk-adjusted clinical outcome reports every six months. The rates of SSI are reported in an observed/expected ratio. The expected ratio is calculated based on the risk factors of the patients which include wound classification. We established a multidisciplinary SSI taskforce. The members of the SSI taskforce included the ACS NSQIP team members, quality, surgeons, nurses, infection control, IT, pharmacy, microbiology, and it was chaired by a colorectal surgeon. The taskforce focused on five areas: pre-op showering and hair removal, skin antisepsis, prophylactic antibiotics, peri-operative maintenance of glycaemia, and normothermia. We planned audits to evaluate our wound classification and our SSI rates based on the SAR. Our expected SSI rates in general surgery and the whole department were 2.52% and 1.70% respectively, while our observed SSI rates were 4.68% and 3.57% respectively, giving us a high outlier status with an odd's ratio of 1.72 and 2.03. Wound classifications were identified as an area of concern. For example, wound classifications were preoperatively selected based on the default wound classification of the booked procedure in the Electronic Medical Record (EMR) which led to under classifying wounds in many occasions. A total of 998 cases were reviewed, our rate of incorrect wound classification

  8. Photometric Supernova Classification with Machine Learning

    NASA Astrophysics Data System (ADS)

    Lochner, Michelle; McEwen, Jason D.; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K.

    2016-08-01

    Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k-nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.

  9. Structured sparse models for classification

    NASA Astrophysics Data System (ADS)

    Castrodad, Alexey

    The main focus of this thesis is the modeling and classification of high dimensional data using structured sparsity. Sparse models, where data is assumed to be well represented as a linear combination of a few elements from a dictionary, have gained considerable attention in recent years, and its use has led to state-of-the-art results in many signal and image processing tasks. The success of sparse modeling is highly due to its ability to efficiently use the redundancy of the data and find its underlying structure. On a classification setting, we capitalize on this advantage to properly model and separate the structure of the classes. We design and validate modeling solutions to challenging problems arising in computer vision and remote sensing. We propose both supervised and unsupervised schemes for the modeling of human actions from motion imagery under a wide variety of acquisition condi- tions. In the supervised case, the main goal is to classify the human actions in the video given a predefined set of actions to learn from. In the unsupervised case, the main goal is to an- alyze the spatio-temporal dynamics of the individuals in the scene without having any prior information on the actions themselves. We also propose a model for remotely sensed hysper- spectral imagery, where the main goal is to perform automatic spectral source separation and mapping at the subpixel level. Finally, we present a sparse model for sensor fusion to exploit the common structure and enforce collaboration of hyperspectral with LiDAR data for better mapping capabilities. In all these scenarios, we demonstrate that these data can be expressed as a combination of atoms from a class-structured dictionary. These data representation becomes essentially a "mixture of classes," and by directly exploiting the sparse codes, one can attain highly accurate classification performance with relatively unsophisticated classifiers.

  10. Multisensor Target Detection And Classification

    NASA Astrophysics Data System (ADS)

    Ruck, Dennis W.; Rogers, Steven K.; Mills, James P.; Kabrisky, Matthew

    1988-08-01

    In this paper a new approach to the detection and classification of tactical targets using a multifunction laser radar sensor is developed. Targets of interest are tanks, jeeps, trucks, and other vehicles. Doppler images are segmented by developing a new technique which compensates for spurious doppler returns. Relative range images are segmented using an approach based on range gradients. The resultant shapes in the segmented images are then classified using Zernike moment invariants as shape descriptors. Two classification decision rules are implemented: a classical statistical nearest-neighbor approach and a multilayer perceptron architecture. The doppler segmentation algorithm was applied to a set of 180 real sensor images. An accurate segmentation was obtained for 89 percent of the images. The new doppler segmentation proved to be a robust method, and the moment invariants were effective in discriminating the tactical targets. Tanks were classified correctly 86 percent of the time. The most important result of this research is the demonstration of the use of a new information processing architecture for image processing applications.

  11. ANALYSIS OF DISCRIMINATING FACTORS IN HUMAN ACTIVITIES THAT AFFECT EXPOSURE

    EPA Science Inventory

    Accurately modeling exposure to particulate matter (PM) and other pollutants ultimately involves the utilization of human location-activity databases to assist in understanding the potential variability of microenvironmental exposures. This paper critically considers and stati...

  12. A novel ensemble machine learning for robust microarray data classification.

    PubMed

    Peng, Yonghong

    2006-06-01

    Microarray data analysis and classification has demonstrated convincingly that it provides an effective methodology for the effective diagnosis of diseases and cancers. Although much research has been performed on applying machine learning techniques for microarray data classification during the past years, it has been shown that conventional machine learning techniques have intrinsic drawbacks in achieving accurate and robust classifications. This paper presents a novel ensemble machine learning approach for the development of robust microarray data classification. Different from the conventional ensemble learning techniques, the approach presented begins with generating a pool of candidate base classifiers based on the gene sub-sampling and then the selection of a sub-set of appropriate base classifiers to construct the classification committee based on classifier clustering. Experimental results have demonstrated that the classifiers constructed by the proposed method outperforms not only the classifiers generated by the conventional machine learning but also the classifiers generated by two widely used conventional ensemble learning methods (bagging and boosting).

  13. Behavior Based Social Dimensions Extraction for Multi-Label Classification

    PubMed Central

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes’ behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes’ connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions. PMID:27049849

  14. The revised lung adenocarcinoma classification-an imaging guide.

    PubMed

    Gardiner, Natasha; Jogai, Sanjay; Wallis, Adam

    2014-10-01

    Advances in our understanding of the pathology, radiology and clinical behaviour of peripheral lung adenocarcinomas facilitated a more robust terminology and classification of these lesions. The International Association for the Study of Lung Cancer/American Thoracic Society/European Respiratory Society (IASLC/ATS/ERS) classification introduced new terminology to better reflect this heterogeneous group of adenocarcinomas formerly known as bronchoalveolar cell carcinoma (BAC). There is now a clear distinction between pre-invasive, minimally invasive and frankly invasive lesions. The radiographic appearance of these ranges from pure ground glass nodules to solid mass lesions. Radiologists must be aware of the new classification in order to work alongside multidisciplinary colleagues to allow accurate staging and treatment. This article reviews the new classification of lung adenocarcinomas. Management options of these lesions with particular focus on radiological implications of the new classification will be reviewed.

  15. Intraregional classification of wine via ICP-MS elemental fingerprinting.

    PubMed

    Coetzee, P P; van Jaarsveld, F P; Vanhaecke, F

    2014-12-01

    The feasibility of elemental fingerprinting in the classification of wines according to their provenance vineyard soil was investigated in the relatively small geographical area of a single wine district. Results for the Stellenbosch wine district (Western Cape Wine Region, South Africa), comprising an area of less than 1,000 km(2), suggest that classification of wines from different estates (120 wines from 23 estates) is indeed possible using accurate elemental data and multivariate statistical analysis based on a combination of principal component analysis, cluster analysis, and discriminant analysis. This is the first study to demonstrate the successful classification of wines at estate level in a single wine district in South Africa. The elements B, Ba, Cs, Cu, Mg, Rb, Sr, Tl and Zn were identified as suitable indicators. White and red wines were grouped in separate data sets to allow successful classification of wines. Correlation between wine classification and soil type distributions in the area was observed.

  16. Classification of earth terrain using polarimetric synthetic aperture radar images

    NASA Technical Reports Server (NTRS)

    Lim, H. H.; Swartz, A. A.; Yueh, H. A.; Kong, J. A.; Shin, R. T.; Van Zyl, J. J.

    1989-01-01

    Supervised and unsupervised classification techniques are developed and used to classify the earth terrain components from SAR polarimetric images of San Francisco Bay and Traverse City, Michigan. The supervised techniques include the Bayes classifiers, normalized polarimetric classification, and simple feature classification using discriminates such as the absolute and normalized magnitude response of individual receiver channel returns and the phase difference between receiver channels. An algorithm is developed as an unsupervised technique which classifies terrain elements based on the relationship between the orientation angle and the handedness of the transmitting and receiving polariation states. It is found that supervised classification produces the best results when accurate classifier training data are used, while unsupervised classification may be applied when training data are not available.

  17. Behavior Based Social Dimensions Extraction for Multi-Label Classification.

    PubMed

    Li, Le; Xu, Junyi; Xiao, Weidong; Ge, Bin

    2016-01-01

    Classification based on social dimensions is commonly used to handle the multi-label classification task in heterogeneous networks. However, traditional methods, which mostly rely on the community detection algorithms to extract the latent social dimensions, produce unsatisfactory performance when community detection algorithms fail. In this paper, we propose a novel behavior based social dimensions extraction method to improve the classification performance in multi-label heterogeneous networks. In our method, nodes' behavior features, instead of community memberships, are used to extract social dimensions. By introducing Latent Dirichlet Allocation (LDA) to model the network generation process, nodes' connection behaviors with different communities can be extracted accurately, which are applied as latent social dimensions for classification. Experiments on various public datasets reveal that the proposed method can obtain satisfactory classification results in comparison to other state-of-the-art methods on smaller social dimensions.

  18. Target classification algorithm based on feature aided tracking

    NASA Astrophysics Data System (ADS)

    Zhan, Ronghui; Zhang, Jun

    2013-03-01

    An effective target classification algorithm based on feature aided tracking (FAT) is proposed, using the length of target (target extent) as the classification information. To implement the algorithm, the Rao-Blackwellised unscented Kalman filter (RBUKF) is used to jointly estimate the kinematic state and target extent; meanwhile the joint probability data association (JPDA) algorithm is exploited to implement multi-target data association aided by target down-range extent. Simulation results under different condition show the presented algorithm is both accurate and robust, and it is suitable for the application of near spaced targets tracking and classification under the environment of dense clutters.

  19. Methodology for hyperspectral image classification using novel neural network

    SciTech Connect

    Subramanian, S., Gat, N., Sheffield, M.,; Barhen, J.; Toomarian, N.

    1997-04-01

    A novel feed forward neural network is used to classify hyperspectral data from the AVIRIS sector. The network applies an alternating direction singular value decomposition technique to achieve rapid training times (few seconds per class). Very few samples (10-12) are required for training. 100% accurate classification is obtained using test data sets. The methodology combines this rapid training neural network together with data reduction and maximal feature separation techniques such as principal component analysis and simultaneous diagonalization of covariance matrices, for rapid and accurate classification of large hyperspectral images. The results are compared to those of standard statistical classifiers. 21 refs., 3 figs., 5 tabs.

  20. Military Exposures

    MedlinePlus

    ... Chemicals (Agent Orange, contaminated water…) Radiation (nuclear weapons, X-rays…) Air Pollutants (burn pit smoke, dust…) Occupational Hazards (asbestos, lead…) Warfare Agents (chemical and biological weapons) Exposure ...

  1. RELIABILITY OF BIOMARKERS OF PESTICIDE EXPOSURE AMONG CHILDREN AND ADULTS IN CTEPP OHIO

    EPA Science Inventory

    Urinary biomarkers offer the potential for providing an efficient tool for exposure classification by reflecting the aggregate of all exposure routes. Substantial variability observed in urinary pesticide metabolite concentrations over short periods of time, however, has cast so...

  2. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  3. Classification criteria for rheumatoid arthritis.

    PubMed

    MacGregor, A J

    1995-05-01

    incorporate features of past disease activity, for example by allowing deformity to substitute for swelling and by incorporating data on the past occurrence of rheumatoid factor and rheumatoid nodules. Developments in the immunology and genetics of RA may in the future provide more accurate tools for classification and may lead to recognition of more precise disease subsets. At present, however, the 1987 ARA criteria provide the most appropriate basis for case recognition in both clinic and population-based studies.

  4. Does Shamblin's classification predict postoperative morbidity in carotid body tumors? A proposal to modify Shamblin's classification.

    PubMed

    Luna-Ortiz, Kuauhyama; Rascon-Ortiz, Mario; Villavicencio-Valencia, Veronica; Herrera-Gomez, Angel

    2006-02-01

    The objective of this study was to analyze the possible correlation between Shamblin's classification and post-surgical morbidity in the treatment of carotid body tumors (CBTs). Seventy-two patients with carotid body tumors were seen over a 22-year period. Twenty-three patients were excluded as they did not comply with the criteria of the objectives. All patients were grouped according to Shamblin's classification. We propose a modification to this classification and make a comparison by analyzing the surgical time and bleeding, as well as the neurological and vascular damage. We resected 50 CBTs in 49 patients, ranging in age from 18 to 73 years. Three groups were formed: group I with 8 (16%) patients, group II with 17 (34%) and group III with 24 (49%). Post-surgical neurological damage was observed in one patient (12.5%) from group I, in six (35%) from group II and in nine patients (37.5%) from group III. Vascular sacrifice had to be performed in 21% of class II tumors and in 8.7% of class III. None of the class I tumors required vascular sacrifice. No statistically significant difference existed for vascular or neurological risk in relation to Shamblin's classification. However, when analyzed according to the classification proposed herein, there was a correlation between Shamblin's classification and vascular sacrifice (P =0.001). There was a statistically significant correlation between the original Shamblin and the modified Shamblin regarding surgical time and bleeding. Shamblin's classification predicts only vascular morbidity. Neurological morbidity is not reflected in it and only reflects the surgeon's experience with CBT resections. Surgical time and bleeding are directly related to the Shamblin as it reflects the size of tumors in relation to the blood vessels. Shamblin's classification must be modified to be more objective so that the international reports can accurately reflect the morbidity related to it.

  5. EXPOSURE ANALYSIS

    EPA Science Inventory

    This proceedings chapter presents the state of the science regarding the evaluation of exposure as it relates to WQC, SQGs, and wildlife criteria (WC). . . . In summary, the exposure workgroup concluded the following: There should be greater use of mechanistic models to predict b...

  6. [Classification of cardiomyopathy].

    PubMed

    Asakura, Masanori; Kitakaze, Masafumi

    2014-01-01

    Cardiomyopathy is a group of cardiovascular diseases with poor prognosis. Some patients with dilated cardiomyopathy need heart transplantations due to severe heart failure. Some patients with hypertrophic cardiomyopathy die unexpectedly due to malignant ventricular arrhythmias. Various phenotypes of cardiomyopathies are due to the heterogeneous group of diseases. The classification of cardiomyopathies is important and indispensable in the clinical situation. However, their classification has not been established, because the causes of cardiomyopathies have not been fully elucidated. We usually use definition and classification offered by WHO/ISFC task force in 1995. Recently, several new definitions and classifications of the cardiomyopathies have been published by American Heart Association, European Society of Cardiology and Japanese Circulation Society.

  7. Delaware Alternative Classifications

    ERIC Educational Resources Information Center

    Miller, Jay

    1975-01-01

    This article discusses the species designation and taxonomies of Delaware and Algonkian and presents eight classifications of taxa by form, habitat, color, movement, sound, use, relationship, and appearance. Relevant research is also reviewed. (CLK)

  8. Postprocessing classification images

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1979-01-01

    Program cleans up remote-sensing maps. It can be used with existing image-processing software. Remapped images closely resemble familiar resource information maps and can replace or supplement classification images not postprocessed by this program.

  9. Adaptive Gaussian Pattern Classification

    DTIC Science & Technology

    1988-08-01

    redundant model of the data to be used in classification . There are two classes of learning, or adaptation schemes. The first, unsupervised learning...37, No. 3, pp. 242-247, 1983. [2] E. F. Codd, Cellular Automata , Academic Press, 1968. [31 H. Everett, G. Gilbreath, S. Alderson, D. J. Marchette...Na al Oca aytm aete !JTI FL E COPY AD-A 199 030 Technical Document 1335 August 1988 Adaptive Gaussian Pattern Classif ication C. E. Priebe D. J

  10. Progressive Classification Using Support Vector Machines

    NASA Technical Reports Server (NTRS)

    Wagstaff, Kiri; Kocurek, Michael

    2009-01-01

    An algorithm for progressive classification of data, analogous to progressive rendering of images, makes it possible to compromise between speed and accuracy. This algorithm uses support vector machines (SVMs) to classify data. An SVM is a machine learning algorithm that builds a mathematical model of the desired classification concept by identifying the critical data points, called support vectors. Coarse approximations to the concept require only a few support vectors, while precise, highly accurate models require far more support vectors. Once the model has been constructed, the SVM can be applied to new observations. The cost of classifying a new observation is proportional to the number of support vectors in the model. When computational resources are limited, an SVM of the appropriate complexity can be produced. However, if the constraints are not known when the model is constructed, or if they can change over time, a method for adaptively responding to the current resource constraints is required. This capability is particularly relevant for spacecraft (or any other real-time systems) that perform onboard data analysis. The new algorithm enables the fast, interactive application of an SVM classifier to a new set of data. The classification process achieved by this algorithm is characterized as progressive because a coarse approximation to the true classification is generated rapidly and thereafter iteratively refined. The algorithm uses two SVMs: (1) a fast, approximate one and (2) slow, highly accurate one. New data are initially classified by the fast SVM, producing a baseline approximate classification. For each classified data point, the algorithm calculates a confidence index that indicates the likelihood that it was classified correctly in the first pass. Next, the data points are sorted by their confidence indices and progressively reclassified by the slower, more accurate SVM, starting with the items most likely to be incorrectly classified. The user

  11. Supernova Photometric Lightcurve Classification

    NASA Astrophysics Data System (ADS)

    Zaidi, Tayeb; Narayan, Gautham

    2016-01-01

    This is a preliminary report on photometric supernova classification. We first explore the properties of supernova light curves, and attempt to restructure the unevenly sampled and sparse data from assorted datasets to allow for processing and classification. The data was primarily drawn from the Dark Energy Survey (DES) simulated data, created for the Supernova Photometric Classification Challenge. This poster shows a method for producing a non-parametric representation of the light curve data, and applying a Random Forest classifier algorithm to distinguish between supernovae types. We examine the impact of Principal Component Analysis to reduce the dimensionality of the dataset, for future classification work. The classification code will be used in a stage of the ANTARES pipeline, created for use on the Large Synoptic Survey Telescope alert data and other wide-field surveys. The final figure-of-merit for the DES data in the r band was 60% for binary classification (Type I vs II).Zaidi was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experiences for Undergraduates Program (AST-1262829).

  12. Separation of Benign and Malicious Network Events for Accurate Malware Family Classification

    DTIC Science & Technology

    2015-09-28

    malware at runtime, including memory access patterns, network traces, OS system calls, file system changes, and registry modifica- tions. Static...statistical features extracted from either static bi- naries [2–4], or from artifacts collected during the execution of the binary [5, 6]. These traces...same features extracted from mixed traffic that is collected on user machines, since these machines generate traffic containing both malware- and

  13. Leveraging IMU data for accurate exercise performance classification and musculoskeletal injury risk screening.

    PubMed

    Whelan, Darragh; O'Reilly, Martin; Huang, Bingquan; Giggins, Oonagh; Kechadi, Tahar; Caulfield, Brian

    2016-08-01

    Inertial measurement units (IMUs) are becoming increasingly prevalent as a method for low cost and portable biomechanical analysis. However, to date they have not been accepted into routine clinical practice. This is often due to a disconnect between translating the data collected by the sensors into meaningful and actionable information for end users. This paper outlines the work completed by our group in attempting to achieve this. We discuss the conceptual framework involved in our work, the methodological approach taken in analysing sensor signals and discuss possible application models. Our work indicates that IMU based systems have the potential to bridge the gap between laboratory and clinical movement analysis. Future studies will focus on collecting a diverse range of movement data and using more sophisticated data analysis techniques to refine systems.

  14. Using minimum DNA marker loci for accurate population classification in rice (Oryza sativa L.)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Using few DNA markers to classify genetic background of a germplasm pool will help breeders make a quick decision while saving time and resources. WHICHLOCI is a computer program that selects the best combination of loci for population assignment through empiric analysis of molecular marker data. Th...

  15. DiScRIBinATE: a rapid method for accurate taxonomic classification of metagenomic sequences

    PubMed Central

    2010-01-01

    Background In metagenomic sequence data, majority of sequences/reads originate from new or partially characterized genomes, the corresponding sequences of which are absent in existing reference databases. Since taxonomic assignment of reads is based on their similarity to sequences from known organisms, the presence of reads originating from new organisms poses a major challenge to taxonomic binning methods. The recently published SOrt-ITEMS algorithm uses an elaborate work-flow to assign reads originating from hitherto unknown genomes with significant accuracy and specificity. Nevertheless, a significant proportion of reads still get misclassified. Besides, the use of an alignment-based orthology step (for improving the specificity of assignments) increases the total binning time of SOrt-ITEMS. Results In this paper, we introduce a rapid binning approach called DiScRIBinATE (Distance Score Ratio for Improved Binning And Taxonomic Estimation). DiScRIBinATE replaces the orthology approach of SOrt-ITEMS with a quicker 'alignment-free' approach. We demonstrate that incorporating this approach reduces binning time by half without any loss in the specificity and accuracy of assignments. Besides, a novel reclassification strategy incorporated in DiScRIBinATE results in reducing the overall misclassification rate to around 3 - 7%. This misclassification rate is 1.5 - 3 times lower as compared to that by SOrt-ITEMS, and 3 - 30 times lower as compared to that by MEGAN. Conclusions A significant reduction in binning time, coupled with a superior assignment accuracy (as compared to existing binning methods), indicates the immense applicability of the proposed algorithm in rapidly mapping the taxonomic diversity of large metagenomic samples with high accuracy and specificity. Availability The program is available on request from the authors. PMID:21106121

  16. Spatial Pattern Classification for More Accurate Forecasting of Variable Energy Resources

    NASA Astrophysics Data System (ADS)

    Novakovskaia, E.; Hayes, C.; Collier, C.

    2014-12-01

    The accuracy of solar and wind forecasts is becoming increasingly essential as grid operators continue to integrate additional renewable generation onto the electric grid. Forecast errors affect rate payers, grid operators, wind and solar plant maintenance crews and energy traders through increases in prices, project down time or lost revenue. While extensive and beneficial efforts were undertaken in recent years to improve physical weather models for a broad spectrum of applications these improvements have generally not been sufficient to meet the accuracy demands of system planners. For renewables, these models are often used in conjunction with additional statistical models utilizing both meteorological observations and the power generation data. Forecast accuracy can be dependent on specific weather regimes for a given location. To account for these dependencies it is important that parameterizations used in statistical models change as the regime changes. An automated tool, based on an artificial neural network model, has been developed to identify different weather regimes as they impact power output forecast accuracy at wind or solar farms. In this study, improvements in forecast accuracy were analyzed for varying time horizons for wind farms and utility-scale PV plants located in different geographical regions.

  17. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  18. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    PubMed Central

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-01-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates. PMID:25844042

  19. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses.

    PubMed

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-04-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates.

  20. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    NASA Astrophysics Data System (ADS)

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-04-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates.

  1. Iris Image Classification Based on Hierarchical Visual Codebook.

    PubMed

    Zhenan Sun; Hui Zhang; Tieniu Tan; Jianyu Wang

    2014-06-01

    Iris recognition as a reliable method for personal identification has been well-studied with the objective to assign the class label of each iris image to a unique subject. In contrast, iris image classification aims to classify an iris image to an application specific category, e.g., iris liveness detection (classification of genuine and fake iris images), race classification (e.g., classification of iris images of Asian and non-Asian subjects), coarse-to-fine iris identification (classification of all iris images in the central database into multiple categories). This paper proposes a general framework for iris image classification based on texture analysis. A novel texture pattern representation method called Hierarchical Visual Codebook (HVC) is proposed to encode the texture primitives of iris images. The proposed HVC method is an integration of two existing Bag-of-Words models, namely Vocabulary Tree (VT), and Locality-constrained Linear Coding (LLC). The HVC adopts a coarse-to-fine visual coding strategy and takes advantages of both VT and LLC for accurate and sparse representation of iris texture. Extensive experimental results demonstrate that the proposed iris image classification method achieves state-of-the-art performance for iris liveness detection, race classification, and coarse-to-fine iris identification. A comprehensive fake iris image database simulating four types of iris spoof attacks is developed as the benchmark for research of iris liveness detection.

  2. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  3. Polysilicon gate and polysilicon wire CD/EPE defect detection and classification through process window

    NASA Astrophysics Data System (ADS)

    Andrews, Scott; Volk, William; Su, Bo; Du, Hong; Kumar, Bhavaniprasad; Pulusuri, Ramanamurthy; Vikram, Abhishek; Li, Xiaochun; Chen, Shaoyun

    2006-10-01

    As technology advances towards 45nm node and beyond, optical lithography faces increased challenges and resolution enhancement techniques (RET) are imperative for multiple process layers and poly-silicon gate layer in particular. With RET implementation, and optical proximity correction (OPC) techniques, the mask layout deviates further away from design intended layout. For an OPC decorated design database, it is important that before mask making, the OPC is verified that it is design related defects free and provides reasonable process window for a given process to ensure manufacturability. For poly-silicon gate layer, due to tight CD control requirement, the demand for accurate lithography process simulation is even greater. As hyper-NA immersion exposure systems become available, accurate resist image computation considering mask topography effects and partial polarized illumination on poly-silicon gate layers through process window is a necessary. In this work, we will show simulation results of DesignScan on an advanced poly-silicon gate layer using a logic based customer database. Active layer database is used to separate poly-silicon gate regions and poly-silicon wire regions. Sensitive CD and edge placement error (EPE) detectors are used to identify design related defects through the lithography process window. The detector sensitivities can be adjusted based on feature sizes and their geometry (gate of different targets or wires, corners, and line ends). This customization of geometry classification and detector sensitivity is critical to achieve desired through process window inspections. With this capability process window inspections will show how CD/EPE changes as functions of exposure dose and defocus with fast results and efficient review and disposition. Accurate process window assessment using CD variation is obtained.

  4. A Novel Vehicle Classification Using Embedded Strain Gauge Sensors.

    PubMed

    Zhang, Wenbin; Wang, Qi; Suo, Chunguang

    2008-11-05

    This paper presents a new vehicle classification and develops a traffic monitoring detector to provide reliable vehicle classification to aid traffic management systems. The basic principle of this approach is based on measuring the dynamic strain caused by vehicles across pavement to obtain the corresponding vehicle parameters - wheelbase and number of axles - to then accurately classify the vehicle. A system prototype with five embedded strain sensors was developed to validate the accuracy and effectiveness of the classification method. According to the special arrangement of the sensors and the different time a vehicle arrived at the sensors one can estimate the vehicle's speed accurately, corresponding to the estimated vehicle wheelbase and number of axles. Because of measurement errors and vehicle characteristics, there is a lot of overlap between vehicle wheelbase patterns. Therefore, directly setting up a fixed threshold for vehicle classification often leads to low-accuracy results. Using the machine learning pattern recognition method to deal with this problem is believed as one of the most effective tools. In this study, support vector machines (SVMs) were used to integrate the classification features extracted from the strain sensors to automatically classify vehicles into five types, ranging from small vehicles to combination trucks, along the lines of the Federal Highway Administration vehicle classification guide. Test bench and field experiments will be introduced in this paper. Two support vector machines classification algorithms (one-against-all, one-against-one) are used to classify single sensor data and multiple sensor combination data. Comparison of the two classification method results shows that the classification accuracy is very close using single data or multiple data. Our results indicate that using multiclass SVM-based fusion multiple sensor data significantly improves the results of a single sensor data, which is trained on the whole

  5. A Novel Vehicle Classification Using Embedded Strain Gauge Sensors

    PubMed Central

    Zhang, Wenbin; Wang, Qi; Suo, Chunguang

    2008-01-01

    This paper presents a new vehicle classification and develops a traffic monitoring detector to provide reliable vehicle classification to aid traffic management systems. The basic principle of this approach is based on measuring the dynamic strain caused by vehicles across pavement to obtain the corresponding vehicle parameters – wheelbase and number of axles – to then accurately classify the vehicle. A system prototype with five embedded strain sensors was developed to validate the accuracy and effectiveness of the classification method. According to the special arrangement of the sensors and the different time a vehicle arrived at the sensors one can estimate the vehicle's speed accurately, corresponding to the estimated vehicle wheelbase and number of axles. Because of measurement errors and vehicle characteristics, there is a lot of overlap between vehicle wheelbase patterns. Therefore, directly setting up a fixed threshold for vehicle classification often leads to low-accuracy results. Using the machine learning pattern recognition method to deal with this problem is believed as one of the most effective tools. In this study, support vector machines (SVMs) were used to integrate the classification features extracted from the strain sensors to automatically classify vehicles into five types, ranging from small vehicles to combination trucks, along the lines of the Federal Highway Administration vehicle classification guide. Test bench and field experiments will be introduced in this paper. Two support vector machines classification algorithms (one-against-all, one-against-one) are used to classify single sensor data and multiple sensor combination data. Comparison of the two classification method results shows that the classification accuracy is very close using single data or multiple data. Our results indicate that using multiclass SVM-based fusion multiple sensor data significantly improves the results of a single sensor data, which is trained on the

  6. What should an ideal spinal injury classification system consist of? A methodological review and conceptual proposal for future classifications

    PubMed Central

    Audigé, Laurent; Hanson, Beate; Chapman, Jens R.; Hosman, Allard J. F.

    2010-01-01

    Since Böhler published the first categorization of spinal injuries based on plain radiographic examinations in 1929, numerous classifications have been proposed. Despite all these efforts, however, only a few have been tested for reliability and validity. This methodological, conceptual review summarizes that a spinal injury classification system should be clinically relevant, reliable and accurate. The clinical relevance of a classification is directly related to its content validity. The ideal content of a spinal injury classification should only include injury characteristics of the vertebral column, is primarily based on the increasingly routinely performed CT imaging, and is clearly distinctive from severity scales and treatment algorithms. Clearly defined observation and conversion criteria are crucial determinants of classification systems’ reliability and accuracy. Ideally, two principle spinal injury characteristics should be easy to discern on diagnostic images: the specific location and morphology of the injured spinal structure. Given the current evidence and diagnostic imaging technology, descriptions of the mechanisms of injury and ligamentous injury should not be included in a spinal injury classification. The presence of concomitant neurologic deficits can be integrated in a spinal injury severity scale, which in turn can be considered in a spinal injury treatment algorithm. Ideally, a validation pathway of a spinal injury classification system should be completed prior to its clinical and scientific implementation. This review provides a methodological concept which might be considered prior to the synthesis of new or modified spinal injury classifications. PMID:20464432

  7. Accurate identification of periodic oscillations buried in white or colored noise using fast orthogonal search.

    PubMed

    Chon, K H

    2001-06-01

    We use a previously introduced fast orthogonal search algorithm to detect sinusoidal frequency components buried in either white or colored noise. We show that the method outperforms the correlogram, modified covariance autoregressive (MODCOVAR) and multiple-signal classification (MUSIC) methods. Fast orthogonal search method achieves accurate detection of sinusoids even with signal-to-noise ratios as low as -10 dB, and is superior at detecting sinusoids buried in 1/f noise. Since the utilized method accurately detects sinusoids even under colored noise, it can be used to extract a 1/f noise process observed in physiological signals such as heart rate and renal blood pressure and flow data.

  8. Concepts of Classification and Taxonomy Phylogenetic Classification

    NASA Astrophysics Data System (ADS)

    Fraix-Burnet, D.

    2016-05-01

    Phylogenetic approaches to classification have been heavily developed in biology by bioinformaticians. But these techniques have applications in other fields, in particular in linguistics. Their main characteristics is to search for relationships between the objects or species in study, instead of grouping them by similarity. They are thus rather well suited for any kind of evolutionary objects. For nearly fifteen years, astrocladistics has explored the use of Maximum Parsimony (or cladistics) for astronomical objects like galaxies or globular clusters. In this lesson we will learn how it works.

  9. ELODIE: A spectrograph for accurate radial velocity measurements.

    NASA Astrophysics Data System (ADS)

    Baranne, A.; Queloz, D.; Mayor, M.; Adrianzyk, G.; Knispel, G.; Kohler, D.; Lacroix, D.; Meunier, J.-P.; Rimbaud, G.; Vin, A.

    1996-10-01

    The fibre-fed echelle spectrograph of Observatoire de Haute-Provence, ELODIE, is presented. This instrument has been in operation since the end of 1993 on the 1.93 m telescope. ELODIE is designed as an updated version of the cross-correlation spectrometer CORAVEL, to perform very accurate radial velocity measurements such as needed in the search, by Doppler shift, for brown-dwarfs or giant planets orbiting around nearby stars. In one single exposure a spectrum at a resolution of 42000 (λ/{DELTA}λ) ranging from 3906A to 6811A is recorded on a 1024x1024 CCD. This performance is achieved by using a tanθ=4 echelle grating and a combination of a prism and a grism as cross-disperser. An automatic on-line data treatment reduces all the ELODIE echelle spectra and computes cross-correlation functions. The instrument design and the data reduction algorithms are described in this paper. The efficiency and accuracy of the instrument and its long term instrumental stability allow us to measure radial velocities with an accuracy better than 15m/s for stars up to 9th magnitude in less than 30 minutes exposure time. Observations of 16th magnitude stars are also possible to measure velocities at about 1km/s accuracy. For classic spectroscopic studies (S/N>100) 9th magnitude stars can be observed in one hour exposure time.

  10. Segmentation Assisted Food Classification for Dietary Assessment.

    PubMed

    Zhu, Fengqing; Bosch, Marc; Schap, Tusarebecca; Khanna, Nitin; Ebert, David S; Boushey, Carol J; Delp, Edward J

    2011-01-24

    Accurate methods and tools to assess food and nutrient intake are essential for the association between diet and health. Preliminary studies have indicated that the use of a mobile device with a built-in camera to obtain images of the food consumed may provide a less burdensome and more accurate method for dietary assessment. We are developing methods to identify food items using a single image acquired from the mobile device. Our goal is to automatically determine the regions in an image where a particular food is located (segmentation) and correctly identify the food type based on its features (classification or food labeling). Images of foods are segmented using Normalized Cuts based on intensity and color. Color and texture features are extracted from each segmented food region. Classification decisions for each segmented region are made using support vector machine methods. The segmentation of each food region is refined based on feedback from the output of classifier to provide more accurate estimation of the quantity of food consumed.

  11. Segmentation assisted food classification for dietary assessment

    NASA Astrophysics Data System (ADS)

    Zhu, Fengqing; Bosch, Marc; Schap, TusaRebecca; Khanna, Nitin; Ebert, David S.; Boushey, Carol J.; Delp, Edward J.

    2011-03-01

    Accurate methods and tools to assess food and nutrient intake are essential for the association between diet and health. Preliminary studies have indicated that the use of a mobile device with a built-in camera to obtain images of the food consumed may provide a less burdensome and more accurate method for dietary assessment. We are developing methods to identify food items using a single image acquired from the mobile device. Our goal is to automatically determine the regions in an image where a particular food is located (segmentation) and correctly identify the food type based on its features (classification or food labeling). Images of foods are segmented using Normalized Cuts based on intensity and color. Color and texture features are extracted from each segmented food region. Classification decisions for each segmented region are made using support vector machine methods. The segmentation of each food region is refined based on feedback from the output of classifier to provide more accurate estimation of the quantity of food consumed.

  12. Implementing the North American Industry Classification System at BLS.

    ERIC Educational Resources Information Center

    Walker, James A.; Murphy, John B.

    2001-01-01

    The United States, Canada, and Mexico developed the North American Industry Classification System, which captures new and emerging industries, uses a unified concept to define industries, and is a consistent and comparable tool for measuring the nations' economies. Despite initial conversion difficulties, the new system will be a more accurate way…

  13. Preprocessing remotely-sensed data for efficient analysis and classification

    SciTech Connect

    Kelly, P.M.; White, J.M.

    1993-02-01

    Interpreting remotely-sensed data typically requires expensive, specialized computing machinery capable of storing and manipulating large amounts of data quickly. In this paper, we present a method for accurately analyzing and categorizing remotely-sensed data on much smaller, less expensive platforms. Data size is reduced in such a way an efficient, interactive method of data classification.

  14. Role of electronmicroscopy in the classification of lupus nephritis.

    PubMed

    Pirani, C L; Olesnicky, L

    1982-07-01

    The role of transmission electronmicroscopy in the clarification of the pleomorphic lesions of lupus nephritis is reviewed. Emphasis is placed on the WHO classification of glomerular lesions and in the importance of electronmicroscopy, particularly the precise identification of the milder forms of glomerular involvement. This is essential for the proper therapeutic management of patients with lupus nephritis and to establish a more accurate prognosis.

  15. Convolutional Neural Networks for patient-specific ECG classification.

    PubMed

    Kiranyaz, Serkan; Ince, Turker; Hamila, Ridha; Gabbouj, Moncef

    2015-01-01

    We propose a fast and accurate patient-specific electrocardiogram (ECG) classification and monitoring system using an adaptive implementation of 1D Convolutional Neural Networks (CNNs) that can fuse feature extraction and classification into a unified learner. In this way, a dedicated CNN will be trained for each patient by using relatively small common and patient-specific training data and thus it can also be used to classify long ECG records such as Holter registers in a fast and accurate manner. Alternatively, such a solution can conveniently be used for real-time ECG monitoring and early alert system on a light-weight wearable device. The experimental results demonstrate that the proposed system achieves a superior classification performance for the detection of ventricular ectopic beats (VEB) and supraventricular ectopic beats (SVEB).

  16. X-ray exposure sensor and controller

    NASA Technical Reports Server (NTRS)

    Berdahl, C. Martin (Inventor)

    1977-01-01

    An exposure controller for x-ray equipment is provided, which comprises a portable and accurate sensor which can be placed adjacent to and directly beneath the area of interest of an x-ray plate, and which measures the amount of exposure received by that area, and turns off the x-ray equipment when the exposure for the particular area of interest on the x-ray plate reaches the value which provides an optimal x-ray plate.

  17. Commission 45: Spectral Classification

    NASA Astrophysics Data System (ADS)

    Giridhar, Sunetra; Gray, Richard O.; Corbally, Christopher J.; Bailer-Jones, Coryn A. L.; Eyer, Laurent; Irwin, Michael J.; Kirkpatrick, J. Davy; Majewski, Steven; Minniti, Dante; Nordström, Birgitta

    This report gives an update of developments (since the last General Assembly at Prague) in the areas that are of relevance to the commission. In addition to numerous papers, a new monograph entitled Stellar Spectral Classification with Richard Gray and Chris Corbally as leading authors will be published by Princeton University Press as part of their Princeton Series in Astrophysics in April 2009. This book is an up-to-date and encyclopedic review of stellar spectral classification across the H-R diagram, including the traditional MK system in the blue-violet, recent extensions into the ultraviolet and infrared, the newly defined L-type and T-type spectral classes, as well as spectral classification of carbon stars, S-type stars, white dwarfs, novae, supernovae and Wolf-Rayet stars.

  18. Information gathering for CLP classification.

    PubMed

    Marcello, Ida; Giordano, Felice; Costamagna, Francesca Marina

    2011-01-01

    Regulation 1272/2008 includes provisions for two types of classification: harmonised classification and self-classification. The harmonised classification of substances is decided at Community level and a list of harmonised classifications is included in the Annex VI of the classification, labelling and packaging Regulation (CLP). If a chemical substance is not included in the harmonised classification list it must be self-classified, based on available information, according to the requirements of Annex I of the CLP Regulation. CLP appoints that the harmonised classification will be performed for carcinogenic, mutagenic or toxic to reproduction substances (CMR substances) and for respiratory sensitisers category 1 and for other hazard classes on a case-by-case basis. The first step of classification is the gathering of available and relevant information. This paper presents the procedure for gathering information and to obtain data. The data quality is also discussed.

  19. Classifications for carcinogenesis of antitumoral drugs.

    PubMed

    Binetti, R; Costamagna, F M; Marcello, I

    2003-12-01

    The aim of this review is to support the medical staff engaged in tumor therapy with the carcinogenicity, mutagenicity, developmental toxicity classification of a large number of chemiotherapic drugs by national and international Agencies; it also gives their rationale and the few cases for which the classification varies among, for example, the European Union and the United States of America. A large list of such drugs, producers, commercial names, CAS numbers and chemical names is reported. This list is subject to changes for the quick development in this field: many drugs are retired and many more are introduced in clinical practice. The list is updated to the summer 2003 and retains many drugs which have more than one use or have limited use. The protection of the medical personnel using antitumor chemiotherapics can need retrospective epidemiological investigations and obsolete drugs are of importance for some of the past exposures.

  20. A New Classification Based on the Kaban's Modification for Surgical Management of Craniofacial Microsomia

    PubMed Central

    Madrid, Jose Rolando Prada; Montealegre, Giovanni; Gomez, Viviana

    2010-01-01

    In medicine, classifications are designed to describe accurately and reliably all anatomic and structural components, establish a prognosis, and guide a given treatment. Classifications should be useful in a universal way to facilitate communication between health professionals and to formulate management protocols. In many situations and particularly with craniofacial microsomia, there have been many different classifications that do not achieve this goal. In fact, when there are so many classifications, one can conclude that there is not a clear one that accomplishes all these ends and defines a treatment protocol. It is our intent to present a new classification based on the Pruzansky's classification, later modified by Kaban, to determine treatment protocols based on the degree of osseous deficiency present in the body, ramus, and temporomandibular joint. Different mandibular defects are presented in two patients with craniofacial microsomia type III and IV according to our classification with the corresponding management proposed for each type and adequate functional results. PMID:22110812

  1. Ensemble polarimetric SAR image classification based on contextual sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Lamei; Wang, Xiao; Zou, Bin; Qiao, Zhijun

    2016-05-01

    Polarimetric SAR image interpretation has become one of the most interesting topics, in which the construction of the reasonable and effective technique of image classification is of key importance. Sparse representation represents the data using the most succinct sparse atoms of the over-complete dictionary and the advantages of sparse representation also have been confirmed in the field of PolSAR classification. However, it is not perfect, like the ordinary classifier, at different aspects. So ensemble learning is introduced to improve the issue, which makes a plurality of different learners training and obtained the integrated results by combining the individual learner to get more accurate and ideal learning results. Therefore, this paper presents a polarimetric SAR image classification method based on the ensemble learning of sparse representation to achieve the optimal classification.

  2. Land Cover Classification Using ALOS Imagery For Penang, Malaysia

    NASA Astrophysics Data System (ADS)

    Sim, C. K.; Abdullah, K.; MatJafri, M. Z.; Lim, H. S.

    2014-02-01

    This paper presents the potential of integrating optical and radar remote sensing data to improve automatic land cover mapping. The analysis involved standard image processing, and consists of spectral signature extraction and application of a statistical decision rule to identify land cover categories. A maximum likelihood classifier is utilized to determine different land cover categories. Ground reference data from sites throughout the study area are collected for training and validation. The land cover information was extracted from the digital data using PCI Geomatica 10.3.2 software package. The variations in classification accuracy due to a number of radar imaging processing techniques are studied. The relationship between the processing window and the land classification is also investigated. The classification accuracies from the optical and radar feature combinations are studied. Our research finds that fusion of radar and optical significantly improved classification accuracies. This study indicates that the land cover/use can be mapped accurately by using this approach.

  3. Classification of cardiac patient states using artificial neural networks

    PubMed Central

    Kannathal, N; Acharya, U Rajendra; Lim, Choo Min; Sadasivan, PK; Krishnan, SM

    2003-01-01

    Electrocardiogram (ECG) is a nonstationary signal; therefore, the disease indicators may occur at random in the time scale. This may require the patient be kept under observation for long intervals in the intensive care unit of hospitals for accurate diagnosis. The present study examined the classification of the states of patients with certain diseases in the intensive care unit using their ECG and an Artificial Neural Networks (ANN) classification system. The states were classified into normal, abnormal and life threatening. Seven significant features extracted from the ECG were fed as input parameters to the ANN for classification. Three neural network techniques, namely, back propagation, self-organizing maps and radial basis functions, were used for classification of the patient states. The ANN classifier in this case was observed to be correct in approximately 99% of the test cases. This result was further improved by taking 13 features of the ECG as input for the ANN classifier. PMID:19649222

  4. Identification of an Efficient Gene Expression Panel for Glioblastoma Classification

    PubMed Central

    Zelaya, Ivette; Laks, Dan R.; Zhao, Yining; Kawaguchi, Riki; Gao, Fuying; Kornblum, Harley I.; Coppola, Giovanni

    2016-01-01

    We present here a novel genetic algorithm-based random forest (GARF) modeling technique that enables a reduction in the complexity of large gene disease signatures to highly accurate, greatly simplified gene panels. When applied to 803 glioblastoma multiforme samples, this method allowed the 840-gene Verhaak et al. gene panel (the standard in the field) to be reduced to a 48-gene classifier, while retaining 90.91% classification accuracy, and outperforming the best available alternative methods. Additionally, using this approach we produced a 32-gene panel which allows for better consistency between RNA-seq and microarray-based classifications, improving cross-platform classification retention from 69.67% to 86.07%. A webpage producing these classifications is available at http://simplegbm.semel.ucla.edu. PMID:27855170

  5. [New WHO-classification of lung and pleural tumors].

    PubMed

    Wagenaar, S S

    1999-05-08

    A new classification of the World Health Organization (WHO) of lung and pleural tumours will be published presently. Compared with the previous edition of 1981 the changed parts more accurately reflect the available therapeutic choices and the prognostic characteristics of the different tumour types. The classification is based on conventional light-microscopical typing. Additional techniques (from histochemistry, immune histochemistry, electron microscopy and molecular biology) have not yet decisive influence on tumour typing. The dichotomy between small-cell and large-cell carcinomas is too simplistic, as the group of large-cell carcinomas is heterogeneous, and further differentiation leads to identification of tumour types with distinct therapeutic options and prognostic characteristics. There are new criteria for the classification of neuroendocrine tumours, such as the mitotic index. It is recommended to use the newly revised classification for diagnostic purposes, epidemiology and biologic studies.

  6. Computerized Classification Testing under Practical Constraints with a Polytomous Model.

    ERIC Educational Resources Information Center

    Lau, C. Allen; Wang, Tianyou

    A study was conducted to extend the sequential probability ratio testing (SPRT) procedure with the polytomous model under some practical constraints in computerized classification testing (CCT), such as methods to control item exposure rate, and to study the effects of other variables, including item information algorithms, test difficulties, item…

  7. Land use/cover classification in the Brazilian Amazon using satellite images.

    PubMed

    Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant'anna, Sidnei João Siqueira

    2012-09-01

    Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

  8. Land use/cover classification in the Brazilian Amazon using satellite images

    PubMed Central

    Lu, Dengsheng; Batistella, Mateus; Li, Guiying; Moran, Emilio; Hetrick, Scott; Freitas, Corina da Costa; Dutra, Luciano Vieira; Sant’Anna, Sidnei João Siqueira

    2013-01-01

    Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation-based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi-resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Of the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, has the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical-based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data. PMID:24353353

  9. Hybrid Classification of Pulmonary Nodules

    NASA Astrophysics Data System (ADS)

    Lee, S. L. A.; Kouzani, A. Z.; Hu, E. J.

    Automated classification of lung nodules is challenging because of the variation in shape and size of lung nodules, as well as their associated differences in their images. Ensemble based learners have demonstrated the potentialof good performance. Random forests are employed for pulmonary nodule classification where each tree in the forest produces a classification decision, and an integrated output is calculated. A classification aided by clustering approach is proposed to improve the lung nodule classification performance. Three experiments are performed using the LIDC lung image database of 32 cases. The classification performance and execution times are presented and discussed.

  10. Flotation classification of ultrafine particles -- A novel classification approach

    SciTech Connect

    Qiu Guanzhou; Luo Lin; Hu Yuehua; Xu Jin; Wang Dianzuo

    1995-12-31

    This paper introduces a novel classification approach named the flotation classification approach which works by controlling interactions between particles. It differs considerably from the conventional classification processes operating on mechanical forces. In the present test, the micro-bubble flotation technology is grafted onto hydro-classification. Selective aggregation and dispersion of ultrafine particles are achieved through governing the interactions in the classification process. A series of laboratory classification tests for {minus}44 gm kaolin have been conducted on a classification column. As a result, about 92% recovery for minus 2 {micro}m size fraction Kaolin in the final product is obtained. In addition, two criteria for the classification are set up. Finally, a principle of classifying and controlling the interactions between particles is discussed in terms of surface thermodynamics and hydrodynamics.

  11. Rockfall exposures in Montserrat mountain

    NASA Astrophysics Data System (ADS)

    Fontquerni Gorchs, Sara; Vilaplana Fernández, Joan Manuel; Guinau Sellés, Marta; Jesús Royán Cordero, Manuel

    2015-04-01

    This study shows the developed methodology to analyze the exposure level on a 1:25000 scale, and the results obtained by applying it to an important part of the Monataña de Montserrat Natural Park for vehicles with and without considering their occupants. The development of this proposal is part of an ongoing study which focuses more in-depth in the analysis of the rockfall risk exposure in different scales and in different natural and social contexts. This research project applies a methodology to evaluate the rockfall exposure level based on the product of the frequency of occurrence of the event by an exposure function of the vulnerable level on a 1:25,000 scale although the scale used for the study was 1:10,000. The proposed methodology to calculate the exposure level is based on six phases: 1- Identification, classification and inventory of every element potentially under risk. 2- Zoning of the frequency of occurrence of the event in the studied area. 3- Design of the exposure function for each studied element. 4- Obtaining the Exposure index, it can be defined as the product of the frequency of occurrence by the exposure function of the vulnerable element through SIG analysis obtained with ArcGis software (ESRI) 5- Obtaining exposure level by grouping into categories the numerical values of the exposure index. 6- Production of the exposition zoning map. The different types of vulnerable elements considered in the totality of the study are: Vehicles in motion, people in vehicles in motion, people on paths, permanent elements and people in buildings. Each defined typology contains all elements with same characteristics and an exposure function has been designed for each of them. For the exposure calculation, two groups of elements have been considered; firstly the group of elements with no people involved and afterwards same group of elements but with people involved. This is a first comprehensive and synthetic work about rockfall exposure on the Montserrat

  12. Indigenous vs. International soil classification system in Ohangwena Region, Namibia

    NASA Astrophysics Data System (ADS)

    Prudat, Brice; Kuhn, Nikolaus J.; Bloemertz, Lena

    2014-05-01

    This poster will present soil diversity in North-Central Namibia, with a focus on soil fertility. It aims to show the correspondence and differences between an international and an indigenous soil classification system. International classifications, like World Reference Base for Soil Resources (WRB), are very helpful tools to share information in soil science and agriculture. However, these classification are meaningful for large scale soil processes understanding but local specificities cannot be understood and differentiated. On the other hand, knowledge that farmers have on cultivated soils is very accurate and adapted to local agricultural use. However, their knowledge should be properly defined and translated to be used by scientists. Once their knowledge can be read by scientists, it provides very powerful tools for soil mapping and characterization. Analysis so far has focused on the area of Ondobe (30 km West from Eenhana, Ohangwena region). This area is located between two major systems, the Cuvelai floodplain to the West and the Kalahari Woodlands to the East. While all the cultivated soils from this region would be classified as Arenosols (WRB), the local classification differentiates five major soil types (Omutunda, Ehenge, Omufitu, Elondo, Ehenene). In WRB classification, these soils correspond, roughly, to specific Arenosols, respectively Hypereutric, Albic, Haplic, Rubic and Salic Arenosols. Further work will evaluate, the local variation inside each indigenous soil types. Hierarchical classification using soil field descriptors will be used to create statistic soil groups. These new groups will then be compared to each classification system.

  13. Equivalent Diagnostic Classification Models

    ERIC Educational Resources Information Center

    Maris, Gunter; Bechger, Timo

    2009-01-01

    Rupp and Templin (2008) do a good job at describing the ever expanding landscape of Diagnostic Classification Models (DCM). In many ways, their review article clearly points to some of the questions that need to be answered before DCMs can become part of the psychometric practitioners toolkit. Apart from the issues mentioned in this article that…

  14. Shark Teeth Classification

    ERIC Educational Resources Information Center

    Brown, Tom; Creel, Sally; Lee, Velda

    2009-01-01

    On a recent autumn afternoon at Harmony Leland Elementary in Mableton, Georgia, students in a fifth-grade science class investigated the essential process of classification--the act of putting things into groups according to some common characteristics or attributes. While they may have honed these skills earlier in the week by grouping their own…

  15. The Classification Conundrum.

    ERIC Educational Resources Information Center

    Granger, Charles R.

    1983-01-01

    Argues against the five-kingdom scheme of classification as using inconsistent criteria, ending up with divisions that are forced, not natural. Advocates an approach using cell type/complexity and modification of the metabolic machinery, recommending the five-kingdom scheme as starting point for class discussion on taxonomy and its conceptual…

  16. Improving Student Question Classification

    ERIC Educational Resources Information Center

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  17. Soil Classification and Treatment.

    ERIC Educational Resources Information Center

    Clemson Univ., SC. Vocational Education Media Center.

    This instructional unit was designed to enable students, primarily at the secondary level, to (1) classify soils according to current capability classifications of the Soil Conservation Service, (2) select treatments needed for a given soil class according to current recommendations provided by the Soil Conservation Service, and (3) interpret a…

  18. The Biglan Classification Revisited.

    ERIC Educational Resources Information Center

    Stoecker, Judith L.

    This study replicated previous research that tested the validity of A. Biglan's classification scheme, a theoretical framework for empirically examining the differences among academic disciplines and classifying them according to three dimensions (hard-soft, pure-applied, life-nonlife). In addition, new data were used to attempt continued…

  19. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  20. Refining Time-Activity Classification of Human Subjects Using the Global Positioning System

    PubMed Central

    Hu, Maogui; Li, Wei; Li, Lianfa; Houston, Douglas; Wu, Jun

    2016-01-01

    Background Detailed spatial location information is important in accurately estimating personal exposure to air pollution. Global Position System (GPS) has been widely used in tracking personal paths and activities. Previous researchers have developed time-activity classification models based on GPS data, most of them were developed for specific regions. An adaptive model for time-location classification can be widely applied to air pollution studies that use GPS to track individual level time-activity patterns. Methods Time-activity data were collected for seven days using GPS loggers and accelerometers from thirteen adult participants from Southern California under free living conditions. We developed an automated model based on random forests to classify major time-activity patterns (i.e. indoor, outdoor-static, outdoor-walking, and in-vehicle travel). Sensitivity analysis was conducted to examine the contribution of the accelerometer data and the supplemental spatial data (i.e. roadway and tax parcel data) to the accuracy of time-activity classification. Our model was evaluated using both leave-one-fold-out and leave-one-subject-out methods. Results Maximum speeds in averaging time intervals of 7 and 5 minutes, and distance to primary highways with limited access were found to be the three most important variables in the classification model. Leave-one-fold-out cross-validation showed an overall accuracy of 99.71%. Sensitivities varied from 84.62% (outdoor walking) to 99.90% (indoor). Specificities varied from 96.33% (indoor) to 99.98% (outdoor static). The exclusion of accelerometer and ambient light sensor variables caused a slight loss in sensitivity for outdoor walking, but little loss in overall accuracy. However, leave-one-subject-out cross-validation showed considerable loss in sensitivity for outdoor static and outdoor walking conditions. Conclusions The random forests classification model can achieve high accuracy for the four major time

  1. Efficient Fingercode Classification

    NASA Astrophysics Data System (ADS)

    Sun, Hong-Wei; Law, Kwok-Yan; Gollmann, Dieter; Chung, Siu-Leung; Li, Jian-Bin; Sun, Jia-Guang

    In this paper, we present an efficient fingerprint classification algorithm which is an essential component in many critical security application systems e. g. systems in the e-government and e-finance domains. Fingerprint identification is one of the most important security requirements in homeland security systems such as personnel screening and anti-money laundering. The problem of fingerprint identification involves searching (matching) the fingerprint of a person against each of the fingerprints of all registered persons. To enhance performance and reliability, a common approach is to reduce the search space by firstly classifying the fingerprints and then performing the search in the respective class. Jain et al. proposed a fingerprint classification algorithm based on a two-stage classifier, which uses a K-nearest neighbor classifier in its first stage. The fingerprint classification algorithm is based on the fingercode representation which is an encoding of fingerprints that has been demonstrated to be an effective fingerprint biometric scheme because of its ability to capture both local and global details in a fingerprint image. We enhance this approach by improving the efficiency of the K-nearest neighbor classifier for fingercode-based fingerprint classification. Our research firstly investigates the various fast search algorithms in vector quantization (VQ) and the potential application in fingerprint classification, and then proposes two efficient algorithms based on the pyramid-based search algorithms in VQ. Experimental results on DB1 of FVC 2004 demonstrate that our algorithms can outperform the full search algorithm and the original pyramid-based search algorithms in terms of computational efficiency without sacrificing accuracy.

  2. Comparison of Cramer classification between Toxtree, the OECD QSAR Toolbox and expert judgment.

    PubMed

    Bhatia, Sneha; Schultz, Terry; Roberts, David; Shen, Jie; Kromidas, Lambros; Marie Api, Anne

    2015-02-01

    The Threshold of Toxicological Concern (TTC) is a pragmatic approach in risk assessment. In the absence of data, it sets up levels of human exposure that are considered to have no appreciable risk to human health. The Cramer decision tree is used extensively to determine these exposure thresholds by categorizing non-carcinogenic chemicals into three different structural classes. Therefore, assigning an accurate Cramer class to a material is a crucial step to preserve the integrity of the risk assessment. In this study the Cramer class of over 1000 fragrance materials across diverse chemical classes were determined by using Toxtree (TT), the OECD QSAR Toolbox (TB), and expert judgment. Disconcordance was observed between TT and the TB. A total of 165 materials (16%) showed different results from the two programs. The overall concordance for Cramer classification between TT and expert judgment is 83%, while the concordance between the TB and expert judgment is 77%. Amines, lactones and heterocycles have the lowest percent agreement with expert judgment for TT and the TB. For amines, the expert judgment agreement is 45% for TT and 55% for the TB. For heterocycles, the expert judgment agreement is 55% for TT and the TB. For lactones, the expert judgment agreement is 56% for TT and 50% for the TB. Additional analyses were conducted to determine the concordance within various chemical classes. Critical checkpoints in the decision tree are identified. Strategies and guidance on determining the Cramer class for various chemical classes are discussed.

  3. Incorporating potency into EU classification for carcinogenicity and reproductive toxicity.

    PubMed

    Hennes, C; Batke, M; Bomann, W; Duhayon, S; Kosemund, K; Politano, V; Stinchcombe, S; Doe, J

    2014-11-01

    Although risk assessment, assessing the potential harm of each particular exposure of a substance, is desirable, it is not feasible in many situations. Risk assessment uses a process of hazard identification, hazard characterisation, and exposure assessment as its components. In the absence of risk assessment, the purpose of classification is to give broad guidance (through the label) on the suitability of a chemical in a range of use situations. Hazard classification in the EU is a process involving identification of the hazards of a substance, followed by comparison of those hazards (including degree of hazard) with defined criteria. Classification should therefore give guidance on degree of hazard as well as hazard identification. Potency is the most important indicator of degree of hazard and should therefore be included in classification. This is done for acute lethality and general toxicity by classifying on dose required to cause the effect. The classification in the EU for carcinogenicity and reproductive toxicity does not discriminate across the wide range of potencies seen (6 orders of magnitude) for carcinogenicity and for developmental toxicity and fertility. Therefore potency should be included in the classification process. The methodology in the EU guidelines for classification for deriving specific concentration limits is a rigorous process for assigning substances which cause tumours or developmental toxicity and infertility in experimental animals to high, medium or low degree of hazard categories by incorporating potency. Methods are suggested on how the degree of hazard so derived could be used in the EU classification process to improve hazard communication and in downstream risk management.

  4. [Classification of primary bone tumors].

    PubMed

    Dominok, G W; Frege, J

    1986-01-01

    An expanded classification for bone tumors is presented based on the well known international classification as well as earlier systems. The current status and future trends in this area are discussed.

  5. Automatic Classification in Information Retrieval.

    ERIC Educational Resources Information Center

    van Rijsbergen, C. J.

    1978-01-01

    Addresses the application of automatic classification methods to the problems associated with computerized document retrieval. Different kinds of classifications are described, and both document and term clustering methods are discussed. References and notes are provided. (Author/JD)

  6. Classification: Purposes, Principles, Progress, Prospects

    ERIC Educational Resources Information Center

    Sokal, Robert R.

    1974-01-01

    Clustering and other new techniques have changed classificatory principles and practice in many sciences. Discussed are defintions, purposes of classification, principles of classification, and recent trends. (Author/RH)

  7. EXPOSURE ANALYSIS

    EPA Science Inventory

    This proceedings chapter will discuss the state-of-the-science regarding the evaluation of exposure as it relates to water quality criteria (WQC), sediment quality guidelines (SQG), and wildlife criteria (WC). Throughout this discussion, attempts are made to identify the methods ...

  8. Free classification of American English dialects by native and non-native listeners

    PubMed Central

    Clopper, Cynthia G.; Bradlow, Ann R.

    2009-01-01

    Most second language acquisition research focuses on linguistic structures, and less research has examined the acquisition of sociolinguistic patterns. The current study explored the perceptual classification of regional dialects of American English by native and non-native listeners using a free classification task. Results revealed similar classification strategies for the native and non-native listeners. However, the native listeners were more accurate overall than the non-native listeners. In addition, the non-native listeners were less able to make use of constellations of cues to accurately classify the talkers by dialect. However, the non-native listeners were able to attend to cues that were either phonologically or sociolinguistically relevant in their native language. These results suggest that non-native listeners can use information in the speech signal to classify talkers by regional dialect, but that their lack of signal-independent cultural knowledge about variation in the second language leads to less accurate classification performance. PMID:20161400

  9. Reading List in Classification Theory.

    ERIC Educational Resources Information Center

    Richmond, Phyllis A.

    The reading list contains 180 references and serves as an introduction to classification research literature. While the list includes major viewpoints on basic work in the field of classification as well as related areas, it excludes critical literature. The major divisions of the reading list are: (1) Definition: What is Classification, (2)…

  10. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  11. PROCEDURES FOR ACCURATE PRODUCTION OF COLOR IMAGES FROM SATELLITE OR AIRCRAFT MULTISPECTRAL DIGITAL DATA.

    USGS Publications Warehouse

    Duval, Joseph S.

    1985-01-01

    Because the display and interpretation of satellite and aircraft remote-sensing data make extensive use of color film products, accurate reproduction of the color images is important. To achieve accurate color reproduction, the exposure and chemical processing of the film must be monitored and controlled. By using a combination of sensitometry, densitometry, and transfer functions that control film response curves, all of the different steps in the making of film images can be monitored and controlled. Because a sensitometer produces a calibrated exposure, the resulting step wedge can be used to monitor the chemical processing of the film. Step wedges put on film by image recording machines provide a means of monitoring the film exposure and color balance of the machines.

  12. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  13. ASSESSING EXPOSURE CLASSIFICATION IN THE AGRICULTURAL HEALTH STUDY

    EPA Science Inventory

    The Agricultural Health Study (AHS) is a prospective epidemiologic study examining cancer and non-cancer health outcomes for over 55,000 pesticide applicators and 34,000 spouses in Iowa and North Carolina. Questionnaires were used to collect information about the use of specific ...

  14. Assessing exposure in epidemiologic studies to disinfection by-products in drinking water: report from an international workshop.

    PubMed Central

    Arbuckle, Tye E; Hrudey, Steve E; Krasner, Stuart W; Nuckols, Jay R; Richardson, Susan D; Singer, Philip; Mendola, Pauline; Dodds, Linda; Weisel, Clifford; Ashley, David L; Froese, Kenneth L; Pegram, Rex A; Schultz, Irvin R; Reif, John; Bachand, Annette M; Benoit, Frank M; Lynberg, Michele; Poole, Charles; Waller, Kirsten

    2002-01-01

    The inability to accurately assess exposure has been one of the major shortcomings of epidemiologic studies of disinfection by-products (DBPs) in drinking water. A number of contributing factors include a) limited information on the identity, occurrence, toxicity, and pharmacokinetics of the many DBPs that can be formed from chlorine, chloramine, ozone, and chlorine dioxide disinfection; b) the complex chemical interrelationships between DBPs and other parameters within a municipal water distribution system; and c) difficulties obtaining accurate and reliable information on personal activity and water consumption patterns. In May 2000, an international workshop was held to bring together various disciplines to develop better approaches for measuring DBP exposure for epidemiologic studies. The workshop reached consensus about the clear need to involve relevant disciplines (e.g., chemists, engineers, toxicologists, biostatisticians and epidemiologists) as partners in developing epidemiologic studies of DBPs in drinking water. The workshop concluded that greater collaboration of epidemiologists with water utilities and regulators should be encouraged in order to make regulatory monitoring data more useful for epidemiologic studies. Similarly, exposure classification categories in epidemiologic studies should be chosen to make results useful for regulatory or policy decision making. PMID:11834463

  15. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  16. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  17. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  18. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  19. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  20. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  1. Fast Image Texture Classification Using Decision Trees

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2011-01-01

    Texture analysis would permit improved autonomous, onboard science data interpretation for adaptive navigation, sampling, and downlink decisions. These analyses would assist with terrain analysis and instrument placement in both macroscopic and microscopic image data products. Unfortunately, most state-of-the-art texture analysis demands computationally expensive convolutions of filters involving many floating-point operations. This makes them infeasible for radiation- hardened computers and spaceflight hardware. A new method approximates traditional texture classification of each image pixel with a fast decision-tree classifier. The classifier uses image features derived from simple filtering operations involving integer arithmetic. The texture analysis method is therefore amenable to implementation on FPGA (field-programmable gate array) hardware. Image features based on the "integral image" transform produce descriptive and efficient texture descriptors. Training the decision tree on a set of training data yields a classification scheme that produces reasonable approximations of optimal "texton" analysis at a fraction of the computational cost. A decision-tree learning algorithm employing the traditional k-means criterion of inter-cluster variance is used to learn tree structure from training data. The result is an efficient and accurate summary of surface morphology in images. This work is an evolutionary advance that unites several previous algorithms (k-means clustering, integral images, decision trees) and applies them to a new problem domain (morphology analysis for autonomous science during remote exploration). Advantages include order-of-magnitude improvements in runtime, feasibility for FPGA hardware, and significant improvements in texture classification accuracy.

  2. Tree Classification Software

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1993-01-01

    This paper introduces the IND Tree Package to prospective users. IND does supervised learning using classification trees. This learning task is a basic tool used in the development of diagnosis, monitoring and expert systems. The IND Tree Package was developed as part of a NASA project to semi-automate the development of data analysis and modelling algorithms using artificial intelligence techniques. The IND Tree Package integrates features from CART and C4 with newer Bayesian and minimum encoding methods for growing classification trees and graphs. The IND Tree Package also provides an experimental control suite on top. The newer features give improved probability estimates often required in diagnostic and screening tasks. The package comes with a manual, Unix 'man' entries, and a guide to tree methods and research. The IND Tree Package is implemented in C under Unix and was beta-tested at university and commercial research laboratories in the United States.

  3. The KFM, A Homemade Yet Accurate and Dependable Fallout Meter

    SciTech Connect

    Kearny, C.H.

    2001-11-20

    The KFM is a homemade fallout meter that can be made using only materials, tools, and skills found in millions of American homes. It is an accurate and dependable electroscope-capacitor. The KFM, in conjunction with its attached table and a watch, is designed for use as a rate meter. Its attached table relates observed differences in the separations of its two leaves (before and after exposures at the listed time intervals) to the dose rates during exposures of these time intervals. In this manner dose rates from 30 mR/hr up to 43 R/hr can be determined with an accuracy of {+-}25%. A KFM can be charged with any one of the three expedient electrostatic charging devices described. Due to the use of anhydrite (made by heating gypsum from wallboard) inside a KFM and the expedient ''dry-bucket'' in which it can be charged when the air is very humid, this instrument always can be charged and used to obtain accurate measurements of gamma radiation no matter how high the relative humidity. The heart of this report is the step-by-step illustrated instructions for making and using a KFM. These instructions have been improved after each successive field test. The majority of the untrained test families, adequately motivated by cash bonuses offered for success and guided only by these written instructions, have succeeded in making and using a KFM. NOTE: ''The KFM, A Homemade Yet Accurate and Dependable Fallout Meter'', was published by Oak Ridge National Laboratory report in1979. Some of the materials originally suggested for suspending the leaves of the Kearny Fallout Meter (KFM) are no longer available. Because of changes in the manufacturing process, other materials (e.g., sewing thread, unwaxed dental floss) may not have the insulating capability to work properly. Oak Ridge National Laboratory has not tested any of the suggestions provided in the preface of the report, but they have been used by other groups. When using these instructions, the builder can verify the

  4. Ectoparasites and classification.

    PubMed

    Hopla, C E; Durden, L A; Keirans, J E

    1994-12-01

    The authors present an introductory overview of the principal groups of ectoparasites (flukes, leeches, crustaceans, insects, arachnids, lampreys and vampire bats) associated with domestic animals. Currently-accepted higher-level classifications are outlined for these parasites. Almost all significant ectoparasites of domestic animals are invertebrates, the majority being arthropods (crustaceans, insects and arachnids). Some of these ectoparasites are of particular importance as vectors of pathogens. Many ectoparasite species are host-specific, and vector species typically transmit characteristic pathogens.

  5. Granular loess classification based

    SciTech Connect

    Browzin, B.S.

    1985-05-01

    This paper discusses how loess might be identified by two index properties: the granulometric composition and the dry unit weight. These two indices are necessary but not always sufficient for identification of loess. On the basis of analyses of samples from three continents, it was concluded that the 0.01-0.5-mm fraction deserves the name loessial fraction. Based on the loessial fraction concept, a granulometric classification of loess is proposed. A triangular chart is used to classify loess.

  6. Cautious Collective Classification

    DTIC Science & Technology

    2009-12-01

    topics ). c©2009 Luke K. McDowell, Kalyan Moy Gupta, and David W. Aha. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden...11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13 ...classification tasks, such 2779 MCDOWELL, GUPTA AND AHA as predicting the topic of a publication or the group membership of a person (Koller et al., 2007

  7. Repulsive-SVDD Classification

    DTIC Science & Technology

    2015-05-22

    Canberra, ACT 2601, Australia dat.tran@canberra.edu.au Abstract. Support vector data description (SVDD) is a well-known ker- nel method that constructs a...for the proposed method . Keywords: Repulsive SVDD · Support vector data description · Sup- port vector machine · Classification 1 Introduction...estimate the support of a high-dimensional distribution. Although this method uses a maximal-margin hyperplane instead of a hypersphere to separate the

  8. Exposure chamber

    DOEpatents

    Moss, Owen R.; Briant, James K.

    1983-01-01

    An exposure chamber includes an imperforate casing having a fluid inlet at the top and an outlet at the bottom. A single vertical series of imperforate trays is provided. Each tray is spaced on all sides from the chamber walls. Baffles adjacent some of the trays restrict and direct the flow to give partial flow back and forth across the chambers and downward flow past the lowermost pan adjacent a central plane of the chamber.

  9. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  10. Spatial Mutual Information Based Hyperspectral Band Selection for Classification

    PubMed Central

    2015-01-01

    The amount of information involved in hyperspectral imaging is large. Hyperspectral band selection is a popular method for reducing dimensionality. Several information based measures such as mutual information have been proposed to reduce information redundancy among spectral bands. Unfortunately, mutual information does not take into account the spatial dependency between adjacent pixels in images thus reducing its robustness as a similarity measure. In this paper, we propose a new band selection method based on spatial mutual information. As validation criteria, a supervised classification method using support vector machine (SVM) is used. Experimental results of the classification of hyperspectral datasets show that the proposed method can achieve more accurate results. PMID:25918742

  11. The new revised classification of acute pancreatitis 2012.

    PubMed

    Sarr, Michael G; Banks, Peter A; Bollen, Thomas L; Dervenis, Christos; Gooszen, Hein G; Johnson, Colin D; Tsiotos, Gregory G; Vege, Santhi Swaroop

    2013-06-01

    This study aims to update the 1991 Atlanta Classification of acute pancreatitis, to standardize the reporting of and terminology of the disease and its complications. Important features of this classification have incorporated new insights into the disease learned over the last 20 years, including the recognition that acute pancreatitis and its complications involve a dynamic process involving two phases, early and late. The accurate and consistent description of acute pancreatitis will help to improve the stratification and reporting of new methods of care of acute pancreatitis across different practices, geographic areas, and countries.

  12. Phylogenetic classification and identification of bacteria by mass spectrometry.

    PubMed

    Freiwald, Anja; Sauer, Sascha

    2009-01-01

    Bacteria are a convenient source of intrinsic marker proteins, which can be detected efficiently by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. The patterns of protein masses observed can be used for accurate classification and identification of bacteria. Key to the reliability of the method is a robust and standardized procedure for sample preparations, including bacterial culturing, chemical treatment for bacterial cell wall disruption and for protein extraction, and mass spectrometry analysis. The protocol is an excellent alternative to classical microbiological classification and identification procedures, requiring minimal sample preparation efforts and costs. Without cell culturing, the protocol takes in general <1 h.

  13. Overview of the International Classification of Vestibular Disorders.

    PubMed

    Bisdorff, Alexandre R; Staab, Jeffrey P; Newman-Toker, David E

    2015-08-01

    Classifications and definitions are essential to facilitate communication; promote accurate diagnostic criteria; develop, test, and use effective therapies; and specify knowledge gaps. This article describes the development of the International Classification of Vestibular Disorders (ICVD) initiative. It describes its history, scope, and goals. The Bárány Society has played a central role in organizing the ICVD by establishing internal development processes and outreach to other scientific societies. The ICVD is organized in four layers. The current focus is on disorders with a high epidemiologic importance, such as Menière disease, benign paroxysmal positional vertigo, vestibular migraine, and behavioral aspects of vestibular disorders.

  14. Classification of mental disorders*

    PubMed Central

    Stengel, E.

    1959-01-01

    One of the fundamental difficulties in devising a classification of mental disorders is the lack of agreement among psychiatrists regarding the concepts upon which it should be based: diagnoses can rarely be verified objectively and the same or similar conditions are described under a confusing variety of names. This situation militates against the ready exchange of ideas and experiences and hampers progress. As a first step towards remedying this state of affairs, the author of the article below has undertaken a critical survey of existing classifications. He shows how some of the difficulties created by lack of knowledge regarding pathology and etiology may be overcome by the use of “operational definitions” and outlines the basic principles on which he believes a generally acceptable international classification might be constructed. If this can be done it should lead to a greater measure of agreement regarding the value of specific treatments for mental disorders and greatly facilitate a broad epidemiological approach to psychiatric research. PMID:13834299

  15. Seismic event classification system

    DOEpatents

    Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William

    1994-01-01

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.

  16. Histologic classification of gliomas.

    PubMed

    Perry, Arie; Wesseling, Pieter

    2016-01-01

    Gliomas form a heterogeneous group of tumors of the central nervous system (CNS) and are traditionally classified based on histologic type and malignancy grade. Most gliomas, the diffuse gliomas, show extensive infiltration in the CNS parenchyma. Diffuse gliomas can be further typed as astrocytic, oligodendroglial, or rare mixed oligodendroglial-astrocytic of World Health Organization (WHO) grade II (low grade), III (anaplastic), or IV (glioblastoma). Other gliomas generally have a more circumscribed growth pattern, with pilocytic astrocytomas (WHO grade I) and ependymal tumors (WHO grade I, II, or III) as the most frequent representatives. This chapter provides an overview of the histology of all glial neoplasms listed in the WHO 2016 classification, including the less frequent "nondiffuse" gliomas and mixed neuronal-glial tumors. For multiple decades the histologic diagnosis of these tumors formed a useful basis for assessment of prognosis and therapeutic management. However, it is now fully clear that information on the molecular underpinnings often allows for a more robust classification of (glial) neoplasms. Indeed, in the WHO 2016 classification, histologic and molecular findings are integrated in the definition of several gliomas. As such, this chapter and Chapter 6 are highly interrelated and neither should be considered in isolation.

  17. Algorithm for reaction classification.

    PubMed

    Kraut, Hans; Eiblmaier, Josef; Grethe, Guenter; Löw, Peter; Matuszczyk, Heinz; Saller, Heinz

    2013-11-25

    Reaction classification has important applications, and many approaches to classification have been applied. Our own algorithm tests all maximum common substructures (MCS) between all reactant and product molecules in order to find an atom mapping containing the minimum chemical distance (MCD). Recent publications have concluded that new MCS algorithms need to be compared with existing methods in a reproducible environment, preferably on a generalized test set, yet the number of test sets available is small, and they are not truly representative of the range of reactions that occur in real reaction databases. We have designed a challenging test set of reactions and are making it publicly available and usable with InfoChem's software or other classification algorithms. We supply a representative set of example reactions, grouped into different levels of difficulty, from a large number of reaction databases that chemists actually encounter in practice, in order to demonstrate the basic requirements for a mapping algorithm to detect the reaction centers in a consistent way. We invite the scientific community to contribute to the future extension and improvement of this data set, to achieve the goal of a common standard.

  18. Seismic event classification system

    DOEpatents

    Dowla, F.U.; Jarpe, S.P.; Maurer, W.

    1994-12-13

    In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.

  19. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology.

    PubMed

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-09-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs.

  20. New classification of natural breeding habitats for Neotropical anophelines in the Yanomami Indian Reserve, Amazon Region, Brazil and a new larval sampling methodology

    PubMed Central

    Sánchez-Ribas, Jordi; Oliveira-Ferreira, Joseli; Rosa-Freitas, Maria Goreti; Trilla, Lluís; Silva-do-Nascimento, Teresa Fernandes

    2015-01-01

    Here we present the first in a series of articles about the ecology of immature stages of anophelines in the Brazilian Yanomami area. We propose a new larval habitat classification and a new larval sampling methodology. We also report some preliminary results illustrating the applicability of the methodology based on data collected in the Brazilian Amazon rainforest in a longitudinal study of two remote Yanomami communities, Parafuri and Toototobi. In these areas, we mapped and classified 112 natural breeding habitats located in low-order river systems based on their association with river flood pulses, seasonality and exposure to sun. Our classification rendered seven types of larval habitats: lakes associated with the river, which are subdivided into oxbow lakes and nonoxbow lakes, flooded areas associated with the river, flooded areas not associated with the river, rainfall pools, small forest streams, medium forest streams and rivers. The methodology for larval sampling was based on the accurate quantification of the effective breeding area, taking into account the area of the perimeter and subtypes of microenvironments present per larval habitat type using a laser range finder and a small portable inflatable boat. The new classification and new sampling methodology proposed herein may be useful in vector control programs. PMID:26517655

  1. Improved Hierarchical Optimization-Based Classification of Hyperspectral Images Using Shape Analysis

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.

    2012-01-01

    A new spectral-spatial method for classification of hyperspectral images is proposed. The HSegClas method is based on the integration of probabilistic classification and shape analysis within the hierarchical step-wise optimization algorithm. First, probabilistic support vector machines classification is applied. Then, at each iteration two neighboring regions with the smallest Dissimilarity Criterion (DC) are merged, and classification probabilities are recomputed. The important contribution of this work consists in estimating a DC between regions as a function of statistical, classification and geometrical (area and rectangularity) features. Experimental results are presented on a 102-band ROSIS image of the Center of Pavia, Italy. The developed approach yields more accurate classification results when compared to previously proposed methods.

  2. Comparisons of neural networks to standard techniques for image classification and correlation

    NASA Technical Reports Server (NTRS)

    Paola, Justin D.; Schowengerdt, Robert A.

    1994-01-01

    Neural network techniques for multispectral image classification and spatial pattern detection are compared to the standard techniques of maximum-likelihood classification and spatial correlation. The neural network produced a more accurate classification than maximum-likelihood of a Landsat scene of Tucson, Arizona. Some of the errors in the maximum-likelihood classification are illustrated using decision region and class probability density plots. As expected, the main drawback to the neural network method is the long time required for the training stage. The network was trained using several different hidden layer sizes to optimize both the classification accuracy and training speed, and it was found that one node per class was optimal. The performance improved when 3x3 local windows of image data were entered into the net. This modification introduces texture into the classification without explicit calculation of a texture measure. Larger windows were successfully used for the detection of spatial features in Landsat and Magellan synthetic aperture radar imagery.

  3. Multispectral rock-type separation and classification.

    SciTech Connect

    Moya, Mary M.; Fogler, Robert Joseph; Paskaleva, Biliana; Hayat, Majeed M.

    2004-06-01

    This paper explores the possibility of separating and classifying remotely-sensed multispectral data from rocks and minerals onto seven geological rock-type groups. These groups are extracted from the general categories of metamorphic, igneous and sedimentary rocks. The study is performed under ideal conditions for which the data is generated according to laboratory hyperspectral data for the members, which are, in turn, passed through the Multi-spectral Thermal Imager (MTI) filters yielding 15 bands. The main challenge in separability is the small size of the training data sets, which initially did not permit direct application of Bayesian decision theory. To enable Bayseian classification, the original training data is linearly perturbed with the addition minerals, vegetation, soil, water and other valid impurities. As a result, the size of the training data is significantly increased and accurate estimates of the covariance matrices are achieved. In addition, a set of reduced (five) linearly-extracted canonical features that are optimal in providing the most important information about the data is determined. An alternative nonlinear feature-selection method is also employed based on spectral indices comprising a small subset of all possible ratios between bands. By applying three optimization strategies, combinations of two and three ratios are found that provide reliable separability and classification between all seven groups according to the Bhattacharyya distance. To set a benchmark to which the MTI capability in rock classification can be compared, an optimization strategy is performed for the selection of optimal multispectral filters, other than the MTI filters, and an improvement in classification is predicted.

  4. IRIS COLOUR CLASSIFICATION SCALES--THEN AND NOW.

    PubMed

    Grigore, Mariana; Avram, Alina

    2015-01-01

    Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual's eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale.

  5. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  6. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  7. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  8. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  9. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  10. Algorithms for Hyperspectral Signature Classification in Non-resolved Object Characterization Using Tabular Nearest Neighbor Encoding

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Key, G.

    Accurate spectral signature classification is key to the nonimaging detection and recognition of spaceborne objects. In classical hyperspectral recognition applications, signature classification accuracy depends on accurate spectral endmember determination [1]. However, in selected target recognition (ATR) applications, it is possible to circumvent the endmember detection problem by employing a Bayesian classifier. Previous approaches to Bayesian classification of spectral signatures have been rule- based, or predicated on a priori parameterized information obtained from offline training, as in the case of neural networks [1,2]. Unfortunately, class separation and classifier refinement results in these methods tends to be suboptimal, and the number of signatures that can be accurately classified often depends linearly on the number of inputs. This can lead to potentially significant classification errors in the presence of noise or densely interleaved signatures. In this paper, we present an emerging technology for nonimaging spectral signature classfication based on a highly accurate but computationally efficient search engine called Tabular Nearest Neighbor Encoding (TNE) [3]. Based on prior results, TNE can optimize its classifier performance to track input nonergodicities, as well as yield measures of confidence or caution for evaluation of classification results. Unlike neural networks, TNE does not have a hidden intermediate data structure (e.g., the neural net weight matrix). Instead, TNE generates and exploits a user-accessible data structure called the agreement map (AM), which can be manipulated by Boolean logic operations to effect accurate classifier refinement algorithms. This allows the TNE programmer or user to determine parameters for classification accuracy, and to mathematically analyze the signatures for which TNE did not obtain classification matches. This dual approach to analysis (i.e., correct vs. incorrect classification) has been shown to

  11. Noise-Tolerant Hyperspectral Signature Classification in Unresolved Object Detection with Adaptive Tabular Nearest Neighbor Encoding

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Key, G.

    Accurate spectral signature classification is a crucial step in the nonimaging detection and recognition of spaceborne objects. In classical hyperspectral recognition applications, especially where linear mixing models are employed, signature classification accuracy depends on accurate spectral endmember discrimination. In selected target recognition (ATR) applications, previous non-adaptive techniques for signature classification have yielded class separation and classifier refinement results that tend to be suboptimal. In practice, the number of signatures accurately classified often depends linearly on the number of inputs. This can lead to potentially severe classification errors in the presence of noise or densely interleaved signatures. In this paper, we present an enhancement of an emerging technology for nonimaging spectral signature classification based on a highly accurate, efficient search engine called Tabular Nearest Neighbor Encoding (TNE). Adaptive TNE can optimize its classifier performance to track input nonergodicities and yield measures of confidence or caution for evaluation of classification results. Unlike neural networks, TNE does not have a hidden intermediate data structure (e.g., a neural net weight matrix). Instead, TNE generates and exploits a user-accessible data structure called the agreement map (AM), which can be manipulated by Boolean logic operations to effect accurate classifier refinement through programmable algorithms. The open architecture and programmability of TNE's pattern-space (AM) processing allows a TNE developer to determine the qualitative and quantitative reasons for classification accuracy, as well as characterize in detail the signatures for which TNE does not obtain classification matches, and why such mis-matches occur. In this study AM-based classification has been modified to partially compensate for input statistical changes, in response to performance metrics such as probability of correct classification (Pd

  12. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms

    PubMed Central

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F.

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7–76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  13. Criminal exposure.

    PubMed

    1999-09-03

    In August, an HIV-positive man plead guilty to sexually assaulting a 14-year-old boy. The sleeping boy awoke to find [name removed] sexually assaulting him, while watching a pornographic video. [Name removed] plead guilty to the assault with intent to rape a child. In addition, [name removed] received three counts of indecent assault and battery on a child, and exposure of pornographic material to a minor. [Name removed] will remain on probation for five years, although the prosecution had recommended sentencing [name removed] to four or five years in prison. The boy continues to be tested for HIV.

  14. Classification of LiDAR Data with Point Based Classification Methods

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Cetin, Z.

    2016-06-01

    LiDAR is one of the most effective systems for 3 dimensional (3D) data collection in wide areas. Nowadays, airborne LiDAR data is used frequently in various applications such as object extraction, 3D modelling, change detection and revision of maps with increasing point density and accuracy. The classification of the LiDAR points is the first step of LiDAR data processing chain and should be handled in proper way since the 3D city modelling, building extraction, DEM generation, etc. applications directly use the classified point clouds. The different classification methods can be seen in recent researches and most of researches work with the gridded LiDAR point cloud. In grid based data processing of the LiDAR data, the characteristic point loss in the LiDAR point cloud especially vegetation and buildings or losing height accuracy during the interpolation stage are inevitable. In this case, the possible solution is the use of the raw point cloud data for classification to avoid data and accuracy loss in gridding process. In this study, the point based classification possibilities of the LiDAR point cloud is investigated to obtain more accurate classes. The automatic point based approaches, which are based on hierarchical rules, have been proposed to achieve ground, building and vegetation classes using the raw LiDAR point cloud data. In proposed approaches, every single LiDAR point is analyzed according to their features such as height, multi-return, etc. then automatically assigned to the class which they belong to. The use of un-gridded point cloud in proposed point based classification process helped the determination of more realistic rule sets. The detailed parameter analyses have been performed to obtain the most appropriate parameters in the rule sets to achieve accurate classes. The hierarchical rule sets were created for proposed Approach 1 (using selected spatial-based and echo-based features) and Approach 2 (using only selected spatial-based features

  15. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  16. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  17. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  18. Identification of Microorganisms by High Resolution Tandem Mass Spectrometry with Accurate Statistical Significance

    NASA Astrophysics Data System (ADS)

    Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo

    2016-02-01

    Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.

  19. Interactive Classification Technology

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    2000-01-01

    The investigators upgraded a knowledge representation language called SL (Symbolic Language) and an automated reasoning system called SMS (Symbolic Manipulation System) to enable the more effective use of the technologies in automated reasoning and interactive classification systems. The overall goals of the project were: 1) the enhancement of the representation language SL to accommodate a wider range of meaning; 2) the development of a default inference scheme to operate over SL notation as it is encoded; and 3) the development of an interpreter for SL that would handle representations of some basic cognitive acts and perspectives.

  20. Classification of degenerative arthritis.

    PubMed Central

    Mitchell, N. S.; Cruess, R. L.

    1977-01-01

    It is suggested that the former division of degenerative arthritis into idiopathic types and those secondary to some disease process is no longer valid. Recent studies have indicated that abnormal concentrations of force on cartilage lead to the development of this disease. A classification is presented that is based on the assumption that the process is initiated by abnormal concentrations of force on normal cartilage matrix, normal concentrations of force on abnormal cartilage matrix or normal concentrations of force on normal cartilage matrix that is supported by bone of abnormal consistency. PMID:907947

  1. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  2. Experience and perspectives on the classification of organic mental disorders.

    PubMed

    Kopelman, Michael D; Fleminger, Simon

    2002-01-01

    The official diagnostic classification systems have been increasingly employed in the last few years, and this is true of both ICD-10 and DSM-IV. We will propose a few principles which should be considered when revisions are attempted. Our existing classifications should be simplified, but new syndromes incorporated where they have pathological justification. Links to other specialist diagnostic classifications should be made (e.g. in epilepsy, sleep disorders, dementias) wherever possible. A broader range of 'Neuropsychiatric Disorders' should be incorporated, including alcohol-related organic disorders, head injury, sleep disorders, if possible including the 'psychogenic syndromes'. Progressive, degenerative disorders need to be clearly distinguished from non-progressive syndromes, and some gradation of severity needs to be built into the classificatory system. Finally, the definitions need to be concise and accurate.

  3. Link prediction boosted psychiatry disorder classification for functional connectivity network

    NASA Astrophysics Data System (ADS)

    Li, Weiwei; Mei, Xue; Wang, Hao; Zhou, Yu; Huang, Jiashuang

    2017-02-01

    Functional connectivity network (FCN) is an effective tool in psychiatry disorders classification, and represents cross-correlation of the regional blood oxygenation level dependent signal. However, FCN is often incomplete for suffering from missing and spurious edges. To accurate classify psychiatry disorders and health control with the incomplete FCN, we first `repair' the FCN with link prediction, and then exact the clustering coefficients as features to build a weak classifier for every FCN. Finally, we apply a boosting algorithm to combine these weak classifiers for improving classification accuracy. Our method tested by three datasets of psychiatry disorder, including Alzheimer's Disease, Schizophrenia and Attention Deficit Hyperactivity Disorder. The experimental results show our method not only significantly improves the classification accuracy, but also efficiently reconstructs the incomplete FCN.

  4. The Comprehensive AOCMF Classification System: Radiological Issues and Systematic Approach

    PubMed Central

    Buitrago-Téllez, Carlos H.; Cornelius, Carl-Peter; Prein, Joachim; Kunz, Christoph; Ieva, Antonio di; Audigé, Laurent

    2014-01-01

    The AOCMF Classification Group developed a hierarchical three-level craniomaxillofacial (CMF) classification system with increasing level of complexity and details. The basic level 1 system differentiates fracture location in the mandible (code 91), midface (code 92), skull base (code 93), and cranial vault (code 94); the levels 2 and 3 focus on defining fracture location and morphology within more detailed regions and subregions. Correct imaging acquisition, systematic analysis, and interpretation according to the anatomic and surgical relevant structures in the CMF regions are essential for an accurate, reproducible, and comprehensive diagnosis of CMF fractures using that system. Basic principles for radiographic diagnosis are based on conventional plain films, multidetector computed tomography, and magnetic resonance imaging. In this tutorial, the radiological issues according to each level of the classification are described. PMID:25489396

  5. Robust tissue classification for reproducible wound assessment in telemedicine environments

    NASA Astrophysics Data System (ADS)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  6. Tongue fissure extraction and classification using hyperspectral imaging technology.

    PubMed

    Li, Qingli; Wang, Yiting; Liu, Hongying; Sun, Zhen; Liu, Zhi

    2010-04-10

    Tongue fissures, an important feature on the tongue surface, may be pathologically related to some diseases. Most existing tongue fissure extraction methods use tongue images captured by traditional charge coupled device cameras. However, these conventional methods cannot be used for an accurate analysis of the tongue surface due to limited information from the images. To solve this, a hyperspectral tongue imager is used to capture tongue images instead of a digital camera. New algorithms for automatic tongue fissure extraction and classification, based on hyperspectral images, are presented. Both spectral and spatial information of the tongue surface is used to segment the tongue body and extract tongue fissures. Then a classification algorithm based on a hidden Markov model is used to classify tongue fissures into 12 typical categories. Results of the experiment show that the new method has good performance in terms of the classification rates of correctness.

  7. Evaluation of several classification schemes for mapping forest cover types in Michigan

    NASA Technical Reports Server (NTRS)

    Hudson, W. D.

    1987-01-01

    Landsat MSS data were evaluated for mapping forest cover types in the northern Lower Peninsula of Michigan. The study examined seasonal variations, interpretation procedures and vegetation composition/distribution and their effect on overall classification accuracy and ability to identify individual pine species. Photographic images were used for visual interpretations while digital analysis was performed using a common (ERDAS) microcomputer image processing system. The classification schemes were evaluated using contingency tables and were ranked using the KAPPA statistic. The various classification schemes were ranked differentially according to study site location. Visual interpretation procedures ranked best, or least accurate, depending on the spatial distribution and complexity of the forest cover. Supervised classification techniques were more accurate than unsupervised clustering over all sites and seasons. Maximum likelihood classification of June data was superior to any digital classification technique of February data. The study indicates that classification accuracy is more dependent on the composition and distribution of forests in the northern lower Peninsula of Michigan than on the selection of a particular classification scheme.

  8. EVALUATING EXCESS DIETARY EXPOSURE OF YOUNG CHILDREN EATING IN CONTAMINATED ENVIRONMENTS

    EPA Science Inventory

    The United States' Food Quality Protection Act of 1996 requires more accurate assessment of children's aggregate exposures to environmental contaminants. Since children have unstructured eating behaviors, their excess exposures, caused by eating activities, becomes an importan...

  9. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  10. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  11. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  12. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  13. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  14. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  15. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  16. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  17. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  18. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  19. Holistic facial expression classification

    NASA Astrophysics Data System (ADS)

    Ghent, John; McDonald, J.

    2005-06-01

    This paper details a procedure for classifying facial expressions. This is a growing and relatively new type of problem within computer vision. One of the fundamental problems when classifying facial expressions in previous approaches is the lack of a consistent method of measuring expression. This paper solves this problem by the computation of the Facial Expression Shape Model (FESM). This statistical model of facial expression is based on an anatomical analysis of facial expression called the Facial Action Coding System (FACS). We use the term Action Unit (AU) to describe a movement of one or more muscles of the face and all expressions can be described using the AU's described by FACS. The shape model is calculated by marking the face with 122 landmark points. We use Principal Component Analysis (PCA) to analyse how the landmark points move with respect to each other and to lower the dimensionality of the problem. Using the FESM in conjunction with Support Vector Machines (SVM) we classify facial expressions. SVMs are a powerful machine learning technique based on optimisation theory. This project is largely concerned with statistical models, machine learning techniques and psychological tools used in the classification of facial expression. This holistic approach to expression classification provides a means for a level of interaction with a computer that is a significant step forward in human-computer interaction.

  20. The Transporter Classification Database

    PubMed Central

    Saier, Milton H.; Reddy, Vamsee S.; Tamang, Dorjee G.; Västermark, Åke

    2014-01-01

    The Transporter Classification Database (TCDB; http://www.tcdb.org) serves as a common reference point for transport protein research. The database contains more than 10 000 non-redundant proteins that represent all currently recognized families of transmembrane molecular transport systems. Proteins in TCDB are organized in a five level hierarchical system, where the first two levels are the class and subclass, the second two are the family and subfamily, and the last one is the transport system. Superfamilies that contain multiple families are included as hyperlinks to the five tier TC hierarchy. TCDB includes proteins from all types of living organisms and is the only transporter classification system that is both universal and recognized by the International Union of Biochemistry and Molecular Biology. It has been expanded by manual curation, contains extensive text descriptions providing structural, functional, mechanistic and evolutionary information, is supported by unique software and is interconnected to many other relevant databases. TCDB is of increasing usefulness to the international scientific community and can serve as a model for the expansion of database technologies. This manuscript describes an update of the database descriptions previously featured in NAR database issues. PMID:24225317

  1. Supply chain planning classification

    NASA Astrophysics Data System (ADS)

    Hvolby, Hans-Henrik; Trienekens, Jacques; Bonde, Hans

    2001-10-01

    Industry experience a need to shift in focus from internal production planning towards planning in the supply network. In this respect customer oriented thinking becomes almost a common good amongst companies in the supply network. An increase in the use of information technology is needed to enable companies to better tune their production planning with customers and suppliers. Information technology opportunities and supply chain planning systems facilitate companies to monitor and control their supplier network. In spite if these developments, most links in today's supply chains make individual plans, because the real demand information is not available throughout the chain. The current systems and processes of the supply chains are not designed to meet the requirements now placed upon them. For long term relationships with suppliers and customers, an integrated decision-making process is needed in order to obtain a satisfactory result for all parties. Especially when customized production and short lead-time is in focus. An effective value chain makes inventory available and visible among the value chain members, minimizes response time and optimizes total inventory value held throughout the chain. In this paper a supply chain planning classification grid is presented based current manufacturing classifications and supply chain planning initiatives.

  2. Molecular classification of gliomas.

    PubMed

    Masui, Kenta; Mischel, Paul S; Reifenberger, Guido

    2016-01-01

    The identification of distinct genetic and epigenetic profiles in different types of gliomas has revealed novel diagnostic, prognostic, and predictive molecular biomarkers for refinement of glioma classification and improved prediction of therapy response and outcome. Therefore, the new (2016) World Health Organization (WHO) classification of tumors of the central nervous system breaks with the traditional principle of diagnosis based on histologic criteria only and incorporates molecular markers. This will involve a multilayered approach combining histologic features and molecular information in an "integrated diagnosis". We review the current state of diagnostic molecular markers for gliomas, focusing on isocitrate dehydrogenase 1 or 2 (IDH1/IDH2) gene mutation, α-thalassemia/mental retardation syndrome X-linked (ATRX) gene mutation, 1p/19q co-deletion and telomerase reverse transcriptase (TERT) promoter mutation in adult tumors, as well as v-raf murine sarcoma viral oncogene homolog B1 (BRAF) and H3 histone family 3A (H3F3A) aberrations in pediatric gliomas. We also outline prognostic and predictive molecular markers, including O6-methylguanine-DNA methyltransferase (MGMT) promoter methylation, and discuss the potential clinical relevance of biologic glioblastoma subtypes defined by integration of multiomics data. Commonly used methods for individual marker detection as well as novel large-scale DNA methylation profiling and next-generation sequencing approaches are discussed. Finally, we illustrate how advances in molecular diagnostics affect novel strategies of targeted therapy, thereby raising new challenges and identifying new leads for personalized treatment of glioma patients.

  3. Classification of Physical Activity

    PubMed Central

    Turksoy, Kamuran; Paulino, Thiago Marques Luz; Zaharieva, Dessi P.; Yavelberg, Loren; Jamnik, Veronica; Riddell, Michael C.; Cinar, Ali

    2015-01-01

    Physical activity has a wide range of effects on glucose concentrations in type 1 diabetes (T1D) depending on the type (ie, aerobic, anaerobic, mixed) and duration of activity performed. This variability in glucose responses to physical activity makes the development of artificial pancreas (AP) systems challenging. Automatic detection of exercise type and intensity, and its classification as aerobic or anaerobic would provide valuable information to AP control algorithms. This can be achieved by using a multivariable AP approach where biometric variables are measured and reported to the AP at high frequency. We developed a classification system that identifies, in real time, the exercise intensity and its reliance on aerobic or anaerobic metabolism and tested this approach using clinical data collected from 5 persons with T1D and 3 individuals without T1D in a controlled laboratory setting using a variety of common types of physical activity. The classifier had an average sensitivity of 98.7% for physiological data collected over a range of exercise modalities and intensities in these subjects. The classifier will be added as a new module to the integrated multivariable adaptive AP system to enable the detection of aerobic and anaerobic exercise for enhancing the accuracy of insulin infusion strategies during and after exercise. PMID:26443291

  4. "Interactive Classification Technology"

    NASA Technical Reports Server (NTRS)

    deBessonet, Cary

    1999-01-01

    The investigators are upgrading a knowledge representation language called SL (Symbolic Language) and an automated reasoning system called SMS (Symbolic Manipulation System) to enable the technologies to be used in automated reasoning and interactive classification systems. The overall goals of the project are: a) the enhancement of the representation language SL to accommodate multiple perspectives and a wider range of meaning; b) the development of a sufficient set of operators to enable the interpreter of SL to handle representations of basic cognitive acts; and c) the development of a default inference scheme to operate over SL notation as it is encoded. As to particular goals the first-year work plan focused on inferencing and.representation issues, including: 1) the development of higher level cognitive/ classification functions and conceptual models for use in inferencing and decision making; 2) the specification of a more detailed scheme of defaults and the enrichment of SL notation to accommodate the scheme; and 3) the adoption of additional perspectives for inferencing.

  5. Waste classification sampling plan

    SciTech Connect

    Landsman, S.D.

    1998-05-27

    The purpose of this sampling is to explain the method used to collect and analyze data necessary to verify and/or determine the radionuclide content of the B-Cell decontamination and decommissioning waste stream so that the correct waste classification for the waste stream can be made, and to collect samples for studies of decontamination methods that could be used to remove fixed contamination present on the waste. The scope of this plan is to establish the technical basis for collecting samples and compiling quantitative data on the radioactive constituents present in waste generated during deactivation activities in B-Cell. Sampling and radioisotopic analysis will be performed on the fixed layers of contamination present on structural material and internal surfaces of process piping and tanks. In addition, dose rate measurements on existing waste material will be performed to determine the fraction of dose rate attributable to both removable and fixed contamination. Samples will also be collected to support studies of decontamination methods that are effective in removing the fixed contamination present on the waste. Sampling performed under this plan will meet criteria established in BNF-2596, Data Quality Objectives for the B-Cell Waste Stream Classification Sampling, J. M. Barnett, May 1998.

  6. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  7. Hydrologic landscape regionalisation using deductive classification and random forests.

    PubMed

    Brown, Stuart C; Lester, Rebecca E; Versace, Vincent L; Fawcett, Jonathon; Laurenson, Laurie

    2014-01-01

    Landscape classification and hydrological regionalisation studies are being increasingly used in ecohydrology to aid in the management and research of aquatic resources. We present a methodology for classifying hydrologic landscapes based on spatial environmental variables by employing non-parametric statistics and hybrid image classification. Our approach differed from previous classifications which have required the use of an a priori spatial unit (e.g. a catchment) which necessarily results in the loss of variability that is known to exist within those units. The use of a simple statistical approach to identify an appropriate number of classes eliminated the need for large amounts of post-hoc testing with different number of groups, or the selection and justification of an arbitrary number. Using statistical clustering, we identified 23 distinct groups within our training dataset. The use of a hybrid classification employing random forests extended this statistical clustering to an area of approximately 228,000 km2 of south-eastern Australia without the need to rely on catchments, landscape units or stream sections. This extension resulted in a highly accurate regionalisation at both 30-m and 2.5-km resolution, and a less-accurate 10-km classification that would be more appropriate for use at a continental scale. A smaller case study, of an area covering 27,000 km2, demonstrated that the method preserved the intra- and inter-catchment variability that is known to exist in local hydrology, based on previous research. Preliminary analysis linking the regionalisation to streamflow indices is promising suggesting that the method could be used to predict streamflow behaviour in ungauged catchments. Our work therefore simplifies current classification frameworks that are becoming more popular in ecohydrology, while better retaining small-scale variability in hydrology, thus enabling future attempts to explain and visualise broad-scale hydrologic trends at the scale of

  8. Hydrologic Landscape Regionalisation Using Deductive Classification and Random Forests

    PubMed Central

    Brown, Stuart C.; Lester, Rebecca E.; Versace, Vincent L.; Fawcett, Jonathon; Laurenson, Laurie

    2014-01-01

    Landscape classification and hydrological regionalisation studies are being increasingly used in ecohydrology to aid in the management and research of aquatic resources. We present a methodology for classifying hydrologic landscapes based on spatial environmental variables by employing non-parametric statistics and hybrid image classification. Our approach differed from previous classifications which have required the use of an a priori spatial unit (e.g. a catchment) which necessarily results in the loss of variability that is known to exist within those units. The use of a simple statistical approach to identify an appropriate number of classes eliminated the need for large amounts of post-hoc testing with different number of groups, or the selection and justification of an arbitrary number. Using statistical clustering, we identified 23 distinct groups within our training dataset. The use of a hybrid classification employing random forests extended this statistical clustering to an area of approximately 228,000 km2 of south-eastern Australia without the need to rely on catchments, landscape units or stream sections. This extension resulted in a highly accurate regionalisation at both 30-m and 2.5-km resolution, and a less-accurate 10-km classification that would be more appropriate for use at a continental scale. A smaller case study, of an area covering 27,000 km2, demonstrated that the method preserved the intra- and inter-catchment variability that is known to exist in local hydrology, based on previous research. Preliminary analysis linking the regionalisation to streamflow indices is promising suggesting that the method could be used to predict streamflow behaviour in ungauged catchments. Our work therefore simplifies current classification frameworks that are becoming more popular in ecohydrology, while better retaining small-scale variability in hydrology, thus enabling future attempts to explain and visualise broad-scale hydrologic trends at the scale of

  9. SPIDERz: SuPport vector classification for IDEntifying Redshifts

    NASA Astrophysics Data System (ADS)

    Jones, Evan; Singal, J.

    2016-08-01

    SPIDERz (SuPport vector classification for IDEntifying Redshifts) applies powerful support vector machine (SVM) optimization and statistical learning techniques to custom data sets to obtain accurate photometric redshift (photo-z) estimations. It is written for the IDL environment and can be applied to traditional data sets consisting of photometric band magnitudes, or alternatively to data sets with additional galaxy parameters (such as shape information) to investigate potential correlations between the extra galaxy parameters and redshift.

  10. Estimating equations for biomarker based exposure estimation under non-steady-state conditions.

    PubMed

    Bartell, Scott M; Johnson, Wesley O

    2011-06-13

    Unrealistic steady-state assumptions are often used to estimate toxicant exposure rates from biomarkers. A biomarker may instead be modeled as a weighted sum of historical time-varying exposures. Estimating equations are derived for a zero-inflated gamma distribution for daily exposures with a known exposure frequency. Simulation studies suggest that the estimating equations can provide accurate estimates of exposure magnitude at any reasonable sample size, and reasonable estimates of the exposure variance at larger sample sizes.

  11. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  12. [A accurate identification method for Chinese materia medica--systematic identification of Chinese materia medica].

    PubMed

    Wang, Xue-Yong; Liao, Cai-Li; Liu, Si-Qi; Liu, Chun-Sheng; Shao, Ai-Juan; Huang, Lu-Qi

    2013-05-01

    This paper put forward a more accurate identification method for identification of Chinese materia medica (CMM), the systematic identification of Chinese materia medica (SICMM) , which might solve difficulties in CMM identification used the ordinary traditional ways. Concepts, mechanisms and methods of SICMM were systematically introduced and possibility was proved by experiments. The establishment of SICMM will solve problems in identification of Chinese materia medica not only in phenotypic characters like the mnorphous, microstructure, chemical constituents, but also further discovery evolution and classification of species, subspecies and population in medical plants. The establishment of SICMM will improve the development of identification of CMM and create a more extensive study space.

  13. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  14. [Systematic classification and community research techniques of arbuscular mycorrhizal fungi: a review].

    PubMed

    Liu, Yong-Jun; Feng, Hu-Yuan

    2010-06-01

    Arbuscular mycorrhizal fungi (AMF) are an important component of natural ecosystem, being able to form symbiont with plant roots. The traditional AMF classification is mainly based on the morphological identification of soil asexual spores, which has some limitations in the taxonomy of AMF. Advanced molecular techniques make the classification of AMF more accurate and scientific, and can improve the taxonomy of AMF established on the basis of morphological identification. The community research of AMF is mainly based on species classification, and has two kinds of investigation methods, i. e., spores morphological identification and molecular analysis. This paper reviewed the research progress in the systematic classification and community research techniques of AMF, with the focus on the molecular techniques in community analysis of AMF. It was considered that using morphological and molecular methods together would redound to the accurate investigation of AMF community, and also, facilitate the improvement of AMF taxonomy.

  15. Automatically high accurate and efficient photomask defects management solution for advanced lithography manufacture

    NASA Astrophysics Data System (ADS)

    Zhu, Jun; Chen, Lijun; Ma, Lantao; Li, Dejian; Jiang, Wei; Pan, Lihong; Shen, Huiting; Jia, Hongmin; Hsiang, Chingyun; Cheng, Guojie; Ling, Li; Chen, Shijie; Wang, Jun; Liao, Wenkui; Zhang, Gary

    2014-04-01

    Defect review is a time consuming job. Human error makes result inconsistent. The defects located on don't care area would not hurt the yield and no need to review them such as defects on dark area. However, critical area defects can impact yield dramatically and need more attention to review them such as defects on clear area. With decrease in integrated circuit dimensions, mask defects are always thousands detected during inspection even more. Traditional manual or simple classification approaches are unable to meet efficient and accuracy requirement. This paper focuses on automatic defect management and classification solution using image output of Lasertec inspection equipment and Anchor pattern centric image process technology. The number of mask defect found during an inspection is always in the range of thousands or even more. This system can handle large number defects with quick and accurate defect classification result. Our experiment includes Die to Die and Single Die modes. The classification accuracy can reach 87.4% and 93.3%. No critical or printable defects are missing in our test cases. The missing classification defects are 0.25% and 0.24% in Die to Die mode and Single Die mode. This kind of missing rate is encouraging and acceptable to apply on production line. The result can be output and reloaded back to inspection machine to have further review. This step helps users to validate some unsure defects with clear and magnification images when captured images can't provide enough information to make judgment. This system effectively reduces expensive inline defect review time. As a fully inline automated defect management solution, the system could be compatible with current inspection approach and integrated with optical simulation even scoring function and guide wafer level defect inspection.

  16. 45 CFR 601.5 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 3 2012-10-01 2012-10-01 false Derivative classification. 601.5 Section 601.5... CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct... classification guide, need not possess original classification authority. (a) If a person who applies...

  17. 46 CFR 503.55 - Derivative classification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 9 2010-10-01 2010-10-01 false Derivative classification. 503.55 Section 503.55... Security Program § 503.55 Derivative classification. (a) In accordance with Part 2 of Executive Order 12958... derivative classification. (1) Derivative classification includes the classification of information based...

  18. 45 CFR 601.5 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 3 2014-10-01 2014-10-01 false Derivative classification. 601.5 Section 601.5... CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct... classification guide, need not possess original classification authority. (a) If a person who applies...

  19. 5 CFR 1312.7 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Derivative classification. 1312.7 Section... Declassification of National Security Information § 1312.7 Derivative classification. A derivative classification... another agency or classification authority. The application of derivative classification markings is...

  20. 6 CFR 7.26 - Derivative classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Derivative classification. 7.26 Section 7.26... INFORMATION Classified Information § 7.26 Derivative classification. (a) Derivative classification is defined... classification guides. (c) Persons who apply derivative classification markings shall observe...

  1. 45 CFR 601.5 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 3 2013-10-01 2013-10-01 false Derivative classification. 601.5 Section 601.5... CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct... classification guide, need not possess original classification authority. (a) If a person who applies...

  2. 6 CFR 7.26 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 6 Domestic Security 1 2014-01-01 2014-01-01 false Derivative classification. 7.26 Section 7.26... INFORMATION Classified Information § 7.26 Derivative classification. (a) Derivative classification is defined... classification guides. (c) Persons who apply derivative classification markings shall observe...

  3. 46 CFR 503.55 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 9 2011-10-01 2011-10-01 false Derivative classification. 503.55 Section 503.55... Security Program § 503.55 Derivative classification. (a) In accordance with Part 2 of Executive Order 13526... derivative classification. (1) Derivative classification includes the classification of information based...

  4. 6 CFR 7.26 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 6 Domestic Security 1 2011-01-01 2011-01-01 false Derivative classification. 7.26 Section 7.26... INFORMATION Classified Information § 7.26 Derivative classification. (a) Derivative classification is defined... classification guides. (c) Persons who apply derivative classification markings shall observe...

  5. 6 CFR 7.26 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 6 Domestic Security 1 2013-01-01 2013-01-01 false Derivative classification. 7.26 Section 7.26... INFORMATION Classified Information § 7.26 Derivative classification. (a) Derivative classification is defined... classification guides. (c) Persons who apply derivative classification markings shall observe...

  6. 46 CFR 503.55 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 9 2012-10-01 2012-10-01 false Derivative classification. 503.55 Section 503.55... Security Program § 503.55 Derivative classification. (a) In accordance with Part 2 of Executive Order 13526... derivative classification. (1) Derivative classification includes the classification of information based...

  7. 5 CFR 1312.7 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Derivative classification. 1312.7 Section... Declassification of National Security Information § 1312.7 Derivative classification. A derivative classification... another agency or classification authority. The application of derivative classification markings is...

  8. 46 CFR 503.55 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 9 2014-10-01 2014-10-01 false Derivative classification. 503.55 Section 503.55... Security Program § 503.55 Derivative classification. (a) In accordance with Part 2 of Executive Order 13526... derivative classification. (1) Derivative classification includes the classification of information based...

  9. 46 CFR 503.55 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 9 2013-10-01 2013-10-01 false Derivative classification. 503.55 Section 503.55... Security Program § 503.55 Derivative classification. (a) In accordance with Part 2 of Executive Order 13526... derivative classification. (1) Derivative classification includes the classification of information based...

  10. Text mining for improved exposure assessment

    PubMed Central

    Baker, Simon; Silins, Ilona; Guo, Yufan; Stenius, Ulla; Korhonen, Anna; Berglund, Marika

    2017-01-01

    Chemical exposure assessments are based on information collected via different methods, such as biomonitoring, personal monitoring, environmental monitoring and questionnaires. The vast amount of chemical-specific exposure information available from web-based databases, such as PubMed, is undoubtedly a great asset to the scientific community. However, manual retrieval of relevant published information is an extremely time consuming task and overviewing the data is nearly impossible. Here, we present the development of an automatic classifier for chemical exposure information. First, nearly 3700 abstracts were manually annotated by an expert in exposure sciences according to a taxonomy exclusively created for exposure information. Natural Language Processing (NLP) techniques were used to extract semantic and syntactic features relevant to chemical exposure text. Using these features, we trained a supervised machine learning algorithm to automatically classify PubMed abstracts according to the exposure taxonomy. The resulting classifier demonstrates good performance in the intrinsic evaluation. We also show that the classifier improves information retrieval of chemical exposure data compared to keyword-based PubMed searches. Case studies demonstrate that the classifier can be used to assist researchers by facilitating information retrieval and classification, enabling data gap recognition and overviewing available scientific literature using chemical-specific publication profiles. Finally, we identify challenges to be addressed in future development of the system. PMID:28257498

  11. Text mining for improved exposure assessment.

    PubMed

    Larsson, Kristin; Baker, Simon; Silins, Ilona; Guo, Yufan; Stenius, Ulla; Korhonen, Anna; Berglund, Marika

    2017-01-01

    Chemical exposure assessments are based on information collected via different methods, such as biomonitoring, personal monitoring, environmental monitoring and questionnaires. The vast amount of chemical-specific exposure information available from web-based databases, such as PubMed, is undoubtedly a great asset to the scientific community. However, manual retrieval of relevant published information is an extremely time consuming task and overviewing the data is nearly impossible. Here, we present the development of an automatic classifier for chemical exposure information. First, nearly 3700 abstracts were manually annotated by an expert in exposure sciences according to a taxonomy exclusively created for exposure information. Natural Language Processing (NLP) techniques were used to extract semantic and syntactic features relevant to chemical exposure text. Using these features, we trained a supervised machine learning algorithm to automatically classify PubMed abstracts according to the exposure taxonomy. The resulting classifier demonstrates good performance in the intrinsic evaluation. We also show that the classifier improves information retrieval of chemical exposure data compared to keyword-based PubMed searches. Case studies demonstrate that the classifier can be used to assist researchers by facilitating information retrieval and classification, enabling data gap recognition and overviewing available scientific literature using chemical-specific publication profiles. Finally, we identify challenges to be addressed in future development of the system.

  12. Metabolism of pesticides after dermal exposure to amphibians

    EPA Science Inventory

    Understanding how pesticide exposure to non-target species influences toxicity is necessary to accurately assess the ecological risks these compounds pose. Aquatic, terrestrial, and arboreal amphibians are often exposed to pesticides during their agricultural application resultin...

  13. Classification-based reasoning

    NASA Technical Reports Server (NTRS)

    Gomez, Fernando; Segami, Carlos

    1991-01-01

    A representation formalism for N-ary relations, quantification, and definition of concepts is described. Three types of conditions are associated with the concepts: (1) necessary and sufficient properties, (2) contingent properties, and (3) necessary properties. Also explained is how complex chains of inferences can be accomplished by representing existentially quantified sentences, and concepts denoted by restrictive relative clauses as classification hierarchies. The representation structures that make possible the inferences are explained first, followed by the reasoning algorithms that draw the inferences from the knowledge structures. All the ideas explained have been implemented and are part of the information retrieval component of a program called Snowy. An appendix contains a brief session with the program.

  14. Classification of extraterrestrial civilizations

    NASA Astrophysics Data System (ADS)

    Tang, Tong B.; Chang, Grace

    1991-06-01

    A scheme of classification of extraterrestrial intelligence (ETI) communities based on the scope of energy accessible to the civilization in question is proposed as an alternative to the Kardeshev (1964) scheme that includes three types of civilization, as determined by their levels of energy expenditure. The proposed scheme includes six classes: (1) a civilization that runs essentially on energy exerted by individual beings or by domesticated lower life forms, (2) harnessing of natural sources on planetary surface with artificial constructions, like water wheels and wind sails, (3) energy from fossils and fissionable isotopes, mined beneath the planet surface, (4) exploitation of nuclear fusion on a large scale, whether on the planet, in space, or from primary solar energy, (5) extensive use of antimatter for energy storage, and (6) energy from spacetime, perhaps via the action of naked singularities.

  15. Subspace ensembles for classification

    NASA Astrophysics Data System (ADS)

    Sun, Shiliang; Zhang, Changshui

    2007-11-01

    Ensemble learning constitutes one of the principal current directions in machine learning and data mining. In this paper, we explore subspace ensembles for classification by manipulating different feature subspaces. Commencing with the nature of ensemble efficacy, we probe into the microcosmic meaning of ensemble diversity, and propose to use region partitioning and region weighting to implement effective subspace ensembles. Individual classifiers possessing eminent performance on a partitioned region reflected by high neighborhood accuracies are deemed to contribute largely to this region, and are assigned large weights in determining the labels of instances in this area. A robust algorithm “Sena” that incarnates the mechanism is presented, which is insensitive to the number of nearest neighbors chosen to calculate neighborhood accuracies. The algorithm exhibits improved performance over the well-known ensembles of bagging, AdaBoost and random subspace. The difference of its effectivity with varying base classifiers is also investigated.

  16. Bayesian classification theory

    NASA Technical Reports Server (NTRS)

    Hanson, Robin; Stutz, John; Cheeseman, Peter

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework and using various mathematical and algorithmic approximations, the AutoClass system searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit or share model parameters though a class hierarchy. We summarize the mathematical foundations of AutoClass.

  17. [Classification of lymphedema].

    PubMed

    Lazareth, I

    2002-06-01

    Classification of lymphedema is debated, because authors don't agree with the disordered physiology. Kinmonth divided all cases into primary and secondary lymphedema. Three types of primary lymphedema have been recognized: congenital, precox and tarda. Secondary lymphedema develops as a consequence of disruption or obstruction of the lymphatic pathways. Iatrogenic lymphedema are caused by surgery and/or radiation therapy. Post-infectious lymphedema are mainly caused by filariasis in tropical areas, and by cellulitis in occidental areas. Neoplastic disease (breast, prostatic cancer, Kaposi's sarcoma) are a major cause of secondary lymphedema. Less frequent etiologies are rheumatoid lymphedema, pathomimic lymphedema, pretibial myxoedema. Reduced lymphatic drainage is associated with severe chronic venous insufficiency and contributes to the leg swelling and the risk of infection.

  18. Robotic Rock Classification

    NASA Technical Reports Server (NTRS)

    Hebert, Martial

    1999-01-01

    This report describes a three-month research program undertook jointly by the Robotics Institute at Carnegie Mellon University and Ames Research Center as part of the Ames' Joint Research Initiative (JRI.) The work was conducted at the Ames Research Center by Mr. Liam Pedersen, a graduate student in the CMU Ph.D. program in Robotics under the supervision Dr. Ted Roush at the Space Science Division of the Ames Research Center from May 15 1999 to August 15, 1999. Dr. Martial Hebert is Mr. Pedersen's research adviser at CMU and is Principal Investigator of this Grant. The goal of this project is to investigate and implement methods suitable for a robotic rover to autonomously identify rocks and minerals in its vicinity, and to statistically characterize the local geological environment. Although primary sensors for these tasks are a reflection spectrometer and color camera, the goal is to create a framework under which data from multiple sensors, and multiple readings on the same object, can be combined in a principled manner. Furthermore, it is envisioned that knowledge of the local area, either a priori or gathered by the robot, will be used to improve classification accuracy. The key results obtained during this project are: The continuation of the development of a rock classifier; development of theoretical statistical methods; development of methods for evaluating and selecting sensors; and experimentation with data mining techniques on the Ames spectral library. The results of this work are being applied at CMU, in particular in the context of the Winter 99 Antarctica expedition in which the classification techniques will be used on the Nomad robot. Conversely, the software developed based on those techniques will continue to be made available to NASA Ames and the data collected from the Nomad experiments will also be made available.

  19. Classification of Rainbows

    NASA Astrophysics Data System (ADS)

    Ricard, J. L.; Peter, A. L.; Barckicke, J.

    2015-12-01

    CLASSIFICATION OF RAINBOWS Jean Louis Ricard,1,2,* Peter Adams ,2 and Jean Barckicke 2,3 1CNRM, Météo-France,42 Avenue Gaspard Coriolis, 31057 Toulouse, France 2CEPAL, 148 Himley Road, Dudley, West Midlands DY1 2QH, United Kingdom 3DP/Compas,Météo-France,42 Avenue Gaspard Coriolis, 31057 Toulouse, France *Corresponding author: Dr_Jean_Ricard@yahoo,co,ukRainbows are the most beautiful and most spectacular optical atmospheric phenomenon. Humphreys (1964) pointedly noted that "the "explanations" generally given of the rainbow [ in textbooks] may well be said to explain beautifully that which does not occur, and to leave unexplained which does" . . . "The records of close observations of rainbows soon show that not even the colors are always the same". Textbooks stress that the main factor affecting the aspect of the rainbow is the radius of the water droplets. In his well-known textbook entitled "the nature of light & colour in the open air", Minnaert (1954) gives the chief features of the rainbow depending on the diameter of the drops producing it. For this study, we have gathered hundreds of pictures of primary bows. We sort out the pictures into classes. The classes are defined in a such way that rainbows belonging to the same class look similar. Our results are surprising and do not confirm Minnaert's classification. In practice, the size of the water droplets is only a minor factor controlling the overall aspect of the rainbow. The main factor appears to be the height of the sun above the horizon. At sunset, the width of the red band increases, while the width of the other bands of colours decreases. The orange, the violet, the blue and the green bands disappear completely in this order. At the end, the primary bow is mainly red and slightly yellow. Picture = Contrast-enhanced photograph of a primary bow picture (prepared by Andrew Dunn).

  20. Land-use classification using ASTER data and self-organized neutral networks

    NASA Astrophysics Data System (ADS)

    Jianwen, Ma; Bagan, Hasi

    2005-11-01

    Operationally AVHRR and TM/TM+ data were used and a supervised maximum likelihood classification (MLH) was applied to depict land use changes in Beijing, providing basic maps for planning and development. With rapid growth of the city these are helpful to deal with higher resolution data, whereas new classification algorithms produce land use maps more accurate. In the paper, new sensor ASTER data and the Kohonen self-organized neural network feature map (KSOM) were tested. The TSOM classified 7% more accurately than the maximum likelihood algorithm in general, and 50% more accurately for the classes 'residential area' and 'roads'. The results suggest that ASTER data and the Kohonen self-organized neural network classification can be used as an alternative data and method in a land use update operational system.

  1. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  2. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  3. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  4. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  5. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  6. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  7. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  8. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  9. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  10. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  11. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  12. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  13. Applications of feature selection. [development of classification algorithms for LANDSAT data

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.

    1976-01-01

    The use of satellite-acquired (LANDSAT) multispectral scanner (MSS) data to conduct an inventory of some crop of economic interest such as wheat over a large geographical area is considered in relation to the development of accurate and efficient algorithms for data classification. The dimension of the measurement space and the computational load for a classification algorithm is increased by the use of multitemporal measurements. Feature selection/combination techniques used to reduce the dimensionality of the problem are described.

  14. Exposure chamber

    DOEpatents

    Moss, Owen R.

    1980-01-01

    A chamber for exposing animals, plants, or materials to air containing gases or aerosols is so constructed that catch pans for animal excrement, for example, serve to aid the uniform distribution of air throughout the chamber instead of constituting obstacles as has been the case in prior animal exposure chambers. The chamber comprises the usual imperforate top, bottom and side walls. Within the chamber, cages and their associated pans are arranged in two columns. The pans are spaced horizontally from the walls of the chamber in all directions. Corresponding pans of the two columns are also spaced horizontally from each other. Preferably the pans of one column are also spaced vertically from corresponding pans of the other column. Air is introduced into the top of the chamber and withdrawn from the bottom. The general flow of air is therefore vertical. The effect of the horizontal pans is based on the fact that a gas flowing past the edge of a flat plate that is perpendicular to the flow forms a wave on the upstream side of the plate. Air flows downwardly between the chamber walls and the outer edges of the pan. It also flows downwardly between the inner edges of the pans of the two columns. It has been found that when the air carries aerosol particles, these particles are substantially uniformly distributed throughout the chamber.

  15. The molecular classification of hereditary endocrine diseases.

    PubMed

    Ye, Lei; Ning, Guang

    2015-12-01

    Hereditary endocrine diseases are an important group of diseases with great heterogeneity. The current classification for hereditary endocrine disease is mostly based upon anatomy, which is helpful for pathophysiological interpretation, but does not address the pathogenic variability associated with different underlying genetic causes. Identification of an endocrinopathy-associated genetic alteration provides evidence for differential diagnosis, discovery of non-classical disease, and the potential for earlier diagnosis and targeted therapy. Molecular diagnosis should be routinely applied when managing patients with suspicion of hereditary disease. To enhance the accurate diagnosis and treatment of patients with hereditary endocrine diseases, we propose categorization of endocrine diseases into three groups based upon the function of the mutant gene: cell differentiation, hormone synthesis and action, and tumorigenesis. Each category was further grouped according to the specific gene function. We believe that this format would facilitate practice of precision medicine in the field of hereditary endocrine diseases.

  16. Automatic classification of time-variable X-ray sources

    SciTech Connect

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  17. Image classification via object-aware holistic superpixel selection.

    PubMed

    Wang, Zilei; Feng, Jiashi; Yan, Shuicheng; Xi, Hongsheng

    2013-11-01

    In this paper, we propose an object-aware holistic superpixel selection (HPS) method to automatically select the discriminative superpixels of an image for image classification purpose. Through only considering the selected superpixels, the interference of cluttered background on the object can be alleviated effectively and thus the classification performance is significantly enhanced. In particular, for an image, HPS first selects the discriminative superpixels for the characteristics of certain class, which can together match the object template of this class well. In addition, these superpixels compose a class-specific matching region. Through performing such superpixel selection for several most probable classes, respectively, HPS generates multiple class-specific matching regions for a single image. Then, HPS merges these matching regions into an integral object region through exploiting their pixel-level intersection information. Finally, such object region instead of the original image is used for image classification. An appealing advantage of HPS is the ability to alleviate the interference of cluttered background yet not require the object to be segmented out accurately. We evaluate the proposed HPS on four challenging image classification benchmark datasets: Oxford-IIIT PET 37, Caltech-UCSD Birds 200, Caltech 101, and PASCAL VOC 2011. The experimental results consistently show that the proposed HPS can remarkably improve the classification performance.

  18. Measurement error in air pollution exposure assessment.

    PubMed

    Navidi, W; Lurmann, F

    1995-01-01

    The exposure of an individual to an air pollutant can be assessed indirectly, with a "microenvironmental" approach, or directly with a personal sampler. Both methods of assessment are subject to measurement error, which can cause considerable bias in estimates of health effects. If the exposure estimates are unbiased and the measurement error is nondifferential, the bias in a linear model can be corrected when the variance of the measurement error is known. Unless the measurement error is quite large, estimates of health effects based on individual exposures appear to be more accurate than those based on ambient levels.

  19. Classification of seizures and epilepsy.

    PubMed

    Riviello, James J

    2003-07-01

    The management of seizures and epilepsy begins with forming a differential diagnosis, making the diagnosis, and then classifying seizure type and epileptic syndrome. Classification guides treatment, including ancillary testing, management, prognosis, and if needed, selection of the appropriate antiepileptic drug (AED). Many AEDs are available, and certain seizure types or epilepsy syndromes respond to specific AEDs. The identification of the genetics, molecular basis, and pathophysiologic mechanisms of epilepsy has resulted from classification of specific epileptic syndromes. The classification system used by the International League Against Epilepsy is periodically revised. The proposed revision changes the classification emphasis from the anatomic origin of seizures (focal vs generalized) to seizure semiology (ie, the signs or clinical manifestations). Modified systems have been developed for specific circumstances (eg, neonatal seizures, infantile seizures, status epilepticus, and epilepsy surgery). This article reviews seizure and epilepsy classification, emphasizing new data.

  20. Vietnamese Document Representation and Classification

    NASA Astrophysics Data System (ADS)

    Nguyen, Giang-Son; Gao, Xiaoying; Andreae, Peter

    Vietnamese is very different from English and little research has been done on Vietnamese document classification, or indeed, on any kind of Vietnamese language processing, and only a few small corpora are available for research. We created a large Vietnamese text corpus with about 18000 documents, and manually classified them based on different criteria such as topics and styles, giving several classification tasks of different difficulty levels. This paper introduces a new syllable-based document representation at the morphological level of the language for efficient classification. We tested the representation on our corpus with different classification tasks using six classification algorithms and two feature selection techniques. Our experiments show that the new representation is effective for Vietnamese categorization, and suggest that best performance can be achieved using syllable-pair document representation, an SVM with a polynomial kernel as the learning algorithm, and using Information gain and an external dictionary for feature selection.

  1. Comparing prediction models for radiographic exposures

    NASA Astrophysics Data System (ADS)

    Ching, W.; Robinson, J.; McEntee, M. F.

    2015-03-01

    During radiographic exposures the milliampere-seconds (mAs), kilovoltage peak (kVp) and source-to-image distance can be adjusted for variations in patient thicknesses. Several exposure adjustment systems have been developed to assist with this selection. This study compares the accuracy of four systems to predict the required mAs for pelvic radiographs taken on a direct digital radiography system (DDR). Sixty radiographs were obtained by adjusting mAs to compensate for varying combinations of source-to-image distance (SID), kVp and patient thicknesses. The 25% rule, the DuPont Bit System and the DigiBit system were compared to determine which of these three most accurately predicted the mAs required for an increase in patient thickness. Similarly, the 15% rule, the DuPont Bit System and the DigiBit system were compared for an increase in kVp. The exposure index (EI) was used as an indication of exposure to the DDR. For each exposure combination the mAs was adjusted until an EI of 1500+/-2% was achieved. The 25% rule was the most accurate at predicting the mAs required for an increase in patient thickness, with 53% of the mAs predictions correct. The DigiBit system was the most accurate at predicting mAs needed for changes in kVp, with 33% of predictions correct. This study demonstrated that the 25% rule and DigiBit system were the most accurate predictors of mAs required for an increase in patient thickness and kVp respectively. The DigiBit system worked well in both scenarios as it is a single exposure adjustment system that considers a variety of exposure factors.

  2. Comparisons of different methods for debris covered glacier classification

    NASA Astrophysics Data System (ADS)

    Tiwari, R. K.; Garg, P. K.; Saini, V.; Shukla, A.

    2016-05-01

    The paper outlines comparisons between different methods for the mapping of debris covered glaciers. The supervised classification method like Maximum Likelihood Classifier (MLC) has been tested using different data set for deriving the glacier area. Along with MLC the semi-automated method like Hierarchical Knowledge Based Classifier (HKBC) has also been used here. All the results were tested for accuracy, processing time and complexities. The results were also tested against the manually digitized boundary. The results suggests that the MLC when used with other ancillary data like geo-morphometric parameters and temperature image takes slightly more time than HKBC due to some to higher amount of post processing time but the output is satisfactory (89 % overall accuracy). Results show that the time taken in different classifications is significantly different which ranges from 1-2 hours in MLC to 5-10 hours in manual digitization. Depending on the classification method, some to large amount of post processing is always required to achieve the crisp glacial boundary. Classical classifier like maximum likelihood classification is less time consuming but the time taken in post-processing is higher than HKBC. Another factor which is important for a better accuracy is the prior knowledge of glacier terrain. In knowledge based classification method, it is required initially to establish crisp rules which are later used during classification, without this per-classification exercise the accuracy may significantly decrease. This is a time consuming procedure (2-3 hours in this case) but a minimal amount of post-processing is required. Thermal and geo-morphometric data when used synergistically, classified glacier boundaries are more crisp and accurate.

  3. Single-trial EEG RSVP classification using convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Shamwell, Jared; Lee, Hyungtae; Kwon, Heesung; Marathe, Amar R.; Lawhern, Vernon; Nothwang, William

    2016-05-01

    Traditionally, Brain-Computer Interfaces (BCI) have been explored as a means to return function to paralyzed or otherwise debilitated individuals. An emerging use for BCIs is in human-autonomy sensor fusion where physiological data from healthy subjects is combined with machine-generated information to enhance the capabilities of artificial systems. While human-autonomy fusion of physiological data and computer vision have been shown to improve classification during visual search tasks, to date these approaches have relied on separately trained classification models for each modality. We aim to improve human-autonomy classification performance by developing a single framework that builds codependent models of human electroencephalograph (EEG) and image data to generate fused target estimates. As a first step, we developed a novel convolutional neural network (CNN) architecture and applied it to EEG recordings of subjects classifying target and non-target image presentations during a rapid serial visual presentation (RSVP) image triage task. The low signal-to-noise ratio (SNR) of EEG inherently limits the accuracy of single-trial classification and when combined with the high dimensionality of EEG recordings, extremely large training sets are needed to prevent overfitting and achieve accurate classification from raw EEG data. This paper explores a new deep CNN architecture for generalized multi-class, single-trial EEG classification across subjects. We compare classification performance from the generalized CNN architecture trained across all subjects to the individualized XDAWN, HDCA, and CSP neural classifiers which are trained and tested on single subjects. Preliminary results show that our CNN meets and slightly exceeds the performance of the other classifiers despite being trained across subjects.

  4. FPGA Coprocessor for Accelerated Classification of Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Scharenbroich, Lucas J.; Werne, Thomas A.

    2008-01-01

    An effort related to that described in the preceding article focuses on developing a spaceborne processing platform for fast and accurate onboard classification of image data, a critical part of modern satellite image processing. The approach again has been to exploit the versatility of recently developed hybrid Virtex-4FX field-programmable gate array (FPGA) to run diverse science applications on embedded processors while taking advantage of the reconfigurable hardware resources of the FPGAs. In this case, the FPGA serves as a coprocessor that implements legacy C-language support-vector-machine (SVM) image-classification algorithms to detect and identify natural phenomena such as flooding, volcanic eruptions, and sea-ice break-up. The FPGA provides hardware acceleration for increased onboard processing capability than previously demonstrated in software. The original C-language program demonstrated on an imaging instrument aboard the Earth Observing-1 (EO-1) satellite implements a linear-kernel SVM algorithm for classifying parts of the images as snow, water, ice, land, or cloud or unclassified. Current onboard processors, such as on EO-1, have limited computing power, extremely limited active storage capability and are no longer considered state-of-the-art. Using commercially available software that translates C-language programs into hardware description language (HDL) files, the legacy C-language program, and two newly formulated programs for a more capable expanded-linear-kernel and a more accurate polynomial-kernel SVM algorithm, have been implemented in the Virtex-4FX FPGA. In tests, the FPGA implementations have exhibited significant speedups over conventional software implementations running on general-purpose hardware.

  5. A Simple Algorithm for Population Classification

    PubMed Central

    Hu, Peng; Hsieh, Ming-Hua; Lei, Ming-Jie; Cui, Bin; Chiu, Sung-Kay; Tzeng, Chi-Meng

    2016-01-01

    A single-nucleotide polymorphism (SNP) is a variation in the DNA sequence that occurs when a single nucleotide in the genome differs across members of the same species. Variations in the DNA sequences of humans are associated with human diseases. This makes SNPs as a key to open up the door of personalized medicine. SNP(s) can also be used for human identification and forensic applications. Compared to short tandem repeat (STR) loci, SNPs have much lower statistical testing power for individual recognition due to the fact that there are only 3 possible genotypes for each SNP marker, but it may provide sufficient information to identify the population to which a certain samples may belong. In this report, using eight SNP markers for 641 samples, we performed a standard statistical classification procedure and found that 86% of the samples could be classified accurately under a two-population model. This study suggests the potential use of SNP(s) in population classification with a small number (n ≤ 8) of genetic markers for forensic screening, biodiversity and disaster victim controlling. PMID:27030001

  6. 45 CFR 601.2 - Classification authority.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.2 Classification authority. The... a Foundation employee develops information that appears to warrant classification because of...

  7. 45 CFR 601.5 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct from “original” classification is the determination that information is in substance the same...

  8. 45 CFR 601.5 - Derivative classification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.5 Derivative classification. Distinct from “original” classification is the determination that information is in substance the same...

  9. 45 CFR 601.2 - Classification authority.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION CLASSIFICATION AND DECLASSIFICATION OF NATIONAL SECURITY INFORMATION § 601.2 Classification authority. The... a Foundation employee develops information that appears to warrant classification because of...

  10. Land cover classification using random forest with genetic algorithm-based parameter optimization

    NASA Astrophysics Data System (ADS)

    Ming, Dongping; Zhou, Tianning; Wang, Min; Tan, Tian

    2016-07-01

    Land cover classification based on remote sensing imagery is an important means to monitor, evaluate, and manage land resources. However, it requires robust classification methods that allow accurate mapping of complex land cover categories. Random forest (RF) is a powerful machine-learning classifier that can be used in land remote sensing. However, two important parameters of RF classification, namely, the number of trees and the number of variables tried at each split, affect classification accuracy. Thus, optimal parameter selection is an inevitable problem in RF-based image classification. This study uses the genetic algorithm (GA) to optimize the two parameters of RF to produce optimal land cover classification accuracy. HJ-1B CCD2 image data are used to classify six different land cover categories in Changping, Beijing, China. Experimental results show that GA-RF can avoid arbitrariness in the selection of parameters. The experiments also compare land cover classification results by using GA-RF method, traditional RF method (with default parameters), and support vector machine method. When the GA-RF method is used, classification accuracies, respectively, improved by 1.02% and 6.64%. The comparison results show that GA-RF is a feasible solution for land cover classification without compromising accuracy or incurring excessive time.

  11. Perspectives on next steps in classification of oro-facial pain - part 1: role of ontology.

    PubMed

    Ceusters, W; Michelotti, A; Raphael, K G; Durham, J; Ohrbach, R

    2015-12-01

    The purpose of this study was to review existing principles of oro-facial pain classifications and to specify design recommendations for a new system that would reflect recent insights in biomedical classification systems, terminologies and ontologies. The study was initiated by a symposium organised by the International RDC/TMD Consortium Network in March 2013, to which the present authors contributed. The following areas are addressed: problems with current classification approaches, status of the ontological basis of pain disorders, insufficient diagnostic aids and biomarkers for pain disorders, exploratory nature of current pain terminology and classification systems, and problems with prevailing classification methods from an ontological perspective. Four recommendations for addressing these problems are as follows: (i) develop a hypothesis-driven classification structure built on principles that ensure to our best understanding an accurate description of the relations among all entities involved in oro-facial pain disorders; (ii) take into account the physiology and phenomenology of oro-facial pain disorders to adequately represent both domains including psychosocial entities in a classification system; (iii) plan at the beginning for field-testing at strategic development stages; and (iv) consider how the classification system will be implemented. Implications in relation to the specific domains of psychosocial factors and biomarkers for inclusion into an oro-facial pain classification system are described in two separate papers.

  12. Perspectives on Next Steps in Classification of Orofacial Pain – Part 1: Role of Ontology

    PubMed Central

    Ceusters, Werner; Michelotti, Ambra; Raphael, Karen G.; Durham, Justin; Ohrbach, Richard

    2016-01-01

    The purpose of this paper is to review existing principles of orofacial pain classifications and to specify design recommendations for a new system that would reflect recent insights in biomedical classification systems, terminologies and ontologies. The paper was initiated by a symposium organized by the International RDC/TMD Consortium Network in March 2013, to which the present authors contributed. The following areas are addressed: problems with current classification approaches, status of the ontological basis of pain disorders, insufficient diagnostic aids and biomarkers for pain disorders, exploratory nature of current pain terminology and classification systems, and problems with prevailing classification methods from an ontological perspective. Four recommendations for addressing these problems are: 1) develop a hypothesis-driven classification structure built on principles that ensure to our best understanding an accurate description of the relations among all entities involved in orofacial pain disorders; 2) take into account the physiology and phenomenology of orofacial pain disorders in order to adequately represent both domains including psychosocial entities in a classification system; 3) plan at the beginning for field-testing at strategic development stages; and 4) consider how the classification system will be implemented. Implications in relation to the specific domains of psychosocial factors and biomarkers for inclusion into an orofacial pain classification system are described in two separate papers. PMID:26212927

  13. Accurate orbit propagation with planetary close encounters

    NASA Astrophysics Data System (ADS)

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  14. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  15. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  16. Accurate glucose detection in a small etalon

    NASA Astrophysics Data System (ADS)

    Martini, Joerg; Kuebler, Sebastian; Recht, Michael; Torres, Francisco; Roe, Jeffrey; Kiesel, Peter; Bruce, Richard

    2010-02-01

    We are developing a continuous glucose monitor for subcutaneous long-term implantation. This detector contains a double chamber Fabry-Perot-etalon that measures the differential refractive index (RI) between a reference and a measurement chamber at 850 nm. The etalon chambers have wavelength dependent transmission maxima which dependent linearly on the RI of their contents. An RI difference of ▵n=1.5.10-6 changes the spectral position of a transmission maximum by 1pm in our measurement. By sweeping the wavelength of a single-mode Vertical-Cavity-Surface-Emitting-Laser (VCSEL) linearly in time and detecting the maximum transmission peaks of the etalon we are able to measure the RI of a liquid. We have demonstrated accuracy of ▵n=+/-3.5.10-6 over a ▵n-range of 0 to 1.75.10-4 and an accuracy of 2% over a ▵nrange of 1.75.10-4 to 9.8.10-4. The accuracy is primarily limited by the reference measurement. The RI difference between the etalon chambers is made specific to glucose by the competitive, reversible release of Concanavalin A (ConA) from an immobilized dextran matrix. The matrix and ConA bound to it, is positioned outside the optical detection path. ConA is released from the matrix by reacting with glucose and diffuses into the optical path to change the RI in the etalon. Factors such as temperature affect the RI in measurement and detection chamber equally but do not affect the differential measurement. A typical standard deviation in RI is +/-1.4.10-6 over the range 32°C to 42°C. The detector enables an accurate glucose specific concentration measurement.

  17. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  18. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  19. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  20. Automatic Classification of Marine Mammals with Speaker Classification Methods.

    PubMed

    Kreimeyer, Roman; Ludwig, Stefan

    2016-01-01

    We present an automatic acoustic classifier for marine mammals based on human speaker classification methods as an element of a passive acoustic monitoring (PAM) tool. This work is part of the Protection of Marine Mammals (PoMM) project under the framework of the European Defense Agency (EDA) and joined by the Research Department for Underwater Acoustics and Geophysics (FWG), Bundeswehr Technical Centre (WTD 71) and Kiel University. The automatic classification should support sonar operators in the risk mitigation process before and during sonar exercises with a reliable automatic classification result.

  1. Improving Human-Machine Cooperative Classification Via Cognitive Theories of Similarity.

    PubMed

    Roads, Brett D; Mozer, Michael C

    2016-07-22

    Acquiring perceptual expertise is slow and effortful. However, untrained novices can accurately make difficult classification decisions (e.g., skin-lesion diagnosis) by reformulating the task as similarity judgment. Given a query image and a set of reference images, individuals are asked to select the best matching reference. When references are suitably chosen, the procedure yields an implicit classification of the query image. To optimize reference selection, we develop and evaluate a predictive model of similarity-based choice. The model builds on existing psychological literature and accommodates stochastic, dynamic shifts of attention among visual feature dimensions. We perform a series of human experiments with two stimulus types (rectangles, faces) and nine classification tasks to validate the model and to demonstrate the model's potential to boost performance. Our system achieves high accuracy for participants who are naive as to the classification task, even when the classification task switches from trial to trial.

  2. Congenital malformations of the female genital tract: the need for a new classification system.

    PubMed

    Grimbizis, Grigoris F; Campo, Rudi

    2010-07-01

    Current proposals for classifying female genital anomalies seem to be associated with limitations in effective categorization, creating the need for a new classification system that is as simple as possible, clear and accurate in its definitions, comprehensive, and correlated with patients' clinical presentation, prognosis, and treatment on an evidence-based foundation. Although creating a new classification system is not an easy task, it is feasible when taking into account the experience gained from applying the existing classification systems, mainly that of the American Fertility Society.

  3. Quantitative sports and functional classification (QSFC) for disabled people with spasticity

    PubMed Central

    Khalili, M

    2004-01-01

    Although sports classification for cerebral palsy has been in use for several years, it is complicated both for training and for scoring. People with cerebral palsy are difficult to fit into classification systems that are appropriate for other disability groups. The aim of this report is to describe the development of a framework for a simple quantitative classification of cerebral palsy. It was designed to be easily understood by all who are investigating, treating, training, coaching, and working with spastic disabled people. The scoring system is accurate and quick, so long as the definitions of items listed are adhered to. PMID:15155434

  4. Environmental endocrine disruptors: A proposed classification scheme

    SciTech Connect

    Fur, P.L. de; Roberts, J.

    1995-12-31

    A number of chemicals known to act on animal systems through the endocrine system have been termed environmental endocrine disruptors. This group includes some of the PCBs and TCDDs, as well as lead, mercury and a large number of pesticides. The common feature is that the chemicals interact with endogenous endocrine systems at the cellular and/or molecular level to alter normal processes that are controlled or regulated by hormones. Although the existence of artificial or environmental estrogens (e.g. chlordecone and DES) has been known for some time, recent data indicate that this phenomenon is widespread. Indeed, anti-androgens have been held responsible for reproductive dysfunction in alligator populations in Florida. But the significance of endocrine disruption was recognized by pesticide manufacturers when insect growth regulators were developed to interfere with hormonal control of growth. Controlling, regulating or managing these chemicals depends in no small part on the ability to identify, screen or otherwise know that a chemical is an endocrine disrupter. Two possible classifications schemes are: using the effects caused in an animal, or animals as an exposure indicator; and using a known screen for the point of contact with the animal. The former would require extensive knowledge of cause and effect relationships in dozens of animal groups; the latter would require a screening tool comparable to an estrogen binding assay. The authors present a possible classification based on chemicals known to disrupt estrogenic, androgenic and ecdysone regulated hormonal systems.

  5. Manifold learning for robust classification of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Kim, Wonkook

    Accurate land cover classification that ensures robust mapping under diverse acquisition conditions is important in environmental studies where the identification of the land cover changes and its quantification have critical implications for management practices, functioning of ecosystems, and impact of climate. While remote sensing data have served as a useful tool for large scale monitoring of the earth, hyperspectral data offer an enhanced capability for more accurate land cover classification. However, constructing a robust classification framework for hyperspectral data poses issues that stem from inherent properties of hyperspectral data, including highly correlated spectral bands, high dimensionality of data, nonlinear spectral responses, and nonstationarity of samples in space and time. This dissertation addresses the issues in hyperspectral data classification by leveraging the concept of manifolds. A manifold is a nonlinear low dimensional subspace that is supported by data samples. Manifolds can be exploited in developing robust feature extraction and classification methods that are pertinent to the aforementioned issues. In this dissertation, various manifold learning algorithms that are widely used in machine learning community are investigated for the classification of hyperspectral data. Performance of global and local manifold learning methods is investigated in terms of (a) parameter values, (b) number of features retained, and (c) scene characteristics of hyperspectral data. The empirical study involving several data sets with diverse characteristics is outlined in Chapter 3. Results indicate that the manifold coordinates produce generally higher classification accuracies compared to those obtained by linear feature extraction methods, when they are used with proper settings. Chapter 4 addresses two limitations in manifold learning---(a) heavy computational requirements and (b) lack of attention to spatial context---which limits the applicability

  6. 5 CFR 1312.7 - Derivative classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.7 Derivative classification. A derivative...

  7. 5 CFR 1312.7 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.7 Derivative classification. A derivative...

  8. 5 CFR 1312.7 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., DOWNGRADING, DECLASSIFICATION AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION Classification and Declassification of National Security Information § 1312.7 Derivative classification. A derivative...

  9. Environmental exposure measurement in cancer epidemiology

    PubMed Central

    2009-01-01

    Environmental exposures, used in the broadest sense of lifestyle, infections, radiation, natural and man-made chemicals and occupation, are a major cause of human cancer. However, the precise contribution of specific risk factors and their interaction, both with each other and with genotype, continues to be difficult to elucidate. This is partially due to limitations in accurately measuring exposure with the subsequent risk of misclassification. One of the primary challenges of molecular cancer epidemiology therefore is to improve exposure assessment. Progress has been made with biomarkers such as carcinogens and their metabolites, DNA and protein adducts and mutations measured in various tissues and body fluids. Nevertheless, much remains to be accomplished in order to establish aetiology and provide the evidence base for public health decisions. This review considers some of the principles behind the application of exposure biomarkers in cancer epidemiology. It also demonstrates how the same biomarkers can contribute both to establishing the biological plausibility of associations between exposure and disease and be valuable endpoints in intervention studies. The potential of new technologies such as transcriptomics, proteomics and metabonomics to provide a step change in environmental exposure assessment is discussed. An increasing recognition of the role of epigenetic changes in carcinogenesis presents a fresh challenge as alterations in DNA methylation, histone modification and microRNA in response to environmental exposures demand a new generation of exposure biomarker. The overall importance of this area of research is brought into sharp relief by the large prospective cohort studies (e.g. UK Biobank) which need accurate exposure measurement in order to shed light on the complex gene:environment interactions underlying common chronic disorders including cancer. It is suggested that a concerted effort is now required, with appropriate funding, to develop and

  10. Rapid infrared mapping for highly accurate automated histology in Barrett's oesophagus.

    PubMed

    Old, O J; Lloyd, G R; Nallala, J; Isabelle, M; Almond, L M; Shepherd, N A; Kendall, C A; Shore, A C; Barr, H; Stone, N

    2016-10-07

    Barrett's oesophagus (BE) is a premalignant condition that can progress to oesophageal adenocarcinoma. Endoscopic surveillance aims to identify potential progression at an early, treatable stage, but generates large numbers of tissue biopsies. Fourier transform infrared (FTIR) mapping was used to develop an automated histology tool for detection of BE and Barrett's neoplasia in tissue biopsies. 22 oesophageal tissue samples were collected from 19 patients. Contiguous frozen tissue sections were taken for pathology review and FTIR imaging. 45 mid-IR images were measured on an Agilent 620 FTIR microscope with an Agilent 670 spectrometer. Each image covering a 140 μm × 140 μm region was measured in 5 minutes, using a 1.1 μm(2) pixel size and 64 scans per pixel. Principal component fed linear discriminant analysis was used to build classification models based on spectral differences, which were then tested using leave-one-sample-out cross validation. Key biochemical differences were identified by their spectral signatures: high glycogen content was seen in normal squamous (NSQ) tissue, high glycoprotein content was observed in glandular BE tissue, and high DNA content in dysplasia/adenocarcinoma samples. Classification of normal squamous samples versus 'abnormal' samples (any stage of Barrett's) was performed with 100% sensitivity and specificity. Neoplastic Barrett's (dysplasia or adenocarcinoma) was identified with 95.6% sensitivity and 86.4% specificity. Highly accurate pathology classification can be achieved with FTIR measurement of frozen tissue sections in a clinically applicable timeframe.

  11. Successional stage of biological soil crusts: an accurate indicator of ecohydrological condition

    USGS Publications Warehouse

    Belnap, Jayne; Wilcox, Bradford P.; Van Scoyoc, Matthew V.; Phillips, Susan L.

    2013-01-01

    Biological soil crusts are a key component of many dryland ecosystems. Following disturbance, biological soil crusts will recover in stages. Recently, a simple classification of these stages has been developed, largely on the basis of external features of the crusts, which reflects their level of development (LOD). The classification system has six LOD classes, from low (1) to high (6). To determine whether the LOD of a crust is related to its ecohydrological function, we used rainfall simulation to evaluate differences in infiltration, runoff, and erosion among crusts in the various LODs, across a range of soil depths and with different wetting pre-treatments. We found large differences between the lowest and highest LODs, with runoff and erosion being greatest from the lowest LOD. Under dry antecedent conditions, about 50% of the water applied ran off the lowest LOD plots, whereas less than 10% ran off the plots of the two highest LODs. Similarly, sediment loss was 400 g m-2 from the lowest LOD and almost zero from the higher LODs. We scaled up the results from these simulations using the Rangeland Hydrology and Erosion Model. Modelling results indicate that erosion increases dramatically as slope length and gradient increase, especially beyond the threshold values of 10 m for slope length and 10% for slope gradient. Our findings confirm that the LOD classification is a quick, easy, nondestructive, and accurate index of hydrological condition and should be incorporated in field and modelling assessments of ecosystem health.

  12. Characterization of population exposure to organochlorines: a cluster analysis application.

    PubMed

    Guimarães, Raphael Mendonça; Asmus, Carmen Ildes Rodrigues Fróes; Burdorf, Alex

    2013-06-01

    This study aimed to show the results from a cluster analysis application in the characterization of population exposure to organochlorines through variables related to time and exposure dose. Characteristics of 354 subjects in a population exposed to organochlorine pesticides residues related to time and exposure dose were subjected to cluster analysis to separate them into subgroups. We performed hierarchical cluster analysis. To evaluate the classification accuracy, compared to intra-group and inter-group variability by ANOVA for each dimension. The aggregation strategy was accomplished by the method of Ward. It was, for the creation of clusters, variables associated with exposure and routes of contamination. The information on the estimated intake doses of compound were used to weight the values of exposure time at each of the routes, so as to obtain values proxy exposure intensity. The results showed three clusters: cluster 1 (n = 45), characteristics of greatest exposure, the cluster 2 (n = 103), intermediate exposure, and cluster 3 (n = 206), less exposure. The bivariate analyzes performed with groups that are groups showed a statistically significant difference. This study demonstrated the applicability of cluster analysis to categorize populations exposed to organochlorines and also points to the relevance of typological studies that may contribute to a better classification of subjects exposed to chemical agents, which is typical of environmental epidemiology studies to a wider understanding of etiological, preventive and therapeutic contamination.

  13. Effective Feature Selection for Classification of Promoter Sequences.

    PubMed

    K, Kouser; P G, Lavanya; Rangarajan, Lalitha; K, Acharya Kshitish

    2016-01-01

    Exploring novel computational methods in making sense of biological data has not only been a necessity, but also productive. A part of this trend is the search for more efficient in silico methods/tools for analysis of promoters, which are parts of DNA sequences that are involved in regulation of expression of genes into other functional molecules. Promoter regions vary greatly in their function based on the sequence of nucleotides and the arrangement of protein-binding short-regions called motifs. In fact, the regulatory nature of the promoters seems to be largely driven by the selective presence and/or the arrangement of these motifs. Here, we explore computational classification of promoter sequences based on the pattern of motif distributions, as such classification can pave a new way of functional analysis of promoters and to discover the functionally crucial motifs. We make use of Position Specific Motif Matrix (PSMM) features for exploring the possibility of accurately classifying promoter sequences using some of the popular classification techniques. The classification results on the complete feature set are low, perhaps due to the huge number of features. We propose two ways of reducing features. Our test results show improvement in the classification output after the reduction of features. The results also show that decision trees outperform SVM (Support Vector Machine), KNN (K Nearest Neighbor) and ensemble classifier LibD3C, particularly with reduced features. The proposed feature selection methods outperform some of the popular feature transformation methods such as PCA and SVD. Also, the methods proposed are as accurate as MRMR (feature selection method) but much faster than MRMR. Such methods could be useful to categorize new promoters and explore regulatory mechanisms of gene expressions in complex eukaryotic species.

  14. Effective Feature Selection for Classification of Promoter Sequences

    PubMed Central

    K., Kouser; P. G., Lavanya; Rangarajan, Lalitha; K., Acharya Kshitish

    2016-01-01

    Exploring novel computational methods in making sense of biological data has not only been a necessity, but also productive. A part of this trend is the search for more efficient in silico methods/tools for analysis of promoters, which are parts of DNA sequences that are involved in regulation of expression of genes into other functional molecules. Promoter regions vary greatly in their function based on the sequence of nucleotides and the arrangement of protein-binding short-regions called motifs. In fact, the regulatory nature of the promoters seems to be largely driven by the selective presence and/or the arrangement of these motifs. Here, we explore computational classification of promoter sequences based on the pattern of motif distributions, as such classification can pave a new way of functional analysis of promoters and to discover the functionally crucial motifs. We make use of Position Specific Motif Matrix (PSMM) features for exploring the possibility of accurately classifying promoter sequences using some of the popular classification techniques. The classification results on the complete feature set are low, perhaps due to the huge number of features. We propose two ways of reducing features. Our test results show improvement in the classification output after the reduction of features. The results also show that decision trees outperform SVM (Support Vector Machine), KNN (K Nearest Neighbor) and ensemble classifier LibD3C, particularly with reduced features. The proposed feature selection methods outperform some of the popular feature transformation methods such as PCA and SVD. Also, the methods proposed are as accurate as MRMR (feature selection method) but much faster than MRMR. Such methods could be useful to categorize new promoters and explore regulatory mechanisms of gene expressions in complex eukaryotic species. PMID:27978541

  15. Classification of protein crystallization imagery.

    PubMed

    Zhu, Xiaoqing; Sun, Shaohua; Bern, Marshall

    2004-01-01

    We investigate automatic classification of protein crystallization imagery, and evaluate the performance of several modern mathematical tools when applied to the problem. For feature extraction, we try a combination of geometric and texture features; for classification algorithms, the support vector machine (SVM) is compared with an automatic decision-tree classifier. Experimental results from 520 images are presented for the binary classification problem: separating successful trials from failed attempts. The best false positive and false negative rates are at 14.6% and 9.6% respectively, achieved by feeding both sets of features to the decision-tree classifier with boosting.

  16. Towards Automatic Classification of Neurons

    PubMed Central

    Armañanzas, Rubén; Ascoli, Giorgio A.

    2015-01-01

    The classification of neurons into types has been much debated since the inception of modern neuroscience. Recent experimental advances are accelerating the pace of data collection. The resulting information growth of morphological, physiological, and molecular properties encourages efforts to automate neuronal classification by powerful machine learning techniques. We review state-of-the-art analysis approaches and availability of suitable data and resources, highlighting prominent challenges and opportunities. The effective solution of the neuronal classification problem will require continuous development of computational methods, high-throughput data production, and systematic metadata organization to enable cross-lab integration. PMID:25765323

  17. Music classification with MPEG-7

    NASA Astrophysics Data System (ADS)

    Crysandt, Holger; Wellhausen, Jens

    2003-01-01

    Driven by increasing amount of music available electronically the need and possibility of automatic classification systems for music becomes more and more important. Currently most search engines for music are based on textual descriptions like artist or/and title. This paper presents a system for automatic music description, classification and visualization for a set of songs. The system is designed to extract significant features of a piece of music in order to find songs of similar genre or a similar sound characteristics. The description is done with the help of MPEG-7 only. The classification and visualization is done with the self organizing map algorithm.

  18. Absolute classification with unsupervised clustering

    NASA Technical Reports Server (NTRS)

    Jeon, Byeungwoo; Landgrebe, D. A.

    1992-01-01

    An absolute classification algorithm is proposed in which the class definition through training samples or otherwise is required only for a particular class of interest. The absolute classification is considered as a problem of unsupervised clustering when one cluster is known initially. The definitions and statistics of the other classes are automatically developed through the weighted unsupervised clustering procedure, which is developed to keep the cluster corresponding to the class of interest from losing its identity as the class of interest. Once all the classes are developed, a conventional relative classifier such as the maximum-likelihood classifier is used in the classification.

  19. Classification of Rainbows

    NASA Astrophysics Data System (ADS)

    Adams, Peter; Ricard, Jean; Barckicke, Jean

    2016-04-01

    Rainbows are the most beautiful and most spectacular optical atmospheric phenomenon. Humphreys (1964) pointedly noted that "the "explanations" generally given of the rainbow [ in textbooks] may well be said to explain beautifully that which does not occur, and to leave unexplained which does" . . . "The records of close observations of rainbows soon show that not even the colors are always the same". Textbooks stress that the main factor affecting the aspect of the rainbow is the radius of the water droplets. In his well-known textbook entitled "the nature of light & colour in the open air", Minnaert (1954) gives the chief features of the rainbow depending on the diameter of the drops producing it. For this study, we have gathered hundreds of pictures of primary bows. We sort out the pictures into classes. The classes are defined in a such way that rainbows belonging to the same class look similar. Our results are surprising and do not confirm Minnaert's classification. In practice, the size of the water droplets is only a minor factor controlling the overall aspect of the rainbow. The main factor appears to be the height of the sun above the horizon. At sunset, the width of the red band increases, while the width of the other bands of colours decreases. The orange, the violet, the blue and the green bands disappear completely in this order. At the end, the primary bow is mainly red and slightly yellow. Picture = taken from the CNRM in Toulouse after a summer storm (Jean Ricard)

  20. Radar clutter classification

    NASA Astrophysics Data System (ADS)

    Stehwien, Wolfgang

    1989-11-01

    The problem of classifying radar clutter as found on air traffic control radar systems is studied. An algorithm based on Bayes decision theory and the parametric maximum a posteriori probability classifier is developed to perform this classification automatically. This classifier employs a quadratic discriminant function and is optimum for feature vectors that are distributed according to the multivariate normal density. Separable clutter classes are most likely to arise from the analysis of the Doppler spectrum. Specifically, a feature set based on the complex reflection coefficients of the lattice prediction error filter is proposed. The classifier is tested using data recorded from L-band air traffic control radars. The Doppler spectra of these data are examined; the properties of the feature set computed using these data are studied in terms of both the marginal and multivariate statistics. Several strategies involving different numbers of features, class assignments, and data set pretesting according to Doppler frequency and signal to noise ratio were evaluated before settling on a workable algorithm. Final results are presented in terms of experimental misclassification rates and simulated and classified plane position indicator displays.

  1. Molecular Classification of Medulloblastoma

    PubMed Central

    KIJIMA, Noriyuki; KANEMURA, Yonehiro

    2016-01-01

    Medulloblastoma (MB) is one of the most frequent malignant brain tumors in children. The current standard treatment regimen consists of surgical resection, craniospinal irradiation, and adjuvant chemotherapy. Although these treatments have the potential to increase the survival of 70–80% of patients with MB, they are also associated with serious treatment-induced morbidity. The current risk stratification of MB is based on clinical factors, including age at presentation, metastatic status, and the presence of residual tumor following resection. In addition, recent genomic studies indicate that MB consists of at least four distinct molecular subgroups: WNT, sonic hedgehog (SHH), Group 3, and Group 4. WNT and SHH MBs are characterized by aberrations in the WNT and SHH signaling pathways, respectively. WNT MB has the best prognosis compared to the other MBs, while SHH MB has an intermediate prognosis. The underlying signaling pathways associated with Group 3 and 4 MBs have not been identified. Group 3 MB is frequently associated with metastasis, resulting in a poor prognosis, while Group 4 is sometimes associated with metastasis and has an intermediate prognosis. Group 4 is the most frequent MB and represents 35% of all MBs. These findings suggest that MB is a heterogeneous disease, and that MB subgroups have distinct molecular, demographic, and clinical characteristics. The molecular classification of MBs is redefining the risk stratification of patients with MB, and has the potential to identify new therapeutic strategies for the treatment of MB. PMID:27238212

  2. Integrating Exposure into Chemical Alternatives Assessment ...

    EPA Pesticide Factsheets

    Most alternatives assessments (AA) published to date are largely hazard-based rankings, and as such may not represent a fully informed consideration of the advantages and disadvantages of possible alternatives. With an assessment goal of identifying an alternative chemical that is more sustainable, other attributes beyond hazard are also important, including exposure, risk, life-cycle impacts, performance, cost, and social responsibility. Building on the 2014 recommendations by the U.S. National Academy of Sciences to improve AA decisions by including comparative exposure assessment, the HESISustainable Chemical Alternatives Technical Committee, which consists of scientists from academia, industry, government, and NGOs, has developed a qualitative comparative exposure approach. Conducting such a comparison can screen for alternatives that are expected to have a higher exposure potential, which could trigger a higher-tiered, more quantitative exposure assessment on the alternatives being considered. This talk will demonstrate an approach for including chemical- and product-related exposure information in a qualitative AA comparison. Starting from existing hazard AAs, a series of four chemical-product application scenarios were examined to test the concept, to understand the effort required, and to determine the value of exposure data in AA decision-making. The group has developed a classification approach for ingredient and product parameters to support compariso

  3. Thermal Spore Exposure Vessels

    NASA Technical Reports Server (NTRS)

    Beaudet, Robert A.; Kempf, Michael; Kirschner, Larry

    2006-01-01

    Thermal spore exposure vessels (TSEVs) are laboratory containers designed for use in measuring rates of death or survival of microbial spores at elevated temperatures. A major consideration in the design of a TSEV is minimizing thermal mass in order to minimize heating and cooling times. This is necessary in order to minimize the number of microbes killed before and after exposure at the test temperature, so that the results of the test accurately reflect the effect of the test temperature. A typical prototype TSEV (see figure) includes a flat-bottomed stainless-steel cylinder 4 in. (10.16 cm) long, 0.5 in. (1.27 cm) in diameter, having a wall thickness of 0.010 plus or minus 0.002 in. (0.254 plus or minus 0.051 mm). Microbial spores are deposited in the bottom of the cylinder, then the top of the cylinder is closed with a sterile rubber stopper. Hypodermic needles are used to puncture the rubber stopper to evacuate the inside of the cylinder or to purge the inside of the cylinder with a gas. In a typical application, the inside of the cylinder is purged with dry nitrogen prior to a test. During a test, the lower portion of the cylinder is immersed in a silicone-oil bath that has been preheated to and maintained at the test temperature. Test temperatures up to 220 C have been used. Because the spores are in direct contact with the thin cylinder wall, they quickly become heated to the test temperature.

  4. Classification of microphthalmos and coloboma.

    PubMed Central

    Warburg, M

    1993-01-01

    A new classification of microphthalmos and coloboma is proposed to bring order to the complexity of clinical and aetiological heterogeneity of these conditions. A phenotypic classification is presented which may help the clinician to give a systematic description of the anomalies. The phenotype does not predict the aetiology but a systematic description of ocular and systemic anomalies improves syndrome identification. There are two major classes, total and partial microphthalmos, and a subclassification which follows the embryology of the anomalies. The aetiological classification consists of three classes: (1) genetic (monogenic and chromosomal), (2) prenatally acquired (teratological agents and intrauterine deformations), and (3) associations. Genetic disorders give rise to malformations; prenatally acquired anomalies are disruptions or deformations. The aetiological classification can be applied to other congenital birth defects and improves counselling of families. Recurrence risks vary considerably between the classes. PMID:8411053

  5. Poisson Structures:. Towards a Classification

    NASA Astrophysics Data System (ADS)

    Grabowski, J.; Marmo, G.; Perelomov, A. M.

    In the present note we give an explicit description of certain class of Poisson structures. The methods lead to a classification of Poisson structures in low dimensions and suggest a possible approach for higher dimensions.

  6. Classification of spacetimes with symmetry

    NASA Astrophysics Data System (ADS)

    Hicks, Jesse W.

    Spacetimes with symmetry play a critical role in Einstein's Theory of General Relativity. Missing from the literature is a correct, usable, and computer accessible classification of such spacetimes. This dissertation fills this gap; specifically, we. i) give a new and different approach to the classification of spacetimes with symmetry using modern methods and tools such as the Schmidt method and computer algebra systems, resulting in ninety-two spacetimes; ii) create digital databases of the classification for easy access and use for researchers; iii) create software to classify any spacetime metric with symmetry against the new database; iv) compare results of our classification with those of Petrov and find that Petrov missed six cases and incorrectly normalized a significant number of metrics; v) classify spacetimes with symmetry in the book Exact Solutions to Einstein's Field Equations Second Edition by Stephani, Kramer, Macallum, Hoenselaers, and Herlt and in Komrakov's paper Einstein-Maxwell equation on four-dimensional homogeneous spaces using the new software.

  7. Spectroscopic classification of supernova candidates

    NASA Astrophysics Data System (ADS)

    Hodgkin, S. T.; Hall, A.; Fraser, M.; Campbell, H.; Wyrzykowski, L.; Kostrzewa-Rutkowska, Z.; Pietro, N.

    2014-09-01

    We report the spectroscopic classification of four supernovae at the 2.5m Isaac Newton Telescope on La Palma, using the Intermediate Dispersion Spectrograph and the R300V grating (3500-8000 Ang; ~6 Ang resolution).

  8. CLASSIFICATION FRAMEWORK FOR COASTAL SYSTEMS

    EPA Science Inventory

    U.S. Environmental Protection Agency. Classification Framework for Coastal Systems. EPA/600/R-04/061. U.S. Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Atlantic Ecology Division, Narragansett, RI, Gulf Ecology Division, Gulf Bree...

  9. Critical Evaluation of Headache Classifications

    PubMed Central

    ÖZGE, Aynur

    2013-01-01

    Transforming a subjective sense like headache into an objective state and establishing a common language for this complaint which can be both a symptom and a disease all by itself have kept the investigators busy for years. Each recommendation proposed has brought along a set of patients who do not meet the criteria. While almost the most ideal and most comprehensive classification studies continued at this point, this time criticisims about withdrawing from daily practice came to the fore. In this article, the classification adventure of scientists who work in the area of headache will be summarized. More specifically, 2 classifications made by the International Headache Society (IHS) and the point reached in relation with the 3rd classification which is still being worked on will be discussed together with headache subtypes. It has been presented with the wish and belief that it will contribute to the readers and young investigators who are interested in this subject.

  10. Handwriting Classification in Forensic Science.

    ERIC Educational Resources Information Center

    Ansell, Michael

    1979-01-01

    Considers systems for the classification of handwriting features, discusses computer storage of information about handwriting features, and summarizes recent studies that give an idea of the range of forensic handwriting research. (GT)

  11. ASSESSING CHILDREN'S EXPOSURES TO PESTICIDES: AN IMPORTANT APPLICATION OF THE STOCHASTIC HUMAN EXPOSURE AND DOSE SIMULATION MODEL (SHEDS)

    EPA Science Inventory

    Accurately quantifying human exposures and doses of various populations to environmental pollutants is critical for the Agency to assess and manage human health risks. For example, the Food Quality Protection Act of 1996 (FQPA) requires EPA to consider aggregate human exposure ...

  12. 77 FR 3800 - Accurate NDE & Inspection, LLC; Confirmatory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... COMMISSION Accurate NDE & Inspection, LLC; Confirmatory Order In the Matter of Accurate NDE & Docket: 150... request ADR with the NRC in an attempt to resolve issues associated with this matter. In response, on August 9, 2011, Accurate NDE requested ADR to resolve this matter with the NRC. On September 28,...

  13. On the accurate estimation of gap fraction during daytime with digital cover photography

    NASA Astrophysics Data System (ADS)

    Hwang, Y. R.; Ryu, Y.; Kimm, H.; Macfarlane, C.; Lang, M.; Sonnentag, O.

    2015-12-01

    Digital cover photography (DCP) has emerged as an indirect method to obtain gap fraction accurately. Thus far, however, the intervention of subjectivity, such as determining the camera relative exposure value (REV) and threshold in the histogram, hindered computing accurate gap fraction. Here we propose a novel method that enables us to measure gap fraction accurately during daytime under various sky conditions by DCP. The novel method computes gap fraction using a single DCP unsaturated raw image which is corrected for scattering effects by canopies and a reconstructed sky image from the raw format image. To test the sensitivity of the novel method derived gap fraction to diverse REVs, solar zenith angles and canopy structures, we took photos in one hour interval between sunrise to midday under dense and sparse canopies with REV 0 to -5. The novel method showed little variation of gap fraction across different REVs in both dense and spares canopies across diverse range of solar zenith angles. The perforated panel experiment, which was used to test the accuracy of the estimated gap fraction, confirmed that the novel method resulted in the accurate and consistent gap fractions across different hole sizes, gap fractions and solar zenith angles. These findings highlight that the novel method opens new opportunities to estimate gap fraction accurately during daytime from sparse to dense canopies, which will be useful in monitoring LAI precisely and validating satellite remote sensing LAI products efficiently.

  14. Cholangiocarcinoma: increasing burden of classifications

    PubMed Central

    Cardinale, Vincenzo; Bragazzi, Maria Consiglia; Carpino, Guido; Torrice, Alessia; Fraveto, Alice; Gentile, Raffaele; Pasqualino, Vincenzo; Melandro, Fabio; Aliberti, Camilla; Bastianelli, Carlo; Brunelli, Roberto; Berloco, Pasquale Bartolomeo; Gaudio, Eugenio

    2013-01-01

    Cholangiocarcinoma (CCA) is a very heterogeneous cancer from any point of view, including epidemiology, risk factors, morphology, pathology, molecular pathology, modalities of growth and clinical features. Given this heterogeneity, a uniform classification respecting the epidemiologic, pathologic and clinical needs is currently lacking. In this manuscript we discussed the different proposed classifications of CCA in relation with recent advances in pathophysiology and biology of this cancer. PMID:24570958

  15. Product Work Classification and Coding

    DTIC Science & Technology

    1986-06-01

    detail is much more useful in planning steel welding processes. In this regard remember that mild steel , HSLA steel , and high-yield steel (e.g. HY80 ...manufacturing facility. In Figure 2.3-2, a classification and coding system for steel parts is shown. This classification and coding system sorts steel parts...system would provide a shop which produced steel parts with a means of organizing parts. Rather than attempting to manage all of its parts as a single

  16. Classification of the acanthocephala.

    PubMed

    Amin, Omar M

    2013-09-01

    In 1985, Amin presented a new system for the classification of the Acanthocephala in Crompton and Nickol's (1985) book 'Biology of the Acanthocephala' and recognized the concepts of Meyer (1931, 1932, 1933) and Van Cleave (1936, 1941, 1947, 1948, 1949, 1951, 1952). This system became the standard for the taxonomy of this group and remains so to date. Many changes have taken place and many new genera and species, as well as higher taxa, have been described since. An updated version of the 1985 scheme incorporating new concepts in molecular taxonomy, gene sequencing and phylogenetic studies is presented. The hierarchy has undergone a total face lift with Amin's (1987) addition of a new class, Polyacanthocephala (and a new order and family) to remove inconsistencies in the class Palaeacanthocephala. Amin and Ha (2008) added a third order (and a new family) to the Palaeacanthocephala, Heteramorphida, which combines features from the palaeacanthocephalan families Polymorphidae and Heteracanthocephalidae. Other families and subfamilies have been added but some have been eliminated, e.g. the three subfamilies of Arythmacanthidae: Arhythmacanthinae Yamaguti, 1935; Neoacanthocephaloidinae Golvan, 1960; and Paracanthocephaloidinae Golvan, 1969. Amin (1985) listed 22 families, 122 genera and 903 species (4, 4 and 14 families; 13, 28 and 81 genera; 167, 167 and 569 species in Archiacanthocephala, Eoacanthocephala and Palaeacanthocephala, respectively). The number of taxa listed in the present treatment is 26 families (18% increase), 157 genera (29%), and 1298 species (44%) (4, 4 and 16; 18, 29 and 106; 189, 255 and 845, in the same order), which also includes 1 family, 1 genus and 4 species in the class Polyacanthocephala Amin, 1987, and 3 genera and 5 species in the fossil family Zhijinitidae.

  17. Using checklists and algorithms to improve qualitative exposure judgment accuracy.

    PubMed

    Arnold, Susan F; Stenzel, Mark; Drolet, Daniel; Ramachandran, Gurumurthy

    2016-01-01

    Most exposure assessments are conducted without the aid of robust personal exposure data and are based instead on qualitative inputs such as education and experience, training, documentation on the process chemicals, tasks and equipment, and other information. Qualitative assessments determine whether there is any follow-up, and influence the type that occurs, such as quantitative sampling, worker training, and implementing exposure and risk management measures. Accurate qualitative exposure judgments ensure appropriate follow-up that in turn ensures appropriate exposure management. Studies suggest that qualitative judgment accuracy is low. A qualitative exposure assessment Checklist tool was developed to guide the application of a set of heuristics to aid decision making. Practicing hygienists (n = 39) and novice industrial hygienists (n = 8) were recruited for a study evaluating the influence of the Checklist on exposure judgment accuracy. Participants generated 85 pre-training judgments and 195 Checklist-guided judgments. Pre-training judgment accuracy was low (33%) and not statistically significantly different from random chance. A tendency for IHs to underestimate the true exposure was observed. Exposure judgment accuracy improved significantly (p <0.001) to 63% when aided by the Checklist. Qualitative judgments guided by the Checklist tool were categorically accurate or over-estimated the true exposure by one category 70% of the time. The overall magnitude of exposure judgment precision also improved following training. Fleiss' κ, evaluating inter-rater agreement between novice assessors was fair to moderate (κ = 0.39). Cohen's weighted and unweighted κ were good to excellent for novice (0.77 and 0.80) and practicing IHs (0.73 and 0.89), respectively. Checklist judgment accuracy was similar to quantitative exposure judgment accuracy observed in studies of similar design using personal exposure measurements, suggesting that the tool could be useful in

  18. Classification of Salivary Gland Neoplasms.

    PubMed

    Bradley, Patrick J

    2016-01-01

    Presently, there is no universal 'working' classification system acceptable to all clinicians involved in the diagnosis and management of patients with salivary gland neoplasms. The most recent World Health Organization Classification of Tumours: Head and Neck Tumours (Salivary Glands) (2005) for benign and malignant neoplasms represents the consensus of current knowledge and is considered the standard pathological classification based on which series should be reported. The TNM classification of salivary gland malignancies has stood the test of time, and using the stage groupings remains the current standard for reporting treated patients' outcomes. Many developments in molecular and genetic methods in the meantime have identified a number of new entities, and new findings for several of the well-established salivary malignancies need to be considered for inclusion in any new classification system. All clinicians involved in the diagnosis, assessment and treatment of patients with salivary gland neoplasms must understand and respect the need for the various classification systems, enabling them to work within a multidisciplinary clinical team environment.

  19. Evaluation of the contribution of LiDAR data and postclassification procedures to object-based classification accuracy

    NASA Astrophysics Data System (ADS)

    Styers, Diane M.; Moskal, L. Monika; Richardson, Jeffrey J.; Halabisky, Meghan A.

    2014-01-01

    Object-based image analysis (OBIA) is becoming an increasingly common method for producing land use/land cover (LULC) classifications in urban areas. In order to produce the most accurate LULC map, LiDAR data and postclassification procedures are often employed, but their relative contributions to accuracy are unclear. We examined the contribution of LiDAR data and postclassification procedures to increase classification accuracies over using imagery alone and assessed sources of error along an ecologically complex urban-to-rural gradient in Olympia, Washington. Overall classification accuracy and user's and producer's accuracies for individual classes were evaluated. The addition of LiDAR data to the OBIA classification resulted in an 8.34% increase in overall accuracy, while manual postclassification to the imagery+LiDAR classification improved accuracy only an additional 1%. Sources of error in this classification were largely due to edge effects, from which multiple different types of errors result.

  20. Classification of US hydropower dams by their modes of operation

    SciTech Connect

    McManamay, Ryan A.; Oigbokie, II, Clement O.; Kao, Shih -Chieh; Bevelhimer, Mark S.

    2016-02-19

    A key challenge to understanding ecohydrologic responses to dam regulation is the absence of a universally transferable classification framework for how dams operate. In the present paper, we develop a classification system to organize the modes of operation (MOPs) for U.S. hydropower dams and powerplants. To determine the full diversity of MOPs, we mined federal documents, open-access data repositories, and internet sources. W then used CART classification trees to predict MOPs based on physical characteristics, regulation, and project generation. Finally, we evaluated how much variation MOPs explained in sub-daily discharge patterns for stream gages downstream of hydropower dams. After reviewing information for 721 dams and 597 power plants, we developed a 2-tier hierarchical classification based on 1) the storage and control of flows to powerplants, and 2) the presence of a diversion around the natural stream bed. This resulted in nine tier-1 MOPs representing a continuum of operations from strictly peaking, to reregulating, to run-of-river, and two tier-2 MOPs, representing diversion and integral dam-powerhouse configurations. Although MOPs differed in physical characteristics and energy production, classification trees had low accuracies (<62%), which suggested accurate evaluations of MOPs may require individual attention. MOPs and dam storage explained 20% of the variation in downstream subdaily flow characteristics and showed consistent alterations in subdaily flow patterns from reference streams. Lastly, this standardized classification scheme is important for future research including estimating reservoir operations for large-scale hydrologic models and evaluating project economics, environmental impacts, and mitigation.

  1. Classification of US hydropower dams by their modes of operation

    DOE PAGES

    McManamay, Ryan A.; Oigbokie, II, Clement O.; Kao, Shih -Chieh; ...

    2016-02-19

    A key challenge to understanding ecohydrologic responses to dam regulation is the absence of a universally transferable classification framework for how dams operate. In the present paper, we develop a classification system to organize the modes of operation (MOPs) for U.S. hydropower dams and powerplants. To determine the full diversity of MOPs, we mined federal documents, open-access data repositories, and internet sources. W then used CART classification trees to predict MOPs based on physical characteristics, regulation, and project generation. Finally, we evaluated how much variation MOPs explained in sub-daily discharge patterns for stream gages downstream of hydropower dams. After reviewingmore » information for 721 dams and 597 power plants, we developed a 2-tier hierarchical classification based on 1) the storage and control of flows to powerplants, and 2) the presence of a diversion around the natural stream bed. This resulted in nine tier-1 MOPs representing a continuum of operations from strictly peaking, to reregulating, to run-of-river, and two tier-2 MOPs, representing diversion and integral dam-powerhouse configurations. Although MOPs differed in physical characteristics and energy production, classification trees had low accuracies (<62%), which suggested accurate evaluations of MOPs may require individual attention. MOPs and dam storage explained 20% of the variation in downstream subdaily flow characteristics and showed consistent alterations in subdaily flow patterns from reference streams. Lastly, this standardized classification scheme is important for future research including estimating reservoir operations for large-scale hydrologic models and evaluating project economics, environmental impacts, and mitigation.« less

  2. Expert review for GHS classification of chemicals on health effects.

    PubMed

    Morita, Takeshi; Morikawa, Kaoru

    2011-01-01

    Intoxication as a result of chemical accidents is a major issue in industrial health. The Globally Harmonized System of Classification and Labelling of Chemicals (GHS) provides a framework for hazard communication on chemicals using labelling or safety data sheets. The GHS will be expected to reduce the number of chemical accidents by communicating the hazards posed and prompting safety measures to be taken. One of the issues which may be a barrier to effective implementation of the GHS results from discrepancies in GHS classifications of chemicals across countries/regions. The main reasons are the differences in information sources used and in the expertise of people making the classification (Classifiers). The GHS requests expert judgment in a weight of evidence (WOE) approach in the application of the criteria of classification. A WOE approach is an assessment method that considers all available information bearing on the determination of toxicity. The quality and consistency of the data, study design, mechanism or mode of action, dose-effect relationships and biological relevance should be taken into account. Therefore, expert review should be necessary to classify chemicals accurately. However, the GHS does not provide any information on the required level of expertise of the Classifiers, definition of who qualifies as an expert, evaluation methods of WOE or data quality, and the timing of expert judgment and the need for updating/re-classification as new information becomes available. In this paper, key methods and issues in expert reviews are discussed. Examples of expert reviews and recommendations for harmonized classification are also presented.

  3. CREST--classification resources for environmental sequence tags.

    PubMed

    Lanzén, Anders; Jørgensen, Steffen L; Huson, Daniel H; Gorfer, Markus; Grindhaug, Svenn Helge; Jonassen, Inge; Øvreås, Lise; Urich, Tim

    2012-01-01

    Sequencing of taxonomic or phylogenetic markers is becoming a fast and efficient method for studying environmental microbial communities. This has resulted in a steadily growing collection of marker sequences, most notably of the small-subunit (SSU) ribosomal RNA gene, and an increased understanding of microbial phylogeny, diversity and community composition patterns. However, to utilize these large datasets together with new sequencing technologies, a reliable and flexible system for taxonomic classification is critical. We developed CREST (Classification Resources for Environmental Sequence Tags), a set of resources and tools for generating and utilizing custom taxonomies and reference datasets for classification of environmental sequences. CREST uses an alignment-based classification method with the lowest common ancestor algorithm. It also uses explicit rank similarity criteria to reduce false positives and identify novel taxa. We implemented this method in a web server, a command line tool and the graphical user interfaced program MEGAN. Further, we provide the SSU rRNA reference database and taxonomy SilvaMod, derived from the publicly available SILVA SSURef, for classification of sequences from bacteria, archaea and eukaryotes. Using cross-validation and environmental datasets, we compared the performance of CREST and SilvaMod to the RDP Classifier. We also utilized Greengenes as a reference database, both with CREST and the RDP Classifier. These analyses indicate that CREST performs better than alignment-free methods with higher recall rate (sensitivity) as well as precision, and with the ability to accurately identify most sequences from novel taxa. Classification using SilvaMod performed better than with Greengenes, particularly when applied to environmental sequences. CREST is freely available under a GNU General Public License (v3) from http://apps.cbu.uib.no/crest and http://lcaclassifier.googlecode.com.

  4. Multi-View Acoustic Sizing and Classification of Individual Fish

    NASA Astrophysics Data System (ADS)

    Roberts, P. L. D.; Jaffe, J. S.

    Estimating biophysical parameters of fish populations in situ such as size, orientation, shape, and taxa is a fundamental goal in oceanography. Towards this end, acoustics is a natural choice due to its rapid, non-invasive capabilities. Here, multi-view methods are explored for classification, size and orientation estimation, and 2D image reconstruction for individual fish. Size- and shape-based classification using multi-view data is shown to be accurate (~10% error) using kernel methods and discriminant analysis. For species-based classification in the absence of significant differences in size or shape, multi-view methods offer significant (~40%) reduction in error, but absolute error rates remain high (~20%) due to the lack of discriminant information in acoustic scatter. Length and orientation estimation are investigated using a parameter-based approach with a simple ellipsoidal scattering model. Good accuracy is obtained when the views span the full 360°. When the span is limited to less than 60°, incorporating a prior constraint on possible body shapes can lead to reduced uncertainty in the estimated parameters. Finally, using views that span the full 360°, sparse Bayesian learning coupled with a conventional Radon transform yields accurate two-dimensional, projected images of the fish.

  5. 37 CFR 2.85 - Classification schedules.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... international classification pursuant to § 2.85(e)(3). (b) Prior United States classification system. Section 6... pursuant to § 2.85(e)(3); or (2) The registration was issued under a classification system prior to that... application. (e) Changes to Nice Agreement. The international classification system changes...

  6. 28 CFR 17.26 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons... documents or classification guides. (b) Persons who apply derivative classification markings shall...

  7. 12 CFR 403.4 - Derivative classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Derivative classification. 403.4 Section 403.4... SAFEGUARDING OF NATIONAL SECURITY INFORMATION § 403.4 Derivative classification. (a) Use of derivative classification. (1) Unlike original classification which is an initial determination, derivative...

  8. 28 CFR 17.26 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons... documents or classification guides. (b) Persons who apply derivative classification markings shall...

  9. 28 CFR 17.26 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons... documents or classification guides. (b) Persons who apply derivative classification markings shall...

  10. 12 CFR 403.4 - Derivative classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Derivative classification. 403.4 Section 403.4... SAFEGUARDING OF NATIONAL SECURITY INFORMATION § 403.4 Derivative classification. (a) Use of derivative classification. (1) Unlike original classification which is an initial determination, derivative...

  11. 12 CFR 403.4 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Derivative classification. 403.4 Section 403.4... SAFEGUARDING OF NATIONAL SECURITY INFORMATION § 403.4 Derivative classification. (a) Use of derivative classification. (1) Unlike original classification which is an initial determination, derivative...

  12. 12 CFR 403.4 - Derivative classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Derivative classification. 403.4 Section 403.4... SAFEGUARDING OF NATIONAL SECURITY INFORMATION § 403.4 Derivative classification. (a) Use of derivative classification. (1) Unlike original classification which is an initial determination, derivative...

  13. 28 CFR 17.26 - Derivative classification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons... documents or classification guides. (b) Persons who apply derivative classification markings shall...

  14. 28 CFR 17.26 - Derivative classification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Derivative classification. 17.26 Section... ACCESS TO CLASSIFIED INFORMATION Classified Information § 17.26 Derivative classification. (a) Persons... documents or classification guides. (b) Persons who apply derivative classification markings shall...

  15. Prototype Expert System for Climate Classification.

    ERIC Educational Resources Information Center

    Harris, Clay

    Many students find climate classification laborious and time-consuming, and through their lack of repetition fail to grasp the details of classification. This paper describes an expert system for climate classification that is being developed at Middle Tennessee State University. Topics include: (1) an introduction to the nature of classification,…

  16. Current terminology and diagnostic classification schemes.

    PubMed

    Okeson, J P

    1997-01-01

    This article reviews the current terminology and classification schemes available for temporomandibular disorders. The origin of each term is presented, and the classification schemes that have been offered for temporomandibular disorders are briefly reviewed. Several important classifications are presented in more detail, with mention of advantages and disadvantages. Final recommendations are provided for future direction in the area of classification schemes.

  17. 32 CFR 2001.14 - Classification challenges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Classification challenges. 2001.14 Section 2001.14 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders,...

  18. 32 CFR 2001.14 - Classification challenges.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Classification challenges. 2001.14 Section 2001.14 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders,...

  19. 32 CFR 2001.14 - Classification challenges.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Classification challenges. 2001.14 Section 2001.14 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders,...

  20. 32 CFR 2001.21 - Original classification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Original classification. 2001.21 Section 2001.21... Markings § 2001.21 Original classification. (a) Primary markings. At the time of original classification... classification markings. See § 2001.24(j) for specific guidance. (e) Date of origin of document. The date...

  1. 32 CFR 2001.21 - Original classification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Original classification. 2001.21 Section 2001.21... Markings § 2001.21 Original classification. (a) Primary markings. At the time of original classification... classification markings. See § 2001.24(j) for specific guidance. (e) Date of origin of document. The date...

  2. 32 CFR 2001.14 - Classification challenges.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Classification challenges. 2001.14 Section 2001.14 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders,...

  3. 32 CFR 2001.21 - Original classification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Original classification. 2001.21 Section 2001.21... Markings § 2001.21 Original classification. (a) Primary markings. At the time of original classification... classification markings. See § 2001.24(j) for specific guidance. (e) Date of origin of document. The date...

  4. 32 CFR 2001.14 - Classification challenges.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Classification challenges. 2001.14 Section 2001.14 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT... Classification § 2001.14 Classification challenges. (a) Challenging classification. Authorized holders,...

  5. HIV classification using coalescent theory

    SciTech Connect

    Zhang, Ming; Letiner, Thomas K; Korber, Bette T

    2008-01-01

    Algorithms for subtype classification and breakpoint detection of HIV-I sequences are based on a classification system of HIV-l. Hence, their quality highly depend on this system. Due to the history of creation of the current HIV-I nomenclature, the current one contains inconsistencies like: The phylogenetic distance between the subtype B and D is remarkably small compared with other pairs of subtypes. In fact, it is more like the distance of a pair of subsubtypes Robertson et al. (2000); Subtypes E and I do not exist any more since they were discovered to be composed of recombinants Robertson et al. (2000); It is currently discussed whether -- instead of CRF02 being a recombinant of subtype A and G -- subtype G should be designated as a circulating recombination form (CRF) nd CRF02 as a subtype Abecasis et al. (2007); There are 8 complete and over 400 partial HIV genomes in the LANL-database which belong neither to a subtype nor to a CRF (denoted by U). Moreover, the current classification system is somehow arbitrary like all complex classification systems that were created manually. To this end, it is desirable to deduce the classification system of HIV systematically by an algorithm. Of course, this problem is not restricted to HIV, but applies to all fast mutating and recombining viruses. Our work addresses the simpler subproblem to score classifications of given input sequences of some virus species (classification denotes a partition of the input sequences in several subtypes and CRFs). To this end, we reconstruct ancestral recombination graphs (ARG) of the input sequences under restrictions determined by the given classification. These restritions are imposed in order to ensure that the reconstructed ARGs do not contradict the classification under consideration. Then, we find the ARG with maximal probability by means of Markov Chain Monte Carlo methods. The probability of the most probable ARG is interpreted as a score for the classification. To our

  6. Hydrocarbon exposure and chronic renal disease.

    PubMed

    Asal, N R; Cleveland, H L; Kaufman, C; Nsa, W; Nelson, D I; Nelson, R Y; Lee, E T; Kingsley, B

    1996-01-01

    The study objective was to investigate further the potential role of long-term exposure to hydrocarbons (HCs) in the development of idiopathic chronic glomerulopathy (ICG) using a more refined measurement of HC exposure. A total of 321 pairs of cases and controls, matched by age, gender, and geographical area, were assembled. A detailed questionnaire was blindly administered to cases and controls to collect information on occupational and medical history and sociodemographic data. By integrating quantified measurements of HC exposure from a variety of sources with each subject's occupational history, a lifetime HC exposure score could be estimated and expressed in parts per million (ppm). Cases had an hydrocarbon exposure mean score of 165 ppm (median 48 ppm) as compared to 162 ppm (median 43 ppm) for controls (P = 0.757). When using hydrocarbon exposure as a dichotomous variable with a cutoff point at 100 ppm, cases had a higher proportion of exposed than controls, but the difference was not statistically significant at the 0.05 level, even after controlling for possible confounders through logistic regression. Subgroup analyses showed mixed results. In most subgroups differences between cases and controls tended to become significant when hydrocarbon was used as a dichotomous variable. Results from this study do not sufficiently support the hypothesized association of HC exposure and ICG in general. Subgroup analyses need further investigations. Efforts to generate accurate estimates of lifetime HC exposure should be emphasized for future investigations.

  7. Classification of cell death

    PubMed Central

    Kroemer, G; Galluzzi, L; Vandenabeele, P; Abrams, J; Alnemri, ES; Baehrecke, EH; Blagosklonny, MV; El-Deiry, WS; Golstein, P; Green, DR; Hengartner, M; Knight, RA; Kumar, S; Lipton, SA; Malorni, W; Nuñez, G; Peter, ME; Tschopp, J; Yuan, J; Piacentini, M; Zhivotovsky, B; Melino, G

    2009-01-01

    Different types of cell death are often defined by morphological criteria, without a clear reference to precise biochemical mechanisms. The Nomenclature Committee on Cell Death (NCCD) proposes unified criteria for the definition of cell death and of its different morphologies, while formulating several caveats against the misuse of words and concepts that slow down progress in the area of cell death research. Authors, reviewers and editors of scientific periodicals are invited to abandon expressions like ‘percentage apoptosis’ and to replace them with more accurate descriptions of the biochemical and cellular parameters that are actually measured. Moreover, at the present stage, it should be accepted that caspase-independent mechanisms can cooperate with (or substitute for) caspases in the execution of lethal signaling pathways and that ‘autophagic cell death’ is a type of cell death occurring together with (but not necessarily by) autophagic vacuolization. This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including ‘entosis’, ‘mitotic catastrophe’, ‘necrosis’, ‘necroptosis’ and ‘pyroptosis’. PMID:18846107

  8. Magnetic classification of meteorites

    NASA Astrophysics Data System (ADS)

    Rochette, P.; Sagnotti, L.; Consolmagno, G.; Denise, M.; Folco, L.; Gattacceca, J.; Osete, M.; Pesonen, L.

    2003-04-01

    either a different type of meteorite or with even terrestrial rock. In the collections studied several percent of the specimens appeared misidentified. Moreover magnetic measurements can be performed directly on the field, by hand or by a robot, opening new possibility for screening meteorites on cold or hot desert surface. P. Rochette et al., Magnetic Classification of stony meteorites: 1. Ordinary chondrites. Met. Planet. Sci., in press.

  9. A Hybrid Classification Scheme for Mining Multisource Geospatial Data

    SciTech Connect

    Vatsavai, Raju; Bhaduri, Budhendra L

    2007-01-01

    Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and atmospheric conditions present at the time of data acquisition. A second problem with statistical classifiers is the requirement of large number of accurate training samples, which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, it is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 15% improvement in classification accuracy over conventional classification schemes.

  10. Advances in Spectral-Spatial Classification of Hyperspectral Images

    NASA Technical Reports Server (NTRS)

    Fauvel, Mathieu; Tarabalka, Yuliya; Benediktsson, Jon Atli; Chanussot, Jocelyn; Tilton, James C.

    2012-01-01

    Recent advances in spectral-spatial classification of hyperspectral images are presented in this paper. Several techniques are investigated for combining both spatial and spectral information. Spatial information is extracted at the object (set of pixels) level rather than at the conventional pixel level. Mathematical morphology is first used to derive the morphological profile of the image, which includes characteristics about the size, orientation and contrast of the spatial structures present in the image. Then the morphological neighborhood is defined and used to derive additional features for classification. Classification is performed with support vector machines using the available spectral information and the extracted spatial information. Spatial post-processing is next investigated to build more homogeneous and spatially consistent thematic maps. To that end, three presegmentation techniques are applied to define regions that are used to regularize the preliminary pixel-wise thematic map. Finally, a multiple classifier system is defined to produce relevant markers that are exploited to segment the hyperspectral image with the minimum spanning forest algorithm. Experimental results conducted on three real hyperspectral images with different spatial and spectral resolutions and corresponding to various contexts are presented. They highlight the importance of spectral-spatial strategies for the accurate classification of hyperspectral images and validate the proposed methods.

  11. Impact of Information based Classification on Network Epidemics

    PubMed Central

    Mishra, Bimal Kumar; Haldar, Kaushik; Sinha, Durgesh Nandini

    2016-01-01

    Formulating mathematical models for accurate approximation of malicious propagation in a network is a difficult process because of our inherent lack of understanding of several underlying physical processes that intrinsically characterize the broader picture. The aim of this paper is to understand the impact of available information in the control of malicious network epidemics. A 1-n-n-1 type differential epidemic model is proposed, where the differentiality allows a symptom based classification. This is the first such attempt to add such a classification into the existing epidemic framework. The model is incorporated into a five class system called the DifEpGoss architecture. Analysis reveals an epidemic threshold, based on which the long-term behavior of the system is analyzed. In this work three real network datasets with 22002, 22469 and 22607 undirected edges respectively, are used. The datasets show that classification based prevention given in the model can have a good role in containing network epidemics. Further simulation based experiments are used with a three category classification of attack and defense strengths, which allows us to consider 27 different possibilities. These experiments further corroborate the utility of the proposed model. The paper concludes with several interesting results. PMID:27329348

  12. Central Texas coastal classification maps - Aransas Pass to Mansfield Channel

    USGS Publications Warehouse

    Morton, Robert A.; Peterson, Russell L.

    2006-01-01

    The primary purpose of the USGS National Assessment of Coastal Change Project is to provide accurate representations of pre-storm ground conditions for areas that are designated high priority because they have dense populations or valuable resources that are at risk from storm waves. A secondary purpose of the project is to develop a geomorphic (land feature) coastal classification that, with only minor modification, can be applied to most coastal regions in the United States. A Coastal Classification Map describing local geomorphic features is the first step toward determining the hazard vulnerability of an area. The Coastal Classification Maps of the National Assessment of Coastal Change Project present ground conditions such as beach width, dune elevations, overwash potential, and density of development. In order to complete a hazard-vulnerability assessment, that information must be integrated with other information, such as prior storm impacts and beach stability. The Coastal Classification Maps provide much of the basic information for such an assessment and represent a critical component of a storm-impact forecasting capability.

  13. Impact of Information based Classification on Network Epidemics

    NASA Astrophysics Data System (ADS)

    Mishra, Bimal Kumar; Haldar, Kaushik; Sinha, Durgesh Nandini

    2016-06-01

    Formulating mathematical models for accurate approximation of malicious propagation in a network is a difficult process because of our inherent lack of understanding of several underlying physical processes that intrinsically characterize the broader picture. The aim of this paper is to understand the impact of available information in the control of malicious network epidemics. A 1-n-n-1 type differential epidemic model is proposed, where the differentiality allows a symptom based classification. This is the first such attempt to add such a classification into the existing epidemic framework. The model is incorporated into a five class system called the DifEpGoss architecture. Analysis reveals an epidemic threshold, based on which the long-term behavior of the system is analyzed. In this work three real network datasets with 22002, 22469 and 22607 undirected edges respectively, are used. The datasets show that classification based prevention given in the model can have a good role in containing network epidemics. Further simulation based experiments are used with a three category classification of attack and defense strengths, which allows us to consider 27 different possibilities. These experiments further corroborate the utility of the proposed model. The paper concludes with several interesting results.

  14. Using Copula Distributions to Support More Accurate Imaging-Based Diagnostic Classifiers for Neuropsychiatric Disorders

    PubMed Central

    Bansal, Ravi; Hao, Xuejun; Liu, Jun; Peterson, Bradley S.

    2014-01-01

    Many investigators have tried to apply machine learning techniques to magnetic resonance images (MRIs) of the brain in order to diagnose neuropsychiatric disorders. Usually the number of brain imaging measures (such as measures of cortical thickness and measures of local surface morphology) derived from the MRIs (i.e., their dimensionality) has been large (e.g. >10) relative to the number of participants who provide the MRI data (<100). Sparse data in a high dimensional space increases the variability of the classification rules that machine learning algorithms generate, thereby limiting the validity, reproducibility, and generalizability of those classifiers. The accuracy and stability of the classifiers can improve significantly if the multivariate distributions of the imaging measures can be estimated accurately. To accurately estimate the multivariate distributions using sparse data, we propose to estimate first the univariate distributions of imaging data and then combine them using a Copula to generate more accurate estimates of their multivariate distributions. We then sample the estimated Copula distributions to generate dense sets of imaging measures and use those measures to train classifiers. We hypothesize that the dense sets of brain imaging measures will generate classifiers that are stable to variations in brain imaging measures, thereby improving the reproducibility, validity, and generalizability of diagnostic classification algorithms in imaging datasets from clinical populations. In our experiments, we used both computer-generated and real-world brain imaging datasets to assess the accuracy of multivariate Copula distributions in estimating the corresponding multivariate distributions of real-world imaging data. Our experiments showed that diagnostic classifiers generated using imaging measures sampled from the Copula were significantly more accurate and more reproducible than were the classifiers generated using either the real-world imaging

  15. An accurate and practical method for inference of weak gravitational lensing from galaxy images

    NASA Astrophysics Data System (ADS)

    Bernstein, Gary M.; Armstrong, Robert; Krawiec, Christina; March, Marisa C.

    2016-07-01

    We demonstrate highly accurate recovery of weak gravitational lensing shear using an implementation of the Bayesian Fourier Domain (BFD) method proposed by Bernstein & Armstrong, extended to correct for selection biases. The BFD formalism is rigorously correct for Nyquist-sampled, background-limited, uncrowded images of background galaxies. BFD does not assign shapes to galaxies, instead compressing the pixel data D into a vector of moments M, such that we have an analytic expression for the probability P(M|g) of obtaining the observations with gravitational lensing distortion g along the line of sight. We implement an algorithm for conducting BFD's integrations over the population of unlensed source galaxies which measures ≈10 galaxies s-1 core-1 with good scaling properties. Initial tests of this code on ≈109 simulated lensed galaxy images recover the simulated shear to a fractional accuracy of m = (2.1 ± 0.4) × 10-3, substantially more accurate than has been demonstrated previously for any generally applicable method. Deep sky exposures generate a sufficiently accurate approximation to the noiseless, unlensed galaxy population distribution assumed as input to BFD. Potential extensions of the method include simultaneous measurement of magnification and shear; multiple-exposure, multiband observations; and joint inference of photometric redshifts and lensing tomography.

  16. Inhalation exposure of animals.

    PubMed Central

    Phalen, R F

    1976-01-01

    Relative advantages and disadvantages and important design criteria for various exposure methods are presented. Five types of exposures are discussed: whole-body chambers, head-only exposures, nose or mouth-only methods, lung-only exposures, and partial-lung exposures. Design considerations covered include: air cleaning and conditioning; construction materials; losses of exposure materials; evenness of exposure; sampling biases; animal observation and care; noise and vibration control, safe exhausts, chamber loading, reliability, pressure fluctuations; neck seals, masks, animal restraint methods; and animal comfort. Ethical considerations in use of animals in inhalation experiments are also discussed. PMID:1017420

  17. Medium wave exposure characterisation using exposure quotients.

    PubMed

    Paniagua, Jesús M; Rufo, Montaña; Jiménez, Antonio; Antolín, Alicia; Pinar, Iván

    2010-06-01

    One of the aspects considered in the International Commission on Non-Ionizing Radiation Protection guidelines is that, in situations of simultaneous exposure to fields of different frequencies, exposure quotients for thermal and electrical stimulation effects should be examined. The aim of the present work was to analyse the electromagnetic radiation levels and exposure quotients for exposure to multiple-frequency sources in the vicinity of medium wave radio broadcasting antennas. The measurements were made with a spectrum analyser and a monopole antenna. Kriging interpolation was used to prepare contour maps and to estimate the levels in the towns and villages of the zone. The results showed that the exposure quotient criterion based on electrical stimulation effects to be more stringent than those based on thermal effects or power density levels. Improvement of dosimetry evaluations requires the spectral components of the radiation to be quantified, followed by application of the criteria for exposure to multiple-frequency sources.

  18. Power quality disturbance classification based on wavelet transform and self-organizing learning neural network

    NASA Astrophysics Data System (ADS)

    Ding, Guangbin; Liu, Lin

    2006-11-01

    A novel approach for the power quality (PQ) disturbances classification based on the wavelet transform (WT) and selforganizing learning array (SOLAR) system is proposed. Wavelet transform is utilized to extract feature vectors for various PQ disturbances and the WT can accurately localizes the characteristics of a signal both in the time and frequency domains. These feature vectors then are applied to a SOLAR system for training and disturbance pattern classification. By comparing with a classic neural network, it is concluded that SOLAR has better data driven learning and local interconnections performance. The research results between the proposed method and the other existing method is discussed and the proposed method can provide accurate classification results. On the basis of hypothesis test of the averages, it is shown that corresponding to different wavelets selection, there is no statistically significant difference in performance of PQ disturbances classification and the relationship between the wavelet decomposition level and classification performance is discussed. The simulation results demonstrate the proposed method gives a new way for identification and classification of dynamic power quality disturbances.

  19. EEG Channel Selection Using Particle Swarm Optimization for the Classification of Auditory Event-Related Potentials

    PubMed Central

    Hokari, Haruhide

    2014-01-01

    Brain-machine interfaces (BMI) rely on the accurate classification of event-related potentials (ERPs) and their performance greatly depends on the appropriate selection of classifier parameters and features from dense-array electroencephalography (EEG) signals. Moreover, in order to achieve a portable and more compact BMI for practical applications, it is also desirable to use a system capable of accurate classification using information from as few EEG channels as possible. In the present work, we propose a method for classifying P300 ERPs using a combination of Fisher Discriminant Analysis (FDA) and a multiobjective hybrid real-binary Particle Swarm Optimization (MHPSO) algorithm. Specifically, the algorithm searches for the set of EEG channels and classifier parameters that simultaneously maximize the classification accuracy and minimize the number of used channels. The performance of the method is assessed through offline analyses on datasets of auditory ERPs from sound discrimination experiments. The proposed method achieved a higher classification accuracy than that achieved by traditional methods while also using fewer channels. It was also found that the number of channels used for classification can be significantly reduced without greatly compromising the classification accuracy. PMID:24982944

  20. Suicide Surveillance in the U.S. Military?Reporting and Classification Biases in Rate Calculations

    ERIC Educational Resources Information Center

    Carr, Joel R.; Hoge, Charles W.; Gardner, John; Potter, Robert

    2004-01-01

    The military has a well-defined population with suicide prevention programs that have been recognized as possible models for civilian suicide prevention efforts. Monitoring prevention programs requires accurate reporting. In civilian settings, several studies have confirmed problems in the reporting and classification of suicides. This analysis…

  1. Utility of the Beta-Multinomial Distribution in Multiple Classification Scheme.

    ERIC Educational Resources Information Center

    Peng, Chao-Ying Joanne

    This study is an attempt to answer the following research question: can the reliability of a criterion-referenced test be accurately determined according to a multiple classification of the student's performance? Specifically, the study pursues the beta-multinomial model, which postulates the probability distribution of an examinee's degree of…

  2. Very High Resolution Classification of Sentinel-1A Data Using Segmentation and Texture Analysis

    NASA Astrophysics Data System (ADS)

    Korosov, Anton A.; Park, Jeong-Won

    2016-08-01

    An algorithm for classification of sea ice, water and other types on Sentinel-1A SAR data has been developed based on thermal noise correction, segmentation, texture features and support vector machines. The algorithm was tested on several SAR images and proves to be accurate (95% true positive hits) and to have very high resolution (100 m pixel size).

  3. Factors Affecting the Item Parameter Estimation and Classification Accuracy of the DINA Model

    ERIC Educational Resources Information Center

    de la Torre, Jimmy; Hong, Yuan; Deng, Weiling

    2010-01-01

    To better understand the statistical properties of the deterministic inputs, noisy "and" gate cognitive diagnosis (DINA) model, the impact of several factors on the quality of the item parameter estimates and classification accuracy was investigated. Results of the simulation study indicate that the fully Bayes approach is most accurate when the…

  4. Automatic Detection and Classification of Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Qu, Ming; Shih, Frank Y.; Jing, Ju; Wang, Haimin

    2006-09-01

    We present an automatic algorithm to detect, characterize, and classify coronal mass ejections (CMEs) in Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The algorithm includes three steps: (1) production running difference images of LASCO C2 and C3; (2) characterization of properties of CMEs such as intensity, height, angular width of span, and speed, and (3) classification of strong, median, and weak CMEs on the basis of CME characterization. In this work, image enhancement, segmentation, and morphological methods are used to detect and characterize CME regions. In addition, Support Vector Machine (SVM) classifiers are incorporated with the CME properties to distinguish strong CMEs from other weak CMEs. The real-time CME detection and classification results are recorded in a database to be available to the public. Comparing the two available CME catalogs, SOHO/LASCO and CACTus CME catalogs, we have achieved accurate and fast detection of strong CMEs and most of weak CMEs.

  5. Non-Destructive Classification Approaches for Equilbrated Ordinary Chondrites

    NASA Technical Reports Server (NTRS)

    Righter, K.; Harrington, R.; Schroeder, C.; Morris, R. V.

    2013-01-01

    Classification of meteorites is most effectively carried out by petrographic and mineralogic studies of thin sections, but a rapid and accurate classification technique for the many samples collected in dense collection areas (hot and cold deserts) is of great interest. Oil immersion techniques have been used to classify a large proportion of the US Antarctic meteorite collections since the mid-1980s [1]. This approach has allowed rapid characterization of thousands of samples over time, but nonetheless utilizes a piece of the sample that has been ground to grains or a powder. In order to compare a few non-destructive techniques with the standard approaches, we have characterized a group of chondrites from the Larkman Nunatak region using magnetic susceptibility and Moessbauer spectroscopy.

  6. Integrating Clonal Selection and Deterministic Sampling for Efficient Associative Classification

    PubMed Central

    Elsayed, Samir A. Mohamed; Rajasekaran, Sanguthevar; Ammar, Reda A.

    2013-01-01

    Traditional Associative Classification (AC) algorithms typically search for all possible association rules to find a representative subset of those rules. Since the search space of such rules may grow exponentially as the support threshold decreases, the rules discovery process can be computationally expensive. One effective way to tackle this problem is to directly find a set of high-stakes association rules that potentially builds a highly accurate classifier. This paper introduces AC-CS, an AC algorithm that integrates the clonal selection of the immune system along with deterministic data sampling. Upon picking a representative sample of the original data, it proceeds in an evolutionary fashion to populate only rules that are likely to yield good classification accuracy. Empirical results on several real datasets show that the approach generates dramatically less rules than traditional AC algorithms. In addition, the proposed approach is significantly more efficient than traditional AC algorithms while achieving a competitive accuracy. PMID:24500504

  7. New decision support tool for acute lymphoblastic leukemia classification

    NASA Astrophysics Data System (ADS)

    Madhukar, Monica; Agaian, Sos; Chronopoulos, Anthony T.

    2012-03-01

    In this paper, we build up a new decision support tool to improve treatment intensity choice in childhood ALL. The developed system includes different methods to accurately measure furthermore cell properties in microscope blood film images. The blood images are exposed to series of pre-processing steps which include color correlation, and contrast enhancement. By performing K-means clustering on the resultant images, the nuclei of the cells under consideration are obtained. Shape features and texture features are then extracted for classification. The system is further tested on the classification of spectra measured from the cell nuclei in blood samples in order to distinguish normal cells from those affected by Acute Lymphoblastic Leukemia. The results show that the proposed system robustly segments and classifies acute lymphoblastic leukemia based on complete microscopic blood images.

  8. The Photometric Classification Server of PanSTARRS1

    NASA Astrophysics Data System (ADS)

    Saglia, R. P.

    2008-12-01

    The Panoramic Survey Telescope and Rapid Response System 1 (PanSTARRS1) project is on his way to start science operations early 2009. In the next 3.5 years it will produce a grizy survey of 3/4 of the sky, 2 mag deeper than Sloan. The Photometric Classification Server is responsibile for the object classification as star, galaxy or quasar, based on multiband photometry. Accordingly, it t should also deliver accurate and timely stellar parameters or photometric redshifts. Several science projects rely on the output of the server, from transit planet search, to transient detections, the structure of the Milky Way, high redshift Quasars, galaxy evolution, cosmological shear, baryonic oscillations and galaxy cluster searches.

  9. Star-galaxy classification using deep convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Kim, Edward J.; Brunner, Robert J.

    2017-02-01

    Most existing star-galaxy classifiers use the reduced summary information from catalogues, requiring careful feature extraction and selection. The latest advances in machine learning that use deep convolutional neural networks (ConvNets) allow a machine to automatically learn the features directly from the data, minimizing the need for input from human experts. We present a star-galaxy classification framework that uses deep ConvNets directly on the reduced, calibrated pixel values. Using data from the Sloan Digital Sky Survey and the Canada-France-Hawaii Telescope Lensing Survey, we demonstrate that ConvNets are able to produce accurate and well-calibrated probabilistic classifications that are competitive with conventional machine learning techniques. Future advances in deep learning may bring more success with current and forthcoming photometric surveys, such as the Dark Energy Survey and the Large Synoptic Survey Telescope, because deep neural networks require very little, manual feature engineering.

  10. Pathologic classification of diabetic nephropathy.

    PubMed

    Tervaert, Thijs W Cohen; Mooyaart, Antien L; Amann, Kerstin; Cohen, Arthur H; Cook, H Terence; Drachenberg, Cinthia B; Ferrario, Franco; Fogo, Agnes B; Haas, Mark; de Heer, Emile; Joh, Kensuke; Noël, Laure H; Radhakrishnan, Jai; Seshan, Surya V; Bajema, Ingeborg M; Bruijn, Jan A

    2010-04-01

    Although pathologic classifications exist for several renal diseases, including IgA nephropathy, focal segmental glomerulosclerosis, and lupus nephritis, a uniform classification for diabetic nephropathy is lacking. Our aim, commissioned by the Research Committee of the Renal Pathology Society, was to develop a consensus classification combining type1 and type 2 diabetic nephropathies. Such a classification should discriminate lesions by various degrees of severity that would be easy to use internationally in clinical practice. We divide diabetic nephropathy into four hierarchical glomerular lesions with a separate evaluation for degrees of interstitial and vascular involvement. Biopsies diagnosed as diabetic nephropathy are classified as follows: Class I, glomerular basement membrane thickening: isolated glomerular basement membrane thickening and only mild, nonspecific changes by light microscopy that do not meet the criteria of classes II through IV. Class II, mesangial expansion, mild (IIa) or severe (IIb): glomeruli classified as mild or severe mesangial expansion but without nodular sclerosis (Kimmelstiel-Wilson lesions) or global glomerulosclerosis in more than 50% of glomeruli. Class III, nodular sclerosis (Kimmelstiel-Wilson lesions): at least one glomerulus with nodular increase in mesangial matrix (Kimmelstiel-Wilson) without changes described in class IV. Class IV, advanced diabetic glomerulosclerosis: more than 50% global glomerulosclerosis with other clinical or pathologic evidence that sclerosis is attributable to diabetic nephropathy. A good interobserver reproducibility for the four classes of DN was shown (intraclass correlation coefficient = 0.84) in a test of this classification.

  11. Confined Aerosol Jet in Fiber Classification and Dustiness Measurement

    NASA Astrophysics Data System (ADS)

    Dubey, Prahit

    The focus of this dissertation is the numerical analysis of confined aerosol jets used in fiber classification and dustiness measurement. Of relevance to the present work are two devices, namely, the Baron Fiber Classifier (BFC), and the Venturi Dustiness Tester (VDT). The BFC is a device used to length-separate fibers, important for toxicological research. The Flow Combination Section (FCS) of this device consists of an upstream region, where an aerosol of uncharged fibers is introduced in the form of an annular jet, in-between two sheath flows. Length-separation occurs by dielectrophoresis, downstream of the FCS in the Fiber Classification Section (FClS). In its standard operation, BFC processes only small quantities of fibers. In order to increase its throughput, higher aerosol flow rates must be considered. The goal of the present investigation is to understand the interaction of sheath and aerosol flows inside the FCS, and to identify possible limits to increasing aerosol flow rates using Computational Fluid Dynamics (CFD). Simulations involve solution of Navier-Stokes equations for axisymmetric and 3D models of the FCS for six different flow rates, and a pure aerodynamic treatment of the aerosol jet. The results show that the geometry of the FCS, and the two sheath flows, are successful in preventing the emergence of vortices in the FCS for aerosol-to-sheath flow inlet velocity ratios below ≈ 50. For larger aerosol-to-sheath flow inlet velocity ratios, two vortices are formed, one near the inner cylinder and one near the outer cylinder. The VDT is a novel device for measuring the dustiness of powders, relevant for dust management and controlling hazardous exposure. It uses just 10 mg of the test powder for its operation, during which the powder is aerosolized and turbulently dispersed (Re = 19,900) for 1.5s into a 5.7 liter chamber; the aerosol is then gently sampled (Re = 2050) for 240s through two filters located at the chamber top. Pump-driven suction at

  12. Classification of iris colour: review and refinement of a classification schema.

    PubMed

    Mackey, David A; Wilkinson, Colleen H; Kearns, Lisa S; Hewitt, Alex W

    2011-07-01

    Eye colour or, more accurately, iris colour is one of the most obvious physical characteristics of a person. European parents frequently ask the colour of their newborn's eyes, only to see the iris change dramatically during their child's first year of life. Genetic and epidemiological findings have uncovered further details about the basis for iris colour, which may have important implications for further research and treatment of some eye diseases and ocular characteristics. Surprisingly there is no widely recognized classification system for eye colour. An added difficulty when trying to devise an international system is that subtle differences in colour description exist between languages (e.g. hazel vs. auburn). We reviewed the recent and very early literature pertaining to eye colour classification. Recent genetic investigations of eye colour have tended to either use simple (three-category grading systems) or more complex digital colour grading. We present a nine-category grading system. Categories in this novel schema include: (i) light blue; (ii) darker blue; (iii) blue with brown peripupillary ring; (iv) green; (v) green with brown iris ring; (vi) peripheral green central brown; (vii) brown with some peripheral green; (viii) brown; and (ix) dark brown. Although different observers may categorize a person's eye colour differently, it is generally only by an adjacent category. We also describe a continuum of iris pigmentation from a small ring of brown around the pupil to almost complete brown with small peripheral flecks. Digital publishing and assessment of iris colour will result in more standardized classification of iris colour and investigation of its role in eye disease.

  13. Comparing Features for Classification of MEG Responses to Motor Imagery

    PubMed Central

    Halme, Hanna-Leena; Parkkonen, Lauri

    2016-01-01

    Background Motor imagery (MI) with real-time neurofeedback could be a viable approach, e.g., in rehabilitation of cerebral stroke. Magnetoencephalography (MEG) noninvasively measures electric brain activity at high temporal resolution and is well-suited for recording oscillatory brain signals. MI is known to modulate 10- and 20-Hz oscillations in the somatomotor system. In order to provide accurate feedback to the subject, the most relevant MI-related features should be extracted from MEG data. In this study, we evaluated several MEG signal features for discriminating between left- and right-hand MI and between MI and rest. Methods MEG was measured from nine healthy participants imagining either left- or right-hand finger tapping according to visual cues. Data preprocessing, feature extraction and classification were performed offline. The evaluated MI-related features were power spectral density (PSD), Morlet wavelets, short-time Fourier transform (STFT), common spatial patterns (CSP), filter-bank common spatial patterns (FBCSP), spatio—spectral decomposition (SSD), and combined SSD+CSP, CSP+PSD, CSP+Morlet, and CSP+STFT. We also compared four classifiers applied to single trials using 5-fold cross-validation for evaluating the classification accuracy and its possible dependence on the classification algorithm. In addition, we estimated the inter-session left-vs-right accuracy for each subject. Results The SSD+CSP combination yielded the best accuracy in both left-vs-right (mean 73.7%) and MI-vs-rest (mean 81.3%) classification. CSP+Morlet yielded the best mean accuracy in inter-session left-vs-right classification (mean 69.1%). There were large inter-subject differences in classification accuracy, and the level of the 20-Hz suppression correlated significantly with the subjective MI-vs-rest accuracy. Selection of the classification algorithm had only a minor effect on the results. Conclusions We obtained good accuracy in sensor-level decoding of MI from single

  14. Multi-temporal surface classification of high mountain environments

    NASA Astrophysics Data System (ADS)

    Fritzmann, Patrick; Bollmann, Erik; Sailer, Rudolf; Stötter, Johann; Vetter, Michael

    2010-05-01

    Particularly in high mountain environments, data acquisitions covering large areas are often impossible or can be very time consuming and quite frequently dangerous. Since 2001, airborne laser scanning (ALS) campaigns have been carried out regularly at glaciological relevant dates at Hintereisferner (Tyrol, Austria), resulting in a unique multi-temporal data set of the same high mountain area. Previous studies showed that ALS is a well suited method for glacier monitoring. Beside the geometric attributes of the surface, the back scattered signal energy, often referred to as intensity, is contained in the resulting data set. These intensities can be used to classify the surface, which has been done yet on small scale test areas at Hintereisferner. In this study a novel classification method based only on gridded intensity data, covering an area of 36 km², was developed to determine the main surface types (ice & water bodies, snow, firn, rock and vegetation). First of all a correction model was applied to reduce influences caused by scanning geometry as well as atmospheric and topographic influences. Furthermore, to homogenize the multi-temporal data set a normalization method had to be developed to make the intensities of the different data sets comparable, which is a basic requirement for a multi-temporal surface classification. The classification system is based on an statistical approach: Thresholds of each surface class were determined on a single raster and then used to classify other intensity rasters. To evaluate the surface classification, ground truth data derived from ortho-images and field campaigns was used. The classification results show that in particular ice and water bodies can accurately be classified from intensity data, while other surface classes are less exactly detected due to various factors influencing the intensity. The most limiting factor is the variability in surface reflectance. In particular the grain size of snow and firn surfaces

  15. Learning semantic histopathological representation for basal cell carcinoma classification

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Ricardo; Rueda, Andrea; Romero, Eduardo

    2013-03-01

    Diagnosis of a histopathology glass slide is a complex process that involves accurate recognition of several structures, their function in the tissue and their relation with other structures. The way in which the pathologist represents the image content and the relations between those objects yields a better and accurate diagnoses. Therefore, an appropriate semantic representation of the image content will be useful in several analysis tasks such as cancer classification, tissue retrieval and histopahological image analysis, among others. Nevertheless, to automatically recognize those structures and extract their inner semantic meaning are still very challenging tasks. In this paper we introduce a new semantic representation that allows to describe histopathological concepts suitable for classification. The approach herein identify local concepts using a dictionary learning approach, i.e., the algorithm learns the most representative atoms from a set of random sampled patches, and then models the spatial relations among them by counting the co-occurrence between atoms, while penalizing the spatial distance. The proposed approach was compared with a bag-of-features representation in a tissue classification task. For this purpose, 240 histological microscopical fields of view, 24 per tissue class, were collected. Those images fed a Support Vector Machine classifier per class, using 120 images as train set and the remaining ones for testing, maintaining the same proportion of each concept in the train and test sets. The obtained classification results, averaged from 100 random partitions of training and test sets, shows that our approach is more sensitive in average than the bag-of-features representation in almost 6%.

  16. Classification of road surface profiles

    SciTech Connect

    Rouillard, V.; Bruscella, B.; Sek, M.

    2000-02-01

    This paper introduces a universal classification methodology for discretely sampled sealed bituminous road profile data for the study of shock and vibrations related to the road transportation process. Data representative of a wide variety of Victorian (Australia) road profiles were used to develop a universal classification methodology with special attention to their non-Gaussian and nonstationary properties. This resulted in the design of computer software to automatically detect and extract transient events from the road spatial acceleration data as well as to identify segments of the constant RMS level enabling transients to be analyzed separately from the underlying road process. Nine universal classification parameters are introduced to describe road profile spatial acceleration based on the statistical characteristics of the transient amplitude and stationary RMS segments. Results from this study are aimed at the areas of road transport simulation as well as road surface characterization.

  17. The revised classification of eukaryotes

    PubMed Central

    Adl, Sina M.; Simpson, Alastair. G.; Lane, Christopher E.; Lukeš, Julius; Bass, David; Bowser, Samuel S.; Brown, Matt; Burki, Fabien; Dunthorn, Micah; Hampl, Vladimir; Heiss, Aaron; Hoppenrath, Mona; Lara, Enrique; leGall, Line; Lynn, Denis H.; McManus, Hilary; Mitchell, Edward A. D.; Mozley-Stanridge, Sharon E.; Parfrey, Laura Wegener; Pawlowski, Jan; Rueckert, Sonja; Shadwick, Lora; Schoch, Conrad; Smirnov, Alexey; Spiegel, Frederick W.

    2012-01-01

    This revision of the classification of eukaryotes, which updates that of Adl et al. (2005), retains an emphasis on the protists and incorporates changes since 2005 that have resolved nodes and branches in phylogenetic trees. Whereas the previous revision was successful in re-introducing name stability to the classification, this revision provides a classification for lineages that were then still unresolved. The supergroups have withstood phylogenetic hypothesis testing with some modifications, but despite some progress, problematic nodes at the base of the eukaryotic tree still remain to be statistically resolved. Looking forward, subsequent transformations to our understanding of the diversity of life will be from the discovery of novel lineages in previously under-sampled areas and from environmental genomic information. PMID:23020233

  18. Galaxy Classification without Feature Extraction

    NASA Astrophysics Data System (ADS)

    Polsterer, K. L.; Gieseke, F.; Kramer, O.

    2012-09-01

    The automatic classification of galaxies according to the different Hubble types is a widely studied problem in the field of astronomy. The complexity of this task led to projects like Galaxy Zoo which try to obtain labeled data based on visual inspection by humans. Many automatic classification frameworks are based on artificial neural networks (ANN) in combination with a feature extraction step in the pre-processing phase. These approaches rely on labeled catalogs for training the models. The small size of the typically used training sets, however, limits the generalization performance of the resulting models. In this work, we present a straightforward application of support vector machines (SVM) for this type of classification tasks. The conducted experiments indicate that using a sufficient number of labeled objects provided by the EFIGI catalog leads to high-quality models. In contrast to standard approaches no additional feature extraction is required.

  19. Classification systems for stalking behavior.

    PubMed

    Racine, Christopher; Billick, Stephen

    2014-01-01

    Stalking is a complex behavioral phenomenon that is unique in that it necessarily involves a prolonged dyadic relationship between both a perpetrator and a victim. Since criminalization of stalking behavior in the 1990s, different conceptual typologies have attempted to classify this behavior to assess risk and aid in management decisions. The authors reviewed the current literature regarding the most recent and accepted stalking classification systems. The three predominant stalker typologies currently in use include Zona's stalker-victim types, Mullen's stalker typology, and the RECON stalker typology. Of these, the RECON classification system alone was developed in an attempt to separate stalkers into groups based on previously known risk factors for behaviorally based phenomenon such as propensity for violence. Understanding and simplifying these classification systems may enhance the potential that new research will lead to evidence-based management and treatment strategies in the stalking situation.

  20. Hierarchical modeling for image classification

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.

    1982-01-01

    As part of the California Integrated Remote Sensing System's (CIRSS) San Bernardino County Project, the use of data layers from a geographic information system (GIS) as an integral part of the Landsat image classification process was investigated. Through a hierarchical modeling technique, elevation, aspect, land use, vegetation, and growth management data from the project's data base were used to guide class labeling decisions in a 1976 Landsat MSS land cover classification. A similar model, incorporating 1976-1979 Landsat spectral change data in addition to other data base elements, was used in the classification of a 1979 Landsat image. The resultant Landsat products were integrated as additional layers into the data base for use in growth management, fire hazard, and hydrological modeling.

  1. Corpus Callosum MR Image Classification

    NASA Astrophysics Data System (ADS)

    Elsayed, A.; Coenen, F.; Jiang, C.; García-Fiñana, M.; Sluming, V.

    An approach to classifying Magnetic Resonance (MR) image data is described. The specific application is the classification of MRI scan data according to the nature of the corpus callosum, however the approach has more general applicability. A variation of the “spectral segmentation with multi-scale graph decomposition” mechanism is introduced. The result of the segmentation is stored in a quad-tree data structure to which a weighted variation (also developed by the authors) of the gSpan algorithm is applied to identify frequent sub-trees. As a result the images are expressed as a set frequent sub-trees. There may be a great many of these and thus a decision tree based feature reduction technique is applied before classification takes place. The results show that the proposed approach performs both efficiently and effectively, obtaining a classification accuracy of over 95% in the case of the given application.

  2. Classification of hyperlipidaemias and hyperlipoproteinaemias*

    PubMed Central

    1970-01-01

    Many studies of atherosclerosis have indicated hyperlipidaemia as a predisposing factor to vascular disease. The relationship holds even for mild degrees of hyperlipidaemia, a fact that underlines the importance of this category of disorders. Both primary and secondary hyperlipidaemias represent such a variety of abnormalities that an internationally acceptable provisional classification is highly desirable in order to facilitate communication between scientists with different backgrounds. The present memorandum presents such a classification; it briefly describes the criteria for diagnosis of the main types of hyperlipidaemia as well as the methods of their determination. Because lipoproteins offer more information than analysis of plasma lipids (most of the plasma lipids being bound to various proteins), the classification is based on lipoprotein analyses by electrophoresis and ultracentrifugation. Simpler methods, however, such as the observation of plasma and measurements of cholesterol and triglycerides, are used to the fullest possible extent in determining the lipoprotein patterns. PMID:4930042

  3. The revised classification of eukaryotes.

    PubMed

    Adl, Sina M; Simpson, Alastair G B; Lane, Christopher E; Lukeš, Julius; Bass, David; Bowser, Samuel S; Brown, Matthew W; Burki, Fabien; Dunthorn, Micah; Hampl, Vladimir; Heiss, Aaron; Hoppenrath, Mona; Lara, Enrique; Le Gall, Line; Lynn, Denis H; McManus, Hilary; Mitchell, Edward A D; Mozley-Stanridge, Sharon E; Parfrey, Laura W; Pawlowski, Jan; Rueckert, Sonja; Shadwick, Laura; Shadwick, Lora; Schoch, Conrad L; Smirnov, Alexey; Spiegel, Frederick W

    2012-09-01

    This revision of the classification of eukaryotes, which updates that of Adl et al. [J. Eukaryot. Microbiol. 52 (2005) 399], retains an emphasis on the protists and incorporates changes since 2005 that have resolved nodes and branches in phylogenetic trees. Whereas the previous revision was successful in re-introducing name stability to the classification, this revision provides a classification for lineages that were then still unresolved. The supergroups have withstood phylogenetic hypothesis testing with some modifications, but despite some progress, problematic nodes at the base of the eukaryotic tree still remain to be statistically resolved. Looking forward, subsequent transformations to our understanding of the diversity of life will be from the discovery of novel lineages in previously under-sampled areas and from environmental genomic information.

  4. Hazard classification of chemicals inducing haemolytic anaemia: An EU regulatory perspective.

    PubMed

    Muller, Andre; Jacobsen, Helene; Healy, Edel; McMickan, Sinead; Istace, Fréderique; Blaude, Marie-Noëlle; Howden, Peter; Fleig, Helmut; Schulte, Agnes

    2006-08-01

    Haemolytic anaemia is often induced following prolonged exposure to chemical substances. Currently, under EU Council Directive 67/548/EEC, substances which induce such effects are classified as dangerous and assigned the risk phrase R48 'Danger of serious damage to health by prolonged exposure.' Whilst the general classification criteria for this endpoint are outlined in Annex VI of this Directive, they do not provide specific information to assess haemolytic anaemia. This review produced by the EU Working Group on Haemolytic Anaemia provides a toxicological assessment of haemolytic anaemia and proposes criteria that can be used in the assessment for classification of substances which induce such effects. An overview of the primary and secondary effects of haemolytic anaemia which can occur in rodent repeated dose toxicity studies is given. A detailed analysis of the toxicological significance of such effects is then performed and correlated with the general classification criteria used for this endpoint. This review intends to give guidance when carrying out an assessment for classification for this endpoint and to allow for better transparency in the decision-making process on when to classify based on the presence of haemolytic anaemia in repeated dose toxicity studies. The extended classification criteria for haemolytic anaemia outlined in this review were accepted by the EU Commission Working Group on the Classification and Labelling of Dangerous Substances in September 2004.

  5. The diagnosis and classification of the cryoglobulinemic syndrome.

    PubMed

    Damoiseaux, Jan

    2014-01-01

    Cryoglobulinemia refers to the presence of reversible precipitation of immunoglobulins in the blood upon exposure to reduced (body) temperature. In clinical practice, the detection and typing of cryoglobulins appear to be poorly standardized. A consensus protocol for detection and typing of cryoglobulins, as extracted from the literature, is presented. Items that require further standardization are discussed. Cryoglobulins may cause clinical symptoms due to either obstruction of the small blood vessels, in particular in the extremities that are more exposed to the cold, or vascular inflammation due to the deposition of immune complexes. Only for the latter situation, i.e. cryoglobulinemic vasculitis in the presence of mixed-type cryoglobulinemia, preliminary classification criteria have been defined. Besides the presence of cryoglobulins, at least two of three items have to be present. These items include a restricted number of self-reported clinical history, typical clinical manifestations, and specified laboratory findings. These classification criteria await further validation in an independent patient cohort.

  6. Accurate classification of 29 objects detected in the 39 month Palermo Swift/BAT hard X-ray catalogue

    NASA Astrophysics Data System (ADS)

    Parisi, P.; Masetti, N.; Jiménez-Bailón, E.; Chavushyan, V.; Palazzi, E.; Landi, R.; Malizia, A.; Bassani, L.; Bazzano, A.; Bird, A. J.; Charles, P. A.; Galaz, G.; Mason, E.; McBride, V. A.; Minniti, D.; Morelli, L.; Schiavone, F.; Ubertini, P.

    2012-09-01

    Through an optical campaign performed at four telescopes located in the northern and the southern hemispheres, plus archival data from two on-line sky surveys, we obtained optical spectroscopy for 29 counterparts of unclassified or poorly studied hard X-ray emitting objects detected with Swift /Burst Alert Telescope (BAT) and listed in the 39 month Palermo catalogue. All these objects also have observations taken with Swift /X-ray Telescope (XRT) or XMM-European Photon Imaging Camera (EPIC) which not only allow us to pinpoint their optical counterpart, but also study their X-ray spectral properties (column density, power law photon index, and F2-10 keV flux). We find that 28 sources in our sample are active galactic nuclei (AGNs); 7 are classified as type 1, while 21 are of type 2; the remaining object is a Galactic cataclysmic variable. Among our type 1 AGNs, we find 5 objects of intermediate Seyfert type (1.2-1.9) and one narrow-line Seyfert 1 galaxy; for 4 out of 7 sources, we are able to estimate the central black hole mass. Three of the type 2 AGNs of our sample display optical features typical of low-ionization nuclear emission-line regions (LINER) and one is a likely Compton thick AGN. All galaxies classified in this work are relatively nearby objects since their redshifts lie in the range 0.008-0.075; the only Galactic object found lies at an estimated distance of 90 pc. We also investigate the optical versus X-ray emission ratio of the galaxies of our sample to test the AGN unified model. For these galaxies, we also compare the X-ray absorption (caused by gas) with the optical reddening (caused by dust): we find that for most of our sources, specifically those of type 1.9-2.0 the former is higher than the latter confirming early results of Maiolino and collaborators; this is possibly due to the properties of dust in the circumnuclear obscuring torus of the AGN. Based on observations obtained from the following observatories: the Astronomical Observatory of Bologna in Loiano (Italy), ESO-La Silla Observatory (Chile) under programme 083.D-0110, Observatorio Astronómico Nacional (San Pedro Mártir, Mexico), and the South African Astronomical Observatory (South Africa).The spectra are only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/545/A101

  7. DEMONSTRATION OF HUMAN EXPOSURE TOOLS

    EPA Science Inventory

    The Human Exposure and Atmospheric Sciences Division (HEASD) of the National Exposure Research Laboratory (NERL) conducts research on exposure measurements, human activity patterns, exposure and dose models, and cumulative exposures critical for the Agency to make scientificall...

  8. A classification of ecological boundaries

    USGS Publications Warehouse

    Strayer, D.L.; Power, M.E.; Fagan, W.F.; Pickett, S.T.A.; Belnap, J.

    2003-01-01

    Ecologists use the term boundary to refer to a wide range of real and conceptual structures. Because imprecise terminology may impede the search for general patterns and theories about ecological boundaries, we present a classification of the attributes of ecological boundaries to aid in communication and theory development. Ecological boundaries may differ in their origin and maintenance, their spatial structure, their function, and their temporal dynamics. A classification system based on these attributes should help ecologists determine whether boundaries are truly comparable. This system can be applied when comparing empirical studies, comparing theories, and testing theoretical predictions against empirical results.

  9. Deep learning for image classification

    NASA Astrophysics Data System (ADS)

    McCoppin, Ryan; Rizki, Mateen

    2014-06-01

    This paper provides an overview of deep learning and introduces the several subfields of deep learning including a specific tutorial of convolutional neural networks. Traditional methods for learning image features are compared to deep learning techniques. In addition, we present our preliminary classification results, our basic implementation of a convolutional restricted Boltzmann machine on the Mixed National Institute of Standards and Technology database (MNIST), and we explain how to use deep learning networks to assist in our development of a robust gender classification system.

  10. 78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-09

    ... submitted for quality verification meet more restrictive quality requirements and age parameters set by ICE... ``registration'' by the U.S. cotton industry and the Intercontinental Exchange (ICE). In response to requests from the U.S. cotton industry and ICE, AMS proposes to offer a futures classification option...

  11. Comparison of linear, nonlinear, and feature selection methods for EEG signal classification.

    PubMed

    Garrett, Deon; Peterson, David A; Anderson, Charles W; Thaut, Michael H

    2003-06-01

    The reliable operation of brain-computer interfaces (BCIs) based on spontaneous electroencephalogram (EEG) signals requires accurate classification of multichannel EEG. The design of EEG representations and classifiers for BCI are open research questions whose difficulty stems from the need to extract complex spatial and temporal patterns from noisy multidimensional time series obtained from EEG measurements. The high-dimensional and noisy nature of EEG may limit the advantage of nonlinear classification methods over linear ones. This paper reports the results of a linear (linear discriminant analysis) and two nonlinear classifiers (neural networks and support vector machines) applied to the classification of spontaneous EEG during five mental tasks, showing that nonlinear classifiers produce only slightly better classification results. An approach to feature selection based on genetic algorithms is also presented with preliminary results of application to EEG during finger movement.

  12. A comparative study on manifold learning of hyperspectral data for land cover classification

    NASA Astrophysics Data System (ADS)

    Ozturk, Ceyda Nur; Bilgin, Gokhan

    2015-03-01

    This paper focuses on the land cover classification problem by employing a number of manifold learning algorithms in the feature extraction phase, then by running single and ensemble of classifiers in the modeling phase. Manifolds are learned on training samples selected randomly within available data, while the transformation of the remaining test samples is realized for linear and nonlinear methods via the learnt mappings and a radial-basis function neural network based interpolation method, respectively. The classification accuracies of the original data and the embedded manifolds are investigated with several classifiers. Experimental results on a 200-band hyperspectral image indicated that support vector machine was the best classifier for most of the methods, being nearly as accurate as the best classification rate of the original data. Furthermore, our modified version of random subspace classifier could even outperform the classification accuracy of the original data for local Fisher's discriminant analysis method despite of a considerable decrease in the extrinsic dimension.

  13. Joint Hierarchical Category Structure Learning and Large-Scale Image Classification.

    PubMed

    Qu, Yanyun; Lin, Li; Shen, Fumin; Lu, Chang; Wu, Yang; Xie, Yuan; Tao, Dacheng

    2016-10-05

    We investigate the scalable image classification problem with a large number of categories. Hierarchical visual data structures are helpful for improving the efficiency and performance of large-scale multi-class classification. We propose a novel image classification method based on learning hierarchical interclass structures. Specifically, we first design a fast algorithm to compute the similarity metric between categories, based on which a visual tree is constructed by hierarchical spectral clustering. Using the learned visual tree, a test sample label is efficiently predicted by searching for the best path over the entire tree. The proposed method is extensively evaluated on the ILSVRC2010 and Caltech 256 benchmark datasets. Experimental results show that our method obtains significantly better category hierarchies than other state-of-the-art visual tree-based methods and, therefore, much more accurate classification.

  14. Marker-Based Hierarchical Segmentation and Classification Approach for Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Tarabalka, Yuliya; Tilton, James C.; Benediktsson, Jon Atli; Chanussot, Jocelyn

    2011-01-01

    The Hierarchical SEGmentation (HSEG) algorithm, which is a combination of hierarchical step-wise optimization and spectral clustering, has given good performances for hyperspectral image analysis. This technique produces at its output a hierarchical set of image segmentations. The automated selection of a single segmentation level is often necessary. We propose and investigate the use of automatically selected markers for this purpose. In this paper, a novel Marker-based HSEG (M-HSEG) method for spectral-spatial classification of hyperspectral images is proposed. First, pixelwise classification is performed and the most reliably classified pixels are selected as markers, with the corresponding class labels. Then, a novel constrained marker-based HSEG algorithm is applied, resulting in a spectral-spatial classification map. The experimental results show that the proposed approach yields accurate segmentation and classification maps, and thus is attractive for hyperspectral image analysis.

  15. EXPOSURE FACTORS RESEARCH | Science Inventory | US ...

    EPA Pesticide Factsheets

    The Exposure Factors Handbook provides a summary of the available statistical data on various factors used in assessing human exposure including drinking water consumption, soil ingestion rates, inhalation rates, dermal factors including skin area and soil adherence factors, consumption of fruits and vegetables, fish, meats, dairy products, homegrown foods, breast milk intake, human activity factors, consumer product use, and residential characteristics. The development of the Exposure Factors Handbook (EPA/600/P-95/002Fa-Fc) has brought to light the importance of the availability of the most up-to-date and accurate data on exposure factors used in assessing exposure to contaminants in the environment. During the preparation of the Exposure Factors Handbook, many data gaps were identified. In order to fulfill those data gaps further research is being conducted. Results from this research will assist in keeping the Handbook updated.There are 8 tasks under this project area.1. Analysis of food consumption using 1994-96 CSFII USDA food consumption survey to update food consumption chapter in the EFH.2. Chemical and statistical analysis of data from a soil ingestion study in Washington state to develop better estimates of soil ingestion by both adults and children.3. Develop methodology to fit distributions to the data for selected factors in the Exposure Factors Handbook to be used in probabilistic assessments. 4. Conduct analysis of fat intake data using th

  16. Comparison of two Classification methods (MLC and SVM) to extract land use and land cover in Johor Malaysia

    NASA Astrophysics Data System (ADS)

    Rokni Deilmai, B.; Ahmad, B. Bin; Zabihi, H.

    2014-06-01

    Mapping is essential for the analysis of the land use and land cover, which influence many environmental processes and properties. For the purpose of the creation of land cover maps, it is important to minimize error. These errors will propagate into later analyses based on these land cover maps. The reliability of land cover maps derived from remotely sensed data depends on an accurate classification. In this study, we have analyzed multispectral data using two different classifiers including Maximum Likelihood Classifier (MLC) and Support Vector Machine (SVM). To pursue this aim, Landsat Thematic Mapper data and identical field-based training sample datasets in Johor Malaysia used for each classification method, which results indicate in five land cover classes forest, oil palm, urban area, water, rubber. Classification results indicate that SVM was more accurate than MLC. With demonstrated capability to produce reliable cover results, the SVM methods should be especially useful for land cover classification.

  17. [Algorithm for assessment of exposure to asbestos].

    PubMed

    Martines, V; Fioravanti, M; Anselmi, A; Attili, F; Battaglia, D; Cerratti, D; Ciarrocca, M; D'Amelio, R; De Lorenzo, G; Ferrante, E; Gaudioso, F; Mascia, E; Rauccio, A; Siena, S; Palitti, T; Tucci, L; Vacca, D; Vigliano, R; Zelano, V; Tomei, F; Sancini, A

    2010-01-01

    There is no universally approved method in the scientific literature to identify subjects exposed to asbestos and divide them in classes according to intensity of exposure. The aim of our work is to study and develope an algorithm based on the findings of occupational anamnestical information provided by a large group of workers. The algorithm allows to discriminate, in a probabilistic way, the risk of exposure by the attribution of a code for each worker (ELSA Code--work estimated exposure to asbestos). The ELSA code has been obtained through a synthesis of information that the international scientific literature identifies as the most predictive for the onset of asbestos-related abnormalities. Four dimensions are analyzed and described: 1) present and/or past occupation; 2) type of materials and equipment used in performing working activity; 3) environment where these activities are carried out; 4) period of time when activities are performed. Although it is possible to have informations in a subjective manner, the decisional procedure is objective and is based on the systematic evaluation of asbestos exposure. From the combination of the four identified dimensions it is possible to have 108 ELSA codes divided in three typological profiles of estimated risk of exposure. The application of the algorithm offers some advantages compared to other methods used for identifying individuals exposed to asbestos: 1) it can be computed both in case of present and past exposure to asbestos; 2) the classification of workers exposed to asbestos using ELSA code is more detailed than the one we have obtained with Job Exposure Matrix (JEM) because the ELSA Code takes in account other indicators of risk besides those considered in the JEM. This algorithm was developed for a project sponsored by the Italian Armed Forces and is also adaptable to other work conditions for in which it could be necessary to assess risk for asbestos exposure.

  18. A New Item Selection Procedure for Mixed Item Type in Computerized Classification Testing.

    ERIC Educational Resources Information Center

    Lau, C. Allen; Wang, Tianyou

    This paper proposes a new Information-Time index as the basis for item selection in computerized classification testing (CCT) and investigates how this new item selection algorithm can help improve test efficiency for item pools with mixed item types. It also investigates how practical constraints such as item exposure rate control, test…

  19. Raster Vs. Point Cloud LiDAR Data Classification

    NASA Astrophysics Data System (ADS)

    El-Ashmawy, N.; Shaker, A.

    2014-09-01

    Airborne Laser Scanning systems with light detection and ranging (LiDAR) technology is one of the fast and accurate 3D point data acquisition techniques. Generating accurate digital terrain and/or surface models (DTM/DSM) is the main application of collecting LiDAR range data. Recently, LiDAR range and intensity data have been used for land cover classification applications. Data range and Intensity, (strength of the backscattered signals measured by the LiDAR systems), are affected by the flying height, the ground elevation, scanning angle and the physical characteristics of the objects surface. These effects may lead to uneven distribution of point cloud or some gaps that may affect the classification process. Researchers have investigated the conversion of LiDAR range point data to raster image for terrain modelling. Interpolation techniques have been used to achieve the best representation of surfaces, and to fill the gaps between the LiDAR footprints. Interpolation methods are also investigated to generate LiDAR range and intensity image data for land cover classification applications. In this paper, different approach has been followed to classifying the LiDAR data (range and intensity) for land cover mapping. The methodology relies on the classification of the point cloud data based on their range and intensity and then converted the classified points into raster image. The gaps in the data are filled based on the classes of the nearest neighbour. Land cover maps are produced using two approaches using: (a) the conventional raster image data based on point interpolation; and (b) the proposed point data classification. A study area covering an urban district in Burnaby, British Colombia, Canada, is selected to compare the results of the two approaches. Five different land cover classes can be distinguished in that area: buildings, roads and parking areas, trees, low vegetation (grass), and bare soil. The results show that an improvement of around 10 % in the

  20. Extraction and classification of objects in multispectral images

    NASA Technical Reports Server (NTRS)

    Robertson, T. V.

    1973-01-01

    Presented here is an algorithm that partitions a digitized multispectral image into parts that correspond to objects in the scene being sensed. The algorithm partitions an image into successively smaller rectangles and produces a partition that tends to minimize a criterion function. Supervised and unsupervised classification techniques can be applied to partitioned images. This partition-then-classify approach is used to process images sensed from aircraft and the ERTS-1 satellite, and the method is shown to give relatively accurate results in classifying agricultural areas and extracting urban areas.

  1. Evaluation of AMOEBA: a spectral-spatial classification method

    USGS Publications Warehouse

    Jenson, Susan K.; Loveland, Thomas R.; Bryant, J.

    1982-01-01

    Muitispectral remotely sensed images have been treated as arbitrary multivariate spectral data for purposes of clustering and classifying. However, the spatial properties of image data can also be exploited. AMOEBA is a clustering and classification method that is based on a spatially derived model for image data. In an evaluation test, Landsat data were classified with both AMOEBA and a widely used spectral classifier. The test showed that irrigated crop types can be classified as accurately with the AMOEBA method as with the generally used spectral method ISOCLS; the AMOEBA method, however, requires less computer time.

  2. Supervised ensemble classification of Kepler variable stars

    NASA Astrophysics Data System (ADS)

    Bass, G.; Borne, K.

    2016-07-01

    Variable star analysis and classification is an important task in the understanding of stellar features and processes. While historically classifications have been done manually by highly skilled experts, the recent and rapid expansion in the quantity and quality of data has demanded new techniques, most notably automatic classification through supervised machine learning. We present an expansion of existing work on the field by analysing variable stars in the Kepler field using an ensemble approach, combining multiple characterization and classification techniques to produce improved classification rates. Classifications for each of the roughly 150 000 stars observed by Kepler are produced separating the stars into one of 14 variable star classes.

  3. Classification of forest land attributes using multi-source remotely sensed data

    NASA Astrophysics Data System (ADS)

    Pippuri, Inka; Suvanto, Aki; Maltamo, Matti; Korhonen, Kari T.; Pitkänen, Juho; Packalen, Petteri

    2016-02-01

    The aim of the study was to (1) examine the classification of forest land using airborne laser scanning (ALS) data, satellite images and sample plots of the Finnish National Forest Inventory (NFI) as training data and to (2) identify best performing metrics for classifying forest land attributes. Six different schemes of forest land classification were studied: land use/land cover (LU/LC) classification using both national classes and FAO (Food and Agricultural Organization of the United Nations) classes, main type, site type, peat land type and drainage status. Special interest was to test different ALS-based surface metrics in classification of forest land attributes. Field data consisted of 828 NFI plots collected in 2008-2012 in southern Finland and remotely sensed data was from summer 2010. Multinomial logistic regression was used as the classification method. Classification of LU/LC classes were highly accurate (kappa-values 0.90 and 0.91) but also the classification of site type, peat land type and drainage status succeeded moderately well (kappa-values 0.51, 0.69 and 0.52). ALS-based surface metrics were found to be the most important predictor variables in classification of LU/LC class, main type and drainage status. In best classification models of forest site types both spectral metrics from satellite data and point cloud metrics from ALS were used. In turn, in the classification of peat land types ALS point cloud metrics played the most important role. Results indicated that the prediction of site type and forest land category could be incorporated into stand level forest management inventory system in Finland.

  4. Indoor transformer stations and ELF magnetic field exposure: use of transformer structural characteristics to improve exposure assessment.

    PubMed

    Okokon, Enembe Oku; Roivainen, Päivi; Kheifets, Leeka; Mezei, Gabor; Juutilainen, Jukka

    2014-01-01

    Previous studies have shown that populations of multiapartment buildings with indoor transformer stations may serve as a basis for improved epidemiological studies on the relationship between childhood leukaemia and extremely-low-frequency (ELF) magnetic fields (MFs). This study investigated whether classification based on structural characteristics of the transformer stations would improve ELF MF exposure assessment. The data included MF measurements in apartments directly above transformer stations ("exposed" apartments) in 30 buildings in Finland, and reference apartments in the same buildings. Transformer structural characteristics (type and location of low-voltage conductors) were used to classify exposed apartments into high-exposure (HE) and intermediate-exposure (IE) categories. An exposure gradient was observed: both the time-average MF and time above a threshold (0.4 μT) were highest in the HE apartments and lowest in the reference apartments, showing a statistically significant trend. The differences between HE and IE apartments, however, were not statistically significant. A simulation exercise showed that the three-category classification did not perform better than a two-category classification (exposed and reference apartments) in detecting the existence of an increased risk. However, data on the structural characteristics of transformers is potentially useful for evaluating exposure-response relationship.

  5. Ranking of predictor variables based on effect size criterion provides an accurate means of automatically classifying opinion column articles

    NASA Astrophysics Data System (ADS)

    Legara, Erika Fille; Monterola, Christopher; Abundo, Cheryl

    2011-01-01

    We demonstrate an accurate procedure based on linear discriminant analysis that allows automatic authorship classification of opinion column articles. First, we extract the following stylometric features of 157 column articles from four authors: statistics on high frequency words, number of words per sentence, and number of sentences per paragraph. Then, by systematically ranking these features based on an effect size criterion, we show that we can achieve an average classification accuracy of 93% for the test set. In comparison, frequency size based ranking has an average accuracy of 80%. The highest possible average classification accuracy of our data merely relying on chance is ∼31%. By carrying out sensitivity analysis, we show that the effect size criterion is superior than frequency ranking because there exist low frequency words that significantly contribute to successful author discrimination. Consistent results are seen when the procedure is applied in classifying the undisputed Federalist papers of Alexander Hamilton and James Madison. To the best of our knowledge, the work is the first attempt in classifying opinion column articles, that by virtue of being shorter in length (as compared to novels or short stories), are more prone to over-fitting issues. The near perfect classification for the longer papers supports this claim. Our results provide an important insight on authorship attribution that has been overlooked in previous studies: that ranking discriminant variables based on word frequency counts is not necessarily an optimal procedure.

  6. Classification of luminaire color using CCDs

    NASA Astrophysics Data System (ADS)

    McMenemy, Karen; Niblock, James

    2006-02-01

    The International Civil Aviation Organization (ICAO) is the regulatory body for Airports. ICAO standards dictate that luminaires used within an airport landing lighting pattern must have a color as defined within the 1931 color chart defined by the Commission Internationale De L'Eclairage (CIE). Currently, visual checks are used to ensure luminaires are operating at the right color within the pattern. That is, during an approach to an airport the pilot must visually check each luminaire within the landing pattern. These visual tests are combined with on the spot meter reading tests. This method is not accurate and it is impossible to assess the color of every luminaire. This paper presents a novel, automated method for assessing the color of luminaires using low cost single chip CCD video camera technology. Images taken from a camera placed within an aircraft cockpit during a normal approach to an airport are post-processed to determine the color of each luminaire in the pattern. In essence, the pixel coverage and total RGB level for each luminaire within the pattern must be extracted and tracked throughout the complete image sequence and an average RGB value used to predict the luminaire color. This prediction is based on a novel pixel model which was derived to determine the minimum pixel coverage required to accurately predict the color of an imaged luminaire. Analysis of how many pixels are required for color recognition and position within a CIE color chart is given and proved empirically. From the analysis it is found that, a minimum diameter of four pixels is required for color recognition of the major luminaires types within the airport landing pattern. The number of pixels required for classification of the color is then derived. This is important as the luminaries are far away when imaged and may cover only a few pixels since a large area must be viewed by the camera. The classification is then verified by laboratory based experiments with different

  7. Influence of nuclei segmentation on breast cancer malignancy classification

    NASA Astrophysics Data System (ADS)

    Jelen, Lukasz; Fevens, Thomas; Krzyzak, Adam

    2009-02-01

    Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides and its influence on malignancy classification. Classification of malignancy plays a very important role during the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable information about the cancer malignancy grade which helps to choose an appropriate treatment. This process involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important. In this work we compare three powerful segmentation approaches and test their impact on the classification of breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation and textural segmentation based on co-occurrence matrix. Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes four different classifiers were trained and tested with previously extracted features. The compared classifiers are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network (PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the best results over the three compared approaches and leads to a good feature extraction with a lowest average error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron with an error rate of 3.07% using fuzzy c-means segmentation.

  8. Classification of Birds and Bats Using Flight Tracks

    SciTech Connect

    Cullinan, Valerie I.; Matzner, Shari; Duberstein, Corey A.

    2015-05-01

    Classification of birds and bats that use areas targeted for offshore wind farm development and the inference of their behavior is essential to evaluating the potential effects of development. The current approach to assessing the number and distribution of birds at sea involves transect surveys using trained individuals in boats or airplanes or using high-resolution imagery. These approaches are costly and have safety concerns. Based on a limited annotated library extracted from a single-camera thermal video, we provide a framework for building models that classify birds and bats and their associated behaviors. As an example, we developed a discriminant model for theoretical flight paths and applied it to data (N = 64 tracks) extracted from 5-min video clips. The agreement between model- and observer-classified path types was initially only 41%, but it increased to 73% when small-scale jitter was censored and path types were combined. Classification of 46 tracks of bats, swallows, gulls, and terns on average was 82% accurate, based on a jackknife cross-validation. Model classification of bats and terns (N = 4 and 2, respectively) was 94% and 91% correct, respectively; however, the variance associated with the tracks from these targets is poorly estimated. Model classification of gulls and swallows (N ≥ 18) was on average 73% and 85% correct, respectively. The models developed here should be considered preliminary because they are based on a small data set both in terms of the numbers of species and the identified flight tracks. Future classification models would be greatly improved by including a measure of distance between the camera and the target.

  9. Parkinson's disease classification using gait analysis via deterministic learning.

    PubMed

    Zeng, Wei; Liu, Fenglin; Wang, Qinghui; Wang, Ying; Ma, Limin; Zhang, Yu

    2016-10-28

    Gait analysis plays an important role in maintaining the well-being of human mobility and health care, and is a valuable tool for obtaining quantitative information on motor deficits in Parkinson's disease (PD). In this paper, we propose a method to classify (diagnose) patients with PD and healthy control subjects using gait analysis via deterministic learning theory. The classification approach consists of two phases: a training phase and a classification phase. In the training phase, gait characteristics represented by the gait dynamics are derived from the vertical ground reaction forces under the usual and self-selected paces of the subjects. The gait dynamics underlying gait patterns of healthy controls and PD patients are locally accurately approximated by radial basis function (RBF) neural networks. The obtained knowledge of approximated gait dynamics is stored in constant RBF networks. The gait patterns of healthy controls and PD patients constitute a training set. In the classification phase, a bank of dynamical estimators is constructed for all the training gait patterns. Prior knowledge of gait dynamics represented by the constant RBF networks is embedded in the estimators. By comparing the set of estimators with a test gait pattern of a certain PD patient to be classified (diagnosed), a set of classification errors are generated. The average L1 norms of the errors are taken as the classification measure between the dynamics of the training gait patterns and the dynamics of the test PD gait pattern according to the smallest error principle. When the gait patterns of 93 PD patients and 73 healthy controls are classified with five-fold cross-validation method, the accuracy, sensitivity and specificity of the results are 96.39%, 96.77% and 95.89%, respectively. Based on the results, it may be claimed that the features and the classifiers used in the present study could effectively separate the gait patterns between the groups of PD patients and healthy

  10. Youth activity spaces and daily exposure to tobacco outlets.

    PubMed

    Lipperman-Kreda, Sharon; Morrison, Christopher; Grube, Joel W; Gaidus, Andrew

    2015-07-01

    We explored whether exposure to tobacco outlets in youths' broader activity spaces differs from that obtained using traditional geographic measures of exposure to tobacco outlet within buffers around homes and schools. Youths completed an initial survey, daily text-prompted surveys, and carried GPS-enabled phones for one week. GPS locations were geocoded and activity spaces were constructed by joining sequential points. We calculated the number of tobacco outlets around these polylines and around homes and schools. Results suggest that activity spaces provide a more accurate measure of tobacco outlet exposures than traditional measures. Assessing tobacco outlet exposure within activity spaces may yield significant information to advance the field.

  11. Fusion of Hyperspectral and Vhr Multispectral Image Classifications in Urban Areas

    NASA Astrophysics Data System (ADS)

    Hervieu, Alexandre; Le Bris, Arnaud; Mallet, Clément

    2016-06-01

    An energetical approach is proposed for classification decision fusion in urban areas using multispectral and hyperspectral imagery at distinct spatial resolutions. Hyperspectral data provides a great ability to discriminate land-cover classes while multispectral data, usually at higher spatial resolution, makes possible a more accurate spatial delineation of the classes. Hence, the aim here is to achieve the most accurate classification maps by taking advantage of both data sources at the decision level: spectral properties of the hyperspectral data and the geometrical resolution of multispectral images. More specifically, the proposed method takes into account probability class membership maps in order to improve the classification fusion process. Such probability maps are available using standard classification techniques such as Random Forests or Support Vector Machines. Classification probability maps are integrated into an energy framework where minimization of a given energy leads to better classification maps. The energy is minimized using a graph-cut method called quadratic pseudo-boolean optimization (QPBO) with ?-expansion. A first model is proposed that gives satisfactory results in terms of classification results and visual interpretation. This model is compared to a standard Potts models adapted to the considered problem. Finally, the model is enhanced by integrating the spatial contrast observed in the data source of higher spatial resolution (i.e., the multispectral image). Obtained results using the proposed energetical decision fusion process are shown on two urban multispectral/hyperspectral datasets. 2-3% improvement is noticed with respect to a Potts formulation and 3-8% compared to a single hyperspectral-based classification.

  12. Global classification and coding of hypersensitivity diseases - An EAACI - WAO survey, strategic paper and review.

    PubMed

    Demoly, P; Tanno, L K; Akdis, C A; Lau, S; Calderon, M A; Santos, A F; Sanchez-Borges, M; Rosenwasser, L J; Pawankar, R; Papadopoulos, N G

    2014-05-01

    Hypersensitivity diseases are not adequately coded in the International Coding of Diseases (ICD)-10 resulting in misclassification, leading to low visibility of these conditions and general accuracy of official statistics. To call attention to the inadequacy of the ICD-10 in relation to allergic and hypersensitivity diseases and to contribute to improvements to be made in the forthcoming revision of ICD, a web-based global survey of healthcare professionals' attitudes toward allergic disorders classification was proposed to the members of European Academy of Allergy and Clinical Immunology (EAACI) (individuals) and World Allergy Organization (WAO) (representative responding on behalf of the national society), launched via internet and circulated for 6 week. As a result, we had 612 members of 144 countries from all six World Health Organization (WHO) global regions who answered the survey. ICD-10 is the most used classification worldwide, but it was not considered appropriate in clinical practice by the majority of participants. The majority indicated the EAACI-WAO classification as being easier and more accurate in the daily practice. They saw the need for a diagnostic system useful for nonallergists and endorsed the possibility of a global, cross-culturally applicable classification system of allergic disorders. This first and most broadly international survey ever conducted of health professionals' attitudes toward allergic disorders classification supports the need to update the current classifications of allergic diseases and can be useful to the WHO in improving the clinical utility of the classification and its global acceptability for the revised ICD-11.

  13. Learning ECOC Code Matrix for Multiclass Classification with Application to Glaucoma Diagnosis.

    PubMed

    Bai, Xiaolong; Niwas, Swamidoss Issac; Lin, Weisi; Ju, Bing-Feng; Kwoh, Chee Keong; Wang, Lipo; Sng, Chelvin C; Aquino, Maria C; Chew, Paul T K

    2016-04-01

    Classification of different mechanisms of angle closure glaucoma (ACG) is important for medical diagnosis. Error-correcting output code (ECOC) is an effective approach for multiclass classification. In this study, we propose a new ensemble learning method based on ECOC with application to classification of four ACG mechanisms. The dichotomizers in ECOC are first optimized individually to increase their accuracy and diversity (or interdependence) which is beneficial to the ECOC framework. Specifically, the best feature set is determined for each possible dichotomizer and a wrapper approach is applied to evaluate the classification accuracy of each dichotomizer on the training dataset using cross-validation. The separability of the ECOC codes is maximized by selecting a set of competitive dichotomizers according to a new criterion, in which a regularization term is introduced in consideration of the binary classification performance of each selected dichotomizer. The proposed method is experimentally applied for classifying four ACG mechanisms. The eye images of 152 glaucoma patients are collected by using anterior segment optical coherence tomography (AS-OCT) and then segmented, from which 84 features are extracted. The weighted average classification accuracy of the proposed method is 87.65 % based on the results of leave-one-out cross-validation (LOOCV), which is much better than that of the other existing ECOC methods. The proposed method achieves accurate classification of four ACG mechanisms which is promising to be applied in diagnosis of glaucoma.

  14. Hyperspectral Image Classification via Multitask Joint Sparse Representation and Stepwise MRF Optimization.

    PubMed

    Yuan, Yuan; Lin, Jianzhe; Wang, Qi

    2016-12-01

    Hyperspectral image (HSI) classification is a crucial issue in remote sensing. Accurate classification benefits a large number of applications such as land use analysis and marine resource utilization. But high data correlation brings difficulty to reliable classification, especially for HSI with abundant spectral information. Furthermore, the traditional methods often fail to well consider the spatial coherency of HSI that also limits the classification performance. To address these inherent obstacles, a novel spectral-spatial classification scheme is proposed in this paper. The proposed method mainly focuses on multitask joint sparse representation (MJSR) and a stepwise Markov random filed framework, which are claimed to be two main contributions in this procedure. First, the MJSR not only reduces the spectral redundancy, but also retains necessary correlation in spectral field during classification. Second, the stepwise optimization further explores the spatial correlation that significantly enhances the classification accuracy and robustness. As far as several universal quality evaluation indexes are concerned, the experimental results on Indian Pines and Pavia University demonstrate the superiority of our method compared with the state-of-the-art competitors.

  15. A predictable and accurate technique with elastomeric impression materials.

    PubMed

    Barghi, N; Ontiveros, J C

    1999-08-01

    A method for obtaining more predictable and accurate final impressions with polyvinylsiloxane impression materials in conjunction with stock trays is proposed and tested. Heavy impression material is used in advance for construction of a modified custom tray, while extra-light material is used for obtaining a more accurate final impression.

  16. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  17. Functions in Biological Kind Classification

    ERIC Educational Resources Information Center

    Lombrozo, Tania; Rehder, Bob

    2012-01-01

    Biological traits that serve functions, such as a zebra's coloration (for camouflage) or a kangaroo's tail (for balance), seem to have a special role in conceptual representations for biological kinds. In five experiments, we investigate whether and why functional features are privileged in biological kind classification. Experiment 1…

  18. Discriminant Analysis for Content Classification.

    ERIC Educational Resources Information Center

    Williams, John H., Jr.

    A series of experiments was performed to investigate the effectiveness and utility of automatically classifying documents through the use of multiple discriminant functions. Classification is accomplished by computing the distance from the mean vector of each category to the vector of observed frequencies of a document and assigning the document…

  19. Value of Personnel Classification Information.

    ERIC Educational Resources Information Center

    Abellera, James W.; And Others

    The study outlines the development of a methodology for meaningfully estimating the value of classification information used by the Air Force to make selection and job assignment decisions which lead to the satisfaction of first-term enlisted manpower requirements. The methodology, called the optimal allocation strategy, is employed to solve a…

  20. Classification from PRECIS: Some Possibilities

    ERIC Educational Resources Information Center

    Richmond, Phyllis A.

    1976-01-01

    The PREserved Context Index System (PRECIS) developed for subject indexing for the British National Bibliography is discussed as a basis for various studies relating to classification which could be made from its initial phrases, strings, entries, and back-up structure. (Author)