Space Object Classification Using Fused Features of Time Series Data
NASA Astrophysics Data System (ADS)
Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.
In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.
Acosta-Mesa, Héctor-Gabriel; Rechy-Ramírez, Fernando; Mezura-Montes, Efrén; Cruz-Ramírez, Nicandro; Hernández Jiménez, Rodolfo
2014-06-01
In this work, we present a novel application of time series discretization using evolutionary programming for the classification of precancerous cervical lesions. The approach optimizes the number of intervals in which the length and amplitude of the time series should be compressed, preserving the important information for classification purposes. Using evolutionary programming, the search for a good discretization scheme is guided by a cost function which considers three criteria: the entropy regarding the classification, the complexity measured as the number of different strings needed to represent the complete data set, and the compression rate assessed as the length of the discrete representation. This discretization approach is evaluated using a time series data based on temporal patterns observed during a classical test used in cervical cancer detection; the classification accuracy reached by our method is compared with the well-known times series discretization algorithm SAX and the dimensionality reduction method PCA. Statistical analysis of the classification accuracy shows that the discrete representation is as efficient as the complete raw representation for the present application, reducing the dimensionality of the time series length by 97%. This representation is also very competitive in terms of classification accuracy when compared with similar approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Tissue classification using depth-dependent ultrasound time series analysis: in-vitro animal study
NASA Astrophysics Data System (ADS)
Imani, Farhad; Daoud, Mohammad; Moradi, Mehdi; Abolmaesumi, Purang; Mousavi, Parvin
2011-03-01
Time series analysis of ultrasound radio-frequency (RF) signals has been shown to be an effective tissue classification method. Previous studies of this method for tissue differentiation at high and clinical-frequencies have been reported. In this paper, analysis of RF time series is extended to improve tissue classification at the clinical frequencies by including novel features extracted from the time series spectrum. The primary feature examined is the Mean Central Frequency (MCF) computed for regions of interest (ROIs) in the tissue extending along the axial axis of the transducer. In addition, the intercept and slope of a line fitted to the MCF-values of the RF time series as a function of depth have been included. To evaluate the accuracy of the new features, an in vitro animal study is performed using three tissue types: bovine muscle, bovine liver, and chicken breast, where perfect two-way classification is achieved. The results show statistically significant improvements over the classification accuracies with previously reported features.
1980-12-05
classification procedures that are common in speech processing. The anesthesia level classification by EEG time series population screening problem example is in...formance. The use of the KL number type metric in NN rule classification, in a delete-one subj ect ’s EE-at-a-time KL-NN and KL- kNN classification of the...17 individual labeled EEG sample population using KL-NN and KL- kNN rules. The results obtained are shown in Table 1. The entries in the table indicate
NASA Astrophysics Data System (ADS)
Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi
This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.
New Features for Neuron Classification.
Hernández-Pérez, Leonardo A; Delgado-Castillo, Duniel; Martín-Pérez, Rainer; Orozco-Morales, Rubén; Lorenzo-Ginori, Juan V
2018-04-28
This paper addresses the problem of obtaining new neuron features capable of improving results of neuron classification. Most studies on neuron classification using morphological features have been based on Euclidean geometry. Here three one-dimensional (1D) time series are derived from the three-dimensional (3D) structure of neuron instead, and afterwards a spatial time series is finally constructed from which the features are calculated. Digitally reconstructed neurons were separated into control and pathological sets, which are related to three categories of alterations caused by epilepsy, Alzheimer's disease (long and local projections), and ischemia. These neuron sets were then subjected to supervised classification and the results were compared considering three sets of features: morphological, features obtained from the time series and a combination of both. The best results were obtained using features from the time series, which outperformed the classification using only morphological features, showing higher correct classification rates with differences of 5.15, 3.75, 5.33% for epilepsy and Alzheimer's disease (long and local projections) respectively. The morphological features were better for the ischemia set with a difference of 3.05%. Features like variance, Spearman auto-correlation, partial auto-correlation, mutual information, local minima and maxima, all related to the time series, exhibited the best performance. Also we compared different evaluators, among which ReliefF was the best ranked.
Monitoring of tissue ablation using time series of ultrasound RF data.
Imani, Farhad; Wu, Mark Z; Lasso, Andras; Burdette, Everett C; Daoud, Mohammad; Fitchinger, Gabor; Abolmaesumi, Purang; Mousavi, Parvin
2011-01-01
This paper is the first report on the monitoring of tissue ablation using ultrasound RF echo time series. We calcuate frequency and time domain features of time series of RF echoes from stationary tissue and transducer, and correlate them with ablated and non-ablated tissue properties. We combine these features in a nonlinear classification framework and demonstrate up to 99% classification accuracy in distinguishing ablated and non-ablated regions of tissue, in areas as small as 12mm2 in size. We also demonstrate significant improvement of ablated tissue classification using RF time series compared to the conventional approach of using single RF scan lines. The results of this study suggest RF echo time series as a promising approach for monitoring ablation, and capturing the changes in the tissue microstructure as a result of heat-induced necrosis.
Drunk driving detection based on classification of multivariate time series.
Li, Zhenlong; Jin, Xue; Zhao, Xiaohua
2015-09-01
This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Estimation of different data compositions for early-season crop type classification.
Hao, Pengyu; Wu, Mingquan; Niu, Zheng; Wang, Li; Zhan, Yulin
2018-01-01
Timely and accurate crop type distribution maps are an important inputs for crop yield estimation and production forecasting as multi-temporal images can observe phenological differences among crops. Therefore, time series remote sensing data are essential for crop type mapping, and image composition has commonly been used to improve the quality of the image time series. However, the optimal composition period is unclear as long composition periods (such as compositions lasting half a year) are less informative and short composition periods lead to information redundancy and missing pixels. In this study, we initially acquired daily 30 m Normalized Difference Vegetation Index (NDVI) time series by fusing MODIS, Landsat, Gaofen and Huanjing (HJ) NDVI, and then composited the NDVI time series using four strategies (daily, 8-day, 16-day, and 32-day). We used Random Forest to identify crop types and evaluated the classification performances of the NDVI time series generated from four composition strategies in two studies regions from Xinjiang, China. Results indicated that crop classification performance improved as crop separabilities and classification accuracies increased, and classification uncertainties dropped in the green-up stage of the crops. When using daily NDVI time series, overall accuracies saturated at 113-day and 116-day in Bole and Luntai, and the saturated overall accuracies (OAs) were 86.13% and 91.89%, respectively. Cotton could be identified 40∼60 days and 35∼45 days earlier than the harvest in Bole and Luntai when using daily, 8-day and 16-day composition NDVI time series since both producer's accuracies (PAs) and user's accuracies (UAs) were higher than 85%. Among the four compositions, the daily NDVI time series generated the highest classification accuracies. Although the 8-day, 16-day and 32-day compositions had similar saturated overall accuracies (around 85% in Bole and 83% in Luntai), the 8-day and 16-day compositions achieved these accuracies around 155-day in Bole and 133-day in Luntai, which were earlier than the 32-day composition (170-day in both Bole and Luntai). Therefore, when the daily NDVI time series cannot be acquired, the 16-day composition is recommended in this study.
Estimation of different data compositions for early-season crop type classification
Wu, Mingquan; Wang, Li; Zhan, Yulin
2018-01-01
Timely and accurate crop type distribution maps are an important inputs for crop yield estimation and production forecasting as multi-temporal images can observe phenological differences among crops. Therefore, time series remote sensing data are essential for crop type mapping, and image composition has commonly been used to improve the quality of the image time series. However, the optimal composition period is unclear as long composition periods (such as compositions lasting half a year) are less informative and short composition periods lead to information redundancy and missing pixels. In this study, we initially acquired daily 30 m Normalized Difference Vegetation Index (NDVI) time series by fusing MODIS, Landsat, Gaofen and Huanjing (HJ) NDVI, and then composited the NDVI time series using four strategies (daily, 8-day, 16-day, and 32-day). We used Random Forest to identify crop types and evaluated the classification performances of the NDVI time series generated from four composition strategies in two studies regions from Xinjiang, China. Results indicated that crop classification performance improved as crop separabilities and classification accuracies increased, and classification uncertainties dropped in the green-up stage of the crops. When using daily NDVI time series, overall accuracies saturated at 113-day and 116-day in Bole and Luntai, and the saturated overall accuracies (OAs) were 86.13% and 91.89%, respectively. Cotton could be identified 40∼60 days and 35∼45 days earlier than the harvest in Bole and Luntai when using daily, 8-day and 16-day composition NDVI time series since both producer’s accuracies (PAs) and user’s accuracies (UAs) were higher than 85%. Among the four compositions, the daily NDVI time series generated the highest classification accuracies. Although the 8-day, 16-day and 32-day compositions had similar saturated overall accuracies (around 85% in Bole and 83% in Luntai), the 8-day and 16-day compositions achieved these accuracies around 155-day in Bole and 133-day in Luntai, which were earlier than the 32-day composition (170-day in both Bole and Luntai). Therefore, when the daily NDVI time series cannot be acquired, the 16-day composition is recommended in this study. PMID:29868265
NASA Astrophysics Data System (ADS)
Eckert, Sandra
2016-08-01
The SPOT-5 Take 5 campaign provided SPOT time series data of an unprecedented spatial and temporal resolution. We analysed 29 scenes acquired between May and September 2015 of a semi-arid region in the foothills of Mount Kenya, with two aims: first, to distinguish rainfed from irrigated cropland and cropland from natural vegetation covers, which show similar reflectance patterns; and second, to identify individual crop types. We tested several input data sets in different combinations: the spectral bands and the normalized difference vegetation index (NDVI) time series, principal components of NDVI time series, and selected NDVI time series statistics. For the classification we used random forests (RF). In the test differentiating rainfed cropland, irrigated cropland, and natural vegetation covers, the best classification accuracies were achieved using spectral bands. For the differentiation of crop types, we analysed the phenology of selected crop types based on NDVI time series. First results are promising.
Featureless classification of light curves
NASA Astrophysics Data System (ADS)
Kügler, S. D.; Gianniotis, N.; Polsterer, K. L.
2015-08-01
In the era of rapidly increasing amounts of time series data, classification of variable objects has become the main objective of time-domain astronomy. Classification of irregularly sampled time series is particularly difficult because the data cannot be represented naturally as a vector which can be directly fed into a classifier. In the literature, various statistical features serve as vector representations. In this work, we represent time series by a density model. The density model captures all the information available, including measurement errors. Hence, we view this model as a generalization to the static features which directly can be derived, e.g. as moments from the density. Similarity between each pair of time series is quantified by the distance between their respective models. Classification is performed on the obtained distance matrix. In the numerical experiments, we use data from the OGLE (Optical Gravitational Lensing Experiment) and ASAS (All Sky Automated Survey) surveys and demonstrate that the proposed representation performs up to par with the best currently used feature-based approaches. The density representation preserves all static information present in the observational data, in contrast to a less-complete description by features. The density representation is an upper boundary in terms of information made available to the classifier. Consequently, the predictive power of the proposed classification depends on the choice of similarity measure and classifier, only. Due to its principled nature, we advocate that this new approach of representing time series has potential in tasks beyond classification, e.g. unsupervised learning.
Genetic programming and serial processing for time series classification.
Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I
2014-01-01
This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.
NASA Astrophysics Data System (ADS)
Cardille, J. A.; Lee, J.
2017-12-01
With the opening of the Landsat archive, there is a dramatically increased potential for creating high-quality time series of land use/land-cover (LULC) classifications derived from remote sensing. Although LULC time series are appealing, their creation is typically challenging in two fundamental ways. First, there is a need to create maximally correct LULC maps for consideration at each time step; and second, there is a need to have the elements of the time series be consistent with each other, without pixels that flip improbably between covers due only to unavoidable, stray classification errors. We have developed the Bayesian Updating of Land Cover - Unsupervised (BULC-U) algorithm to address these challenges simultaneously, and introduce and apply it here for two related but distinct purposes. First, with minimal human intervention, we produced an internally consistent, high-accuracy LULC time series in rapidly changing Mato Grosso, Brazil for a time interval (1986-2000) in which cropland area more than doubled. The spatial and temporal resolution of the 59 LULC snapshots allows users to witness the establishment of towns and farms at the expense of forest. The new time series could be used by policy-makers and analysts to unravel important considerations for conservation and management, including the timing and location of past development, the rate and nature of changes in forest connectivity, the connection with road infrastructure, and more. The second application of BULC-U is to sharpen the well-known GlobCover 2009 classification from 300m to 30m, while improving accuracy measures for every class. The greatly improved resolution and accuracy permits a better representation of the true LULC proportions, the use of this map in models, and quantification of the potential impacts of changes. Given that there may easily be thousands and potentially millions of images available to harvest for an LULC time series, it is imperative to build useful algorithms requiring minimal human intervention. Through image segmentation and classification, BULC-U allows us to use both the spectral and spatial characteristics of imagery to sharpen classifications and create time series. It is hoped that this study may allow us and other users of this new method to consider time series across ever larger areas.
Wu, Zi Yi; Xie, Ping; Sang, Yan Fang; Gu, Hai Ting
2018-04-01
The phenomenon of jump is one of the importantly external forms of hydrological variabi-lity under environmental changes, representing the adaption of hydrological nonlinear systems to the influence of external disturbances. Presently, the related studies mainly focus on the methods for identifying the jump positions and jump times in hydrological time series. In contrast, few studies have focused on the quantitative description and classification of jump degree in hydrological time series, which make it difficult to understand the environmental changes and evaluate its potential impacts. Here, we proposed a theatrically reliable and easy-to-apply method for the classification of jump degree in hydrological time series, using the correlation coefficient as a basic index. The statistical tests verified the accuracy, reasonability, and applicability of this method. The relationship between the correlation coefficient and the jump degree of series were described using mathematical equation by derivation. After that, several thresholds of correlation coefficients under different statistical significance levels were chosen, based on which the jump degree could be classified into five levels: no, weak, moderate, strong and very strong. Finally, our method was applied to five diffe-rent observed hydrological time series, with diverse geographic and hydrological conditions in China. The results of the classification of jump degrees in those series were closely accorded with their physically hydrological mechanisms, indicating the practicability of our method.
Classification of time-series images using deep convolutional neural networks
NASA Astrophysics Data System (ADS)
Hatami, Nima; Gavet, Yann; Debayle, Johan
2018-04-01
Convolutional Neural Networks (CNN) has achieved a great success in image recognition task by automatically learning a hierarchical feature representation from raw data. While the majority of Time-Series Classification (TSC) literature is focused on 1D signals, this paper uses Recurrence Plots (RP) to transform time-series into 2D texture images and then take advantage of the deep CNN classifier. Image representation of time-series introduces different feature types that are not available for 1D signals, and therefore TSC can be treated as texture image recognition task. CNN model also allows learning different levels of representations together with a classifier, jointly and automatically. Therefore, using RP and CNN in a unified framework is expected to boost the recognition rate of TSC. Experimental results on the UCR time-series classification archive demonstrate competitive accuracy of the proposed approach, compared not only to the existing deep architectures, but also to the state-of-the art TSC algorithms.
Fault Diagnosis from Raw Sensor Data Using Deep Neural Networks Considering Temporal Coherence.
Zhang, Ran; Peng, Zhen; Wu, Lifeng; Yao, Beibei; Guan, Yong
2017-03-09
Intelligent condition monitoring and fault diagnosis by analyzing the sensor data can assure the safety of machinery. Conventional fault diagnosis and classification methods usually implement pretreatments to decrease noise and extract some time domain or frequency domain features from raw time series sensor data. Then, some classifiers are utilized to make diagnosis. However, these conventional fault diagnosis approaches suffer from the expertise of feature selection and they do not consider the temporal coherence of time series data. This paper proposes a fault diagnosis model based on Deep Neural Networks (DNN). The model can directly recognize raw time series sensor data without feature selection and signal processing. It also takes advantage of the temporal coherence of the data. Firstly, raw time series training data collected by sensors are used to train the DNN until the cost function of DNN gets the minimal value; Secondly, test data are used to test the classification accuracy of the DNN on local time series data. Finally, fault diagnosis considering temporal coherence with former time series data is implemented. Experimental results show that the classification accuracy of bearing faults can get 100%. The proposed fault diagnosis approach is effective in recognizing the type of bearing faults.
Fault Diagnosis from Raw Sensor Data Using Deep Neural Networks Considering Temporal Coherence
Zhang, Ran; Peng, Zhen; Wu, Lifeng; Yao, Beibei; Guan, Yong
2017-01-01
Intelligent condition monitoring and fault diagnosis by analyzing the sensor data can assure the safety of machinery. Conventional fault diagnosis and classification methods usually implement pretreatments to decrease noise and extract some time domain or frequency domain features from raw time series sensor data. Then, some classifiers are utilized to make diagnosis. However, these conventional fault diagnosis approaches suffer from the expertise of feature selection and they do not consider the temporal coherence of time series data. This paper proposes a fault diagnosis model based on Deep Neural Networks (DNN). The model can directly recognize raw time series sensor data without feature selection and signal processing. It also takes advantage of the temporal coherence of the data. Firstly, raw time series training data collected by sensors are used to train the DNN until the cost function of DNN gets the minimal value; Secondly, test data are used to test the classification accuracy of the DNN on local time series data. Finally, fault diagnosis considering temporal coherence with former time series data is implemented. Experimental results show that the classification accuracy of bearing faults can get 100%. The proposed fault diagnosis approach is effective in recognizing the type of bearing faults. PMID:28282936
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Land Cover Analysis by Using Pixel-Based and Object-Based Image Classification Method in Bogor
NASA Astrophysics Data System (ADS)
Amalisana, Birohmatin; Rokhmatullah; Hernina, Revi
2017-12-01
The advantage of image classification is to provide earth’s surface information like landcover and time-series changes. Nowadays, pixel-based image classification technique is commonly performed with variety of algorithm such as minimum distance, parallelepiped, maximum likelihood, mahalanobis distance. On the other hand, landcover classification can also be acquired by using object-based image classification technique. In addition, object-based classification uses image segmentation from parameter such as scale, form, colour, smoothness and compactness. This research is aimed to compare the result of landcover classification and its change detection between parallelepiped pixel-based and object-based classification method. Location of this research is Bogor with 20 years range of observation from 1996 until 2016. This region is famous as urban areas which continuously change due to its rapid development, so that time-series landcover information of this region will be interesting.
Jane, Nancy Yesudhas; Nehemiah, Khanna Harichandran; Arputharaj, Kannan
2016-01-01
Clinical time-series data acquired from electronic health records (EHR) are liable to temporal complexities such as irregular observations, missing values and time constrained attributes that make the knowledge discovery process challenging. This paper presents a temporal rough set induced neuro-fuzzy (TRiNF) mining framework that handles these complexities and builds an effective clinical decision-making system. TRiNF provides two functionalities namely temporal data acquisition (TDA) and temporal classification. In TDA, a time-series forecasting model is constructed by adopting an improved double exponential smoothing method. The forecasting model is used in missing value imputation and temporal pattern extraction. The relevant attributes are selected using a temporal pattern based rough set approach. In temporal classification, a classification model is built with the selected attributes using a temporal pattern induced neuro-fuzzy classifier. For experimentation, this work uses two clinical time series dataset of hepatitis and thrombosis patients. The experimental result shows that with the proposed TRiNF framework, there is a significant reduction in the error rate, thereby obtaining the classification accuracy on an average of 92.59% for hepatitis and 91.69% for thrombosis dataset. The obtained classification results prove the efficiency of the proposed framework in terms of its improved classification accuracy.
Koopman Operator Framework for Time Series Modeling and Analysis
NASA Astrophysics Data System (ADS)
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
Zhou, Fuqun; Zhang, Aining
2016-01-01
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152
Zhou, Fuqun; Zhang, Aining
2016-10-25
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.
Classification of time series patterns from complex dynamic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less
Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio
2015-12-01
This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.
NASA Astrophysics Data System (ADS)
Luo, Qiu; Xin, Wu; Qiming, Xiong
2017-06-01
In the process of vegetation remote sensing information extraction, the problem of phenological features and low performance of remote sensing analysis algorithm is not considered. To solve this problem, the method of remote sensing vegetation information based on EVI time-series and the classification of decision-tree of multi-source branch similarity is promoted. Firstly, to improve the time-series stability of recognition accuracy, the seasonal feature of vegetation is extracted based on the fitting span range of time-series. Secondly, the decision-tree similarity is distinguished by adaptive selection path or probability parameter of component prediction. As an index, it is to evaluate the degree of task association, decide whether to perform migration of multi-source decision tree, and ensure the speed of migration. Finally, the accuracy of classification and recognition of pests and diseases can reach 87%--98% of commercial forest in Dalbergia hainanensis, which is significantly better than that of MODIS coverage accuracy of 80%--96% in this area. Therefore, the validity of the proposed method can be verified.
Aggregation of Sentinel-2 time series classifications as a solution for multitemporal analysis
NASA Astrophysics Data System (ADS)
Lewiński, Stanislaw; Nowakowski, Artur; Malinowski, Radek; Rybicki, Marcin; Kukawska, Ewa; Krupiński, Michał
2017-10-01
The general aim of this work was to elaborate efficient and reliable aggregation method that could be used for creating a land cover map at a global scale from multitemporal satellite imagery. The study described in this paper presents methods for combining results of land cover/land use classifications performed on single-date Sentinel-2 images acquired at different time periods. For that purpose different aggregation methods were proposed and tested on study sites spread on different continents. The initial classifications were performed with Random Forest classifier on individual Sentinel-2 images from a time series. In the following step the resulting land cover maps were aggregated pixel by pixel using three different combinations of information on the number of occurrences of a certain land cover class within a time series and the posterior probability of particular classes resulting from the Random Forest classification. From the proposed methods two are shown superior and in most cases were able to reach or outperform the accuracy of the best individual classifications of single-date images. Moreover, the aggregations results are very stable when used on data with varying cloudiness. They also enable to reduce considerably the number of cloudy pixels in the resulting land cover map what is significant advantage for mapping areas with frequent cloud coverage.
Wong, Raymond
2013-01-01
Voice biometrics is one kind of physiological characteristics whose voice is different for each individual person. Due to this uniqueness, voice classification has found useful applications in classifying speakers' gender, mother tongue or ethnicity (accent), emotion states, identity verification, verbal command control, and so forth. In this paper, we adopt a new preprocessing method named Statistical Feature Extraction (SFX) for extracting important features in training a classification model, based on piecewise transformation treating an audio waveform as a time-series. Using SFX we can faithfully remodel statistical characteristics of the time-series; together with spectral analysis, a substantial amount of features are extracted in combination. An ensemble is utilized in selecting only the influential features to be used in classification model induction. We focus on the comparison of effects of various popular data mining algorithms on multiple datasets. Our experiment consists of classification tests over four typical categories of human voice data, namely, Female and Male, Emotional Speech, Speaker Identification, and Language Recognition. The experiments yield encouraging results supporting the fact that heuristically choosing significant features from both time and frequency domains indeed produces better performance in voice classification than traditional signal processing techniques alone, like wavelets and LPC-to-CC. PMID:24288684
NASA Astrophysics Data System (ADS)
Niazmardi, S.; Safari, A.; Homayouni, S.
2017-09-01
Crop mapping through classification of Satellite Image Time-Series (SITS) data can provide very valuable information for several agricultural applications, such as crop monitoring, yield estimation, and crop inventory. However, the SITS data classification is not straightforward. Because different images of a SITS data have different levels of information regarding the classification problems. Moreover, the SITS data is a four-dimensional data that cannot be classified using the conventional classification algorithms. To address these issues in this paper, we presented a classification strategy based on Multiple Kernel Learning (MKL) algorithms for SITS data classification. In this strategy, initially different kernels are constructed from different images of the SITS data and then they are combined into a composite kernel using the MKL algorithms. The composite kernel, once constructed, can be used for the classification of the data using the kernel-based classification algorithms. We compared the computational time and the classification performances of the proposed classification strategy using different MKL algorithms for the purpose of crop mapping. The considered MKL algorithms are: MKL-Sum, SimpleMKL, LPMKL and Group-Lasso MKL algorithms. The experimental tests of the proposed strategy on two SITS data sets, acquired by SPOT satellite sensors, showed that this strategy was able to provide better performances when compared to the standard classification algorithm. The results also showed that the optimization method of the used MKL algorithms affects both the computational time and classification accuracy of this strategy.
Optical signal processing using photonic reservoir computing
NASA Astrophysics Data System (ADS)
Salehi, Mohammad Reza; Dehyadegari, Louiza
2014-10-01
As a new approach to recognition and classification problems, photonic reservoir computing has such advantages as parallel information processing, power efficient and high speed. In this paper, a photonic structure has been proposed for reservoir computing which is investigated using a simple, yet, non-partial noisy time series prediction task. This study includes the application of a suitable topology with self-feedbacks in a network of SOA's - which lends the system a strong memory - and leads to adjusting adequate parameters resulting in perfect recognition accuracy (100%) for noise-free time series, which shows a 3% improvement over previous results. For the classification of noisy time series, the rate of accuracy showed a 4% increase and amounted to 96%. Furthermore, an analytical approach was suggested to solve rate equations which led to a substantial decrease in the simulation time, which is an important parameter in classification of large signals such as speech recognition, and better results came up compared with previous works.
Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong
2015-01-01
Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited. PMID:26528811
Zhou, Zhen; Huang, Jingfeng; Wang, Jing; Zhang, Kangyu; Kuang, Zhaomin; Zhong, Shiquan; Song, Xiaodong
2015-01-01
Most areas planted with sugarcane are located in southern China. However, remote sensing of sugarcane has been limited because useable remote sensing data are limited due to the cloudy climate of this region during the growing season and severe spectral mixing with other crops. In this study, we developed a methodology for automatically mapping sugarcane over large areas using time-series middle-resolution remote sensing data. For this purpose, two major techniques were used, the object-oriented method (OOM) and data mining (DM). In addition, time-series Chinese HJ-1 CCD images were obtained during the sugarcane growing period. Image objects were generated using a multi-resolution segmentation algorithm, and DM was implemented using the AdaBoost algorithm, which generated the prediction model. The prediction model was applied to the HJ-1 CCD time-series image objects, and then a map of the sugarcane planting area was produced. The classification accuracy was evaluated using independent field survey sampling points. The confusion matrix analysis showed that the overall classification accuracy reached 93.6% and that the Kappa coefficient was 0.85. Thus, the results showed that this method is feasible, efficient, and applicable for extrapolating the classification of other crops in large areas where the application of high-resolution remote sensing data is impractical due to financial considerations or because qualified images are limited.
Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D
2016-01-01
One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.
Workshop on Algorithms for Time-Series Analysis
NASA Astrophysics Data System (ADS)
Protopapas, Pavlos
2012-04-01
abstract-type="normal">SummaryThis Workshop covered the four major subjects listed below in two 90-minute sessions. Each talk or tutorial allowed questions, and concluded with a discussion. Classification: Automatic classification using machine-learning methods is becoming a standard in surveys that generate large datasets. Ashish Mahabal (Caltech) reviewed various methods, and presented examples of several applications. Time-Series Modelling: Suzanne Aigrain (Oxford University) discussed autoregressive models and multivariate approaches such as Gaussian Processes. Meta-classification/mixture of expert models: Karim Pichara (Pontificia Universidad Católica, Chile) described the substantial promise which machine-learning classification methods are now showing in automatic classification, and discussed how the various methods can be combined together. Event Detection: Pavlos Protopapas (Harvard) addressed methods of fast identification of events with low signal-to-noise ratios, enlarging on the characterization and statistical issues of low signal-to-noise ratios and rare events.
Model-based Clustering of Categorical Time Series with Multinomial Logit Classification
NASA Astrophysics Data System (ADS)
Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea
2010-09-01
A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.
Can We Speculate Running Application With Server Power Consumption Trace?
Li, Yuanlong; Hu, Han; Wen, Yonggang; Zhang, Jun
2018-05-01
In this paper, we propose to detect the running applications in a server by classifying the observed power consumption series for the purpose of data center energy consumption monitoring and analysis. Time series classification problem has been extensively studied with various distance measurements developed; also recently the deep learning-based sequence models have been proved to be promising. In this paper, we propose a novel distance measurement and build a time series classification algorithm hybridizing nearest neighbor and long short term memory (LSTM) neural network. More specifically, first we propose a new distance measurement termed as local time warping (LTW), which utilizes a user-specified index set for local warping, and is designed to be noncommutative and nondynamic programming. Second, we hybridize the 1-nearest neighbor (1NN)-LTW and LSTM together. In particular, we combine the prediction probability vector of 1NN-LTW and LSTM to determine the label of the test cases. Finally, using the power consumption data from a real data center, we show that the proposed LTW can improve the classification accuracy of dynamic time warping (DTW) from about 84% to 90%. Our experimental results prove that the proposed LTW is competitive on our data set compared with existed DTW variants and its noncommutative feature is indeed beneficial. We also test a linear version of LTW and find out that it can perform similar to state-of-the-art DTW-based method while it runs as fast as the linear runtime lower bound methods like LB_Keogh for our problem. With the hybrid algorithm, for the power series classification task we achieve an accuracy up to about 93%. Our research can inspire more studies on time series distance measurement and the hybrid of the deep learning models with other traditional models.
Time Series Model Identification and Prediction Variance Horizon.
1980-06-01
stationary time series Y(t). -6- In terms of p(v), the definition of the three time series memory types is: No Memory Short Memory Long Memory X IP (v)I 0 0...X lp(v)l < - I IP (v) = v=1 v=l v=l Within short memory time series there are three types whose classification in terms of correlation functions is...1974) "Some Recent Advances in Time Series Modeling", TEEE Transactions on Automatic ControZ, VoZ . AC-19, No. 6, December, 723-730. Parzen, E. (1976) "An
Bromuri, Stefano; Zufferey, Damien; Hennebert, Jean; Schumacher, Michael
2014-10-01
This research is motivated by the issue of classifying illnesses of chronically ill patients for decision support in clinical settings. Our main objective is to propose multi-label classification of multivariate time series contained in medical records of chronically ill patients, by means of quantization methods, such as bag of words (BoW), and multi-label classification algorithms. Our second objective is to compare supervised dimensionality reduction techniques to state-of-the-art multi-label classification algorithms. The hypothesis is that kernel methods and locality preserving projections make such algorithms good candidates to study multi-label medical time series. We combine BoW and supervised dimensionality reduction algorithms to perform multi-label classification on health records of chronically ill patients. The considered algorithms are compared with state-of-the-art multi-label classifiers in two real world datasets. Portavita dataset contains 525 diabetes type 2 (DT2) patients, with co-morbidities of DT2 such as hypertension, dyslipidemia, and microvascular or macrovascular issues. MIMIC II dataset contains 2635 patients affected by thyroid disease, diabetes mellitus, lipoid metabolism disease, fluid electrolyte disease, hypertensive disease, thrombosis, hypotension, chronic obstructive pulmonary disease (COPD), liver disease and kidney disease. The algorithms are evaluated using multi-label evaluation metrics such as hamming loss, one error, coverage, ranking loss, and average precision. Non-linear dimensionality reduction approaches behave well on medical time series quantized using the BoW algorithm, with results comparable to state-of-the-art multi-label classification algorithms. Chaining the projected features has a positive impact on the performance of the algorithm with respect to pure binary relevance approaches. The evaluation highlights the feasibility of representing medical health records using the BoW for multi-label classification tasks. The study also highlights that dimensionality reduction algorithms based on kernel methods, locality preserving projections or both are good candidates to deal with multi-label classification tasks in medical time series with many missing values and high label density. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal
2013-04-01
The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.
Hao, Pengyu; Wang, Li; Niu, Zheng
2015-01-01
A range of single classifiers have been proposed to classify crop types using time series vegetation indices, and hybrid classifiers are used to improve discriminatory power. Traditional fusion rules use the product of multi-single classifiers, but that strategy cannot integrate the classification output of machine learning classifiers. In this research, the performance of two hybrid strategies, multiple voting (M-voting) and probabilistic fusion (P-fusion), for crop classification using NDVI time series were tested with different training sample sizes at both pixel and object levels, and two representative counties in north Xinjiang were selected as study area. The single classifiers employed in this research included Random Forest (RF), Support Vector Machine (SVM), and See 5 (C 5.0). The results indicated that classification performance improved (increased the mean overall accuracy by 5%~10%, and reduced standard deviation of overall accuracy by around 1%) substantially with the training sample number, and when the training sample size was small (50 or 100 training samples), hybrid classifiers substantially outperformed single classifiers with higher mean overall accuracy (1%~2%). However, when abundant training samples (4,000) were employed, single classifiers could achieve good classification accuracy, and all classifiers obtained similar performances. Additionally, although object-based classification did not improve accuracy, it resulted in greater visual appeal, especially in study areas with a heterogeneous cropping pattern. PMID:26360597
76 FR 51239 - North American Industry Classification System; Revision for 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... definitional and economic changes so that they can create continuous time series and accurately analyze data changes over time. The inclusion of revenues from FGP activities in manufacturing will effectively change...) to exclude production that occurs in a foreign country for historical consistency in time series...
Singha, Mrinal; Wu, Bingfang; Zhang, Miao
2016-01-01
Accurate and timely mapping of paddy rice is vital for food security and environmental sustainability. This study evaluates the utility of temporal features extracted from coarse resolution data for object-based paddy rice classification of fine resolution data. The coarse resolution vegetation index data is first fused with the fine resolution data to generate the time series fine resolution data. Temporal features are extracted from the fused data and added with the multi-spectral data to improve the classification accuracy. Temporal features provided the crop growth information, while multi-spectral data provided the pattern variation of paddy rice. The achieved overall classification accuracy and kappa coefficient were 84.37% and 0.68, respectively. The results indicate that the use of temporal features improved the overall classification accuracy of a single-date multi-spectral image by 18.75% from 65.62% to 84.37%. The minimum sensitivity (MS) of the paddy rice classification has also been improved. The comparison showed that the mapped paddy area was analogous to the agricultural statistics at the district level. This work also highlighted the importance of feature selection to achieve higher classification accuracies. These results demonstrate the potential of the combined use of temporal and spectral features for accurate paddy rice classification. PMID:28025525
Singha, Mrinal; Wu, Bingfang; Zhang, Miao
2016-12-22
Accurate and timely mapping of paddy rice is vital for food security and environmental sustainability. This study evaluates the utility of temporal features extracted from coarse resolution data for object-based paddy rice classification of fine resolution data. The coarse resolution vegetation index data is first fused with the fine resolution data to generate the time series fine resolution data. Temporal features are extracted from the fused data and added with the multi-spectral data to improve the classification accuracy. Temporal features provided the crop growth information, while multi-spectral data provided the pattern variation of paddy rice. The achieved overall classification accuracy and kappa coefficient were 84.37% and 0.68, respectively. The results indicate that the use of temporal features improved the overall classification accuracy of a single-date multi-spectral image by 18.75% from 65.62% to 84.37%. The minimum sensitivity (MS) of the paddy rice classification has also been improved. The comparison showed that the mapped paddy area was analogous to the agricultural statistics at the district level. This work also highlighted the importance of feature selection to achieve higher classification accuracies. These results demonstrate the potential of the combined use of temporal and spectral features for accurate paddy rice classification.
Predicting Flavonoid UGT Regioselectivity
Jackson, Rhydon; Knisley, Debra; McIntosh, Cecilia; Pfeiffer, Phillip
2011-01-01
Machine learning was applied to a challenging and biologically significant protein classification problem: the prediction of avonoid UGT acceptor regioselectivity from primary sequence. Novel indices characterizing graphical models of residues were proposed and found to be widely distributed among existing amino acid indices and to cluster residues appropriately. UGT subsequences biochemically linked to regioselectivity were modeled as sets of index sequences. Several learning techniques incorporating these UGT models were compared with classifications based on standard sequence alignment scores. These techniques included an application of time series distance functions to protein classification. Time series distances defined on the index sequences were used in nearest neighbor and support vector machine classifiers. Additionally, Bayesian neural network classifiers were applied to the index sequences. The experiments identified improvements over the nearest neighbor and support vector machine classifications relying on standard alignment similarity scores, as well as strong correlations between specific subsequences and regioselectivities. PMID:21747849
76 FR 15999 - Proposed Collection, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
... time series analyses and industry comparisons, and in special studies such as analyses of... ensure that requested data can be provided in the desired format, reporting burden (time and financial... Classification System (NAICS) level, and at the national, State, MSA, and county levels. The QCEW series has...
ERIC Educational Resources Information Center
Kvetan, Vladimir, Ed.
2014-01-01
Reliable and consistent time series are essential to any kind of economic forecasting. Skills forecasting needs to combine data from national accounts and labour force surveys, with the pan-European dimension of Cedefop's skills supply and demand forecasts, relying on different international classification standards. Sectoral classification (NACE)…
Qian, Liwei; Zheng, Haoran; Zhou, Hong; Qin, Ruibin; Li, Jinlong
2013-01-01
The increasing availability of time series expression datasets, although promising, raises a number of new computational challenges. Accordingly, the development of suitable classification methods to make reliable and sound predictions is becoming a pressing issue. We propose, here, a new method to classify time series gene expression via integration of biological networks. We evaluated our approach on 2 different datasets and showed that the use of a hidden Markov model/Gaussian mixture models hybrid explores the time-dependence of the expression data, thereby leading to better prediction results. We demonstrated that the biclustering procedure identifies function-related genes as a whole, giving rise to high accordance in prognosis prediction across independent time series datasets. In addition, we showed that integration of biological networks into our method significantly improves prediction performance. Moreover, we compared our approach with several state-of–the-art algorithms and found that our method outperformed previous approaches with regard to various criteria. Finally, our approach achieved better prediction results on early-stage data, implying the potential of our method for practical prediction. PMID:23516469
USDA-ARS?s Scientific Manuscript database
Synthetically generated Landsat time-series based on the STARFM algorithm are increasingly used for applications in forestry or agriculture. Although successes in classification and derivation of phenological orbiomass parameters are evident, a thorough evaluation of the limits of the method is stil...
NASA Astrophysics Data System (ADS)
Omenzetter, Piotr; de Lautour, Oliver R.
2010-04-01
Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.
Heart rate time series characteristics for early detection of infections in critically ill patients.
Tambuyzer, T; Guiza, F; Boonen, E; Meersseman, P; Vervenne, H; Hansen, T K; Bjerre, M; Van den Berghe, G; Berckmans, D; Aerts, J M; Meyfroidt, G
2017-04-01
It is difficult to make a distinction between inflammation and infection. Therefore, new strategies are required to allow accurate detection of infection. Here, we hypothesize that we can distinguish infected from non-infected ICU patients based on dynamic features of serum cytokine concentrations and heart rate time series. Serum cytokine profiles and heart rate time series of 39 patients were available for this study. The serum concentration of ten cytokines were measured using blood sampled every 10 min between 2100 and 0600 hours. Heart rate was recorded every minute. Ten metrics were used to extract features from these time series to obtain an accurate classification of infected patients. The predictive power of the metrics derived from the heart rate time series was investigated using decision tree analysis. Finally, logistic regression methods were used to examine whether classification performance improved with inclusion of features derived from the cytokine time series. The AUC of a decision tree based on two heart rate features was 0.88. The model had good calibration with 0.09 Hosmer-Lemeshow p value. There was no significant additional value of adding static cytokine levels or cytokine time series information to the generated decision tree model. The results suggest that heart rate is a better marker for infection than information captured by cytokine time series when the exact stage of infection is not known. The predictive value of (expensive) biomarkers should always be weighed against the routinely monitored data, and such biomarkers have to demonstrate added value.
This study applied a phenology-based land-cover classification approach across the Laurentian Great Lakes Basin (GLB) using time-series data consisting of 23 Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) composite images (250 ...
System Complexity Reduction via Feature Selection
ERIC Educational Resources Information Center
Deng, Houtao
2011-01-01
This dissertation transforms a set of system complexity reduction problems to feature selection problems. Three systems are considered: classification based on association rules, network structure learning, and time series classification. Furthermore, two variable importance measures are proposed to reduce the feature selection bias in tree…
Using spectrotemporal indices to improve the fruit-tree crop classification accuracy
NASA Astrophysics Data System (ADS)
Peña, M. A.; Liao, R.; Brenning, A.
2017-06-01
This study assesses the potential of spectrotemporal indices derived from satellite image time series (SITS) to improve the classification accuracy of fruit-tree crops. Six major fruit-tree crop types in the Aconcagua Valley, Chile, were classified by applying various linear discriminant analysis (LDA) techniques on a Landsat-8 time series of nine images corresponding to the 2014-15 growing season. As features we not only used the complete spectral resolution of the SITS, but also all possible normalized difference indices (NDIs) that can be constructed from any two bands of the time series, a novel approach to derive features from SITS. Due to the high dimensionality of this "enhanced" feature set we used the lasso and ridge penalized variants of LDA (PLDA). Although classification accuracies yielded by the standard LDA applied on the full-band SITS were good (misclassification error rate, MER = 0.13), they were further improved by 23% (MER = 0.10) with ridge PLDA using the enhanced feature set. The most important bands to discriminate the crops of interest were mainly concentrated on the first two image dates of the time series, corresponding to the crops' greenup stage. Despite the high predictor weights provided by the red and near infrared bands, typically used to construct greenness spectral indices, other spectral regions were also found important for the discrimination, such as the shortwave infrared band at 2.11-2.19 μm, sensitive to foliar water changes. These findings support the usefulness of spectrotemporal indices in the context of SITS-based crop type classifications, which until now have been mainly constructed by the arithmetic combination of two bands of the same image date in order to derive greenness temporal profiles like those from the normalized difference vegetation index.
Gharehbaghi, Arash; Linden, Maria
2017-10-12
This paper presents a novel method for learning the cyclic contents of stochastic time series: the deep time-growing neural network (DTGNN). The DTGNN combines supervised and unsupervised methods in different levels of learning for an enhanced performance. It is employed by a multiscale learning structure to classify cyclic time series (CTS), in which the dynamic contents of the time series are preserved in an efficient manner. This paper suggests a systematic procedure for finding the design parameter of the classification method for a one-versus-multiple class application. A novel validation method is also suggested for evaluating the structural risk, both in a quantitative and a qualitative manner. The effect of the DTGNN on the performance of the classifier is statistically validated through the repeated random subsampling using different sets of CTS, from different medical applications. The validation involves four medical databases, comprised of 108 recordings of the electroencephalogram signal, 90 recordings of the electromyogram signal, 130 recordings of the heart sound signal, and 50 recordings of the respiratory sound signal. Results of the statistical validations show that the DTGNN significantly improves the performance of the classification and also exhibits an optimal structural risk.
Ensemble Deep Learning for Biomedical Time Series Classification
2016-01-01
Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost. PMID:27725828
An Evaluation Method of Words Tendency Depending on Time-Series Variation and Its Improvements.
ERIC Educational Resources Information Center
Atlam, El-Sayed; Okada, Makoto; Shishibori, Masami; Aoe, Jun-ichi
2002-01-01
Discussion of word frequency and keywords in text focuses on a method to estimate automatically the stability classes that indicate a word's popularity with time-series variations based on the frequency change in past electronic text data. Compares the evaluation of decision tree stability class results with manual classification results.…
Continuous Change Detection and Classification (CCDC) of Land Cover Using All Available Landsat Data
NASA Astrophysics Data System (ADS)
Zhu, Z.; Woodcock, C. E.
2012-12-01
A new algorithm for Continuous Change Detection and Classification (CCDC) of land cover using all available Landsat data is developed. This new algorithm is capable of detecting many kinds of land cover change as new images are collected and at the same time provide land cover maps for any given time. To better identify land cover change, a two step cloud, cloud shadow, and snow masking algorithm is used for eliminating "noisy" observations. Next, a time series model that has components of seasonality, trend, and break estimates the surface reflectance and temperature. The time series model is updated continuously with newly acquired observations. Due to the high variability in spectral response for different kinds of land cover change, the CCDC algorithm uses a data-driven threshold derived from all seven Landsat bands. When the difference between observed and predicted exceeds the thresholds three consecutive times, a pixel is identified as land cover change. Land cover classification is done after change detection. Coefficients from the time series models and the Root Mean Square Error (RMSE) from model fitting are used as classification inputs for the Random Forest Classifier (RFC). We applied this new algorithm for one Landsat scene (Path 12 Row 31) that includes all of Rhode Island as well as much of Eastern Massachusetts and parts of Connecticut. A total of 532 Landsat images acquired between 1982 and 2011 were processed. During this period, 619,924 pixels were detected to change once (91% of total changed pixels) and 60,199 pixels were detected to change twice (8% of total changed pixels). The most frequent land cover change category is from mixed forest to low density residential which occupies more than 8% of total land cover change pixels.
Support vector machine (SVM) was applied for land-cover characterization using MODIS time-series data. Classification performance was examined with respect to training sample size, sample variability, and landscape homogeneity (purity). The results were compared to two convention...
Learning time series for intelligent monitoring
NASA Technical Reports Server (NTRS)
Manganaris, Stefanos; Fisher, Doug
1994-01-01
We address the problem of classifying time series according to their morphological features in the time domain. In a supervised machine-learning framework, we induce a classification procedure from a set of preclassified examples. For each class, we infer a model that captures its morphological features using Bayesian model induction and the minimum message length approach to assign priors. In the performance task, we classify a time series in one of the learned classes when there is enough evidence to support that decision. Time series with sufficiently novel features, belonging to classes not present in the training set, are recognized as such. We report results from experiments in a monitoring domain of interest to NASA.
Gu, Yingxin; Brown, Jesslyn F.; Miura, Tomoaki; van Leeuwen, Willem J.D.; Reed, Bradley C.
2010-01-01
This study introduces a new geographic framework, phenological classification, for the conterminous United States based on Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) time-series data and a digital elevation model. The resulting pheno-class map is comprised of 40 pheno-classes, each having unique phenological and topographic characteristics. Cross-comparison of the pheno-classes with the 2001 National Land Cover Database indicates that the new map contains additional phenological and climate information. The pheno-class framework may be a suitable basis for the development of an Advanced Very High Resolution Radiometer (AVHRR)-MODIS NDVI translation algorithm and for various biogeographic studies.
Classification and machine recognition of severe weather patterns
NASA Technical Reports Server (NTRS)
Wang, P. P.; Burns, R. C.
1976-01-01
Forecasting and warning of severe weather conditions are treated from the vantage point of pattern recognition by machine. Pictorial patterns and waveform patterns are distinguished. Time series data on sferics are dealt with by considering waveform patterns. A severe storm patterns recognition machine is described, along with schemes for detection via cross-correlation of time series (same channel or different channels). Syntactic and decision-theoretic approaches to feature extraction are discussed. Active and decayed tornados and thunderstorms, lightning discharges, and funnels and their related time series data are studied.
Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P
2014-10-01
There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.
Samiee, Kaveh; Kovács, Petér; Gabbouj, Moncef
2015-02-01
A system for epileptic seizure detection in electroencephalography (EEG) is described in this paper. One of the challenges is to distinguish rhythmic discharges from nonstationary patterns occurring during seizures. The proposed approach is based on an adaptive and localized time-frequency representation of EEG signals by means of rational functions. The corresponding rational discrete short-time Fourier transform (DSTFT) is a novel feature extraction technique for epileptic EEG data. A multilayer perceptron classifier is fed by the coefficients of the rational DSTFT in order to separate seizure epochs from seizure-free epochs. The effectiveness of the proposed method is compared with several state-of-art feature extraction algorithms used in offline epileptic seizure detection. The results of the comparative evaluations show that the proposed method outperforms competing techniques in terms of classification accuracy. In addition, it provides a compact representation of EEG time-series.
Robust evaluation of time series classification algorithms for structural health monitoring
NASA Astrophysics Data System (ADS)
Harvey, Dustin Y.; Worden, Keith; Todd, Michael D.
2014-03-01
Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and mechanical infrastructure through analysis of structural response measurements. The supervised learning methodology for data-driven SHM involves computation of low-dimensional, damage-sensitive features from raw measurement data that are then used in conjunction with machine learning algorithms to detect, classify, and quantify damage states. However, these systems often suffer from performance degradation in real-world applications due to varying operational and environmental conditions. Probabilistic approaches to robust SHM system design suffer from incomplete knowledge of all conditions a system will experience over its lifetime. Info-gap decision theory enables nonprobabilistic evaluation of the robustness of competing models and systems in a variety of decision making applications. Previous work employed info-gap models to handle feature uncertainty when selecting various components of a supervised learning system, namely features from a pre-selected family and classifiers. In this work, the info-gap framework is extended to robust feature design and classifier selection for general time series classification through an efficient, interval arithmetic implementation of an info-gap data model. Experimental results are presented for a damage type classification problem on a ball bearing in a rotating machine. The info-gap framework in conjunction with an evolutionary feature design system allows for fully automated design of a time series classifier to meet performance requirements under maximum allowable uncertainty.
Tormene, Paolo; Giorgino, Toni; Quaglini, Silvana; Stefanelli, Mario
2009-01-01
The purpose of this study was to assess the performance of a real-time ("open-end") version of the dynamic time warping (DTW) algorithm for the recognition of motor exercises. Given a possibly incomplete input stream of data and a reference time series, the open-end DTW algorithm computes both the size of the prefix of reference which is best matched by the input, and the dissimilarity between the matched portions. The algorithm was used to provide real-time feedback to neurological patients undergoing motor rehabilitation. We acquired a dataset of multivariate time series from a sensorized long-sleeve shirt which contains 29 strain sensors distributed on the upper limb. Seven typical rehabilitation exercises were recorded in several variations, both correctly and incorrectly executed, and at various speeds, totaling a data set of 840 time series. Nearest-neighbour classifiers were built according to the outputs of open-end DTW alignments and their global counterparts on exercise pairs. The classifiers were also tested on well-known public datasets from heterogeneous domains. Nonparametric tests show that (1) on full time series the two algorithms achieve the same classification accuracy (p-value =0.32); (2) on partial time series, classifiers based on open-end DTW have a far higher accuracy (kappa=0.898 versus kappa=0.447;p<10(-5)); and (3) the prediction of the matched fraction follows closely the ground truth (root mean square <10%). The results hold for the motor rehabilitation and the other datasets tested, as well. The open-end variant of the DTW algorithm is suitable for the classification of truncated quantitative time series, even in the presence of noise. Early recognition and accurate class prediction can be achieved, provided that enough variance is available over the time span of the reference. Therefore, the proposed technique expands the use of DTW to a wider range of applications, such as real-time biofeedback systems.
NASA Astrophysics Data System (ADS)
Iannacone, J.; Berti, M.; Allievi, J.; Del Conte, S.; Corsini, A.
2013-12-01
Space borne InSAR has proven to be very valuable for landslides detection. In particular, extremely slow landslides (Cruden and Varnes, 1996) can be now clearly identified, thanks to the millimetric precision reached by recent multi-interferometric algorithms. The typical approach in radar interpretation for landslides mapping is based on average annual velocity of the deformation which is calculated over the entire times series. The Hotspot and Cluster Analysis (Lu et al., 2012) and the PSI-based matrix approach (Cigna et al., 2013) are examples of landslides mapping techniques based on average annual velocities. However, slope movements can be affected by non-linear deformation trends, (i.e. reactivation of dormant landslides, deceleration due to natural or man-made slope stabilization, seasonal activity, etc). Therefore, analyzing deformation time series is crucial in order to fully characterize slope dynamics. While this is relatively simple to be carried out manually when dealing with small dataset, the time series analysis over regional scale dataset requires automated classification procedures. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The analysis allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) which are likely to represent different slope processes. The analysis also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. All the classification algorithms were integrated into a Graphical User Interface called PSTime. We investigated an area of about 2000 km2 in the Northern Apennines of Italy by using SqueeSAR™ algorithm (Ferretti et al., 2011). Two Radarsat-1 data stack, comprising of 112 scenes in descending orbit and 124 scenes in ascending orbit, were processed. The time coverage lasts from April 2003 to November 2012, with an average temporal frequency of 1 scene/month. Radar interpretation has been carried out by considering average annual velocities as well as acceleration/deceleration trends evidenced by PSTime. Altogether, from ascending and descending geometries respectively, this approach allowed detecting of 115 and 112 potential landslides on the basis of average displacement rate and 77 and 79 landslides on the basis of acceleration trends. In conclusion, time series analysis resulted to be very valuable for landslide mapping. In particular it highlighted areas with marked acceleration in a specific period in time while still being affected by low average annual velocity over the entire analysis period. On the other hand, even in areas with high average annual velocity, time series analysis was of primary importance to characterize the slope dynamics in terms of acceleration events.
Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E; Poizner, Howard; Sejnowski, Terrence J
2013-01-01
Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson's disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to -30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A' under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data.
Non-Linear Dynamical Classification of Short Time Series of the Rössler System in High Noise Regimes
Lainscsek, Claudia; Weyhenmeyer, Jonathan; Hernandez, Manuel E.; Poizner, Howard; Sejnowski, Terrence J.
2013-01-01
Time series analysis with delay differential equations (DDEs) reveals non-linear properties of the underlying dynamical system and can serve as a non-linear time-domain classification tool. Here global DDE models were used to analyze short segments of simulated time series from a known dynamical system, the Rössler system, in high noise regimes. In a companion paper, we apply the DDE model developed here to classify short segments of encephalographic (EEG) data recorded from patients with Parkinson’s disease and healthy subjects. Nine simulated subjects in each of two distinct classes were generated by varying the bifurcation parameter b and keeping the other two parameters (a and c) of the Rössler system fixed. All choices of b were in the chaotic parameter range. We diluted the simulated data using white noise ranging from 10 to −30 dB signal-to-noise ratios (SNR). Structure selection was supervised by selecting the number of terms, delays, and order of non-linearity of the model DDE model that best linearly separated the two classes of data. The distances d from the linear dividing hyperplane was then used to assess the classification performance by computing the area A′ under the ROC curve. The selected model was tested on untrained data using repeated random sub-sampling validation. DDEs were able to accurately distinguish the two dynamical conditions, and moreover, to quantify the changes in the dynamics. There was a significant correlation between the dynamical bifurcation parameter b of the simulated data and the classification parameter d from our analysis. This correlation still held for new simulated subjects with new dynamical parameters selected from each of the two dynamical regimes. Furthermore, the correlation was robust to added noise, being significant even when the noise was greater than the signal. We conclude that DDE models may be used as a generalizable and reliable classification tool for even small segments of noisy data. PMID:24379798
Target Detection and Classification Using Seismic and PIR Sensors
2012-06-01
time series analysis via wavelet - based partitioning,” Signal Process...regard, this paper presents a wavelet - based method for target detection and classification. The proposed method has been validated on data sets of...The work reported in this paper makes use of a wavelet - based feature extraction method , called Symbolic Dynamic Filtering (SDF) [12]–[14]. The
Change classification in SAR time series: a functional approach
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2017-10-01
Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.
feets: feATURE eXTRACTOR for tIME sERIES
NASA Astrophysics Data System (ADS)
Cabral, Juan; Sanchez, Bruno; Ramos, Felipe; Gurovich, Sebastián; Granitto, Pablo; VanderPlas, Jake
2018-06-01
feets characterizes and analyzes light-curves from astronomical photometric databases for modelling, classification, data cleaning, outlier detection and data analysis. It uses machine learning algorithms to determine the numerical descriptors that characterize and distinguish the different variability classes of light-curves; these range from basic statistical measures such as the mean or standard deviation to complex time-series characteristics such as the autocorrelation function. The library is not restricted to the astronomical field and could also be applied to any kind of time series. This project is a derivative work of FATS (ascl:1711.017).
NASA Astrophysics Data System (ADS)
de Lautour, Oliver R.; Omenzetter, Piotr
2010-07-01
Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.
Signatures of ecological processes in microbial community time series.
Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie
2018-06-28
Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.
5 CFR 330.205 - Agency RPL applications.
Code of Federal Regulations, 2013 CFR
2013-01-01
... register for positions at the same representative rate and work schedule (full-time, part-time, seasonal... grades or pay levels, appointment type (permanent or time-limited), occupations (e.g., position classification series or career groups), and minimum number of hours of work per week, as applicable. ...
NASA Astrophysics Data System (ADS)
Sah, Shagan
An increasingly important application of remote sensing is to provide decision support during emergency response and disaster management efforts. Land cover maps constitute one such useful application product during disaster events; if generated rapidly after any disaster, such map products can contribute to the efficacy of the response effort. In light of recent nuclear incidents, e.g., after the earthquake/tsunami in Japan (2011), our research focuses on constructing rapid and accurate land cover maps of the impacted area in case of an accidental nuclear release. The methodology involves integration of results from two different approaches, namely coarse spatial resolution multi-temporal and fine spatial resolution imagery, to increase classification accuracy. Although advanced methods have been developed for classification using high spatial or temporal resolution imagery, only a limited amount of work has been done on fusion of these two remote sensing approaches. The presented methodology thus involves integration of classification results from two different remote sensing modalities in order to improve classification accuracy. The data used included RapidEye and MODIS scenes over the Nine Mile Point Nuclear Power Station in Oswego (New York, USA). The first step in the process was the construction of land cover maps from freely available, high temporal resolution, low spatial resolution MODIS imagery using a time-series approach. We used the variability in the temporal signatures among different land cover classes for classification. The time series-specific features were defined by various physical properties of a pixel, such as variation in vegetation cover and water content over time. The pixels were classified into four land cover classes - forest, urban, water, and vegetation - using Euclidean and Mahalanobis distance metrics. On the other hand, a high spatial resolution commercial satellite, such as RapidEye, can be tasked to capture images over the affected area in the case of a nuclear event. This imagery served as a second source of data to augment results from the time series approach. The classifications from the two approaches were integrated using an a posteriori probability-based fusion approach. This was done by establishing a relationship between the classes, obtained after classification of the two data sources. Despite the coarse spatial resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion-based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. This fusion thus contributed to classification accuracy refinement, with a few additional advantages, such as correction for cloud cover and providing for an approach that is robust against point-in-time seasonal anomalies, due to the inclusion of multi-temporal data. We concluded that this approach is capable of generating land cover maps of acceptable accuracy and rapid turnaround, which in turn can yield reliable estimates of crop acreage of a region. The final algorithm is part of an automated software tool, which can be used by emergency response personnel to generate a nuclear ingestion pathway information product within a few hours of data collection.
Cross-entropy clustering framework for catchment classification
NASA Astrophysics Data System (ADS)
Tongal, Hakan; Sivakumar, Bellie
2017-09-01
There is an increasing interest in catchment classification and regionalization in hydrology, as they are useful for identification of appropriate model complexity and transfer of information from gauged catchments to ungauged ones, among others. This study introduces a nonlinear cross-entropy clustering (CEC) method for classification of catchments. The method specifically considers embedding dimension (m), sample entropy (SampEn), and coefficient of variation (CV) to represent dimensionality, complexity, and variability of the time series, respectively. The method is applied to daily streamflow time series from 217 gauging stations across Australia. The results suggest that a combination of linear and nonlinear parameters (i.e. m, SampEn, and CV), representing different aspects of the underlying dynamics of streamflows, could be useful for determining distinct patterns of flow generation mechanisms within a nonlinear clustering framework. For the 217 streamflow time series, nine hydrologically homogeneous clusters that have distinct patterns of flow regime characteristics and specific dominant hydrological attributes with different climatic features are obtained. Comparison of the results with those obtained using the widely employed k-means clustering method (which results in five clusters, with the loss of some information about the features of the clusters) suggests the superiority of the cross-entropy clustering method. The outcomes from this study provide a useful guideline for employing the nonlinear dynamic approaches based on hydrologic signatures and for gaining an improved understanding of streamflow variability at a large scale.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time compared with frequency-based methods such as the DFT and cross-spectral analysis.
NASA Astrophysics Data System (ADS)
Gattano, C.; Lambert, S.; Bizouard, C.
2017-12-01
In the context of selecting sources defining the celestial reference frame, we compute astrometric time series of all VLBI radio-sources from observations in the International VLBI Service database. The time series are then analyzed with Allan variance in order to estimate the astrometric stability. From results, we establish a new classification that takes into account the whole multi-time scales information. The algorithm is flexible on the definition of ``stable source" through an adjustable threshold.
NASA Astrophysics Data System (ADS)
Nitze, Ingmar; Barrett, Brian; Cawkwell, Fiona
2015-02-01
The analysis and classification of land cover is one of the principal applications in terrestrial remote sensing. Due to the seasonal variability of different vegetation types and land surface characteristics, the ability to discriminate land cover types changes over time. Multi-temporal classification can help to improve the classification accuracies, but different constraints, such as financial restrictions or atmospheric conditions, may impede their application. The optimisation of image acquisition timing and frequencies can help to increase the effectiveness of the classification process. For this purpose, the Feature Importance (FI) measure of the state-of-the art machine learning method Random Forest was used to determine the optimal image acquisition periods for a general (Grassland, Forest, Water, Settlement, Peatland) and Grassland specific (Improved Grassland, Semi-Improved Grassland) land cover classification in central Ireland based on a 9-year time-series of MODIS Terra 16 day composite data (MOD13Q1). Feature Importances for each acquisition period of the Enhanced Vegetation Index (EVI) and Normalised Difference Vegetation Index (NDVI) were calculated for both classification scenarios. In the general land cover classification, the months December and January showed the highest, and July and August the lowest separability for both VIs over the entire nine-year period. This temporal separability was reflected in the classification accuracies, where the optimal choice of image dates outperformed the worst image date by 13% using NDVI and 5% using EVI on a mono-temporal analysis. With the addition of the next best image periods to the data input the classification accuracies converged quickly to their limit at around 8-10 images. The binary classification schemes, using two classes only, showed a stronger seasonal dependency with a higher intra-annual, but lower inter-annual variation. Nonetheless anomalous weather conditions, such as the cold winter of 2009/2010 can alter the temporal separability pattern significantly. Due to the extensive use of the NDVI for land cover discrimination, the findings of this study should be transferrable to data from other optical sensors with a higher spatial resolution. However, the high impact of outliers from the general climatic pattern highlights the limitation of spatial transferability to locations with different climatic and land cover conditions. The use of high-temporal, moderate resolution data such as MODIS in conjunction with machine-learning techniques proved to be a good base for the prediction of image acquisition timing for optimal land cover classification results.
NASA Astrophysics Data System (ADS)
Sun, Chao; Liu, Yongxue; Zhao, Saishuai; Zhou, Minxi; Yang, Yuhao; Li, Feixue
2016-03-01
Salt marshes are seen as the most dynamic and valuable ecosystems in coastal zones, and in these areas, it is crucial to obtain accurate remote sensing information on the spatial distributions of species over time. However, discriminating various types of salt marsh is rather difficult because of their strong spectral similarities. Previous salt marsh mapping studies have focused mainly on high spatial and spectral (i.e., hyperspectral) resolution images combined with auxiliary information; however, the results are often limited to small regions. With a high temporal and moderate spatial resolution, the Chinese HuanJing-1 (HJ-1) satellite optical imagery can be used not only to monitor phenological changes of salt marsh vegetation over short-time intervals, but also to obtain coverage of large areas. Here, we apply HJ-1 satellite imagery to the middle coast of Jiangsu in east China to monitor changes in saltmarsh vegetation cover. First, we constructed a monthly NDVI time-series to classify various types of salt marsh and then we tested the possibility of using compressed time-series continuously, to broaden the applicability of this particular approach. Our principal findings are as follows: (1) the overall accuracy of salt marsh mapping based on the monthly NDVI time-series was 90.3%, which was ∼16.0% higher than the single-phase classification strategy; (2) a compressed time-series, including NDVI from six key months (April, June-September, and November), demonstrated very little reduction (2.3%) in overall accuracy but led to obvious improvements in unstable regions; and (3) a simple rule for Spartina alterniflora identification was established using a scene solely from November, which may provide an effective way for regularly monitoring its distribution.
On the identification of sleep stages in mouse electroencephalography time-series.
Lampert, Thomas; Plano, Andrea; Austin, Jim; Platt, Bettina
2015-05-15
The automatic identification of sleep stages in electroencephalography (EEG) time-series is a long desired goal for researchers concerned with the study of sleep disorders. This paper presents advances towards achieving this goal, with particular application to EEG time-series recorded from mice. Approaches in the literature apply supervised learning classifiers, however, these do not reach the performance levels required for use within a laboratory. In this paper, detection reliability is increased, most notably in the case of REM stage identification, by naturally decomposing the problem and applying a support vector machine (SVM) based classifier to each of the EEG channels. Their outputs are integrated within a multiple classifier system. Furthermore, there exists no general consensus on the ideal choice of parameter values in such systems. Therefore, an investigation into the effects upon the classification performance is presented by varying parameters such as the epoch length; features size; number of training samples; and the method for calculating the power spectral density estimate. Finally, the results of these investigations are brought together to demonstrate the performance of the proposed classification algorithm in two cases: intra-animal classification and inter-animal classification. It is shown that, within a dataset of 10 EEG recordings, and using less than 1% of an EEG as training data, a mean classification errors of Awake 6.45%, NREM 5.82%, and REM 6.65% (with standard deviations less than 0.6%) are achieved in intra-animal analysis and, when using the equivalent of 7% of one EEG as training data, Awake 10.19%, NREM 7.75%, and REM 17.43% are achieved in inter-animal analysis (with mean standard deviations of 6.42%, 2.89%, and 9.69% respectively). A software package implementing the proposed approach will be made available through Cybula Ltd. Copyright © 2015 Elsevier B.V. All rights reserved.
Spatio-temporal Event Classification using Time-series Kernel based Structured Sparsity
Jeni, László A.; Lőrincz, András; Szabó, Zoltán; Cohn, Jeffrey F.; Kanade, Takeo
2016-01-01
In many behavioral domains, such as facial expression and gesture, sparse structure is prevalent. This sparsity would be well suited for event detection but for one problem. Features typically are confounded by alignment error in space and time. As a consequence, high-dimensional representations such as SIFT and Gabor features have been favored despite their much greater computational cost and potential loss of information. We propose a Kernel Structured Sparsity (KSS) method that can handle both the temporal alignment problem and the structured sparse reconstruction within a common framework, and it can rely on simple features. We characterize spatio-temporal events as time-series of motion patterns and by utilizing time-series kernels we apply standard structured-sparse coding techniques to tackle this important problem. We evaluated the KSS method using both gesture and facial expression datasets that include spontaneous behavior and differ in degree of difficulty and type of ground truth coding. KSS outperformed both sparse and non-sparse methods that utilize complex image features and their temporal extensions. In the case of early facial event classification KSS had 10% higher accuracy as measured by F1 score over kernel SVM methods1. PMID:27830214
Hybrid analysis for indicating patients with breast cancer using temperature time series.
Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura
2016-07-01
Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an average accuracy of 95.38% was obtained. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Classification of polytype structures of zinc sulfide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laptev, V.I.
1994-12-31
It is suggested that the existing classification of polytype structures of zinc sulfide be supplemented with an additional criterion: the characteristic of regular point systems (Wyckoff positions) including their type, number, and multiplicity. The consideration of the Wyckoff positions allowed the establishment of construction principles of known polytype series of different symmetries and the systematization (for the first time) of the polytypes with the same number of differently packed layers. the classification suggested for polytype structures of zinc sulfide is compact and provides a basis for creating search systems. The classification table obtained can also be used for numerous siliconmore » carbide polytypes. 8 refs., 4 tabs.« less
A window-based time series feature extraction method.
Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife
2017-10-01
This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Crop classification and mapping based on Sentinel missions data in cloud environment
NASA Astrophysics Data System (ADS)
Lavreniuk, M. S.; Kussul, N.; Shelestov, A.; Vasiliev, V.
2017-12-01
Availability of high resolution satellite imagery (Sentinel-1/2/3, Landsat) over large territories opens new opportunities in agricultural monitoring. In particular, it becomes feasible to solve crop classification and crop mapping task at country and regional scale using time series of heterogenous satellite imagery. But in this case, we face with the problem of Big Data. Dealing with time series of high resolution (10 m) multispectral imagery we need to download huge volumes of data and then process them. The solution is to move "processing chain" closer to data itself to drastically shorten time for data transfer. One more advantage of such approach is the possibility to parallelize data processing workflow and efficiently implement machine learning algorithms. This could be done with cloud platform where Sentinel imagery are stored. In this study, we investigate usability and efficiency of two different cloud platforms Amazon and Google for crop classification and crop mapping problems. Two pilot areas were investigated - Ukraine and England. Google provides user friendly environment Google Earth Engine for Earth observation applications with a lot of data processing and machine learning tools already deployed. At the same time with Amazon one gets much more flexibility in implementation of his own workflow. Detailed analysis of pros and cons will be done in the presentation.
Uncertain Classification of Variable Stars: Handling Observational GAPS and Noise
NASA Astrophysics Data System (ADS)
Castro, Nicolás; Protopapas, Pavlos; Pichara, Karim
2018-01-01
Automatic classification methods applied to sky surveys have revolutionized the astronomical target selection process. Most surveys generate a vast amount of time series, or “lightcurves,” that represent the brightness variability of stellar objects in time. Unfortunately, lightcurves’ observations take several years to be completed, producing truncated time series that generally remain without the application of automatic classifiers until they are finished. This happens because state-of-the-art methods rely on a variety of statistical descriptors or features that present an increasing degree of dispersion when the number of observations decreases, which reduces their precision. In this paper, we propose a novel method that increases the performance of automatic classifiers of variable stars by incorporating the deviations that scarcity of observations produces. Our method uses Gaussian process regression to form a probabilistic model of each lightcurve’s observations. Then, based on this model, bootstrapped samples of the time series features are generated. Finally, a bagging approach is used to improve the overall performance of the classification. We perform tests on the MAssive Compact Halo Object (MACHO) and Optical Gravitational Lensing Experiment (OGLE) catalogs, results show that our method effectively classifies some variability classes using a small fraction of the original observations. For example, we found that RR Lyrae stars can be classified with ~80% accuracy just by observing the first 5% of the whole lightcurves’ observations in the MACHO and OGLE catalogs. We believe these results prove that, when studying lightcurves, it is important to consider the features’ error and how the measurement process impacts it.
A multi-temporal analysis approach for land cover mapping in support of nuclear incident response
NASA Astrophysics Data System (ADS)
Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.
2012-06-01
Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.
Weighted statistical parameters for irregularly sampled time series
NASA Astrophysics Data System (ADS)
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
Predicting Flood in Perlis Using Ant Colony Optimization
NASA Astrophysics Data System (ADS)
Nadia Sabri, Syaidatul; Saian, Rizauddin
2017-06-01
Flood forecasting is widely being studied in order to reduce the effect of flood such as loss of property, loss of life and contamination of water supply. Usually flood occurs due to continuous heavy rainfall. This study used a variant of Ant Colony Optimization (ACO) algorithm named the Ant-Miner to develop the classification prediction model to predict flood. However, since Ant-Miner only accept discrete data, while rainfall data is a time series data, a pre-processing steps is needed to discretize the rainfall data initially. This study used a technique called the Symbolic Aggregate Approximation (SAX) to convert the rainfall time series data into discrete data. As an addition, Simple K-Means algorithm was used to cluster the data produced by SAX. The findings show that the predictive accuracy of the classification prediction model is more than 80%.
Characterizing artifacts in RR stress test time series.
Astudillo-Salinas, Fabian; Palacio-Baus, Kenneth; Solano-Quinde, Lizandro; Medina, Ruben; Wong, Sara
2016-08-01
Electrocardiographic stress test records have a lot of artifacts. In this paper we explore a simple method to characterize the amount of artifacts present in unprocessed RR stress test time series. Four time series classes were defined: Very good lead, Good lead, Low quality lead and Useless lead. 65 ECG, 8 lead, records of stress test series were analyzed. Firstly, RR-time series were annotated by two experts. The automatic methodology is based on dividing the RR-time series in non-overlapping windows. Each window is marked as noisy whenever it exceeds an established standard deviation threshold (SDT). Series are classified according to the percentage of windows that exceeds a given value, based upon the first manual annotation. Different SDT were explored. Results show that SDT close to 20% (as a percentage of the mean) provides the best results. The coincidence between annotators classification is 70.77% whereas, the coincidence between the second annotator and the automatic method providing the best matches is larger than 63%. Leads classified as Very good leads and Good leads could be combined to improve automatic heartbeat labeling.
Multiscale limited penetrable horizontal visibility graph for analyzing nonlinear time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong; Zhang, Shan-Shan
2016-10-01
Visibility graph has established itself as a powerful tool for analyzing time series. We in this paper develop a novel multiscale limited penetrable horizontal visibility graph (MLPHVG). We use nonlinear time series from two typical complex systems, i.e., EEG signals and two-phase flow signals, to demonstrate the effectiveness of our method. Combining MLPHVG and support vector machine, we detect epileptic seizures from the EEG signals recorded from healthy subjects and epilepsy patients and the classification accuracy is 100%. In addition, we derive MLPHVGs from oil-water two-phase flow signals and find that the average clustering coefficient at different scales allows faithfully identifying and characterizing three typical oil-water flow patterns. These findings render our MLPHVG method particularly useful for analyzing nonlinear time series from the perspective of multiscale network analysis.
Interpretable Early Classification of Multivariate Time Series
ERIC Educational Resources Information Center
Ghalwash, Mohamed F.
2013-01-01
Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…
Time Series of Images to Improve Tree Species Classification
NASA Astrophysics Data System (ADS)
Miyoshi, G. T.; Imai, N. N.; de Moraes, M. V. A.; Tommaselli, A. M. G.; Näsi, R.
2017-10-01
Tree species classification provides valuable information to forest monitoring and management. The high floristic variation of the tree species appears as a challenging issue in the tree species classification because the vegetation characteristics changes according to the season. To help to monitor this complex environment, the imaging spectroscopy has been largely applied since the development of miniaturized sensors attached to Unmanned Aerial Vehicles (UAV). Considering the seasonal changes in forests and the higher spectral and spatial resolution acquired with sensors attached to UAV, we present the use of time series of images to classify four tree species. The study area is an Atlantic Forest area located in the western part of São Paulo State. Images were acquired in August 2015 and August 2016, generating three data sets of images: only with the image spectra of 2015; only with the image spectra of 2016; with the layer stacking of images from 2015 and 2016. Four tree species were classified using Spectral angle mapper (SAM), Spectral information divergence (SID) and Random Forest (RF). The results showed that SAM and SID caused an overfitting of the data whereas RF showed better results and the use of the layer stacking improved the classification achieving a kappa coefficient of 18.26 %.
Large Scale Crop Mapping in Ukraine Using Google Earth Engine
NASA Astrophysics Data System (ADS)
Shelestov, A.; Lavreniuk, M. S.; Kussul, N.
2016-12-01
There are no globally available high resolution satellite-derived crop specific maps at present. Only coarse-resolution imagery (> 250 m spatial resolution) has been utilized to derive global cropland extent. In 2016 we are going to carry out a country level demonstration of Sentinel-2 use for crop classification in Ukraine within the ESA Sen2-Agri project. But optical imagery can be contaminated by cloud cover that makes it difficult to acquire imagery in an optimal time range to discriminate certain crops. Due to the Copernicus program since 2015, a lot of Sentinel-1 SAR data at high spatial resolution is available for free for Ukraine. It allows us to use the time series of SAR data for crop classification. Our experiment for one administrative region in 2015 showed much higher crop classification accuracy with SAR data than with optical only time series [1, 2]. Therefore, in 2016 within the Google Earth Engine Research Award we use SAR data together with optical ones for large area crop mapping (entire territory of Ukraine) using cloud computing capabilities available at Google Earth Engine (GEE). This study compares different classification methods for crop mapping for the whole territory of Ukraine using data and algorithms from GEE. Classification performance assessed using overall classification accuracy, Kappa coefficients, and user's and producer's accuracies. Also, crop areas from derived classification maps compared to the official statistics [3]. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297. N. Kussul, S. Skakun, A. Shelestov, O. Kussul, "The use of satellite SAR imagery to crop classification in Ukraine within JECAM project," IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp.1497-1500, 13-18 July 2014, Quebec City, Canada. F.J. Gallego, N. Kussul, S. Skakun, O. Kravchenko, A. Shelestov, O. Kussul, "Efficiency assessment of using satellite data for crop area estimation in Ukraine," International Journal of Applied Earth Observation and Geoinformation vol. 29, pp. 22-30, 2014.
2016-01-01
Moderate Resolution Imaging Spectroradiometer (MODIS) data forms the basis for numerous land use and land cover (LULC) mapping and analysis frameworks at regional scale. Compared to other satellite sensors, the spatial, temporal and spectral specifications of MODIS are considered as highly suitable for LULC classifications which support many different aspects of social, environmental and developmental research. The LULC mapping of this study was carried out in the context of the development of an evaluation approach for Zimbabwe’s land reform program. Within the discourse about the success of this program, a lack of spatially explicit methods to produce objective data, such as on the extent of agricultural area, is apparent. We therefore assessed the suitability of moderate spatial and high temporal resolution imagery and phenological parameters to retrieve regional figures about the extent of cropland area in former freehold tenure in a series of 13 years from 2001–2013. Time-series data was processed with TIMESAT and was stratified according to agro-ecological potential zoning of Zimbabwe. Random Forest (RF) classifications were used to produce annual binary crop/non crop maps which were evaluated with high spatial resolution data from other satellite sensors. We assessed the cropland products in former freehold tenure in terms of classification accuracy, inter-annual comparability and heterogeneity. Although general LULC patterns were depicted in classification results and an overall accuracy of over 80% was achieved, user accuracies for rainfed agriculture were limited to below 65%. We conclude that phenological analysis has to be treated with caution when rainfed agriculture and grassland in semi-humid tropical regions have to be separated based on MODIS spectral data and phenological parameters. Because classification results significantly underestimate redistributed commercial farmland in Zimbabwe, we argue that the method cannot be used to produce spatial information on land-use which could be linked to tenure change. Hence capabilities of moderate resolution data are limited to assess Zimbabwe’s land reform. To make use of the unquestionable potential of MODIS time-series analysis, we propose an analysis of plant productivity which allows to link annual growth and production of vegetation to ownership after Zimbabwe’s land reform. PMID:27253327
A new complexity measure for time series analysis and classification
NASA Astrophysics Data System (ADS)
Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth
2013-07-01
Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).
Strath, Scott J; Kate, Rohit J; Keenan, Kevin G; Welch, Whitney A; Swartz, Ann M
2016-01-01
To develop and test time series single site and multi-site placement models, we used wrist, hip and ankle processed accelerometer data to estimate energy cost and type of physical activity in adults. Ninety-nine subjects in three age groups (18–39, 40–64, 65 + years) performed 11 activities while wearing three triaxial accelereometers: one each on the non-dominant wrist, hip, and ankle. During each activity net oxygen cost (METs) was assessed. The time series of accelerometer signals were represented in terms of uniformly discretized values called bins. Support Vector Machine was used for activity classification with bins and every pair of bins used as features. Bagged decision tree regression was used for net metabolic cost prediction. To evaluate model performance we employed the jackknife leave-one-out cross validation method. Single accelerometer and multi-accelerometer site model estimates across and within age group revealed similar accuracy, with a bias range of −0.03 to 0.01 METs, bias percent of −0.8 to 0.3%, and a rMSE range of 0.81–1.04 METs. Multi-site accelerometer location models improved activity type classification over single site location models from a low of 69.3% to a maximum of 92.8% accuracy. For each accelerometer site location model, or combined site location model, percent accuracy classification decreased as a function of age group, or when young age groups models were generalized to older age groups. Specific age group models on average performed better than when all age groups were combined. A time series computation show promising results for predicting energy cost and activity type. Differences in prediction across age group, a lack of generalizability across age groups, and that age group specific models perform better than when all ages are combined needs to be considered as analytic calibration procedures to detect energy cost and type are further developed. PMID:26449155
Generalizing DTW to the multi-dimensional case requires an adaptive approach
Hu, Bing; Jin, Hongxia; Wang, Jun; Keogh, Eamonn
2017-01-01
In recent years Dynamic Time Warping (DTW) has emerged as the distance measure of choice for virtually all time series data mining applications. For example, virtually all applications that process data from wearable devices use DTW as a core sub-routine. This is the result of significant progress in improving DTW’s efficiency, together with multiple empirical studies showing that DTW-based classifiers at least equal (and generally surpass) the accuracy of all their rivals across dozens of datasets. Thus far, most of the research has considered only the one-dimensional case, with practitioners generalizing to the multi-dimensional case in one of two ways, dependent or independent warping. In general, it appears the community believes either that the two ways are equivalent, or that the choice is irrelevant. In this work, we show that this is not the case. The two most commonly used multi-dimensional DTW methods can produce different classifications, and neither one dominates over the other. This seems to suggest that one should learn the best method for a particular application. However, we will show that this is not necessary; a simple, principled rule can be used on a case-by-case basis to predict which of the two methods we should trust at the time of classification. Our method allows us to ensure that classification results are at least as accurate as the better of the two rival methods, and, in many cases, our method is significantly more accurate. We demonstrate our ideas with the most extensive set of multi-dimensional time series classification experiments ever attempted. PMID:29104448
NASA Astrophysics Data System (ADS)
Madokoro, H.; Tsukada, M.; Sato, K.
2013-07-01
This paper presents an unsupervised learning-based object category formation and recognition method for mobile robot vision. Our method has the following features: detection of feature points and description of features using a scale-invariant feature transform (SIFT), selection of target feature points using one class support vector machines (OC-SVMs), generation of visual words using self-organizing maps (SOMs), formation of labels using adaptive resonance theory 2 (ART-2), and creation and classification of categories on a category map of counter propagation networks (CPNs) for visualizing spatial relations between categories. Classification results of dynamic images using time-series images obtained using two different-size robots and according to movements respectively demonstrate that our method can visualize spatial relations of categories while maintaining time-series characteristics. Moreover, we emphasize the effectiveness of our method for category formation of appearance changes of objects.
Accelerometry-based classification of human activities using Markov modeling.
Mannini, Andrea; Sabatini, Angelo Maria
2011-01-01
Accelerometers are a popular choice as body-motion sensors: the reason is partly in their capability of extracting information that is useful for automatically inferring the physical activity in which the human subject is involved, beside their role in feeding biomechanical parameters estimators. Automatic classification of human physical activities is highly attractive for pervasive computing systems, whereas contextual awareness may ease the human-machine interaction, and in biomedicine, whereas wearable sensor systems are proposed for long-term monitoring. This paper is concerned with the machine learning algorithms needed to perform the classification task. Hidden Markov Model (HMM) classifiers are studied by contrasting them with Gaussian Mixture Model (GMM) classifiers. HMMs incorporate the statistical information available on movement dynamics into the classification process, without discarding the time history of previous outcomes as GMMs do. An example of the benefits of the obtained statistical leverage is illustrated and discussed by analyzing two datasets of accelerometer time series.
Recurrent Neural Networks for Multivariate Time Series with Missing Values.
Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan
2018-04-17
Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.
Mei, Jiangyuan; Liu, Meizhu; Wang, Yuan-Fang; Gao, Huijun
2016-06-01
Multivariate time series (MTS) datasets broadly exist in numerous fields, including health care, multimedia, finance, and biometrics. How to classify MTS accurately has become a hot research topic since it is an important element in many computer vision and pattern recognition applications. In this paper, we propose a Mahalanobis distance-based dynamic time warping (DTW) measure for MTS classification. The Mahalanobis distance builds an accurate relationship between each variable and its corresponding category. It is utilized to calculate the local distance between vectors in MTS. Then we use DTW to align those MTS which are out of synchronization or with different lengths. After that, how to learn an accurate Mahalanobis distance function becomes another key problem. This paper establishes a LogDet divergence-based metric learning with triplet constraint model which can learn Mahalanobis matrix with high precision and robustness. Furthermore, the proposed method is applied on nine MTS datasets selected from the University of California, Irvine machine learning repository and Robert T. Olszewski's homepage, and the results demonstrate the improved performance of the proposed approach.
The Soil Series in Soil Classifications of the United States
NASA Astrophysics Data System (ADS)
Indorante, Samuel; Beaudette, Dylan; Brevik, Eric C.
2014-05-01
Organized national soil survey began in the United States in 1899, with soil types as the units being mapped. The soil series concept was introduced into the U.S. soil survey in 1903 as a way to relate soils being mapped in one area to the soils of other areas. The original concept of a soil series was all soil types formed in the same parent materials that were of the same geologic age. However, within about 15 years soil series became the primary units being mapped in U.S. soil survey. Soil types became subdivisions of soil series, with the subdivisions based on changes in texture. As the soil series became the primary mapping unit the concept of what a soil series was also changed. Instead of being based on parent materials and geologic age, the soil series of the 1920s was based on the morphology and composition of the soil profile. Another major change in the concept of soil series occurred when U.S. Soil Taxonomy was released in 1975. Under Soil Taxonomy, the soil series subdivisions were based on the uses the soils might be put to, particularly their agricultural uses (Simonson, 1997). While the concept of the soil series has changed over the years, the term soil series has been the longest-lived term in U.S. soil classification. It has appeared in every official classification system used by the U.S. soil survey (Brevik and Hartemink, 2013). The first classification system was put together by Milton Whitney in 1909 and had soil series at its second lowest level, with soil type at the lowest level. The second classification system used by the U.S. soil survey was developed by C.F. Marbut, H.H. Bennett, J.E. Lapham, and M.H. Lapham in 1913. It had soil series at the second highest level, with soil classes and soil types at more detailed levels. This was followed by another system in 1938 developed by M. Baldwin, C.E. Kellogg, and J. Thorp. In this system soil series were again at the second lowest level with soil types at the lowest level. The soil type concept was dropped and replaced by the soil phase in the 1950s in a modification of the 1938 Baldwin et al. classification (Simonson, 1997). When Soil Taxonomy was released in 1975, soil series became the most detailed (lowest) level of the classification system, and the only term maintained throughout all U.S. classifications to date. While the number of recognized soil series have increased steadily throughout the history of U.S. soil survey, there was a rapid increase in the recognition of new soil series following the introduction of Soil Taxonomy (Brevik and Hartemink, 2013). References Brevik, E.C., and A.E. Hartemink. 2013. Soil maps of the United States of America. Soil Science Society of America Journal 77:1117-1132. doi:10.2136/sssaj2012.0390. Simonson, R.W. 1997. Evolution of soil series and type concepts in the United States. Advances in Geoecology 29:79-108.
A Web-Based Framework For a Time-Domain Warehouse
NASA Astrophysics Data System (ADS)
Brewer, J. M.; Bloom, J. S.; Kennedy, R.; Starr, D. L.
2009-09-01
The Berkeley Transients Classification Pipeline (TCP) uses a machine-learning classifier to automatically categorize transients from large data torrents and provide automated notification of astronomical events of scientific interest. As part of the training process, we created a large warehouse of light-curve sources with well-labelled classes that serve as priors to the classification engine. This web-based interactive framework, which we are now making public via DotAstro.org (http://dotastro.org/), allows us to ingest time-variable source data in a wide variety of formats and store it in a common internal data model. Data is passed between pipeline modules in a prototype XML representation of time-series format (VOTimeseries), which can also be emitted to collaborators through dotastro.org. After import, the sources can be visualized using Google Sky, light curves can be inspected interactively, and classifications can be manually adjusted.
A subject-independent pattern-based Brain-Computer Interface
Ray, Andreas M.; Sitaram, Ranganatha; Rana, Mohit; Pasqualotto, Emanuele; Buyukturkoglu, Korhan; Guan, Cuntai; Ang, Kai-Keng; Tejos, Cristián; Zamorano, Francisco; Aboitiz, Francisco; Birbaumer, Niels; Ruiz, Sergio
2015-01-01
While earlier Brain-Computer Interface (BCI) studies have mostly focused on modulating specific brain regions or signals, new developments in pattern classification of brain states are enabling real-time decoding and modulation of an entire functional network. The present study proposes a new method for real-time pattern classification and neurofeedback of brain states from electroencephalographic (EEG) signals. It involves the creation of a fused classification model based on the method of Common Spatial Patterns (CSPs) from data of several healthy individuals. The subject-independent model is then used to classify EEG data in real-time and provide feedback to new individuals. In a series of offline experiments involving training and testing of the classifier with individual data from 27 healthy subjects, a mean classification accuracy of 75.30% was achieved, demonstrating that the classification system at hand can reliably decode two types of imagery used in our experiments, i.e., happy emotional imagery and motor imagery. In a subsequent experiment it is shown that the classifier can be used to provide neurofeedback to new subjects, and that these subjects learn to “match” their brain pattern to that of the fused classification model in a few days of neurofeedback training. This finding can have important implications for future studies on neurofeedback and its clinical applications on neuropsychiatric disorders. PMID:26539089
Sequential visibility-graph motifs
NASA Astrophysics Data System (ADS)
Iacovacci, Jacopo; Lacasa, Lucas
2016-04-01
Visibility algorithms transform time series into graphs and encode dynamical information in their topology, paving the way for graph-theoretical time series analysis as well as building a bridge between nonlinear dynamics and network science. In this work we introduce and study the concept of sequential visibility-graph motifs, smaller substructures of n consecutive nodes that appear with characteristic frequencies. We develop a theory to compute in an exact way the motif profiles associated with general classes of deterministic and stochastic dynamics. We find that this simple property is indeed a highly informative and computationally efficient feature capable of distinguishing among different dynamics and robust against noise contamination. We finally confirm that it can be used in practice to perform unsupervised learning, by extracting motif profiles from experimental heart-rate series and being able, accordingly, to disentangle meditative from other relaxation states. Applications of this general theory include the automatic classification and description of physical, biological, and financial time series.
George R. Hoffman; Robert R. Alexander
1987-01-01
A vegetation classification based on concepts and methods developed by Daubenmire was used to identify 12 forest habitat types and one shrub habitat type in the Black Hills. Included were two habitat types in the Quercus macrocarpa series, seven in the Pinus ponderosa series, one in the Populus tremuloides series, two in the Picea glaucci series, and one in the...
Exploring Children's Thinking. Part 1: The Development of Classification (Preschool - Third Grade).
ERIC Educational Resources Information Center
Alward, Keith R.
This unit of the Flexible Learning System (FLS), the first of a 3-volume series on children's thinking, discusses the development of classification in children between 3 and 8 years of age. The series is based on the application of Jean Piaget's work to early childhood education. The development of classification is revealed in the way children…
Wulsin, D. F.; Gupta, J. R.; Mani, R.; Blanco, J. A.; Litt, B.
2011-01-01
Clinical electroencephalography (EEG) records vast amounts of human complex data yet is still reviewed primarily by human readers. Deep Belief Nets (DBNs) are a relatively new type of multi-layer neural network commonly tested on two-dimensional image data, but are rarely applied to times-series data such as EEG. We apply DBNs in a semi-supervised paradigm to model EEG waveforms for classification and anomaly detection. DBN performance was comparable to standard classifiers on our EEG dataset, and classification time was found to be 1.7 to 103.7 times faster than the other high-performing classifiers. We demonstrate how the unsupervised step of DBN learning produces an autoencoder that can naturally be used in anomaly measurement. We compare the use of raw, unprocessed data—a rarity in automated physiological waveform analysis—to hand-chosen features and find that raw data produces comparable classification and better anomaly measurement performance. These results indicate that DBNs and raw data inputs may be more effective for online automated EEG waveform recognition than other common techniques. PMID:21525569
Marinelli, A; Guerra, E; Rotini, R
2016-12-01
In the recent years, considerable improvements have come in biomechanical knowledge about the role of elbow stabilizers. In particular, the complex interactions among the different stabilizers when injured at the same time have been better understood. Anyway, uncertainties about both nomenclature and classification still exist in the definition of the different patterns of instability. The authors examine the literature of the last 130 years about elbow instability classification, analyzing the intuitions and the value of each of them. However, because of the lack of a satisfactory classification, in 2015 a working group has been created inside SICSeG (Italian Society of Shoulder and Elbow Surgery) with the aim of defining an exhaustive classification as simple, complete and reproducible as possible. A new all-inclusive elbow instability classification is proposed. This classification considers two main parameters: timing (acute and chronic forms) and involved stabilizers (simple and complex forms), and four secondary parameters: etiology (traumatic, rheumatic, congenital…), the involved joint (radius and ulna as a single unit articulating with the humerus or the proximal radio-ulnar joint), the degree of displacement (dislocation or subluxation) and the mechanism of instability or dislocation (PLRI, PMRI, direct axial loading, pure varus or valgus stress). This classification is at the same time complete enough to include all the instability patterns and practical enough to be effectively used in the clinical practice. This classification can help in defining a shared language, can improve our understanding of the disorder, reduce misunderstanding of diagnosis and improve comparison among different case series.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
NASA Astrophysics Data System (ADS)
Zhu, Zhe
2017-08-01
The free and open access to all archived Landsat images in 2008 has completely changed the way of using Landsat data. Many novel change detection algorithms based on Landsat time series have been developed We present a comprehensive review of four important aspects of change detection studies based on Landsat time series, including frequencies, preprocessing, algorithms, and applications. We observed the trend that the more recent the study, the higher the frequency of Landsat time series used. We reviewed a series of image preprocessing steps, including atmospheric correction, cloud and cloud shadow detection, and composite/fusion/metrics techniques. We divided all change detection algorithms into six categories, including thresholding, differencing, segmentation, trajectory classification, statistical boundary, and regression. Within each category, six major characteristics of different algorithms, such as frequency, change index, univariate/multivariate, online/offline, abrupt/gradual change, and sub-pixel/pixel/spatial were analyzed. Moreover, some of the widely-used change detection algorithms were also discussed. Finally, we reviewed different change detection applications by dividing these applications into two categories, change target and change agent detection.
Comparison of Sub-Pixel Classification Approaches for Crop-Specific Mapping
This paper examined two non-linear models, Multilayer Perceptron (MLP) regression and Regression Tree (RT), for estimating sub-pixel crop proportions using time-series MODIS-NDVI data. The sub-pixel proportions were estimated for three major crop types including corn, soybean, a...
The Pacific Northwest Hydrologic Landscapes (PNW HL) at the assessment unit scale has provided a solid conceptual classification framework to relate and transfer hydrologically meaningful information between watersheds without access to streamflow time series. A collection of tec...
Milroy, B C; Sackelariou, R P; Lendvay, P G; Baldwin, M R; McGlynn, M
1991-01-01
This paper describes a simple method of classification and evaluation of the functional results of replanted and revascularized parts in the hand. The results are presented in graphic form and have been analyzed to correlate various factors: injured part, cause, and zone (level) of injury. The type of injury, ischemic time and age have been studied in more detail to determine their influence of the final functional result. The series contains 187 amputated and devascularized parts of the hand in 119 patients who have undergone surgery at the Prince of Wales Hospital from 1984 through 1988. The length of cold or warm ischemic times, up to 16 hours in this series, while not affecting survival of the amputated part, does adversely affect the functional result. The survival rate of replanted parts in children was significantly less favorable than in adults, but the functional results were uniformly superior.
Bernard L. Kovalchik; Rodrick R. Clausnitzer
2004-01-01
This is a classification of aquatic, wetland, and riparian series and plant associations found within the Colville, Okanogan, and Wenatchee National Forests. It is based on the potential vegetation occurring on lake and pond margins, wetland fens and bogs, and fluvial surfaces along streams and rivers within Forest Service lands. Data used in the classification were...
Rai, Shesh N; Trainor, Patrick J; Khosravi, Farhad; Kloecker, Goetz; Panchapakesan, Balaji
2016-01-01
The development of biosensors that produce time series data will facilitate improvements in biomedical diagnostics and in personalized medicine. The time series produced by these devices often contains characteristic features arising from biochemical interactions between the sample and the sensor. To use such characteristic features for determining sample class, similarity-based classifiers can be utilized. However, the construction of such classifiers is complicated by the variability in the time domains of such series that renders the traditional distance metrics such as Euclidean distance ineffective in distinguishing between biological variance and time domain variance. The dynamic time warping (DTW) algorithm is a sequence alignment algorithm that can be used to align two or more series to facilitate quantifying similarity. In this article, we evaluated the performance of DTW distance-based similarity classifiers for classifying time series that mimics electrical signals produced by nanotube biosensors. Simulation studies demonstrated the positive performance of such classifiers in discriminating between time series containing characteristic features that are obscured by noise in the intensity and time domains. We then applied a DTW distance-based k -nearest neighbors classifier to distinguish the presence/absence of mesenchymal biomarker in cancer cells in buffy coats in a blinded test. Using a train-test approach, we find that the classifier had high sensitivity (90.9%) and specificity (81.8%) in differentiating between EpCAM-positive MCF7 cells spiked in buffy coats and those in plain buffy coats.
Characterizing time series via complexity-entropy curves
NASA Astrophysics Data System (ADS)
Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.
2017-06-01
The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.
NASA Astrophysics Data System (ADS)
Ma, Weiwei; Gong, Cailan; Hu, Yong; Li, Long; Meng, Peng
2015-10-01
Remote sensing technology has been broadly recognized for its convenience and efficiency in mapping vegetation, particularly in high-altitude and inaccessible areas where there are lack of in-situ observations. In this study, Landsat Thematic Mapper (TM) images and Chinese environmental mitigation satellite CCD sensor (HJ-1 CCD) images, both of which are at 30m spatial resolution were employed for identifying and monitoring of vegetation types in a area of Western China——Qinghai Lake Watershed(QHLW). A decision classification tree (DCT) algorithm using multi-characteristic including seasonal TM/HJ-1 CCD time series data combined with digital elevation models (DEMs) dataset, and a supervised maximum likelihood classification (MLC) algorithm with single-data TM image were applied vegetation classification. Accuracy of the two algorithms was assessed using field observation data. Based on produced vegetation classification maps, it was found that the DCT using multi-season data and geomorphologic parameters was superior to the MLC algorithm using single-data image, improving the overall accuracy by 11.86% at second class level and significantly reducing the "salt and pepper" noise. The DCT algorithm applied to TM /HJ-1 CCD time series data geomorphologic parameters appeared as a valuable and reliable tool for monitoring vegetation at first class level (5 vegetation classes) and second class level(8 vegetation subclasses). The DCT algorithm using multi-characteristic might provide a theoretical basis and general approach to automatic extraction of vegetation types from remote sensing imagery over plateau areas.
NASA Astrophysics Data System (ADS)
Löw, Fabian; Schorcht, Gunther; Michel, Ulrich; Dech, Stefan; Conrad, Christopher
2012-10-01
Accurate crop identification and crop area estimation are important for studies on irrigated agricultural systems, yield and water demand modeling, and agrarian policy development. In this study a novel combination of Random Forest (RF) and Support Vector Machine (SVM) classifiers is presented that (i) enhances crop classification accuracy and (ii) provides spatial information on map uncertainty. The methodology was implemented over four distinct irrigated sites in Middle Asia using RapidEye time series data. The RF feature importance statistics was used as feature-selection strategy for the SVM to assess possible negative effects on classification accuracy caused by an oversized feature space. The results of the individual RF and SVM classifications were combined with rules based on posterior classification probability and estimates of classification probability entropy. SVM classification performance was increased by feature selection through RF. Further experimental results indicate that the hybrid classifier improves overall classification accuracy in comparison to the single classifiers as well as useŕs and produceŕs accuracy.
Code of Federal Regulations, 2012 CFR
2012-07-01
... classification unit of a manufacturer's product line and is comprised of all vehicle designs, models or series... surface. (3) Competition motorcycle means any motorcycle designed and marketed solely for use in closed.... (7) Acoustical Assurance Period (AAP) means a specified period of time or miles driven after sale to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... classification unit of a manufacturer's product line and is comprised of all vehicle designs, models or series... surface. (3) Competition motorcycle means any motorcycle designed and marketed solely for use in closed.... (7) Acoustical Assurance Period (AAP) means a specified period of time or miles driven after sale to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... classification unit of a manufacturer's product line and is comprised of all vehicle designs, models or series... surface. (3) Competition motorcycle means any motorcycle designed and marketed solely for use in closed.... (7) Acoustical Assurance Period (AAP) means a specified period of time or miles driven after sale to...
Rana, Mohit; Prasad, Vinod A.; Guan, Cuntai; Birbaumer, Niels; Sitaram, Ranganatha
2016-01-01
Recently, studies have reported the use of Near Infrared Spectroscopy (NIRS) for developing Brain–Computer Interface (BCI) by applying online pattern classification of brain states from subject-specific fNIRS signals. The purpose of the present study was to develop and test a real-time method for subject-specific and subject-independent classification of multi-channel fNIRS signals using support-vector machines (SVM), so as to determine its feasibility as an online neurofeedback system. Towards this goal, we used left versus right hand movement execution and movement imagery as study paradigms in a series of experiments. In the first two experiments, activations in the motor cortex during movement execution and movement imagery were used to develop subject-dependent models that obtained high classification accuracies thereby indicating the robustness of our classification method. In the third experiment, a generalized classifier-model was developed from the first two experimental data, which was then applied for subject-independent neurofeedback training. Application of this method in new participants showed mean classification accuracy of 63% for movement imagery tasks and 80% for movement execution tasks. These results, and their corresponding offline analysis reported in this study demonstrate that SVM based real-time subject-independent classification of fNIRS signals is feasible. This method has important applications in the field of hemodynamic BCIs, and neuro-rehabilitation where patients can be trained to learn spatio-temporal patterns of healthy brain activity. PMID:27467528
Hidden discriminative features extraction for supervised high-order time series modeling.
Nguyen, Ngoc Anh Thi; Yang, Hyung-Jeong; Kim, Sunhee
2016-11-01
In this paper, an orthogonal Tucker-decomposition-based extraction of high-order discriminative subspaces from a tensor-based time series data structure is presented, named as Tensor Discriminative Feature Extraction (TDFE). TDFE relies on the employment of category information for the maximization of the between-class scatter and the minimization of the within-class scatter to extract optimal hidden discriminative feature subspaces that are simultaneously spanned by every modality for supervised tensor modeling. In this context, the proposed tensor-decomposition method provides the following benefits: i) reduces dimensionality while robustly mining the underlying discriminative features, ii) results in effective interpretable features that lead to an improved classification and visualization, and iii) reduces the processing time during the training stage and the filtering of the projection by solving the generalized eigenvalue issue at each alternation step. Two real third-order tensor-structures of time series datasets (an epilepsy electroencephalogram (EEG) that is modeled as channel×frequency bin×time frame and a microarray data that is modeled as gene×sample×time) were used for the evaluation of the TDFE. The experiment results corroborate the advantages of the proposed method with averages of 98.26% and 89.63% for the classification accuracies of the epilepsy dataset and the microarray dataset, respectively. These performance averages represent an improvement on those of the matrix-based algorithms and recent tensor-based, discriminant-decomposition approaches; this is especially the case considering the small number of samples that are used in practice. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Girardi, P.; Pastres, R.; Gaetan, C.; Mangin, A.; Taji, M. A.
2015-12-01
In this paper, we present the results of a classification of Adriatic waters, based on spatial time series of remotely sensed Chlorophyll type-a. The study was carried out using a clustering procedure combining quantile smoothing and an agglomerative clustering algorithms. The smoothing function includes a seasonal term, thus allowing one to classify areas according to “similar” seasonal evolution, as well as according to “similar” trends. This methodology, which is here applied for the first time to Ocean Colour data, is more robust with respect to other classical methods, as it does not require any assumption on the probability distribution of the data. This approach was applied to the classification of an eleven year long time series, from January 2002 to December 2012, of monthly values of Chlorophyll type-a concentrations covering the whole Adriatic Sea. The data set was made available by ACRI (http://hermes.acri.fr) in the framework of the Glob-Colour Project (http://www.globcolour.info). Data were obtained by calibrating Ocean Colour data provided by different satellite missions, such as MERIS, SeaWiFS and MODIS. The results clearly show the presence of North-South and West-East gradient in the level of Chlorophyll, which is consistent with literature findings. This analysis could provide a sound basis for the identification of “water bodies” and of Chlorophyll type-a thresholds which define their Good Ecological Status, in terms of trophic level, as required by the implementation of the Marine Strategy Framework Directive. The forthcoming availability of Sentinel-3 OLCI data, in continuity of the previous missions, and with perspective of more than a 15-year monitoring system, offers a real opportunity of expansion of our study as a strong support to the implementation of both the EU Marine Strategy Framework Directive and the UNEP-MAP Ecosystem Approach in the Mediterranean.
Analysis of Multipsectral Time Series for supporting Forest Management Plans
NASA Astrophysics Data System (ADS)
Simoniello, T.; Carone, M. T.; Costantini, G.; Frattegiani, M.; Lanfredi, M.; Macchiato, M.
2010-05-01
Adequate forest management requires specific plans based on updated and detailed mapping. Multispectral satellite time series have been largely applied to forest monitoring and studies at different scales tanks to their capability of providing synoptic information on some basic parameters descriptive of vegetation distribution and status. As a low expensive tool for supporting forest management plans in operative context, we tested the use of Landsat-TM/ETM time series (1987-2006) in the high Agri Valley (Southern Italy) for planning field surveys as well as for the integration of existing cartography. As preliminary activity to make all scenes radiometrically consistent the no-change regression normalization was applied to the time series; then all the data concerning available forest maps, municipal boundaries, water basins, rivers, and roads were overlapped in a GIS environment. From the 2006 image we elaborated the NDVI map and analyzed the distribution for each land cover class. To separate the physiological variability and identify the anomalous areas, a threshold on the distributions was applied. To label the non homogenous areas, a multitemporal analysis was performed by separating heterogeneity due to cover changes from that linked to basilar unit mapping and classification labelling aggregations. Then a map of priority areas was produced to support the field survey plan. To analyze the territorial evolution, the historical land cover maps were elaborated by adopting a hybrid classification approach based on a preliminary segmentation, the identification of training areas, and a subsequent maximum likelihood categorization. Such an analysis was fundamental for the general assessment of the territorial dynamics and in particular for the evaluation of the efficacy of past intervention activities.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-09
.... Above GS-15 Positions B. Classification 1. Occupational Series 2. Classification Standards and Position... Duty Locations Appendix B: Occupational Series by Occupational Family Appendix C: Intervention Model..., MD; Lakehurst, NJ; and Orlando, FL. These facilities support research, development, test, evaluation...
NASA Astrophysics Data System (ADS)
Salmon, B. P.; Kleynhans, W.; Olivier, J. C.; van den Bergh, F.; Wessels, K. J.
2018-05-01
Humans are transforming land cover at an ever-increasing rate. Accurate geographical maps on land cover, especially rural and urban settlements are essential to planning sustainable development. Time series extracted from MODerate resolution Imaging Spectroradiometer (MODIS) land surface reflectance products have been used to differentiate land cover classes by analyzing the seasonal patterns in reflectance values. The proper fitting of a parametric model to these time series usually requires several adjustments to the regression method. To reduce the workload, a global setting of parameters is done to the regression method for a geographical area. In this work we have modified a meta-optimization approach to setting a regression method to extract the parameters on a per time series basis. The standard deviation of the model parameters and magnitude of residuals are used as scoring function. We successfully fitted a triply modulated model to the seasonal patterns of our study area using a non-linear extended Kalman filter (EKF). The approach uses temporal information which significantly reduces the processing time and storage requirements to process each time series. It also derives reliability metrics for each time series individually. The features extracted using the proposed method are classified with a support vector machine and the performance of the method is compared to the original approach on our ground truth data.
This research examined sub-pixel land-cover classification performance for tree canopy, impervious surface, and cropland in the Laurentian Great Lakes Basin (GLB) using both timeseries MODIS (MOderate Resolution Imaging Spectroradiometer) NDVI (Normalized Difference Vegetation In...
Goldstein, Benjamin A; Chang, Tara I; Winkelmayer, Wolfgang C
2015-10-01
Electronic Health Records (EHRs) present the opportunity to observe serial measurements on patients. While potentially informative, analyzing these data can be challenging. In this work we present a means to classify individuals based on a series of measurements collected by an EHR. Using patients undergoing hemodialysis, we categorized people based on their intradialytic blood pressure. Our primary criteria were that the classifications were time dependent and independent of other subjects. We fit a curve of intradialytic blood pressure using regression splines and then calculated first and second derivatives to come up with four mutually exclusive classifications at different time points. We show that these classifications relate to near term risk of cardiac events and are moderately stable over a succeeding two-week period. This work has general application for analyzing dense EHR data. Copyright © 2015 Elsevier Inc. All rights reserved.
Aggregative Learning Method and Its Application for Communication Quality Evaluation
NASA Astrophysics Data System (ADS)
Akhmetov, Dauren F.; Kotaki, Minoru
2007-12-01
In this paper, so-called Aggregative Learning Method (ALM) is proposed to improve and simplify the learning and classification abilities of different data processing systems. It provides a universal basis for design and analysis of mathematical models of wide class. A procedure was elaborated for time series model reconstruction and analysis for linear and nonlinear cases. Data approximation accuracy (during learning phase) and data classification quality (during recall phase) are estimated from introduced statistic parameters. The validity and efficiency of the proposed approach have been demonstrated through its application for monitoring of wireless communication quality, namely, for Fixed Wireless Access (FWA) system. Low memory and computation resources were shown to be needed for the procedure realization, especially for data classification (recall) stage. Characterized with high computational efficiency and simple decision making procedure, the derived approaches can be useful for simple and reliable real-time surveillance and control system design.
NASA Astrophysics Data System (ADS)
Knoefel, Patrick; Loew, Fabian; Conrad, Christopher
2015-04-01
Crop maps based on classification of remotely sensed data are of increased attendance in agricultural management. This induces a more detailed knowledge about the reliability of such spatial information. However, classification of agricultural land use is often limited by high spectral similarities of the studied crop types. More, spatially and temporally varying agro-ecological conditions can introduce confusion in crop mapping. Classification errors in crop maps in turn may have influence on model outputs, like agricultural production monitoring. One major goal of the PhenoS project ("Phenological structuring to determine optimal acquisition dates for Sentinel-2 data for field crop classification"), is the detection of optimal phenological time windows for land cover classification purposes. Since many crop species are spectrally highly similar, accurate classification requires the right selection of satellite images for a certain classification task. In the course of one growing season, phenological phases exist where crops are separable with higher accuracies. For this purpose, coupling of multi-temporal spectral characteristics and phenological events is promising. The focus of this study is set on the separation of spectrally similar cereal crops like winter wheat, barley, and rye of two test sites in Germany called "Harz/Central German Lowland" and "Demmin". However, this study uses object based random forest (RF) classification to investigate the impact of image acquisition frequency and timing on crop classification uncertainty by permuting all possible combinations of available RapidEye time series recorded on the test sites between 2010 and 2014. The permutations were applied to different segmentation parameters. Then, classification uncertainty was assessed and analysed, based on the probabilistic soft-output from the RF algorithm at the per-field basis. From this soft output, entropy was calculated as a spatial measure of classification uncertainty. The results indicate that uncertainty estimates provide a valuable addition to traditional accuracy assessments and helps the user to allocate error in crop maps.
Innovative use of self-organising maps (SOMs) in model validation.
NASA Astrophysics Data System (ADS)
Jolly, Ben; McDonald, Adrian; Coggins, Jack
2016-04-01
We present an innovative combination of techniques for validation of numerical weather prediction (NWP) output against both observations and reanalyses using two classification schemes, demonstrated by a validation of the operational NWP 'AMPS' (the Antarctic Mesoscale Prediction System). Historically, model validation techniques have centred on case studies or statistics at various time scales (yearly/seasonal/monthly). Within the past decade the latter technique has been expanded by the addition of classification schemes in place of time scales, allowing more precise analysis. Classifications are typically generated for either the model or the observations, then used to create composites for both which are compared. Our method creates and trains a single self-organising map (SOM) on both the model output and observations, which is then used to classify both datasets using the same class definitions. In addition to the standard statistics on class composites, we compare the classifications themselves between the model and the observations. To add further context to the area studied, we use the same techniques to compare the SOM classifications with regimes developed for another study to great effect. The AMPS validation study compares model output against surface observations from SNOWWEB and existing University of Wisconsin-Madison Antarctic Automatic Weather Stations (AWS) during two months over the austral summer of 2014-15. Twelve SOM classes were defined in a '4 x 3' pattern, trained on both model output and observations of 2 m wind components, then used to classify both training datasets. Simple statistics (correlation, bias and normalised root-mean-square-difference) computed for SOM class composites showed that AMPS performed well during extreme weather events, but less well during lighter winds and poorly during the more changeable conditions between either extreme. Comparison of the classification time-series showed that, while correlations were lower during lighter wind periods, AMPS actually forecast the existence of those periods well suggesting that the correlations may be unfairly low. Further investigation showed poor temporal alignment during more changeable conditions, highlighting problems AMPS has around the exact timing of events. There was also a tendency for AMPS to over-predict certain wind flow patterns at the expense of others. In order to gain a larger scale perspective, we compared our mesoscale SOM classification time-series with synoptic scale regimes developed by another study using ERA-Interim reanalysis output and k-means clustering. There was good alignment between the regimes and the observations classifications (observations/regimes), highlighting the effect of synoptic scale forcing on the area. However, comparing the alignment between observations/regimes and AMPS/regimes showed that AMPS may have problems accurately resolving the strength and location of cyclones in the Ross Sea to the north of the target area.
Challenges in the automated classification of variable stars in large databases
NASA Astrophysics Data System (ADS)
Graham, Matthew; Drake, Andrew; Djorgovski, S. G.; Mahabal, Ashish; Donalek, Ciro
2017-09-01
With ever-increasing numbers of astrophysical transient surveys, new facilities and archives of astronomical time series, time domain astronomy is emerging as a mainstream discipline. However, the sheer volume of data alone - hundreds of observations for hundreds of millions of sources - necessitates advanced statistical and machine learning methodologies for scientific discovery: characterization, categorization, and classification. Whilst these techniques are slowly entering the astronomer's toolkit, their application to astronomical problems is not without its issues. In this paper, we will review some of the challenges posed by trying to identify variable stars in large data collections, including appropriate feature representations, dealing with uncertainties, establishing ground truths, and simple discrete classes.
A Classification System to Guide Physical Therapy Management in Huntington Disease: A Case Series.
Fritz, Nora E; Busse, Monica; Jones, Karen; Khalil, Hanan; Quinn, Lori
2017-07-01
Individuals with Huntington disease (HD), a rare neurological disease, experience impairments in mobility and cognition throughout their disease course. The Medical Research Council framework provides a schema that can be applied to the development and evaluation of complex interventions, such as those provided by physical therapists. Treatment-based classifications, based on expert consensus and available literature, are helpful in guiding physical therapy management across the stages of HD. Such classifications also contribute to the development and further evaluation of well-defined complex interventions in this highly variable and complex neurodegenerative disease. The purpose of this case series was to illustrate the use of these classifications in the management of 2 individuals with late-stage HD. Two females, 40 and 55 years of age, with late-stage HD participated in this case series. Both experienced progressive declines in ambulatory function and balance as well as falls or fear of falling. Both individuals received daily care in the home for activities of daily living. Physical therapy Treatment-Based Classifications for HD guided the interventions and outcomes. Eight weeks of in-home balance training, strength training, task-specific practice of functional activities including transfers and walking tasks, and family/carer education were provided. Both individuals demonstrated improvements that met or exceeded the established minimal detectible change values for gait speed and Timed Up and Go performance. Both also demonstrated improvements on Berg Balance Scale and Physical Performance Test performance, with 1 of the 2 individuals exceeding the established minimal detectible changes for both tests. Reductions in fall risk were evident in both cases. These cases provide proof-of-principle to support use of treatment-based classifications for physical therapy management in individuals with HD. Traditional classification of early-, mid-, and late-stage disease progression may not reflect patients' true capabilities; those with late-stage HD may be as responsive to interventions as those at an earlier disease stage.Video Abstract available for additional insights from the authors (see Supplemental Digital Content 1, available at: http://links.lww.com/JNPT/A172).
Automatic Classification of Time-variable X-Ray Sources
NASA Astrophysics Data System (ADS)
Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.
2014-05-01
To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ~97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7-500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.
Segmentation and analysis of mouse pituitary cells with graphic user interface (GUI)
NASA Astrophysics Data System (ADS)
González, Erika; Medina, Lucía.; Hautefeuille, Mathieu; Fiordelisio, Tatiana
2018-02-01
In this work we present a method to perform pituitary cell segmentation in image stacks acquired by fluorescence microscopy from pituitary slice preparations. Although there exist many procedures developed to achieve cell segmentation tasks, they are generally based on the edge detection and require high resolution images. However in the biological preparations that we worked on, the cells are not well defined as experts identify their intracellular calcium activity due to fluorescence intensity changes in different regions over time. This intensity changes were associated with time series over regions, and because they present a particular behavior they were used into a classification procedure in order to perform cell segmentation. Two logistic regression classifiers were implemented for the time series classification task using as features the area under the curve and skewness in the first classifier and skewness and kurtosis in the second classifier. Once we have found both decision boundaries in two different feature spaces by training using 120 time series, the decision boundaries were tested over 12 image stacks through a python graphical user interface (GUI), generating binary images where white pixels correspond to cells and the black ones to background. Results show that area-skewness classifier reduces the time an expert dedicates in locating cells by up to 75% in some stacks versus a 92% for the kurtosis-skewness classifier, this evaluated on the number of regions the method found. Due to the promising results, we expect that this method will be improved adding more relevant features to the classifier.
NASA Astrophysics Data System (ADS)
Zafari, A.; Zurita-Milla, R.; Izquierdo-Verdiguier, E.
2017-10-01
Crop maps are essential inputs for the agricultural planning done at various governmental and agribusinesses agencies. Remote sensing offers timely and costs efficient technologies to identify and map crop types over large areas. Among the plethora of classification methods, Support Vector Machine (SVM) and Random Forest (RF) are widely used because of their proven performance. In this work, we study the synergic use of both methods by introducing a random forest kernel (RFK) in an SVM classifier. A time series of multispectral WorldView-2 images acquired over Mali (West Africa) in 2014 was used to develop our case study. Ground truth containing five common crop classes (cotton, maize, millet, peanut, and sorghum) were collected at 45 farms and used to train and test the classifiers. An SVM with the standard Radial Basis Function (RBF) kernel, a RF, and an SVM-RFK were trained and tested over 10 random training and test subsets generated from the ground data. Results show that the newly proposed SVM-RFK classifier can compete with both RF and SVM-RBF. The overall accuracies based on the spectral bands only are of 83, 82 and 83% respectively. Adding vegetation indices to the analysis result in the classification accuracy of 82, 81 and 84% for SVM-RFK, RF, and SVM-RBF respectively. Overall, it can be observed that the newly tested RFK can compete with SVM-RBF and RF classifiers in terms of classification accuracy.
Lu, Na; Li, Tengfei; Pan, Jinjin; Ren, Xiaodong; Feng, Zuren; Miao, Hongyu
2015-05-01
Electroencephalogram (EEG) provides a non-invasive approach to measure the electrical activities of brain neurons and has long been employed for the development of brain-computer interface (BCI). For this purpose, various patterns/features of EEG data need to be extracted and associated with specific events like cue-paced motor imagery. However, this is a challenging task since EEG data are usually non-stationary time series with a low signal-to-noise ratio. In this study, we propose a novel method, called structure constrained semi-nonnegative matrix factorization (SCS-NMF), to extract the key patterns of EEG data in time domain by imposing the mean envelopes of event-related potentials (ERPs) as constraints on the semi-NMF procedure. The proposed method is applicable to general EEG time series, and the extracted temporal features by SCS-NMF can also be combined with other features in frequency domain to improve the performance of motor imagery classification. Real data experiments have been performed using the SCS-NMF approach for motor imagery classification, and the results clearly suggest the superiority of the proposed method. Comparison experiments have also been conducted. The compared methods include ICA, PCA, Semi-NMF, Wavelets, EMD and CSP, which further verified the effectivity of SCS-NMF. The SCS-NMF method could obtain better or competitive performance over the state of the art methods, which provides a novel solution for brain pattern analysis from the perspective of structure constraint. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fuchs, Erich; Gruber, Christian; Reitmaier, Tobias; Sick, Bernhard
2009-09-01
Neural networks are often used to process temporal information, i.e., any kind of information related to time series. In many cases, time series contain short-term and long-term trends or behavior. This paper presents a new approach to capture temporal information with various reference periods simultaneously. A least squares approximation of the time series with orthogonal polynomials will be used to describe short-term trends contained in a signal (average, increase, curvature, etc.). Long-term behavior will be modeled with the tapped delay lines of a time-delay neural network (TDNN). This network takes the coefficients of the orthogonal expansion of the approximating polynomial as inputs such considering short-term and long-term information efficiently. The advantages of the method will be demonstrated by means of artificial data and two real-world application examples, the prediction of the user number in a computer network and online tool wear classification in turning.
A vegetation classification system applied to southern California
Timothy E. Paysen; Jeanine A. Derby; Hugh Black; Vernon C. Bleich; John W. Mincks
1980-01-01
A classification system for use in describing vegetation has been developed and is being applied to southern California. It is based upon a hierarchical stratification of vegetation, using physiognomic and taxonomic criteria. The system categories are Formation, Subformation. Series, Association, and Phase. Formations, Subformations, and Series have been specified for...
Automatic Detection and Classification of Unsafe Events During Power Wheelchair Use.
Pineau, Joelle; Moghaddam, Athena K; Yuen, Hiu Kim; Archambault, Philippe S; Routhier, François; Michaud, François; Boissy, Patrick
2014-01-01
Using a powered wheelchair (PW) is a complex task requiring advanced perceptual and motor control skills. Unfortunately, PW incidents and accidents are not uncommon and their consequences can be serious. The objective of this paper is to develop technological tools that can be used to characterize a wheelchair user's driving behavior under various settings. In the experiments conducted, PWs are outfitted with a datalogging platform that records, in real-time, the 3-D acceleration of the PW. Data collection was conducted over 35 different activities, designed to capture a spectrum of PW driving events performed at different speeds (collisions with fixed or moving objects, rolling on incline plane, and rolling across multiple types obstacles). The data was processed using time-series analysis and data mining techniques, to automatically detect and identify the different events. We compared the classification accuracy using four different types of time-series features: 1) time-delay embeddings; 2) time-domain characterization; 3) frequency-domain features; and 4) wavelet transforms. In the analysis, we compared the classification accuracy obtained when distinguishing between safe and unsafe events during each of the 35 different activities. For the purposes of this study, unsafe events were defined as activities containing collisions against objects at different speed, and the remainder were defined as safe events. We were able to accurately detect 98% of unsafe events, with a low (12%) false positive rate, using only five examples of each activity. This proof-of-concept study shows that the proposed approach has the potential of capturing, based on limited input from embedded sensors, contextual information on PW use, and of automatically characterizing a user's PW driving behavior.
NASA Astrophysics Data System (ADS)
Murawski, Aline; Bürger, Gerd; Vorogushyn, Sergiy; Merz, Bruno
2016-04-01
The use of a weather pattern based approach for downscaling of coarse, gridded atmospheric data, as usually obtained from the output of general circulation models (GCM), allows for investigating the impact of anthropogenic greenhouse gas emissions on fluxes and state variables of the hydrological cycle such as e.g. on runoff in large river catchments. Here we aim at attributing changes in high flows in the Rhine catchment to anthropogenic climate change. Therefore we run an objective classification scheme (simulated annealing and diversified randomisation - SANDRA, available from the cost733 classification software) on ERA20C reanalyses data and apply the established classification to GCMs from the CMIP5 project. After deriving weather pattern time series from GCM runs using forcing from all greenhouse gases (All-Hist) and using natural greenhouse gas forcing only (Nat-Hist), a weather generator will be employed to obtain climate data time series for the hydrological model. The parameters of the weather pattern classification (i.e. spatial extent, number of patterns, classification variables) need to be selected in a way that allows for good stratification of the meteorological variables that are of interest for the hydrological modelling. We evaluate the skill of the classification in stratifying meteorological data using a multi-variable approach. This allows for estimating the stratification skill for all meteorological variables together, not separately as usually done in existing similar work. The advantage of the multi-variable approach is to properly account for situations where e.g. two patterns are associated with similar mean daily temperature, but one pattern is dry while the other one is related to considerable amounts of precipitation. Thus, the separation of these two patterns would not be justified when considering temperature only, but is perfectly reasonable when accounting for precipitation as well. Besides that, the weather patterns derived from reanalyses data should be well represented in the All-Hist GCM runs in terms of e.g. frequency, seasonality, and persistence. In this contribution we show how to select the most appropriate weather pattern classification and how the classes derived from it are reflected in the GCMs.
Implicit Wiener series analysis of epileptic seizure recordings.
Barbero, Alvaro; Franz, Matthias; van Drongelen, Wim; Dorronsoro, José R; Schölkopf, Bernhard; Grosse-Wentrup, Moritz
2009-01-01
Implicit Wiener series are a powerful tool to build Volterra representations of time series with any degree of non-linearity. A natural question is then whether higher order representations yield more useful models. In this work we shall study this question for ECoG data channel relationships in epileptic seizure recordings, considering whether quadratic representations yield more accurate classifiers than linear ones. To do so we first show how to derive statistical information on the Volterra coefficient distribution and how to construct seizure classification patterns over that information. As our results illustrate, a quadratic model seems to provide no advantages over a linear one. Nevertheless, we shall also show that the interpretability of the implicit Wiener series provides insights into the inter-channel relationships of the recordings.
Improving the precision of dynamic forest parameter estimates using Landsat
Evan B. Brooks; John W. Coulston; Randolph H. Wynne; Valerie A. Thomas
2016-01-01
The use of satellite-derived classification maps to improve post-stratified forest parameter estimates is wellestablished.When reducing the variance of post-stratification estimates for forest change parameters such as forestgrowth, it is logical to use a change-related strata map. At the stand level, a time series of Landsat images is
Salary Compression: A Time-Series Ratio Analysis of ARL Position Classifications
ERIC Educational Resources Information Center
Seaman, Scott
2007-01-01
Although salary compression has previously been identified in such professional schools as engineering, business, and computer science, there is now evidence of salary compression among Association of Research Libraries members. Using salary data from the "ARL Annual Salary Survey", this study analyzes average annual salaries from 1994-1995…
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
NASA Astrophysics Data System (ADS)
Ma, J.; Dmochowski, J. E.
2016-12-01
Southern California's Santa Monica Mountain coastal range hosts chaparral and coastal sage scrub ecosystems with distinct, local variations in their fire regime, microclimate, and proximity to urbanization. The high biodiversity combined with ongoing human impact make monitoring the ecological and land cover changes crucial. Due to their extensive, continuous temporal coverage and high spatial resolution, Landsat data are well suited to this purpose. Landsat-derived time-series NDVI data and classification maps have been compiled to identify regions most sensitive to change in order to determine the effects of fire regime, geography, and urbanization on vegetative changes; and assess the encroachment of non-native grasses. Spatial analysis of the classification maps identified the factors more conducive to land-cover changes as native shrubs were replaced with non-native grasses. Understanding the dynamics that govern semi-arid resilience, overall greening, and fire regime is important to predicting and managing large scale ecosystem changes as pressures from global climate change and urbanization intensify.
NASA Astrophysics Data System (ADS)
Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha
2018-06-01
Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.
NASA Astrophysics Data System (ADS)
Florindo, João. Batista
2018-04-01
This work proposes the use of Singular Spectrum Analysis (SSA) for the classification of texture images, more specifically, to enhance the performance of the Bouligand-Minkowski fractal descriptors in this task. Fractal descriptors are known to be a powerful approach to model and particularly identify complex patterns in natural images. Nevertheless, the multiscale analysis involved in those descriptors makes them highly correlated. Although other attempts to address this point was proposed in the literature, none of them investigated the relation between the fractal correlation and the well-established analysis employed in time series. And SSA is one of the most powerful techniques for this purpose. The proposed method was employed for the classification of benchmark texture images and the results were compared with other state-of-the-art classifiers, confirming the potential of this analysis in image classification.
Information Foraging Theory in Software Maintenance
2012-09-30
classified information, stamp classification level on the top and bottom of this page. 17. LIMITATION OF ABSTRACT. This block must be completed to assign a ...time: for example a time series plot of model reaction times to many (simulated) stimuli presented to it in a run • “ Statistical ” abstractions summed...shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number
NASA Astrophysics Data System (ADS)
Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao
2017-04-01
Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the use of monthly NDVI time series imagery. These results show the importance of considering the phenological stage for image selection for mapping S. alterniflora using GF-1 WFV imagery. Furthermore, in light of the better tradeoff between the number of images and classification accuracy when using multitemporal GF-1 WFV imagery, we suggest using multitemporal imagery acquired at appropriate phenological windows for S. alterniflora mapping at regional scales.
NASA Astrophysics Data System (ADS)
Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.
2018-03-01
Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.
Fractal analyses reveal independent complexity and predictability of gait
Dierick, Frédéric; Nivard, Anne-Laure
2017-01-01
Locomotion is a natural task that has been assessed for decades and used as a proxy to highlight impairments of various origins. So far, most studies adopted classical linear analyses of spatio-temporal gait parameters. Here, we use more advanced, yet not less practical, non-linear techniques to analyse gait time series of healthy subjects. We aimed at finding more sensitive indexes related to spatio-temporal gait parameters than those previously used, with the hope to better identify abnormal locomotion. We analysed large-scale stride interval time series and mean step width in 34 participants while altering walking direction (forward vs. backward walking) and with or without galvanic vestibular stimulation. The Hurst exponent α and the Minkowski fractal dimension D were computed and interpreted as indexes expressing predictability and complexity of stride interval time series, respectively. These holistic indexes can easily be interpreted in the framework of optimal movement complexity. We show that α and D accurately capture stride interval changes in function of the experimental condition. Walking forward exhibited maximal complexity (D) and hence, adaptability. In contrast, walking backward and/or stimulation of the vestibular system decreased D. Furthermore, walking backward increased predictability (α) through a more stereotyped pattern of the stride interval and galvanic vestibular stimulation reduced predictability. The present study demonstrates the complementary power of the Hurst exponent and the fractal dimension to improve walking classification. Our developments may have immediate applications in rehabilitation, diagnosis, and classification procedures. PMID:29182659
Hamilton, Ian; Lloyd, Charlie; Hewitt, Catherine; Godfrey, Christine
2014-01-01
The UK Misuse of Drugs Act (1971) divided controlled drugs into three groups A, B and C, with descending criminal sanctions attached to each class. Cannabis was originally assigned by the Act to Group B but in 2004, it was transferred to the lowest risk group, Group C. Then in 2009, on the basis of increasing concerns about a link between high strength cannabis and schizophrenia, it was moved back to Group B. The aim of this study is to test the assumption that changes in classification lead to changes in levels of psychosis. In particular, it explores whether the two changes in 2004 and 2009 were associated with changes in the numbers of people admitted for cannabis psychosis. An interrupted time series was used to investigate the relationship between the two changes in cannabis classification and their impact on hospital admissions for cannabis psychosis. Reflecting the two policy changes, two interruptions to the time series were made. Hospital Episode Statistics admissions data was analysed covering the period 1999 through to 2010. There was a significantly increasing trend in cannabis psychosis admissions from 1999 to 2004. However, following the reclassification of cannabis from B to C in 2004, there was a significant change in the trend such that cannabis psychosis admissions declined to 2009. Following the second reclassification of cannabis back to class B in 2009, there was a significant change to increasing admissions. This study shows a statistical association between the reclassification of cannabis and hospital admissions for cannabis psychosis in the opposite direction to that predicted by the presumed relationship between the two. However, the reasons for this statistical association are unclear. It is unlikely to be due to changes in cannabis use over this period. Other possible explanations include changes in policing and systemic changes in mental health services unrelated to classification decisions. Copyright © 2013 Elsevier B.V. All rights reserved.
Patients classification on weaning trials using neural networks and wavelet transform.
Arizmendi, Carlos; Viviescas, Juan; González, Hernando; Giraldo, Beatriz
2014-01-01
The determination of the optimal time of the patients in weaning trial process from mechanical ventilation, between patients capable of maintaining spontaneous breathing and patients that fail to maintain spontaneous breathing, is a very important task in intensive care unit. Wavelet Transform (WT) and Neural Networks (NN) techniques were applied in order to develop a classifier for the study of patients on weaning trial process. The respiratory pattern of each patient was characterized through different time series. Genetic Algorithms (GA) and Forward Selection were used as feature selection techniques. A classification performance of 77.00±0.06% of well classified patients, was obtained using a NN and GA combination, with only 6 variables of the 14 initials.
NASA Astrophysics Data System (ADS)
Pasquarella, Valerie J.
Just as the carbon dioxide observations that form the Keeling curve revolutionized the study of the global carbon cycle, free and open access to all available Landsat imagery is fundamentally changing how the Landsat record is being used to study ecosystems and ecological dynamics. This dissertation advances the use of Landsat time series for visualization, classification, and detection of changes in terrestrial ecological processes. More specifically, it includes new examples of how complex ecological patterns manifest in time series of Landsat observations, as well as novel approaches for detecting and quantifying these patterns. Exploration of the complexity of spectral-temporal patterns in the Landsat record reveals both seasonal variability and longer-term trajectories difficult to characterize using conventional bi-temporal or even annual observations. These examples provide empirical evidence of hypothetical ecosystem response functions proposed by Kennedy et al. (2014). Quantifying observed seasonal and phenological differences in the spectral reflectance of Massachusetts' forest communities by combining existing harmonic curve fitting and phenology detection algorithms produces stable feature sets that consistently out-performed more traditional approaches for detailed forest type classification. This study addresses the current lack of species-level forest data at Landsat resolutions, demonstrating the advantages of spectral-temporal features as classification inputs. Development of a targeted change detection method using transformations of time series data improves spatial and temporal information on the occurrence of flood events in landscapes actively modified by recovering North American beaver (Castor canadensis) populations. These results indicate the utility of the Landsat record for the study of species-habitat relationships, even in complex wetland environments. Overall, this dissertation confirms the value of the Landsat archive as a continuous record of terrestrial ecosystem state and dynamics. Given the global coverage of remote sensing datasets, the time series visualization and analysis approaches presented here can be extended to other areas. These approaches will also be improved by more frequent collection of moderate resolution imagery, as planned by the Landsat and Sentinel-2 programs. In the modern era of global environmental change, use of the Landsat spectral-temporal domain presents new and exciting opportunities for the long-term large-scale study of ecosystem extent, composition, condition, and change.
Automatic classification of time-variable X-ray sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lo, Kitty K.; Farrell, Sean; Murphy, Tara
2014-05-01
To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, andmore » other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.« less
Manipulation Capabilities with Simple Hands
2010-01-01
allowing it to interpret online kinesthetic data, addressing two objectives: • Grasp classification: Distinguish between successful and unsuccessful...determining the grasp outcome before the grasping process is complete, by using the entire time series or kinesthetic signature of the grasping process. As...the grasp proceeds and additional kinesthetic data accumulates, the confidence also increases. In some cases Manipulation Capabilities with Simple Hands
Dey, Soumyabrata; Rao, A Ravishankar; Shah, Mubarak
2014-01-01
Attention Deficit Hyperactive Disorder (ADHD) is getting a lot of attention recently for two reasons. First, it is one of the most commonly found childhood disorders and second, the root cause of the problem is still unknown. Functional Magnetic Resonance Imaging (fMRI) data has become a popular tool for the analysis of ADHD, which is the focus of our current research. In this paper we propose a novel framework for the automatic classification of the ADHD subjects using their resting state fMRI (rs-fMRI) data of the brain. We construct brain functional connectivity networks for all the subjects. The nodes of the network are constructed with clusters of highly active voxels and edges between any pair of nodes represent the correlations between their average fMRI time series. The activity level of the voxels are measured based on the average power of their corresponding fMRI time-series. For each node of the networks, a local descriptor comprising of a set of attributes of the node is computed. Next, the Multi-Dimensional Scaling (MDS) technique is used to project all the subjects from the unknown graph-space to a low dimensional space based on their inter-graph distance measures. Finally, the Support Vector Machine (SVM) classifier is used on the low dimensional projected space for automatic classification of the ADHD subjects. Exhaustive experimental validation of the proposed method is performed using the data set released for the ADHD-200 competition. Our method shows promise as we achieve impressive classification accuracies on the training (70.49%) and test data sets (73.55%). Our results reveal that the detection rates are higher when classification is performed separately on the male and female groups of subjects.
Coronal Mass Ejection Data Clustering and Visualization of Decision Trees
NASA Astrophysics Data System (ADS)
Ma, Ruizhe; Angryk, Rafal A.; Riley, Pete; Filali Boubrahimi, Soukaina
2018-05-01
Coronal mass ejections (CMEs) can be categorized as either “magnetic clouds” (MCs) or non-MCs. Features such as a large magnetic field, low plasma-beta, and low proton temperature suggest that a CME event is also an MC event; however, so far there is neither a definitive method nor an automatic process to distinguish the two. Human labeling is time-consuming, and results can fluctuate owing to the imprecise definition of such events. In this study, we approach the problem of MC and non-MC distinction from a time series data analysis perspective and show how clustering can shed some light on this problem. Although many algorithms exist for traditional data clustering in the Euclidean space, they are not well suited for time series data. Problems such as inadequate distance measure, inaccurate cluster center description, and lack of intuitive cluster representations need to be addressed for effective time series clustering. Our data analysis in this work is twofold: clustering and visualization. For clustering we compared the results from the popular hierarchical agglomerative clustering technique to a distance density clustering heuristic we developed previously for time series data clustering. In both cases, dynamic time warping will be used for similarity measure. For classification as well as visualization, we use decision trees to aggregate single-dimensional clustering results to form a multidimensional time series decision tree, with averaged time series to present each decision. In this study, we achieved modest accuracy and, more importantly, an intuitive interpretation of how different parameters contribute to an MC event.
Ghalyan, Najah F; Miller, David J; Ray, Asok
2018-06-12
Estimation of a generating partition is critical for symbolization of measurements from discrete-time dynamical systems, where a sequence of symbols from a (finite-cardinality) alphabet may uniquely specify the underlying time series. Such symbolization is useful for computing measures (e.g., Kolmogorov-Sinai entropy) to identify or characterize the (possibly unknown) dynamical system. It is also useful for time series classification and anomaly detection. The seminal work of Hirata, Judd, and Kilminster (2004) derives a novel objective function, akin to a clustering objective, that measures the discrepancy between a set of reconstruction values and the points from the time series. They cast estimation of a generating partition via the minimization of their objective function. Unfortunately, their proposed algorithm is nonconvergent, with no guarantee of finding even locally optimal solutions with respect to their objective. The difficulty is a heuristic-nearest neighbor symbol assignment step. Alternatively, we develop a novel, locally optimal algorithm for their objective. We apply iterative nearest-neighbor symbol assignments with guaranteed discrepancy descent, by which joint, locally optimal symbolization of the entire time series is achieved. While most previous approaches frame generating partition estimation as a state-space partitioning problem, we recognize that minimizing the Hirata et al. (2004) objective function does not induce an explicit partitioning of the state space, but rather the space consisting of the entire time series (effectively, clustering in a (countably) infinite-dimensional space). Our approach also amounts to a novel type of sliding block lossy source coding. Improvement, with respect to several measures, is demonstrated over popular methods for symbolizing chaotic maps. We also apply our approach to time-series anomaly detection, considering both chaotic maps and failure application in a polycrystalline alloy material.
Convolutional Neural Network for Multi-Source Deep Learning Crop Classification in Ukraine
NASA Astrophysics Data System (ADS)
Lavreniuk, M. S.
2016-12-01
Land cover and crop type maps are one of the most essential inputs when dealing with environmental and agriculture monitoring tasks [1]. During long time neural network (NN) approach was one of the most efficient and popular approach for most applications, including crop classification using remote sensing data, with high an overall accuracy (OA) [2]. In the last years the most popular and efficient method for multi-sensor and multi-temporal land cover classification is convolution neural networks (CNNs). Taking into account presence clouds in optical data, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of optical imagery from Landsat-8 satellite. After missing data restoration, optical data from Landsat-8 was merged with Sentinel-1A radar data for better crop types discrimination [3]. An ensemble of CNNs is proposed for multi-temporal satellite images supervised classification. Each CNN in the corresponding ensemble is a 1-d CNN with 4 layers implemented using the Google's library TensorFlow. The efficiency of the proposed approach was tested on a time-series of Landsat-8 and Sentinel-1A images over the JECAM test site (Kyiv region) in Ukraine in 2015. Overall classification accuracy for ensemble of CNNs was 93.5% that outperformed an ensemble of multi-layer perceptrons (MLPs) by +0.8% and allowed us to better discriminate summer crops, in particular maize and soybeans. For 2016 we would like to validate this method using Sentinel-1 and Sentinel-2 data for Ukraine territory within ESA project on country level demonstration Sen2Agri. 1. A. Kolotii et al., "Comparison of biophysical and satellite predictors for wheat yield forecasting in Ukraine," The Int. Arch. of Photogram., Rem. Sens. and Spatial Inform. Scie., vol. 40, no. 7, pp. 39-44, 2015. 2. F. Waldner et al., "Towards a set of agrosystem-specific cropland mapping methods to address the global cropland diversity," Int. Journal of Rem. Sens. vol. 37, no. 14, pp 3196-3231, 2016. 3. S. Skakun et al., "Efficiency assessment of multitemporal C-band Radarsat-2 intensity and Landsat-8 surface reflectance satellite imagery for crop classification in Ukraine," IEEE Journal of Selected Topics in Applied Earth Observ. and Rem. Sens., 2015, DOI: 10.1109/JSTARS.2015.2454297.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, S.; Gezari, S.; Heinis, S.
2015-03-20
We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands g {sub P1}, r {sub P1}, i {sub P1}, and z {sub P1}. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and anmore » analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.« less
A recurrent neural network for classification of unevenly sampled variable stars
NASA Astrophysics Data System (ADS)
Naul, Brett; Bloom, Joshua S.; Pérez, Fernando; van der Walt, Stéfan
2018-02-01
Astronomical surveys of celestial sources produce streams of noisy time series measuring flux versus time (`light curves'). Unlike in many other physical domains, however, large (and source-specific) temporal gaps in data arise naturally due to intranight cadence choices as well as diurnal and seasonal constraints1-5. With nightly observations of millions of variable stars and transients from upcoming surveys4,6, efficient and accurate discovery and classification techniques on noisy, irregularly sampled data must be employed with minimal human-in-the-loop involvement. Machine learning for inference tasks on such data traditionally requires the laborious hand-coding of domain-specific numerical summaries of raw data (`features')7. Here, we present a novel unsupervised autoencoding recurrent neural network8 that makes explicit use of sampling times and known heteroskedastic noise properties. When trained on optical variable star catalogues, this network produces supervised classification models that rival other best-in-class approaches. We find that autoencoded features learned in one time-domain survey perform nearly as well when applied to another survey. These networks can continue to learn from new unlabelled observations and may be used in other unsupervised tasks, such as forecasting and anomaly detection.
Time series modeling of live-cell shape dynamics for image-based phenotypic profiling.
Gordonov, Simon; Hwang, Mun Kyung; Wells, Alan; Gertler, Frank B; Lauffenburger, Douglas A; Bathe, Mark
2016-01-01
Live-cell imaging can be used to capture spatio-temporal aspects of cellular responses that are not accessible to fixed-cell imaging. As the use of live-cell imaging continues to increase, new computational procedures are needed to characterize and classify the temporal dynamics of individual cells. For this purpose, here we present the general experimental-computational framework SAPHIRE (Stochastic Annotation of Phenotypic Individual-cell Responses) to characterize phenotypic cellular responses from time series imaging datasets. Hidden Markov modeling is used to infer and annotate morphological state and state-switching properties from image-derived cell shape measurements. Time series modeling is performed on each cell individually, making the approach broadly useful for analyzing asynchronous cell populations. Two-color fluorescent cells simultaneously expressing actin and nuclear reporters enabled us to profile temporal changes in cell shape following pharmacological inhibition of cytoskeleton-regulatory signaling pathways. Results are compared with existing approaches conventionally applied to fixed-cell imaging datasets, and indicate that time series modeling captures heterogeneous dynamic cellular responses that can improve drug classification and offer additional important insight into mechanisms of drug action. The software is available at http://saphire-hcs.org.
NASA Astrophysics Data System (ADS)
Hashiba, Hideki; Nakayama, Yasunori; Sugimura, Toshiro
The growth of major cities in Asia, as a consequence of economic development, is feared to have adverse influences on the natural environment of the surrounding areas. Comparison of land cover changes in major cities from the viewpoints of both spatial and time series is necessary to fully understand the characteristics of urban development in Asia. To accomplish this, multiple satellite remote sensing data were analyzed across a wide range and over a long term in this study. The process of transition of a major Asian city in Tokyo, Osaka, Beijing, Shanghai, and Hong Kong was analyzed from the characteristic changes of the vegetation index value and the land cover over about 40 years, from 1972 to 2010. Image data for LANDSAT/MSS, LAND-SAT/TM, ALOS/AVNIR-2, and ALOS/PRISM were obtained using a tandem time series. The ratio and state of detailed distribution of land cover were clarified by the classification processing. The time series clearly showed different change characteristics for each city and its surrounding natural environment of vegetation and forest etc. as a result of development processes.
NASA Astrophysics Data System (ADS)
Sharma, A. K.; Hubert-Moy, L.; Betbederet, J.; Ruiz, L.; Sekhar, M.; Corgne, S.
2016-08-01
Monitoring land use and land cover and more particularly irrigated cropland dynamics is of great importance for water resources management and land use planning. The objective of this study was to evaluate the combined use of multi-temporal optical and radar data with a high spatial resolution in order to improve the precision of irrigated crop identification by taking into account information on crop phenological stages. SAR and optical parameters were derived from time- series of seven quad-pol RADARSAT-2 and four Landsat-8 images which were acquired on the Berambadi catchment, South India, during the monsoon crop season at the growth stages of turmeric crop. To select the best parameter to discriminate turmeric crops, an analysis of covariance (ANCOVA) was applied on all the time-series parameters and the most discriminant ones were classified using the Support Vector Machine (SVM) technique. Results show that in absence of optical images, polarimetric parameters derived from SAR time-series can be used for the turmeric area estimates and that the combined use of SAR and optical parameters can improve the classification accuracy to identify turmeric.
Development and Testing of Data Mining Algorithms for Earth Observation
NASA Technical Reports Server (NTRS)
Glymour, Clark
2005-01-01
The new algorithms developed under this project included a principled procedure for classification of objects, events or circumstances according to a target variable when a very large number of potential predictor variables is available but the number of cases that can be used for training a classifier is relatively small. These "high dimensional" problems require finding a minimal set of variables -called the Markov Blanket-- sufficient for predicting the value of the target variable. An algorithm, the Markov Blanket Fan Search, was developed, implemented and tested on both simulated and real data in conjunction with a graphical model classifier, which was also implemented. Another algorithm developed and implemented in TETRAD IV for time series elaborated on work by C. Granger and N. Swanson, which in turn exploited some of our earlier work. The algorithms in question learn a linear time series model from data. Given such a time series, the simultaneous residual covariances, after factoring out time dependencies, may provide information about causal processes that occur more rapidly than the time series representation allow, so called simultaneous or contemporaneous causal processes. Working with A. Monetta, a graduate student from Italy, we produced the correct statistics for estimating the contemporaneous causal structure from time series data using the TETRAD IV suite of algorithms. Two economists, David Bessler and Kevin Hoover, have independently published applications using TETRAD style algorithms to the same purpose. These implementations and algorithmic developments were separately used in two kinds of studies of climate data: Short time series of geographically proximate climate variables predicting agricultural effects in California, and longer duration climate measurements of temperature teleconnections.
Modern trends in Class III orthognathic treatment: A time series analysis.
Lee, Chang-Hoon; Park, Hyun-Hee; Seo, Byoung-Moo; Lee, Shin-Jae
2017-03-01
To examine the current trends in surgical-orthodontic treatment for patients with Class III malocclusion using time-series analysis. The records of 2994 consecutive patients who underwent orthognathic surgery from January 1, 2004, through December 31, 2015, at Seoul National University Dental Hospital, Seoul, Korea, were reviewed. Clinical data from each surgical and orthodontic treatment record included patient's sex, age at the time of surgery, malocclusion classification, type of orthognathic surgical procedure, place where the orthodontic treatment was performed, orthodontic treatment modality, and time elapsed for pre- and postoperative orthodontic treatment. Out of the orthognathic surgery patients, 86% had Class III malocclusion. Among them, two-jaw surgeries have become by far the most common orthognathic surgical treatment these days. The age at the time of surgery and the number of new patients had seasonal variations, which demonstrated opposing patterns. There was neither positive nor negative correlation between pre- and postoperative orthodontic treatment time. Elapsed orthodontic treatment time for both before and after Class III orthognathic surgeries has been decreasing over the years. Results of the time series analysis might provide clinicians with some insights into current surgical and orthodontic management.
Study of phase clustering method for analyzing large volumes of meteorological observation data
NASA Astrophysics Data System (ADS)
Volkov, Yu. V.; Krutikov, V. A.; Botygin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.
2017-11-01
The article describes an iterative parallel phase grouping algorithm for temperature field classification. The algorithm is based on modified method of structure forming by using analytic signal. The developed method allows to solve tasks of climate classification as well as climatic zoning for any time or spatial scale. When used to surface temperature measurement series, the developed algorithm allows to find climatic structures with correlated changes of temperature field, to make conclusion on climate uniformity in a given area and to overview climate changes over time by analyzing offset in type groups. The information on climate type groups specific for selected geographical areas is expanded by genetic scheme of class distribution depending on change in mutual correlation level between ground temperature monthly average.
Proisy, Christophe; Viennois, Gaëlle; Sidik, Frida; Andayani, Ariani; Enright, James Anthony; Guitet, Stéphane; Gusmawati, Niken; Lemonnier, Hugues; Muthusankar, Gowrappan; Olagoke, Adewole; Prosperi, Juliana; Rahmania, Rinny; Ricout, Anaïs; Soulard, Benoit; Suhardjono
2018-06-01
Revegetation of abandoned aquaculture regions should be a priority for any integrated coastal zone management (ICZM). This paper examines the potential of a matchless time series of 20 very high spatial resolution (VHSR) optical satellite images acquired for mapping trends in the evolution of mangrove forests from 2001 to 2015 in an estuary fragmented into aquaculture ponds. Evolution of mangrove extent was quantified through robust multitemporal analysis based on supervised image classification. Results indicated that mangroves are expanding inside and outside ponds and over pond dykes. However, the yearly expansion rate of vegetation cover greatly varied between replanted ponds. Ground truthing showed that only Rhizophora species had been planted, whereas natural mangroves consist of Avicennia and Sonneratia species. In addition, the dense Rhizophora plantations present very low regeneration capabilities compared with natural mangroves. Time series of VHSR images provide comprehensive and intuitive level of information for the support of ICZM. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chrol-Cannon, Joseph; Jin, Yaochu
2014-01-01
Reservoir computing provides a simpler paradigm of training recurrent networks by initialising and adapting the recurrent connections separately to a supervised linear readout. This creates a problem, though. As the recurrent weights and topology are now separated from adapting to the task, there is a burden on the reservoir designer to construct an effective network that happens to produce state vectors that can be mapped linearly into the desired outputs. Guidance in forming a reservoir can be through the use of some established metrics which link a number of theoretical properties of the reservoir computing paradigm to quantitative measures that can be used to evaluate the effectiveness of a given design. We provide a comprehensive empirical study of four metrics; class separation, kernel quality, Lyapunov's exponent and spectral radius. These metrics are each compared over a number of repeated runs, for different reservoir computing set-ups that include three types of network topology and three mechanisms of weight adaptation through synaptic plasticity. Each combination of these methods is tested on two time-series classification problems. We find that the two metrics that correlate most strongly with the classification performance are Lyapunov's exponent and kernel quality. It is also evident in the comparisons that these two metrics both measure a similar property of the reservoir dynamics. We also find that class separation and spectral radius are both less reliable and less effective in predicting performance.
NASA Astrophysics Data System (ADS)
Helmer, E.; Ruzycki, T. S.; Wunderle, J. M.; Kwit, C.; Ewert, D. N.; Voggesser, S. M.; Brandeis, T. J.
2011-12-01
We mapped tropical dry forest height (RMSE = 0.9 m, R2 = 0.84, range 0.6-7 m) and foliage height profiles with a time series of gap-filled Landsat and Advanced Land Imager (ALI) imagery for the island of Eleuthera, The Bahamas. We also mapped disturbance type and age with decision tree classification of the image time series. Having mapped these variables in the context of studies of wintering habitat of an endangered Nearctic-Neotropical migrant bird, the Kirtland's Warbler (Dendroica kirtlandii), we then illustrated relationships between forest vertical structure, disturbance type and counts of forage species important to the Kirtland's Warbler. The ALI imagery and the Landsat time series were both critical to the result for forest height, which the strong relationship of forest height with disturbance type and age facilitated. Also unique to this study was that seven of the eight image time steps were cloud-gap-filled images: mosaics of the clear parts of several cloudy scenes, in which cloud gaps in a reference scene for each time step are filled with image data from alternate scenes. We created each cloud-cleared image, including a virtually seamless ALI image mosaic, with regression tree normalization of the image data that filled cloud gaps. We also illustrated how viewing time series imagery as red-green-blue composites of tasseled cap wetness (RGB wetness composites) aids reference data collection for classifying tropical forest disturbance type and age.
OPTIMAL TIME-SERIES SELECTION OF QUASARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Nathaniel R.; Bloom, Joshua S.
2011-03-15
We present a novel method for the optimal selection of quasars using time-series observations in a single photometric bandpass. Utilizing the damped random walk model of Kelly et al., we parameterize the ensemble quasar structure function in Sloan Stripe 82 as a function of observed brightness. The ensemble model fit can then be evaluated rigorously for and calibrated with individual light curves with no parameter fitting. This yields a classification in two statistics-one describing the fit confidence and the other describing the probability of a false alarm-which can be tuned, a priori, to achieve high quasar detection fractions (99% completenessmore » with default cuts), given an acceptable rate of false alarms. We establish the typical rate of false alarms due to known variable stars as {approx}<3% (high purity). Applying the classification, we increase the sample of potential quasars relative to those known in Stripe 82 by as much as 29%, and by nearly a factor of two in the redshift range 2.5 < z < 3, where selection by color is extremely inefficient. This represents 1875 new quasars in a 290 deg{sup 2} field. The observed rates of both quasars and stars agree well with the model predictions, with >99% of quasars exhibiting the expected variability profile. We discuss the utility of the method at high redshift and in the regime of noisy and sparse data. Our time-series selection complements well-independent selection based on quasar colors and has strong potential for identifying high-redshift quasars for Baryon Acoustic Oscillations and other cosmology studies in the LSST era.« less
Ronald E. McRoberts
2014-01-01
Multiple remote sensing-based approaches to estimating gross afforestation, gross deforestation, and net deforestation are possible. However, many of these approaches have severe data requirements in the form of long time series of remotely sensed data and/or large numbers of observations of land cover change to train classifiers and assess the accuracy of...
Reconstruction Using Locoregional Flaps for Large Skull Base Defects.
Hatano, Takaharu; Motomura, Hisashi; Ayabe, Shinobu
2015-06-01
We present a modified locoregional flap for the reconstruction of large anterior skull base defects that should be reconstructed with a free flap according to Yano's algorithm. No classification of skull base defects had been proposed for a long time. Yano et al suggested a new classification in 2012. The lb defect of Yano's classification extends horizontally from the cribriform plate to the orbital roof. According to Yano's algorithm for subsequent skull base reconstructive procedures, a lb defect should be reconstructed with a free flap such as an anterolateral thigh free flap or rectus abdominis myocutaneous free flap. However, our modified locoregional flap has also enabled reconstruction of lb defects. In this case series, we used a locoregional flap for lb defects. No major postoperative complications occurred. We present our modified locoregional flap that enables reconstruction of lb defects.
UNMANNED AERIAL VEHICLE (UAV) HYPERSPECTRAL REMOTE SENSING FOR DRYLAND VEGETATION MONITORING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nancy F. Glenn; Jessica J. Mitchell; Matthew O. Anderson
2012-06-01
UAV-based hyperspectral remote sensing capabilities developed by the Idaho National Lab and Idaho State University, Boise Center Aerospace Lab, were recently tested via demonstration flights that explored the influence of altitude on geometric error, image mosaicking, and dryland vegetation classification. The test flights successfully acquired usable flightline data capable of supporting classifiable composite images. Unsupervised classification results support vegetation management objectives that rely on mapping shrub cover and distribution patterns. Overall, supervised classifications performed poorly despite spectral separability in the image-derived endmember pixels. Future mapping efforts that leverage ground reference data, ultra-high spatial resolution photos and time series analysis shouldmore » be able to effectively distinguish native grasses such as Sandberg bluegrass (Poa secunda), from invasives such as burr buttercup (Ranunculus testiculatus) and cheatgrass (Bromus tectorum).« less
Application of data cubes for improving detection of water cycle extreme events
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.
2015-12-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.
Application of Data Cubes for Improving Detection of Water Cycle Extreme Events
NASA Technical Reports Server (NTRS)
Albayrak, Arif; Teng, William
2015-01-01
As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).
Optimizing Functional Network Representation of Multivariate Time Series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco Del; Menasalvas, Ernestina; Boccaletti, Stefano
2012-09-01
By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks.
Optimizing Functional Network Representation of Multivariate Time Series
Zanin, Massimiliano; Sousa, Pedro; Papo, David; Bajo, Ricardo; García-Prieto, Juan; Pozo, Francisco del; Menasalvas, Ernestina; Boccaletti, Stefano
2012-01-01
By combining complex network theory and data mining techniques, we provide objective criteria for optimization of the functional network representation of generic multivariate time series. In particular, we propose a method for the principled selection of the threshold value for functional network reconstruction from raw data, and for proper identification of the network's indicators that unveil the most discriminative information on the system for classification purposes. We illustrate our method by analysing networks of functional brain activity of healthy subjects, and patients suffering from Mild Cognitive Impairment, an intermediate stage between the expected cognitive decline of normal aging and the more pronounced decline of dementia. We discuss extensions of the scope of the proposed methodology to network engineering purposes, and to other data mining tasks. PMID:22953051
Mapping Wetlands of Dongting Lake in China Using Landsat and SENTINEL-1 Time Series at 30M
NASA Astrophysics Data System (ADS)
Xing, L.; Tang, X.; Wang, H.; Fan, W.; Gao, X.
2018-04-01
Mapping and monitoring wetlands of Dongting lake using optical sensor data has been limited by cloud cover, and open access Sentinal-1 C-band data could provide cloud-free SAR images with both have high spatial and temporal resolution, which offer new opportunities for monitoring wetlands. In this study, we combined optical data and SAR data to map wetland of Dongting Lake reserves in 2016. Firstly, we generated two monthly composited Landsat land surface reflectance, NDVI, NDWI, TC-Wetness time series and Sentinel-1 (backscattering coefficient for VH and VV) time series. Secondly, we derived surface water body with two monthly frequencies based on the threshold method using the Sentinel-1 time series. Then the permanent water and seasonal water were separated by the submergence ratio. Other land cover types were identified based on SVM classifier using Landsat time series. Results showed that (1) the overall accuracies and kappa coefficients were above 86.6 % and 0.8. (3) Natural wetlands including permanent water body (14.8 %), seasonal water body (34.6 %), and permanent marshes (10.9 %) were the main land cover types, accounting for 60.3 % of the three wetland reserves. Human-made wetlands, such as rice fields, accounted 34.3 % of the total area. Generally, this study proposed a new flowchart for wetlands mapping in Dongting lake by combining multi-source remote sensing data, and the use of the two-monthly composited optical time series effectively made up the missing data due to the clouds and increased the possibility of precise wetlands classification.
Long-range dismount activity classification: LODAC
NASA Astrophysics Data System (ADS)
Garagic, Denis; Peskoe, Jacob; Liu, Fang; Cuevas, Manuel; Freeman, Andrew M.; Rhodes, Bradley J.
2014-06-01
Continuous classification of dismount types (including gender, age, ethnicity) and their activities (such as walking, running) evolving over space and time is challenging. Limited sensor resolution (often exacerbated as a function of platform standoff distance) and clutter from shadows in dense target environments, unfavorable environmental conditions, and the normal properties of real data all contribute to the challenge. The unique and innovative aspect of our approach is a synthesis of multimodal signal processing with incremental non-parametric, hierarchical Bayesian machine learning methods to create a new kind of target classification architecture. This architecture is designed from the ground up to optimally exploit correlations among the multiple sensing modalities (multimodal data fusion) and rapidly and continuously learns (online self-tuning) patterns of distinct classes of dismounts given little a priori information. This increases classification performance in the presence of challenges posed by anti-access/area denial (A2/AD) sensing. To fuse multimodal features, Long-range Dismount Activity Classification (LODAC) develops a novel statistical information theoretic approach for multimodal data fusion that jointly models multimodal data (i.e., a probabilistic model for cross-modal signal generation) and discovers the critical cross-modal correlations by identifying components (features) with maximal mutual information (MI) which is efficiently estimated using non-parametric entropy models. LODAC develops a generic probabilistic pattern learning and classification framework based on a new class of hierarchical Bayesian learning algorithms for efficiently discovering recurring patterns (classes of dismounts) in multiple simultaneous time series (sensor modalities) at multiple levels of feature granularity.
Evaluation of the WHO criteria for the classification of patients with mastocytosis.
Sánchez-Muñoz, Laura; Alvarez-Twose, Ivan; García-Montero, Andrés C; Teodosio, Cristina; Jara-Acevedo, María; Pedreira, Carlos E; Matito, Almudena; Morgado, Jose Mario T; Sánchez, Maria Luz; Mollejo, Manuela; Gonzalez-de-Olano, David; Orfao, Alberto; Escribano, Luis
2011-09-01
Diagnosis and classification of mastocytosis is currently based on the World Health Organization (WHO) criteria. Here, we evaluate the utility of the WHO criteria for the diagnosis and classification of a large series of mastocytosis patients (n=133), and propose a new algorithm that could be routinely applied for refined diagnosis and classification of the disease. Our results confirm the utility of the WHO criteria and provide evidence for the need of additional information for (1) a more precise diagnosis of mastocytosis, (2) specific identification of new forms of the disease, (3) the differential diagnosis between cutaneous mastocytosis vs systemic mastocytosis, and (4) improved distinction between indolent systemic mastocytosis and aggressive systemic mastocytosis. Based on our results, a new algorithm is proposed for a better diagnostic definition and prognostic classification of mastocytosis, as confirmed prospectively in an independent validation series of 117 mastocytosis patients.
A neural network for noise correlation classification
NASA Astrophysics Data System (ADS)
Paitz, Patrick; Gokhberg, Alexey; Fichtner, Andreas
2018-02-01
We present an artificial neural network (ANN) for the classification of ambient seismic noise correlations into two categories, suitable and unsuitable for noise tomography. By using only a small manually classified data subset for network training, the ANN allows us to classify large data volumes with low human effort and to encode the valuable subjective experience of data analysts that cannot be captured by a deterministic algorithm. Based on a new feature extraction procedure that exploits the wavelet-like nature of seismic time-series, we efficiently reduce the dimensionality of noise correlation data, still keeping relevant features needed for automated classification. Using global- and regional-scale data sets, we show that classification errors of 20 per cent or less can be achieved when the network training is performed with as little as 3.5 per cent and 16 per cent of the data sets, respectively. Furthermore, the ANN trained on the regional data can be applied to the global data, and vice versa, without a significant increase of the classification error. An experiment where four students manually classified the data, revealed that the classification error they would assign to each other is substantially larger than the classification error of the ANN (>35 per cent). This indicates that reproducibility would be hampered more by human subjectivity than by imperfections of the ANN.
a Landsat Time-Series Stacks Model for Detection of Cropland Change
NASA Astrophysics Data System (ADS)
Chen, J.; Chen, J.; Zhang, J.
2017-09-01
Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.
NASA Astrophysics Data System (ADS)
Niculescu, Simona; Lardeux, Cédric; Hanganu, Jenica
2018-05-01
Wetlands are important and valuable ecosystems, yet, since 1900, more than 50 % of wetlands have been lost worldwide. An example of altered and partially restored coastal wetlands is the Danube Delta in Romania. Over time, human intervention has manifested itself in more than a quarter of the entire Danube surface. This intervention was brutal and has rendered ecosystem restoration very difficult. Studies for the rehabilitation / re-vegetation were started immediately after the Danube Delta was declared as a Biosphere Reservation in 1990. Remote sensing offers accurate methods for detecting and mapping change in restored wetlands. Vegetation change detection is a powerful indicator of restoration success. The restoration projects use vegetative cover as an important indicator of restoration success. To follow the evolution of the vegetation cover of the restored areas, satellite images radar and optical of last generation have been used, such as Sentinel-1 and Sentinel-2. Indeed the sensor sensitivity to the landscape depends on the wavelength what- ever radar or optical data and their polarization for radar data. Combining this kind of data is particularly relevant for the classification of wetland vegetation, which are associated with the density and size of the vegetation. In addition, the high temporal acquisition frequency of Sentinel-1 which are not sensitive to cloud cover al- low to use temporal signature of the different land cover. Thus we analyse the polarimetric and temporal signature of Sentinel-1 data in order to better understand the signature of the different study classes. In a second phase, we performed classifications based on the Random Forest supervised classification algorithm involving the entire Sentinel-1 time series, then starting from a Sentinel-2 collection and finally involving combinations of Sentinel-1 and -2 data.
SUVI Thematic Maps: A new tool for space weather forecasting
NASA Astrophysics Data System (ADS)
Hughes, J. M.; Seaton, D. B.; Darnel, J.
2017-12-01
The new Solar Ultraviolet Imager (SUVI) instruments aboard NOAA's GOES-R series satellites collect continuous, high-quality imagery of the Sun in six wavelengths. SUVI imagers produce at least one image every 10 seconds, or 8,640 images per day, considerably more data than observers can digest in real time. Over the projected 20-year lifetime of the four GOES-R series spacecraft, SUVI will provide critical imagery for space weather forecasters and produce an extensive but unwieldy archive. In order to condense the database into a dynamic and searchable form we have developed solar thematic maps, maps of the Sun with key features, such as coronal holes, flares, bright regions, quiet corona, and filaments, identified. Thematic maps will be used in NOAA's Space Weather Prediction Center to improve forecaster response time to solar events and generate several derivative products. Likewise, scientists use thematic maps to find observations of interest more easily. Using an expert-trained, naive Bayesian classifier to label each pixel, we create thematic maps in real-time. We created software to collect expert classifications of solar features based on SUVI images. Using this software, we compiled a database of expert classifications, from which we could characterize the distribution of pixels associated with each theme. Given new images, the classifier assigns each pixel the most appropriate label according to the trained distribution. Here we describe the software to collect expert training and the successes and limitations of the classifier. The algorithm excellently identifies coronal holes but fails to consistently detect filaments and prominences. We compare the Bayesian classifier to an artificial neural network, one of our attempts to overcome the aforementioned limitations. These results are very promising and encourage future research into an ensemble classification approach.
van der Heijden, Martijn; Dikkers, Frederik G; Halmos, Gyorgy B
2015-12-01
Laryngomalacia is the most common cause of dyspnea and stridor in newborn infants. Laryngomalacia is a dynamic change of the upper airway based on abnormally pliable supraglottic structures, which causes upper airway obstruction. In the past, different classification systems have been introduced. Until now no classification system is widely accepted and applied. Our goal is to provide a simple and complete classification system based on systematic literature search and our experiences. Retrospective cohort study with literature review. All patients with laryngomalacia under the age of 5 at time of diagnosis were included. Photo and video documentation was used to confirm diagnosis and characteristics of dynamic airway change. Outcome was compared with available classification systems in literature. Eighty-five patients were included. In contrast to other classification systems, only three typical different dynamic changes have been identified in our series. Two existing classification systems covered 100% of our findings, but there was an unnecessary overlap between different types in most of the systems. Based on our finding, we propose a new a classification system for laryngomalacia, which is purely based on dynamic airway changes. The groningen laryngomalacia classification is a new, simplified classification system with three types, based on purely dynamic laryngeal changes, tested in a tertiary referral center: Type 1: inward collapse of arytenoids cartilages, Type 2: medial displacement of aryepiglottic folds, and Type 3: posterocaudal displacement of epiglottis against the posterior pharyngeal wall. © 2015 Wiley Periodicals, Inc.
Martín-Gonzalo, Juan Andrés; Rodríguez-Andonaegui, Irene; López-López, Javier; Pascual-Pascual, Samuel Ignacio
2018-01-01
The Hereditary Spastic Paraplegias (HSP) are a group of heterogeneous disorders with a wide spectrum of underlying neural pathology, and hence HSP patients express a variety of gait abnormalities. Classification of these phenotypes may help in monitoring disease progression and personalizing therapies. This is currently managed by measuring values of some kinematic and spatio-temporal parameters at certain moments during the gait cycle, either in the doctor´s surgery room or after very precise measurements produced by instrumental gait analysis (IGA). These methods, however, do not provide information about the whole structure of the gait cycle. Classification of the similarities among time series of IGA measured values of sagittal joint positions throughout the whole gait cycle can be achieved by hierarchical clustering analysis based on multivariate dynamic time warping (DTW). Random forests can estimate which are the most important isolated parameters to predict the classification revealed by DTW, since clinicians need to refer to them in their daily practice. We acquired time series of pelvic, hip, knee, ankle and forefoot sagittal angular positions from 26 HSP and 33 healthy children with an optokinetic IGA system. DTW revealed six gait patterns with different degrees of impairment of walking speed, cadence and gait cycle distribution and related with patient’s age, sex, GMFCS stage, concurrence of polyneuropathy and abnormal visual evoked potentials or corpus callosum. The most important parameters to differentiate patterns were mean pelvic tilt and hip flexion at initial contact. Longer time of support, decreased values of hip extension and increased knee flexion at initial contact can differentiate the mildest, near to normal HSP gait phenotype and the normal healthy one. Increased values of knee flexion at initial contact and delayed peak of knee flexion are important factors to distinguish GMFCS stages I from II-III and concurrence of polyneuropathy. PMID:29518090
Classifying with confidence from incomplete information.
Parrish, Nathan; Anderson, Hyrum S.; Gupta, Maya R.; ...
2013-12-01
For this paper, we consider the problem of classifying a test sample given incomplete information. This problem arises naturally when data about a test sample is collected over time, or when costs must be incurred to compute the classification features. For example, in a distributed sensor network only a fraction of the sensors may have reported measurements at a certain time, and additional time, power, and bandwidth is needed to collect the complete data to classify. A practical goal is to assign a class label as soon as enough data is available to make a good decision. We formalize thismore » goal through the notion of reliability—the probability that a label assigned given incomplete data would be the same as the label assigned given the complete data, and we propose a method to classify incomplete data only if some reliability threshold is met. Our approach models the complete data as a random variable whose distribution is dependent on the current incomplete data and the (complete) training data. The method differs from standard imputation strategies in that our focus is on determining the reliability of the classification decision, rather than just the class label. We show that the method provides useful reliability estimates of the correctness of the imputed class labels on a set of experiments on time-series data sets, where the goal is to classify the time-series as early as possible while still guaranteeing that the reliability threshold is met.« less
1981-04-01
JAN 73 1473 EDITION OF • NOV 6S IS OBSOLETE SECURITY M^ mrm THIS PAGE (When Date Entered) UNCLASSIFIED SECURITY CLASSIFICATION OP THIS PAOKWhrn Dmtm...similar to those obtained during the laboratory simulation (Figure 5) o > Q ID < 47.5 48.5 49.5 50.5 TIME ( ms ) Figure 5, Example of the...45> ■400 VELOCITY PULSE PROJECTILE TYPE HE(TP) 12 TIME ( MS ) Figure 8. Time Series for the 75mm HE(JPl Projectile, Rd, No. 4 reference length
Classification of DNA nucleotides with transverse tunneling currents
NASA Astrophysics Data System (ADS)
Nyvold Pedersen, Jonas; Boynton, Paul; Di Ventra, Massimiliano; Jauho, Antti-Pekka; Flyvbjerg, Henrik
2017-01-01
It has been theoretically suggested and experimentally demonstrated that fast and low-cost sequencing of DNA, RNA, and peptide molecules might be achieved by passing such molecules between electrodes embedded in a nanochannel. The experimental realization of this scheme faces major challenges, however. In realistic liquid environments, typical currents in tunneling devices are of the order of picoamps. This corresponds to only six electrons per microsecond, and this number affects the integration time required to do current measurements in real experiments. This limits the speed of sequencing, though current fluctuations due to Brownian motion of the molecule average out during the required integration time. Moreover, data acquisition equipment introduces noise, and electronic filters create correlations in time-series data. We discuss how these effects must be included in the analysis of, e.g., the assignment of specific nucleobases to current signals. As the signals from different molecules overlap, unambiguous classification is impossible with a single measurement. We argue that the assignment of molecules to a signal is a standard pattern classification problem and calculation of the error rates is straightforward. The ideas presented here can be extended to other sequencing approaches of current interest.
NASA Astrophysics Data System (ADS)
Song, Biao; Lu, Dan; Peng, Ming; Li, Xia; Zou, Ye; Huang, Meizhen; Lu, Feng
2017-02-01
Raman spectroscopy is developed as a fast and non-destructive method for the discrimination and classification of hydroxypropyl methyl cellulose (HPMC) samples. 44 E series and 41 K series of HPMC samples are measured by a self-developed portable Raman spectrometer (Hx-Raman) which is excited by a 785 nm diode laser and the spectrum range is 200-2700 cm-1 with a resolution (FWHM) of 6 cm-1. Multivariate analysis is applied for discrimination of E series from K series. By methods of principal components analysis (PCA) and Fisher discriminant analysis (FDA), a discrimination result with sensitivity of 90.91% and specificity of 95.12% is achieved. The corresponding receiver operating characteristic (ROC) is 0.99, indicting the accuracy of the predictive model. This result demonstrates the prospect of portable Raman spectrometer for rapid, non-destructive classification and discrimination of E series and K series samples of HPMC.
Janssen, Simone; Schmidt, Sabine
2009-07-01
The perception of prosodic cues in human speech may be rooted in mechanisms common to mammals. The present study explores to what extent bats use rhythm and frequency, typically carrying prosodic information in human speech, for the classification of communication call series. Using a two-alternative, forced choice procedure, we trained Megaderma lyra to discriminate between synthetic contact call series differing in frequency, rhythm on level of calls and rhythm on level of call series, and measured the classification performance for stimuli differing in only one, or two, of the above parameters. A comparison with predictions from models based on one, combinations of two, or all, parameters revealed that the bats based their decision predominantly on frequency and in addition on rhythm on the level of call series, whereas rhythm on level of calls was not taken into account in this paradigm. Moreover, frequency and rhythm on the level of call series were evaluated independently. Our results show that parameters corresponding to prosodic cues in human languages are perceived and evaluated by bats. Thus, these necessary prerequisites for a communication via prosodic structures in mammals have evolved far before human speech.
Application of information-retrieval methods to the classification of physical data
NASA Technical Reports Server (NTRS)
Mamotko, Z. N.; Khorolskaya, S. K.; Shatrovskiy, L. I.
1975-01-01
Scientific data received from satellites are characterized as a multi-dimensional time series, whose terms are vector functions of a vector of measurement conditions. Information retrieval methods are used to construct lower dimensional samples on the basis of the condition vector, in order to obtain these data and to construct partial relations. The methods are applied to the joint Soviet-French Arkad project.
Reliable Early Classification on Multivariate Time Series with Numerical and Categorical Attributes
2015-05-22
design a procedure of feature extraction in REACT named MEG (Mining Equivalence classes with shapelet Generators) based on the concept of...Equivalence Classes Mining [12, 15]. MEG can efficiently and effectively generate the discriminative features. In addition, several strategies are proposed...technique of parallel computing [4] to propose a process of pa- rallel MEG for substantially reducing the computational overhead of discovering shapelet
NASA Astrophysics Data System (ADS)
Karahaliou, A.; Vassiou, K.; Skiadopoulos, S.; Kanavou, T.; Yiakoumelos, A.; Costaridou, L.
2009-07-01
The current study investigates whether texture features extracted from lesion kinetics feature maps can be used for breast cancer diagnosis. Fifty five women with 57 breast lesions (27 benign, 30 malignant) were subjected to dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) on 1.5T system. A linear-slope model was fitted pixel-wise to a representative lesion slice time series and fitted parameters were used to create three kinetic maps (wash out, time to peak enhancement and peak enhancement). 28 grey level co-occurrence matrices features were extracted from each lesion kinetic map. The ability of texture features per map in discriminating malignant from benign lesions was investigated using a Probabilistic Neural Network classifier. Additional classification was performed by combining classification outputs of most discriminating feature subsets from the three maps, via majority voting. The combined scheme outperformed classification based on individual maps achieving area under Receiver Operating Characteristics curve 0.960±0.029. Results suggest that heterogeneity of breast lesion kinetics, as quantified by texture analysis, may contribute to computer assisted tissue characterization in DCE-MRI.
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Ryan, Robert E.; Smoot, James; Kuper, Phillip; Prados, Donald; Russell, Jeffrey; Ross, Kenton; Gasser, Gerald; Sader, Steven; McKellip, Rodney
2007-01-01
This report details one of three experiments performed during FY 2007 for the NASA RPC (Rapid Prototyping Capability) at Stennis Space Center. This RPC experiment assesses the potential of VIIRS (Visible/Infrared Imager/Radiometer Suite) and MODIS (Moderate Resolution Imaging Spectroradiometer) data for detecting and monitoring forest defoliation from the non-native Eurasian gypsy moth (Lymantria dispar). The intent of the RPC experiment was to assess the degree to which VIIRS data can provide forest disturbance monitoring information as an input to a forest threat EWS (Early Warning System) as compared to the level of information that can be obtained from MODIS data. The USDA Forest Service (USFS) plans to use MODIS products for generating broad-scaled, regional monitoring products as input to an EWS for forest health threat assessment. NASA SSC is helping the USFS to evaluate and integrate currently available satellite remote sensing technologies and data products for the EWS, including the use of MODIS products for regional monitoring of forest disturbance. Gypsy moth defoliation of the mid-Appalachian highland region was selected as a case study. Gypsy moth is one of eight major forest insect threats listed in the Healthy Forest Restoration Act (HFRA) of 2003; the gypsy moth threatens eastern U.S. hardwood forests, which are also a concern highlighted in the HFRA of 2003. This region was selected for the project because extensive gypsy moth defoliation occurred there over multiple years during the MODIS operational period. This RPC experiment is relevant to several nationally important mapping applications, including agricultural efficiency, coastal management, ecological forecasting, disaster management, and carbon management. In this experiment, MODIS data and VIIRS data simulated from MODIS were assessed for their ability to contribute broad, regional geospatial information on gypsy moth defoliation. Landsat and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer) data were used to assess the quality of gypsy moth defoliation mapping products derived from MODIS data and from simulated VIIRS data. The project focused on use of data from MODIS Terra as opposed to MODIS Aqua mainly because only MODIS Terra data was collected during 2000 and 2001-years with comparatively high amounts of gypsy moth defoliation within the study area. The project assessed the quality of VIIRS data simulation products. Hyperion data was employed to assess the quality of MODIS-based VIIRS simulation datasets using image correlation analysis techniques. The ART (Application Research Toolbox) software was used for data simulation. Correlation analysis between MODIS-simulated VIIRS data and Hyperion-simulated VIIRS data for red, NIR (near-infrared), and NDVI (Normalized Difference Vegetation Index) image data products collectively indicate that useful, effective VIIRS simulations can be produced using Hyperion and MODIS data sources. The r(exp 2) for red, NIR, and NDVI products were 0.56, 0.63, and 0.62, respectively, indicating a moderately high correlation between the 2 data sources. Temporal decorrelation from different data acquisition times and image misregistration may have lowered correlation results. The RPC experiment also generated MODIS-based time series data products using the TSPT (Time Series Product Tool) software. Time series of simulated VIIRS NDVI products were produced at approximately 400-meter resolution GSD (Ground Sampling Distance) at nadir for comparison to MODIS NDVI products at either 250- or 500-meter GSD. The project also computed MODIS (MOD02) NDMI (Normalized Difference Moisture Index) products at 500-meter GSD for comparison to NDVI-based products. For each year during 2000-2006, MODIS and VIIRS (simulated from MOD02) time series were computed during the peak gypsy moth defoliation time frame in the study area (approximately June 10 through July 27). Gypsy moth defoliation mapping products from simated VIIRS and MOD02 time series were produced using multiple methods, including image classification and change detection via image differencing. The latter enabled an automated defoliation detection product computed using percent change in maximum NDVI for a peak defoliation period during 2001 compared to maximum NDVI across the entire 2000-2006 time frame. Final gypsy moth defoliation mapping products were assessed for accuracy using randomly sampled locations found on available geospatial reference data (Landsat and ASTER data in conjunction with defoliation map data from the USFS). Extensive gypsy moth defoliation patches were evident on screen displays of multitemporal color composites derived from MODIS data and from simulated VIIRS vegetation index data. Such defoliation was particularly evident for 2001, although widespread denuded forests were also seen for 2000 and 2003. These visualizations were validated using aforementioned reference data. Defoliation patches were visible on displays of MODIS-based NDVI and NDMI data. The viewing of apparent defoliation patches on all of these products necessitated adoption of a specialized temporal data processing method (e.g., maximum NDVI during the peak defoliation time frame). The frequency of cloud cover necessitated this approach. Multitemporal simulated VIIRS and MODIS Terra data both produced effective general classifications of defoliated forest versus other land cover. For 2001, the MOD02-simulated VIIRS 400-meter NDVI classification produced a similar yet slightly lower overall accuracy (87.28 percent with 0.72 Kappa) than the MOD02 250-meter NDVI classification (88.44 percent with 0.75 Kappa). The MOD13 250-meter NDVI classification had a lower overall accuracy (79.13 percent) and a much lower Kappa (0.46). The report discusses accuracy assessment results in much more detail, comparing overall classification and individual class accuracy statistics for simulated VIIRS 400-meter NDVI, MOD02 250-meter NDVI, MOD02-500 meter NDVI, MOD13 250-meter NDVI, and MOD02 500-meter NDMI classifications. Automated defoliation detection products from simulated VIIRS and MOD02 data for 2001 also yielded similar, relatively high overall classification accuracy (85.55 percent for the VIIRS 400-meter NDVI versus 87.28 percent for the MOD02 250-meter NDVI). In contrast, the USFS aerial sketch map of gypsy moth defoliation showed a lower overall classification accuracy at 73.64 percent. The overall classification Kappa values were also similar for the VIIRS (approximately 0.67 Kappa) versus the MOD02 (approximately 0.72 Kappa) automated defoliation detection product, which were much higher than the values exhibited by the USFS sketch map product (overall Kappa of approximately 0.47). The report provides additional details on the accuracy of automated gypsy moth defoliation detection products compared with USFS sketch maps. The results suggest that VIIRS data can be effectively simulated from MODIS data and that VIIRS data will produce gypsy moth defoliation mapping products that are similar to MODIS-based products. The results of the RPC experiment indicate that VIIRS and MODIS data products have good potential for integration into the forest threat EWS. The accuracy assessment was performed only for 2001 because of time constraints and a relative scarcity of cloud-free Landsat and ASTER data for the peak defoliation period of the other years in the 2000-2006 time series. Additional work should be performed to assess the accuracy of gypsy moth defoliation detection products for additional years.The study area (mid-Appalachian highlands) and application (gypsy moth forest defoliation) are not necessarily representative of all forested regions and of all forest threat disturbance agents. Additional work should be performed on other inland and coastal regions as well as for other major forest threats.
Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors.
Belkacem, Abdelkader Nasreddine; Saetia, Supat; Zintus-art, Kalanyu; Shin, Duk; Kambara, Hiroyuki; Yoshimura, Natsue; Berrached, Nasreddine; Koike, Yasuharu
2015-01-01
EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control.
Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors
Saetia, Supat; Zintus-art, Kalanyu; Shin, Duk; Kambara, Hiroyuki; Yoshimura, Natsue; Berrached, Nasreddine; Koike, Yasuharu
2015-01-01
EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control. PMID:26690500
Algorithms exploiting ultrasonic sensors for subject classification
NASA Astrophysics Data System (ADS)
Desai, Sachi; Quoraishee, Shafik
2009-09-01
Proposed here is a series of techniques exploiting micro-Doppler ultrasonic sensors capable of characterizing various detected mammalian targets based on their physiological movements captured a series of robust features. Employed is a combination of unique and conventional digital signal processing techniques arranged in such a manner they become capable of classifying a series of walkers. These processes for feature extraction develops a robust feature space capable of providing discrimination of various movements generated from bipeds and quadrupeds and further subdivided into large or small. These movements can be exploited to provide specific information of a given signature dividing it in a series of subset signatures exploiting wavelets to generate start/stop times. After viewing a series spectrograms of the signature we are able to see distinct differences and utilizing kurtosis, we generate an envelope detector capable of isolating each of the corresponding step cycles generated during a walk. The walk cycle is defined as one complete sequence of walking/running from the foot pushing off the ground and concluding when returning to the ground. This time information segments the events that are readily seen in the spectrogram but obstructed in the temporal domain into individual walk sequences. This walking sequence is then subsequently translated into a three dimensional waterfall plot defining the expected energy value associated with the motion at particular instance of time and frequency. The value is capable of being repeatable for each particular class and employable to discriminate the events. Highly reliable classification is realized exploiting a classifier trained on a candidate sample space derived from the associated gyrations created by motion from actors of interest. The classifier developed herein provides a capability to classify events as an adult humans, children humans, horses, and dogs at potentially high rates based on the tested sample space. The algorithm developed and described will provide utility to an underused sensor modality for human intrusion detection because of the current high-rate of generated false alarms. The active ultrasonic sensor coupled in a multi-modal sensor suite with binary, less descriptive sensors like seismic devices realizing a greater accuracy rate for detection of persons of interest for homeland purposes.
Description of congenital hand anomalies: a personal view.
Tonkin, M A
2006-10-01
A series of four congenital hand cases exhibiting central clefting are presented. The cases are morphologically similar and exhibit characteristics of both symbrachydactyly and central longitudinal deficiency. The cases demonstrate difficulties in classification by either the IFSSH classification system or the JSSH modification of it. An alternative descriptive approach to classification is suggested.
Linear Classification of Dairy Cattle. Slide Script.
ERIC Educational Resources Information Center
Sipiorski, James; Spike, Peter
This slide script, part of a series of slide scripts designed for use in vocational agriculture classes, deals with principles of the linear classification of dairy cattle. Included in the guide are narrations for use with 63 slides, which illustrate the following areas that are considered in the linear classification system: stature, strength,…
Automated Spatio-Temporal Analysis of Remotely Sensed Imagery for Water Resources Management
NASA Astrophysics Data System (ADS)
Bahr, Thomas
2016-04-01
Since 2012, the state of California faces an extreme drought, which impacts water supply in many ways. Advanced remote sensing is an important technology to better assess water resources, monitor drought conditions and water supplies, plan for drought response and mitigation, and measure drought impacts. In the present case study latest time series analysis capabilities are used to examine surface water in reservoirs located along the western flank of the Sierra Nevada region of California. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. A time series from Landsat images (L-5 TM, L-7 ETM+, L-8 OLI) of the AOI was obtained for 1999 to 2015 (October acquisitions). Downloaded from the USGS EarthExplorer web site, they already were georeferenced to a UTM Zone 10N (WGS-84) coordinate system. ENVITasks were used to pre-process the Landsat images as follows: • Triangulation based gap-filling for the SLC-off Landsat-7 ETM+ images. • Spatial subsetting to the same geographic extent. • Radiometric correction to top-of-atmosphere (TOA) reflectance. • Atmospheric correction using QUAC®, which determines atmospheric correction parameters directly from the observed pixel spectra in a scene, without ancillary information. Spatio-temporal analysis was executed with the following tasks: • Creation of Modified Normalized Difference Water Index images (MNDWI, Xu 2006) to enhance open water features while suppressing noise from built-up land, vegetation, and soil. • Threshold based classification of the water index images to extract the water features. • Classification aggregation as a post-classification cleanup process. • Export of the respective water classes to vector layers for further evaluation in a GIS. • Animation of the classification series and export to a common video format. • Plotting the time series of water surface area in square kilometers. The automated spatio-temporal analysis introduced here can be embedded in virtually any existing geospatial workflow for operational applications. Three integration options were implemented in this case study: • Integration within any ArcGIS environment whether deployed on the desktop, in the cloud, or online. Execution uses a customized ArcGIS script tool. A Python script file retrieves the parameters from the user interface and runs the precompiled IDL code. That IDL code is used to interface between the Python script and the relevant ENVITasks. • Publishing the spatio-temporal analysis tasks as services via the ENVI Services Engine (ESE). ESE is a cloud-based image analysis solution to publish and deploy advanced ENVI image and data analytics to existing enterprise infrastructures. For this purpose the entire IDL code can be capsuled in a single ENVITask. • Integration in an existing geospatial workflow using the Python-to-IDL Bridge. This mechanism allows calling IDL code within Python on a user-defined platform. The results of this case study verify the drastic decrease of the amount of surface water in the AOI, indicative of the major drought that is pervasive throughout California. Accordingly, the time series analysis was correlated successfully with the daily reservoir elevations of the Don Pedro reservoir (station DNP, operated by CDEC).
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
ERIC Educational Resources Information Center
Albrechtsen, Hanne, Ed.; Mai, Jens-Erik, Ed.
This volume is a compilation of the papers presented at the 10th ASIS (American Society for Information Science) workshop on classification research. Major themes include the social and cultural informatics of classification and coding systems, subject access and indexing theory, genre analysis and the agency of documents in the ordering of…
An iterative approach to optimize change classification in SAR time series data
NASA Astrophysics Data System (ADS)
Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan
2016-10-01
The detection of changes using remote sensing imagery has become a broad field of research with many approaches for many different applications. Besides the simple detection of changes between at least two images acquired at different times, analyses which aim on the change type or category are at least equally important. In this study, an approach for a semi-automatic classification of change segments is presented. A sparse dataset is considered to ensure the fast and simple applicability for practical issues. The dataset is given by 15 high resolution (HR) TerraSAR-X (TSX) amplitude images acquired over a time period of one year (11/2013 to 11/2014). The scenery contains the airport of Stuttgart (GER) and its surroundings, including urban, rural, and suburban areas. Time series imagery offers the advantage of analyzing the change frequency of selected areas. In this study, the focus is set on the analysis of small-sized high frequently changing regions like parking areas, construction sites and collecting points consisting of high activity (HA) change objects. For each HA change object, suitable features are extracted and a k-means clustering is applied as the categorization step. Resulting clusters are finally compared to a previously introduced knowledge-based class catalogue, which is modified until an optimal class description results. In other words, the subjective understanding of the scenery semantics is optimized by the data given reality. Doing so, an even sparsely dataset containing only amplitude imagery can be evaluated without requiring comprehensive training datasets. Falsely defined classes might be rejected. Furthermore, classes which were defined too coarsely might be divided into sub-classes. Consequently, classes which were initially defined too narrowly might be merged. An optimal classification results when the combination of previously defined key indicators (e.g., number of clusters per class) reaches an optimum.
Exercise recognition for Kinect-based telerehabilitation.
Antón, D; Goñi, A; Illarramendi, A
2015-01-01
An aging population and people's higher survival to diseases and traumas that leave physical consequences are challenging aspects in the context of an efficient health management. This is why telerehabilitation systems are being developed, to allow monitoring and support of physiotherapy sessions at home, which could reduce healthcare costs while also improving the quality of life of the users. Our goal is the development of a Kinect-based algorithm that provides a very accurate real-time monitoring of physical rehabilitation exercises and that also provides a friendly interface oriented both to users and physiotherapists. The two main constituents of our algorithm are the posture classification method and the exercises recognition method. The exercises consist of series of movements. Each movement is composed of an initial posture, a final posture and the angular trajectories of the limbs involved in the movement. The algorithm was designed and tested with datasets of real movements performed by volunteers. We also explain in the paper how we obtained the optimal values for the trade-off values for posture and trajectory recognition. Two relevant aspects of the algorithm were evaluated in our tests, classification accuracy and real-time data processing. We achieved 91.9% accuracy in posture classification and 93.75% accuracy in trajectory recognition. We also checked whether the algorithm was able to process the data in real-time. We found that our algorithm could process more than 20,000 postures per second and all the required trajectory data-series in real-time, which in practice guarantees no perceptible delays. Later on, we carried out two clinical trials with real patients that suffered shoulder disorders. We obtained an exercise monitoring accuracy of 95.16%. We present an exercise recognition algorithm that handles the data provided by Kinect efficiently. The algorithm has been validated in a real scenario where we have verified its suitability. Moreover, we have received a positive feedback from both users and the physiotherapists who took part in the tests.
NASA Technical Reports Server (NTRS)
Spruce, J. P.; Smoot, James; Ellis, Jean; Hilbert, Kent; Swann, Roberta
2012-01-01
This paper discusses the development and implementation of a geospatial data processing method and multi-decadal Landsat time series for computing general coastal U.S. land-use and land-cover (LULC) classifications and change products consisting of seven classes (water, barren, upland herbaceous, non-woody wetland, woody upland, woody wetland, and urban). Use of this approach extends the observational period of the NOAA-generated Coastal Change and Analysis Program (C-CAP) products by almost two decades, assuming the availability of one cloud free Landsat scene from any season for each targeted year. The Mobile Bay region in Alabama was used as a study area to develop, demonstrate, and validate the method that was applied to derive LULC products for nine dates at approximate five year intervals across a 34-year time span, using single dates of data for each classification in which forests were either leaf-on, leaf-off, or mixed senescent conditions. Classifications were computed and refined using decision rules in conjunction with unsupervised classification of Landsat data and C-CAP value-added products. Each classification's overall accuracy was assessed by comparing stratified random locations to available reference data, including higher spatial resolution satellite and aerial imagery, field survey data, and raw Landsat RGBs. Overall classification accuracies ranged from 83 to 91% with overall Kappa statistics ranging from 0.78 to 0.89. The accuracies are comparable to those from similar, generalized LULC products derived from C-CAP data. The Landsat MSS-based LULC product accuracies are similar to those from Landsat TM or ETM+ data. Accurate classifications were computed for all nine dates, yielding effective results regardless of season. This classification method yielded products that were used to compute LULC change products via additive GIS overlay techniques.
Swetapadma, Aleena; Yadav, Anamika
2015-01-01
Many schemes are reported for shunt fault location estimation, but fault location estimation of series or open conductor faults has not been dealt with so far. The existing numerical relays only detect the open conductor (series) fault and give the indication of the faulty phase(s), but they are unable to locate the series fault. The repair crew needs to patrol the complete line to find the location of series fault. In this paper fuzzy based fault detection/classification and location schemes in time domain are proposed for both series faults, shunt faults, and simultaneous series and shunt faults. The fault simulation studies and fault location algorithm have been developed using Matlab/Simulink. Synchronized phasors of voltage and current signals of both the ends of the line have been used as input to the proposed fuzzy based fault location scheme. Percentage of error in location of series fault is within 1% and shunt fault is 5% for all the tested fault cases. Validation of percentage of error in location estimation is done using Chi square test with both 1% and 5% level of significance. PMID:26413088
High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections.
Zhu, Xiangbin; Qiu, Huiling
2016-01-01
Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved.
High Accuracy Human Activity Recognition Based on Sparse Locality Preserving Projections
2016-01-01
Human activity recognition(HAR) from the temporal streams of sensory data has been applied to many fields, such as healthcare services, intelligent environments and cyber security. However, the classification accuracy of most existed methods is not enough in some applications, especially for healthcare services. In order to improving accuracy, it is necessary to develop a novel method which will take full account of the intrinsic sequential characteristics for time-series sensory data. Moreover, each human activity may has correlated feature relationship at different levels. Therefore, in this paper, we propose a three-stage continuous hidden Markov model (TSCHMM) approach to recognize human activities. The proposed method contains coarse, fine and accurate classification. The feature reduction is an important step in classification processing. In this paper, sparse locality preserving projections (SpLPP) is exploited to determine the optimal feature subsets for accurate classification of the stationary-activity data. It can extract more discriminative activities features from the sensor data compared with locality preserving projections. Furthermore, all of the gyro-based features are used for accurate classification of the moving data. Compared with other methods, our method uses significantly less number of features, and the over-all accuracy has been obviously improved. PMID:27893761
Burnt area mapping from ERS-SAR time series using the principal components transformation
NASA Astrophysics Data System (ADS)
Gimeno, Meritxell; San-Miguel Ayanz, Jesus; Barbosa, Paulo M.; Schmuck, Guido
2003-03-01
Each year thousands of hectares of forest burnt across Southern Europe. To date, remote sensing assessments of this phenomenon have focused on the use of optical satellite imagery. However, the presence of clouds and smoke prevents the acquisition of this type of data in some areas. It is possible to overcome this problem by using synthetic aperture radar (SAR) data. Principal component analysis (PCA) was performed to quantify differences between pre- and post- fire images and to investigate the separability over a European Remote Sensing (ERS) SAR time series. Moreover, the transformation was carried out to determine the best conditions to acquire optimal SAR imagery according to meteorological parameters and the procedures to enhance burnt area discrimination for the identification of fire damage assessment. A comparative neural network classification was performed in order to map and to assess the burnts using a complete ERS time series or just an image before and an image after the fire according to the PCA. The results suggest that ERS is suitable to highlight areas of localized changes associated with forest fire damage in Mediterranean landcover.
Carrión, Alicia; Miralles, Ramón; Lara, Guillermo
2014-09-01
In this paper, we present a novel and completely different approach to the problem of scattering material characterization: measuring the degree of predictability of the time series. Measuring predictability can provide information of the signal strength of the deterministic component of the time series in relation to the whole time series acquired. This relationship can provide information about coherent reflections in material grains with respect to the rest of incoherent noises that typically appear in non-destructive testing using ultrasonics. This is a non-parametric technique commonly used in chaos theory that does not require making any kind of assumptions about attenuation profiles. In highly scattering media (low SNR), it has been shown theoretically that the degree of predictability allows material characterization. The experimental results obtained in this work with 32 cement probes of 4 different porosities demonstrate the ability of this technique to do classification. It has also been shown that, in this particular application, the measurement of predictability can be used as an indicator of the percentages of porosity of the test samples with great accuracy. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Ryan, Robert E.; Smoot, James C.; Prados, Donald; McKellip, Rodney; Sader. Steven A.; Gasser, Jerry; May, George; Hargrove, William
2007-01-01
A NASA RPC (Rapid Prototyping Capability) experiment was conducted to assess the potential of VIIRS (Visible/Infrared Imager/Radiometer Suite) data for monitoring non-native gypsy moth (Lymantria dispar) defoliation of forests. This experiment compares defoliation detection products computed from simulated VIIRS and from MODIS (Moderate Resolution Imaging Spectroradiometer) time series products as potential inputs to a forest threat EWS (Early Warning System) being developed for the USFS (USDA Forest Service). Gypsy moth causes extensive defoliation of broadleaved forests in the United States and is specifically identified in the Healthy Forest Restoration Act (HFRA) of 2003. The HFRA mandates development of a national forest threat EWS. This system is being built by the USFS and NASA is aiding integration of needed satellite data products into this system, including MODIS products. This RPC experiment enabled the MODIS follow-on, VIIRS, to be evaluated as a data source for EWS forest monitoring products. The experiment included 1) assessment of MODIS-simulated VIIRS NDVI products, and 2) evaluation of gypsy moth defoliation mapping products from MODIS-simulated VIIRS and from MODIS NDVI time series data. This experiment employed MODIS data collected over the approximately 15 million acre mid-Appalachian Highlands during the annual peak defoliation time frame (approximately June 10 through July 27) during 2000-2006. NASA Stennis Application Research Toolbox software was used to produce MODIS-simulated VIIRS data and NASA Stennis Time Series Product Tool software was employed to process MODIS and MODIS-simulated VIIRS time series data scaled to planetary reflectance. MODIS-simulated VIIRS data was assessed through comparison to Hyperion-simulated VIIRS data using data collected during gypsy moth defoliation. Hyperion-simulated MODIS data showed a high correlation with actual MODIS data (NDVI R2 of 0.877 and RMSE of 0.023). MODIS-simulated VIIRS data for the same date showed moderately high correlation with Hyperion-simulated VIIRS data (NDVI R2 of 0.62 and RMSE of 0.035), even though the datasets were collected about a half an hour apart during changing weather conditions. MODIS products (MOD02, MOD09, and MOD13) and MOD02-simulated VIIRS time series data were used to generate defoliation mapping products based on image classification and image differencing change detection techniques. Accuracy of final defoliation mapping products was assessed by image interpreting over 170 randomly sampled locations found on Landsat and ASTER data in conjunction with defoliation map data from the USFS. The MOD02-simulated VIIRS 400-meter NDVI classification produced a similar overall accuracy (87.28 percent with 0.72 Kappa) to the MOD02 250-meter NDVI classification (86.71 percent with 0.71 Kappa). In addition, the VIIRS 400-meter NDVI, MOD02 250-meter NDVI, and MOD02 500-meter NDVI showed good user and producer accuracies for the defoliated forest class (70 percent) and acceptable Kappa values (0.66). MOD02 and MOD02-simulated VIIRS data both showed promise as data sources for regional monitoring of forest disturbance due to insect defoliation.
Understanding similarity of groundwater systems with empirical copulas
NASA Astrophysics Data System (ADS)
Haaf, Ezra; Kumar, Rohini; Samaniego, Luis; Barthel, Roland
2016-04-01
Within the classification framework for groundwater systems that aims for identifying similarity of hydrogeological systems and transferring information from a well-observed to an ungauged system (Haaf and Barthel, 2015; Haaf and Barthel, 2016), we propose a copula-based method for describing groundwater-systems similarity. Copulas are an emerging method in hydrological sciences that make it possible to model the dependence structure of two groundwater level time series, independently of the effects of their marginal distributions. This study is based on Samaniego et al. (2010), which described an approach calculating dissimilarity measures from bivariate empirical copula densities of streamflow time series. Subsequently, streamflow is predicted in ungauged basins by transferring properties from similar catchments. The proposed approach is innovative because copula-based similarity has not yet been applied to groundwater systems. Here we estimate the pairwise dependence structure of 600 wells in Southern Germany using 10 years of weekly groundwater level observations. Based on these empirical copulas, dissimilarity measures are estimated, such as the copula's lower- and upper corner cumulated probability, copula-based Spearman's rank correlation - as proposed by Samaniego et al. (2010). For the characterization of groundwater systems, copula-based metrics are compared with dissimilarities obtained from precipitation signals corresponding to the presumed area of influence of each groundwater well. This promising approach provides a new tool for advancing similarity-based classification of groundwater system dynamics. Haaf, E., Barthel, R., 2015. Methods for assessing hydrogeological similarity and for classification of groundwater systems on the regional scale, EGU General Assembly 2015, Vienna, Austria. Haaf, E., Barthel, R., 2016. An approach for classification of hydrogeological systems at the regional scale based on groundwater hydrographs EGU General Assembly 2016, Vienna, Austria. Samaniego, L., Bardossy, A., Kumar, R., 2010. Streamflow prediction in ungauged catchments using copula-based dissimilarity measures. Water Resources Research, 46. DOI:10.1029/2008wr007695
NASA Astrophysics Data System (ADS)
Su, Lihong
In remote sensing communities, support vector machine (SVM) learning has recently received increasing attention. SVM learning usually requires large memory and enormous amounts of computation time on large training sets. According to SVM algorithms, the SVM classification decision function is fully determined by support vectors, which compose a subset of the training sets. In this regard, a solution to optimize SVM learning is to efficiently reduce training sets. In this paper, a data reduction method based on agglomerative hierarchical clustering is proposed to obtain smaller training sets for SVM learning. Using a multiple angle remote sensing dataset of a semi-arid region, the effectiveness of the proposed method is evaluated by classification experiments with a series of reduced training sets. The experiments show that there is no loss of SVM accuracy when the original training set is reduced to 34% using the proposed approach. Maximum likelihood classification (MLC) also is applied on the reduced training sets. The results show that MLC can also maintain the classification accuracy. This implies that the most informative data instances can be retained by this approach.
An approach for automatic classification of grouper vocalizations with passive acoustic monitoring.
Ibrahim, Ali K; Chérubin, Laurent M; Zhuang, Hanqi; Schärer Umpierre, Michelle T; Dalgleish, Fraser; Erdol, Nurgun; Ouyang, B; Dalgleish, A
2018-02-01
Grouper, a family of marine fishes, produce distinct vocalizations associated with their reproductive behavior during spawning aggregation. These low frequencies sounds (50-350 Hz) consist of a series of pulses repeated at a variable rate. In this paper, an approach is presented for automatic classification of grouper vocalizations from ambient sounds recorded in situ with fixed hydrophones based on weighted features and sparse classifier. Group sounds were labeled initially by humans for training and testing various feature extraction and classification methods. In the feature extraction phase, four types of features were used to extract features of sounds produced by groupers. Once the sound features were extracted, three types of representative classifiers were applied to categorize the species that produced these sounds. Experimental results showed that the overall percentage of identification using the best combination of the selected feature extractor weighted mel frequency cepstral coefficients and sparse classifier achieved 82.7% accuracy. The proposed algorithm has been implemented in an autonomous platform (wave glider) for real-time detection and classification of group vocalizations.
Variability of rainfall over Lake Kariba catchment area in the Zambezi river basin, Zimbabwe
NASA Astrophysics Data System (ADS)
Muchuru, Shepherd; Botai, Joel O.; Botai, Christina M.; Landman, Willem A.; Adeola, Abiodun M.
2016-04-01
In this study, average monthly and annual rainfall totals recorded for the period 1970 to 2010 from a network of 13 stations across the Lake Kariba catchment area of the Zambezi river basin were analyzed in order to characterize the spatial-temporal variability of rainfall across the catchment area. In the analysis, the data were subjected to intervention and homogeneity analysis using the Cumulative Summation (CUSUM) technique and step change analysis using rank-sum test. Furthermore, rainfall variability was characterized by trend analysis using the non-parametric Mann-Kendall statistic. Additionally, the rainfall series were decomposed and the spectral characteristics derived using Cross Wavelet Transform (CWT) and Wavelet Coherence (WC) analysis. The advantage of using the wavelet-based parameters is that they vary in time and can therefore be used to quantitatively detect time-scale-dependent correlations and phase shifts between rainfall time series at various localized time-frequency scales. The annual and seasonal rainfall series were homogeneous and demonstrated no apparent significant shifts. According to the inhomogeneity classification, the rainfall series recorded across the Lake Kariba catchment area belonged to category A (useful) and B (doubtful), i.e., there were zero to one and two absolute tests rejecting the null hypothesis (at 5 % significance level), respectively. Lastly, the long-term variability of the rainfall series across the Lake Kariba catchment area exhibited non-significant positive and negative trends with coherent oscillatory modes that are constantly locked in phase in the Morlet wavelet space.
Hidden Semi-Markov Models and Their Application
NASA Astrophysics Data System (ADS)
Beyreuther, M.; Wassermann, J.
2008-12-01
In the framework of detection and classification of seismic signals there are several different approaches. Our choice for a more robust detection and classification algorithm is to adopt Hidden Markov Models (HMM), a technique showing major success in speech recognition. HMM provide a powerful tool to describe highly variable time series based on a double stochastic model and therefore allow for a broader class description than e.g. template based pattern matching techniques. Being a fully probabilistic model, HMM directly provide a confidence measure of an estimated classification. Furthermore and in contrast to classic artificial neuronal networks or support vector machines, HMM are incorporating the time dependence explicitly in the models thus providing a adequate representation of the seismic signal. As the majority of detection algorithms, HMM are not based on the time and amplitude dependent seismogram itself but on features estimated from the seismogram which characterize the different classes. Features, or in other words characteristic functions, are e.g. the sonogram bands, instantaneous frequency, instantaneous bandwidth or centroid time. In this study we apply continuous Hidden Semi-Markov Models (HSMM), an extension of continuous HMM. The duration probability of a HMM is an exponentially decaying function of the time, which is not a realistic representation of the duration of an earthquake. In contrast HSMM use Gaussians as duration probabilities, which results in an more adequate model. The HSMM detection and classification system is running online as an EARTHWORM module at the Bavarian Earthquake Service. Here the signals that are to be classified simply differ in epicentral distance. This makes it possible to easily decide whether a classification is correct or wrong and thus allows to better evaluate the advantages and disadvantages of the proposed algorithm. The evaluation is based on several month long continuous data and the results are additionally compared to the previously published discrete HMM, continuous HMM and a classic STA/LTA. The intermediate evaluation results are very promising.
Series and parallel arc-fault circuit interrupter tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Fresquez, Armando J.; Gudgel, Bob
2013-07-01
While the 2011 National Electrical Codeª (NEC) only requires series arc-fault protection, some arc-fault circuit interrupter (AFCI) manufacturers are designing products to detect and mitigate both series and parallel arc-faults. Sandia National Laboratories (SNL) has extensively investigated the electrical differences of series and parallel arc-faults and has offered possible classification and mitigation solutions. As part of this effort, Sandia National Laboratories has collaborated with MidNite Solar to create and test a 24-string combiner box with an AFCI which detects, differentiates, and de-energizes series and parallel arc-faults. In the case of the MidNite AFCI prototype, series arc-faults are mitigated by openingmore » the PV strings, whereas parallel arc-faults are mitigated by shorting the array. A range of different experimental series and parallel arc-fault tests with the MidNite combiner box were performed at the Distributed Energy Technologies Laboratory (DETL) at SNL in Albuquerque, NM. In all the tests, the prototype de-energized the arc-faults in the time period required by the arc-fault circuit interrupt testing standard, UL 1699B. The experimental tests confirm series and parallel arc-faults can be successfully mitigated with a combiner box-integrated solution.« less
Two-pass imputation algorithm for missing value estimation in gene expression time series.
Tsiporkova, Elena; Boeva, Veselka
2007-10-01
Gene expression microarray experiments frequently generate datasets with multiple values missing. However, most of the analysis, mining, and classification methods for gene expression data require a complete matrix of gene array values. Therefore, the accurate estimation of missing values in such datasets has been recognized as an important issue, and several imputation algorithms have already been proposed to the biological community. Most of these approaches, however, are not particularly suitable for time series expression profiles. In view of this, we propose a novel imputation algorithm, which is specially suited for the estimation of missing values in gene expression time series data. The algorithm utilizes Dynamic Time Warping (DTW) distance in order to measure the similarity between time expression profiles, and subsequently selects for each gene expression profile with missing values a dedicated set of candidate profiles for estimation. Three different DTW-based imputation (DTWimpute) algorithms have been considered: position-wise, neighborhood-wise, and two-pass imputation. These have initially been prototyped in Perl, and their accuracy has been evaluated on yeast expression time series data using several different parameter settings. The experiments have shown that the two-pass algorithm consistently outperforms, in particular for datasets with a higher level of missing entries, the neighborhood-wise and the position-wise algorithms. The performance of the two-pass DTWimpute algorithm has further been benchmarked against the weighted K-Nearest Neighbors algorithm, which is widely used in the biological community; the former algorithm has appeared superior to the latter one. Motivated by these findings, indicating clearly the added value of the DTW techniques for missing value estimation in time series data, we have built an optimized C++ implementation of the two-pass DTWimpute algorithm. The software also provides for a choice between three different initial rough imputation methods.
The Analysis of Surface EMG Signals with the Wavelet-Based Correlation Dimension Method
Zhang, Yanyan; Wang, Jue
2014-01-01
Many attempts have been made to effectively improve a prosthetic system controlled by the classification of surface electromyographic (SEMG) signals. Recently, the development of methodologies to extract the effective features still remains a primary challenge. Previous studies have demonstrated that the SEMG signals have nonlinear characteristics. In this study, by combining the nonlinear time series analysis and the time-frequency domain methods, we proposed the wavelet-based correlation dimension method to extract the effective features of SEMG signals. The SEMG signals were firstly analyzed by the wavelet transform and the correlation dimension was calculated to obtain the features of the SEMG signals. Then, these features were used as the input vectors of a Gustafson-Kessel clustering classifier to discriminate four types of forearm movements. Our results showed that there are four separate clusters corresponding to different forearm movements at the third resolution level and the resulting classification accuracy was 100%, when two channels of SEMG signals were used. This indicates that the proposed approach can provide important insight into the nonlinear characteristics and the time-frequency domain features of SEMG signals and is suitable for classifying different types of forearm movements. By comparing with other existing methods, the proposed method exhibited more robustness and higher classification accuracy. PMID:24868240
Dominguez Veiga, Jose Juan; O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E
2017-08-04
Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the proposed method to automatically classify the exercise being completed was assessed using this dataset. For comparative purposes, classification using the same dataset was also performed using the more conventional approach of feature-extraction and classification using random forest classifiers. With the collected dataset and the proposed method, the different exercises could be recognized with a 95.89% (3827/3991) accuracy, which is competitive with current state-of-the-art techniques in ED. The high level of accuracy attained with the proposed approach indicates that the waveform morphologies in the time-series plots for each of the exercises is sufficiently distinct among the participants to allow the use of machine vision approaches. The use of high-level machine learning frameworks, coupled with the novel use of machine vision techniques instead of complex manually crafted features, may facilitate access to research in the HAR field for individuals without extensive digital signal processing or machine learning backgrounds. ©Jose Juan Dominguez Veiga, Martin O'Reilly, Darragh Whelan, Brian Caulfield, Tomas E Ward. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 04.08.2017.
O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E
2017-01-01
Background Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. Objective The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. Methods We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the proposed method to automatically classify the exercise being completed was assessed using this dataset. For comparative purposes, classification using the same dataset was also performed using the more conventional approach of feature-extraction and classification using random forest classifiers. Results With the collected dataset and the proposed method, the different exercises could be recognized with a 95.89% (3827/3991) accuracy, which is competitive with current state-of-the-art techniques in ED. Conclusions The high level of accuracy attained with the proposed approach indicates that the waveform morphologies in the time-series plots for each of the exercises is sufficiently distinct among the participants to allow the use of machine vision approaches. The use of high-level machine learning frameworks, coupled with the novel use of machine vision techniques instead of complex manually crafted features, may facilitate access to research in the HAR field for individuals without extensive digital signal processing or machine learning backgrounds. PMID:28778851
Muscle segmentation in time series images of Drosophila metamorphosis.
Yadav, Kuleesha; Lin, Feng; Wasser, Martin
2015-01-01
In order to study genes associated with muscular disorders, we characterize the phenotypic changes in Drosophila muscle cells during metamorphosis caused by genetic perturbations. We collect in vivo images of muscle fibers during remodeling of larval to adult muscles. In this paper, we focus on the new image processing pipeline designed to quantify the changes in shape and size of muscles. We propose a new two-step approach to muscle segmentation in time series images. First, we implement a watershed algorithm to divide the image into edge-preserving regions, and then, we classify these regions into muscle and non-muscle classes on the basis of shape and intensity. The advantage of our method is two-fold: First, better results are obtained because classification of regions is constrained by the shape of muscle cell from previous time point; and secondly, minimal user intervention results in faster processing time. The segmentation results are used to compare the changes in cell size between controls and reduction of the autophagy related gene Atg 9 during Drosophila metamorphosis.
NASA Astrophysics Data System (ADS)
Haaf, Ezra; Barthel, Roland
2016-04-01
When assessing hydrogeological conditions at the regional scale, the analyst is often confronted with uncertainty of structures, inputs and processes while having to base inference on scarce and patchy data. Haaf and Barthel (2015) proposed a concept for handling this predicament by developing a groundwater systems classification framework, where information is transferred from similar, but well-explored and better understood to poorly described systems. The concept is based on the central hypothesis that similar systems react similarly to the same inputs and vice versa. It is conceptually related to PUB (Prediction in ungauged basins) where organization of systems and processes by quantitative methods is intended and used to improve understanding and prediction. Furthermore, using the framework it is expected that regional conceptual and numerical models can be checked or enriched by ensemble generated data from neighborhood-based estimators. In a first step, groundwater hydrographs from a large dataset in Southern Germany are compared in an effort to identify structural similarity in groundwater dynamics. A number of approaches to group hydrographs, mostly based on a similarity measure - which have previously only been used in local-scale studies, can be found in the literature. These are tested alongside different global feature extraction techniques. The resulting classifications are then compared to a visual "expert assessment"-based classification which serves as a reference. A ranking of the classification methods is carried out and differences shown. Selected groups from the classifications are related to geological descriptors. Here we present the most promising results from a comparison of classifications based on series correlation, different series distances and series features, such as the coefficients of the discrete Fourier transform and the intrinsic mode functions of empirical mode decomposition. Additionally, we show examples of classes corresponding to geological descriptors. Haaf, E., Barthel, R., 2015. Methods for assessing hydrogeological similarity and for classification of groundwater systems on the regional scale, EGU General Assembly 2015, Vienna, Austria.
NASA Technical Reports Server (NTRS)
Huckle, H. F. (Principal Investigator)
1980-01-01
The most probable current U.S. taxonomic classification of the soils estimated to dominate world soil map units (WSM)) in selected crop producing states of Argentina and Brazil are presented. Representative U.S. soil series the units are given. The map units occurring in each state are listed with areal extent and major U.S. land resource areas in which similar soils most probably occur. Soil series sampled in LARS Technical Report 111579 and major land resource areas in which they occur with corresponding similar WSM units at the taxonomic subgroup levels are given.
Streamflow variability and classification using false nearest neighbor method
NASA Astrophysics Data System (ADS)
Vignesh, R.; Jothiprakash, V.; Sivakumar, B.
2015-12-01
Understanding regional streamflow dynamics and patterns continues to be a challenging problem. The present study introduces the false nearest neighbor (FNN) algorithm, a nonlinear dynamic-based method, to examine the spatial variability of streamflow over a region. The FNN method is a dimensionality-based approach, where the dimension of the time series represents its variability. The method uses phase space reconstruction and nearest neighbor concepts, and identifies false neighbors in the reconstructed phase space. The FNN method is applied to monthly streamflow data monitored over a period of 53 years (1950-2002) in an extensive network of 639 stations in the contiguous United States (US). Since selection of delay time in phase space reconstruction may influence the FNN outcomes, analysis is carried out for five different delay time values: monthly, seasonal, and annual separation of data as well as delay time values obtained using autocorrelation function (ACF) and average mutual information (AMI) methods. The FNN dimensions for the 639 streamflow series are generally identified to range from 4 to 12 (with very few exceptional cases), indicating a wide range of variability in the dynamics of streamflow across the contiguous US. However, the FNN dimensions for a majority of the streamflow series are found to be low (less than or equal to 6), suggesting low level of complexity in streamflow dynamics in most of the individual stations and over many sub-regions. The FNN dimension estimates also reveal that streamflow dynamics in the western parts of the US (including far west, northwestern, and southwestern parts) generally exhibit much greater variability compared to that in the eastern parts of the US (including far east, northeastern, and southeastern parts), although there are also differences among 'pockets' within these regions. These results are useful for identification of appropriate model complexity at individual stations, patterns across regions and sub-regions, interpolation and extrapolation of data, and catchment classification. An attempt is also made to relate the FNN dimensions with catchment characteristics and streamflow statistical properties.
A graph-based approach to detect spatiotemporal dynamics in satellite image time series
NASA Astrophysics Data System (ADS)
Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal
2017-08-01
Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.
Modeling sports highlights using a time-series clustering framework and model interpretation
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay
2005-01-01
In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.
Interpretable Categorization of Heterogeneous Time Series Data
NASA Technical Reports Server (NTRS)
Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua
2017-01-01
We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.
SAX-VSM: Interpretable Time Series Classification Using SAX and Vector Space Model
2013-01-01
points in the region 800-1900 cm−1. The two top-ranked by SAX- VSM subsequences in both datasets correpond to spectrogram intervals of Chlorogenic acid ...1600 1800 Wavenumbers Best class-characteristic subsequences - Chlorogenic acid Arabica Robusta 800 1000 1200 1400 1600 1800 Wavenumbers Second to best...correspond to chlorogenic acid (best subsequence) and to caffeine (second to best) regions of spectra. This result aligns with the original work based on
pySPACE—a signal processing and classification environment in Python
Krell, Mario M.; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H.; Kirchner, Elsa A.; Kirchner, Frank
2013-01-01
In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries. PMID:24399965
pySPACE-a signal processing and classification environment in Python.
Krell, Mario M; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H; Kirchner, Elsa A; Kirchner, Frank
2013-01-01
In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.
Mahoney, Christine M; Kelly, Ryan T; Alexander, Liz; Newburn, Matt; Bader, Sydney; Ewing, Robert G; Fahey, Albert J; Atkinson, David A; Beagley, Nathaniel
2016-04-05
Time-of-flight-secondary ion mass spectrometry (TOF-SIMS) and laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) were used for characterization and identification of unique signatures from a series of 18 Composition C-4 plastic explosives. The samples were obtained from various commercial and military sources around the country. Positive and negative ion TOF-SIMS data were acquired directly from the C-4 residue on Si surfaces, where the positive ion mass spectra obtained were consistent with the major composition of organic additives, and the negative ion mass spectra were more consistent with explosive content in the C-4 samples. Each series of mass spectra was subjected to partial least squares-discriminant analysis (PLS-DA), a multivariate statistical analysis approach which serves to first find the areas of maximum variance within different classes of C-4 and subsequently to classify unknown samples based on correlations between the unknown data set and the original data set (often referred to as a training data set). This method was able to successfully classify test samples of C-4, though with a limited degree of certainty. The classification accuracy of the method was further improved by integrating the positive and negative ion data using a Bayesian approach. The TOF-SIMS data was combined with a second analytical method, LA-ICPMS, which was used to analyze elemental signatures in the C-4. The integrated data were able to classify test samples with a high degree of certainty. Results indicate that this Bayesian integrated approach constitutes a robust classification method that should be employable even in dirty samples collected in the field.
Guo, Hao; Liu, Lei; Chen, Junjie; Xu, Yong; Jie, Xiang
2017-01-01
Functional magnetic resonance imaging (fMRI) is one of the most useful methods to generate functional connectivity networks of the brain. However, conventional network generation methods ignore dynamic changes of functional connectivity between brain regions. Previous studies proposed constructing high-order functional connectivity networks that consider the time-varying characteristics of functional connectivity, and a clustering method was performed to decrease computational cost. However, random selection of the initial clustering centers and the number of clusters negatively affected classification accuracy, and the network lost neurological interpretability. Here we propose a novel method that introduces the minimum spanning tree method to high-order functional connectivity networks. As an unbiased method, the minimum spanning tree simplifies high-order network structure while preserving its core framework. The dynamic characteristics of time series are not lost with this approach, and the neurological interpretation of the network is guaranteed. Simultaneously, we propose a multi-parameter optimization framework that involves extracting discriminative features from the minimum spanning tree high-order functional connectivity networks. Compared with the conventional methods, our resting-state fMRI classification method based on minimum spanning tree high-order functional connectivity networks greatly improved the diagnostic accuracy for Alzheimer's disease. PMID:29249926
Stationarity analysis of historical flood series in France and Spain (14th-20th centuries)
NASA Astrophysics Data System (ADS)
Barriendos, M.; Coeur, D.; Lang, M.; Llasat, M. C.; Naulet, R.; Lemaitre, F.; Barrera, A.
Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600-1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.
NASA Astrophysics Data System (ADS)
Madokoro, H.; Yamanashi, A.; Sato, K.
2013-08-01
This paper presents an unsupervised scene classification method for actualizing semantic recognition of indoor scenes. Background and foreground features are respectively extracted using Gist and color scale-invariant feature transform (SIFT) as feature representations based on context. We used hue, saturation, and value SIFT (HSV-SIFT) because of its simple algorithm with low calculation costs. Our method creates bags of features for voting visual words created from both feature descriptors to a two-dimensional histogram. Moreover, our method generates labels as candidates of categories for time-series images while maintaining stability and plasticity together. Automatic labeling of category maps can be realized using labels created using adaptive resonance theory (ART) as teaching signals for counter propagation networks (CPNs). We evaluated our method for semantic scene classification using KTH's image database for robot localization (KTH-IDOL), which is popularly used for robot localization and navigation. The mean classification accuracies of Gist, gray SIFT, one class support vector machines (OC-SVM), position-invariant robust features (PIRF), and our method are, respectively, 39.7, 58.0, 56.0, 63.6, and 79.4%. The result of our method is 15.8% higher than that of PIRF. Moreover, we applied our method for fine classification using our original mobile robot. We obtained mean classification accuracy of 83.2% for six zones.
Energy crop mapping with enhanced TM/MODIS time series in the BCAP agricultural lands
NASA Astrophysics Data System (ADS)
Wang, Cuizhen; Fan, Qian; Li, Qingting; SooHoo, William M.; Lu, Linlin
2017-02-01
Since the mid-2000s, agricultural lands in the United States have been undergoing rapid change to meet the increasing bioenergy demand. In 2009 the USDA Biomass Crop Assistance Program (BCAP) was established. In its Project Area 1, land owners are financially supported to grow perennial prairie grasses (switchgrass) in their row-crop lands. To promote the program, this study tested the feasibility of biomass crop mapping based on unique timings of crop development. With a previously published data fusion algorithm - the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM), a 10-day normalized difference vegetation index (NDVI) time series in 2007 was established by fusing MODIS reflectance into TM image series. Two critical dates - peak growing (PG) and peak drying (PD) - were extracted and a unique "PG-0-PD" timing sequence was defined for each crop. With a knowledge-based decision tree approach, the classification of enhanced TM/MODIS time series reached an overall accuracy of 76% against the USDA Crop Data layer (CDL). Especially, our results showed that winter wheat single cropping and wheat-soybean double cropping were much better classified, which may provide additional information for the CDL product. More importantly, this study extracted the first spatial layer of warm-season prairie grasses that have not been published in any national land cover products, which could serve as a base map for decision making of bioenergy land use in BCAP land.
Energy-efficiency based classification of the manufacturing workstation
NASA Astrophysics Data System (ADS)
Frumuşanu, G.; Afteni, C.; Badea, N.; Epureanu, A.
2017-08-01
EU Directive 92/75/EC established for the first time an energy consumption labelling scheme, further implemented by several other directives. As consequence, nowadays many products (e.g. home appliances, tyres, light bulbs, houses) have an EU Energy Label when offered for sale or rent. Several energy consumption models of manufacturing equipments have been also developed. This paper proposes an energy efficiency - based classification of the manufacturing workstation, aiming to characterize its energetic behaviour. The concept of energy efficiency of the manufacturing workstation is defined. On this base, a classification methodology has been developed. It refers to specific criteria and their evaluation modalities, together to the definition & delimitation of energy efficiency classes. The energy class position is defined after the amount of energy needed by the workstation in the middle point of its operating domain, while its extension is determined by the value of the first coefficient from the Taylor series that approximates the dependence between the energy consume and the chosen parameter of the working regime. The main domain of interest for this classification looks to be the optimization of the manufacturing activities planning and programming. A case-study regarding an actual lathe classification from energy efficiency point of view, based on two different approaches (analytical and numerical) is also included.
Development of Subscale Fast Cookoff Test (PREPRINT)
2006-09-21
The hazards classification procedures have been harmonized with both the UN Test and Criteria Manual for UN Series 1...aimed at the development of a sub-scale alternate test protocol to the external fire test currently required for final hazards classification (HC...external fire test currently required for final hazards classification (HC) of an ordnance system. The specific goal of this part of the task was
Weyhenmeyer, Jonathan; Hernandez, Manuel E; Lainscsek, Claudia; Sejnowski, Terrence J; Poizner, Howard
2014-01-01
Parkinson's disease (PD) is known to lead to marked alterations in cortical-basal ganglia activity that may be amenable to serve as a biomarker for PD diagnosis. Using non-linear delay differential equations (DDE) for classification of PD patients on and off dopaminergic therapy (PD-on, PD-off, respectively) from healthy age-matched controls (CO), we show that 1 second of quasi-resting state clean and raw electroencephalogram (EEG) data can be used to classify CO from PD-on/off based on the area under the receiver operating characteristic curve (AROC). Raw EEG is shown to classify more robustly (AROC=0.59-0.86) than clean EEG data (AROC=0.57-0.72). Decomposition of the raw data into stereotypical and non-stereotypical artifacts provides evidence that increased classification of raw EEG time series originates from muscle artifacts. Thus, non-linear feature extraction and classification of raw EEG data in a low dimensional feature space is a potential biomarker for Parkinson's disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, Mark William; Steinbach, Ryan Matthew; Moya, Mary M
2015-10-01
Except in the most extreme conditions, Synthetic aperture radar (SAR) is a remote sensing technology that can operate day or night. A SAR can provide surveillance over a long time period by making multiple passes over a wide area. For object-based intelligence it is convenient to segment and classify the SAR images into objects that identify various terrains and man-made structures that we call “static features.” In this paper we introduce a novel SAR image product that captures how different regions decorrelate at different rates. Using superpixels and their first two moments we develop a series of one-class classification algorithmsmore » using a goodness-of-fit metric. P-value fusion is used to combine the results from different classes. We also show how to combine multiple one-class classifiers to get a confidence about a classification. This can be used by downstream algorithms such as a conditional random field to enforce spatial constraints.« less
NASA Astrophysics Data System (ADS)
Pratiher, Sawon; Patra, Sayantani; Pratiher, Souvik
2017-06-01
A novel analytical methodology for segregating healthy and neurological disorders from gait patterns is proposed by employing a set of oscillating components called intrinsic mode functions (IMF's). These IMF's are generated by the Empirical Mode Decomposition of the gait time series and the Hilbert transformed analytic signal representation forms the complex plane trace of the elliptical shaped analytic IMFs. The area measure and the relative change in the centroid position of the polygon formed by the Convex Hull of these analytic IMF's are taken as the discriminative features. Classification accuracy of 79.31% with Ensemble learning based Adaboost classifier validates the adequacy of the proposed methodology for a computer aided diagnostic (CAD) system for gait pattern identification. Also, the efficacy of several potential biomarkers like Bandwidth of Amplitude Modulation and Frequency Modulation IMF's and it's Mean Frequency from the Fourier-Bessel expansion from each of these analytic IMF's has been discussed for its potency in diagnosis of gait pattern identification and classification.
Reducing uncertainty on satellite image classification through spatiotemporal reasoning
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Nikolakaki, Natassa; Psillakis, Periklis; Miliaresis, George; Xanthakis, Michail
2014-05-01
The natural habitat constantly endures both inherent natural and human-induced influences. Remote sensing has been providing monitoring oriented solutions regarding the natural Earth surface, by offering a series of tools and methodologies which contribute to prudent environmental management. Processing and analysis of multi-temporal satellite images for the observation of the land changes include often classification and change-detection techniques. These error prone procedures are influenced mainly by the distinctive characteristics of the study areas, the remote sensing systems limitations and the image analysis processes. The present study takes advantage of the temporal continuity of multi-temporal classified images, in order to reduce classification uncertainty, based on reasoning rules. More specifically, pixel groups that temporally oscillate between classes are liable to misclassification or indicate problematic areas. On the other hand, constant pixel group growth indicates a pressure prone area. Computational tools are developed in order to disclose the alterations in land use dynamics and offer a spatial reference to the pressures that land use classes endure and impose between them. Moreover, by revealing areas that are susceptible to misclassification, we propose specific target site selection for training during the process of supervised classification. The underlying objective is to contribute to the understanding and analysis of anthropogenic and environmental factors that influence land use changes. The developed algorithms have been tested upon Landsat satellite image time series, depicting the National Park of Ainos in Kefallinia, Greece, where the unique in the world Abies cephalonica grows. Along with the minor changes and pressures indicated in the test area due to harvesting and other human interventions, the developed algorithms successfully captured fire incidents that have been historically confirmed. Overall, the results have shown that the use of the suggested procedures can contribute to the reduction of the classification uncertainty and support the existing knowledge regarding the pressure among land-use changes.
Phenological features for winter rapeseed identification in Ukraine using satellite data
NASA Astrophysics Data System (ADS)
Kravchenko, Oleksiy
2014-05-01
Winter rapeseed is one of the major oilseed crops in Ukraine that is characterized by high profitability and often grown with violations of the crop rotation requirements leading to soil degradation. Therefore, rapeseed identification using satellite data is a promising direction for operational estimation of the crop acreage and rotation control. Crop acreage of rapeseed is about 0.5-3% of total area of Ukraine, which poses a major problem for identification using satellite data [1]. While winter rapeseed could be classified using biomass features observed during autumn vegetation, these features are quite unstable due to field to field differences in planting dates as well as spatial and temporal heterogeneity in soil moisture availability. Due to this fact autumn biomass level features could be used only locally (at NUTS-3 level) and are not suitable for large-scale country wide crop identification. We propose to use crop parameters at flowering phenological stage for crop identification and present a method for parameters estimation using time-series of moderate resolution data. Rapeseed flowering could be observed as a bell-shaped peak in red reflectance time series. However the duration of the flowering period that is observable by satellite data is about only two weeks, which is quite short period taking into account inevitable cloud coverage issues. Thus we need daily time series to resolve the flowering peak and due to this we are limited to moderate resolution data. We used daily atmospherically corrected MODIS data coming from Terra and Aqua satellites within 90-160 DOY period to perform features calculations. Empirical BRDF correction is used to minimize angular effects. We used Gaussian Processes Regression (GPR) for temporal interpolation to minimize errors due to residual could coverage, atmospheric correction and a mixed pixel problems. We estimate 12 parameters for each time series. They are red and near-infrared (NIR) reflectance and the timing at four stages: before and after the flowering, at the peak flowering and at the maximum NIR level. We used Support Vector Machine for data classification. The most relevant feature for classification is flowering peak timing followed by flowering peak magnitude. The dependency of the peak time on the latitude as a sole feature could be used to reject 90% of non-rapeseed pixels that is greatly reduces the imbalance of the classification problem. To assess the accuracy of our approach we performed a stratified area frame sampling survey in Odessa region (NUTS-2 level) in 2013. The omission error is about 12.6% while commission error is higher at the level of 22%. This fact is explained by high viewing angle composition criterion that is used in our approach to mitigate high cloud coverage problem. However the errors are quite stable spatially and could be easily corrected by regression technique. To do this we performed area estimation for Odessa region using regression estimator and obtained good area estimation accuracy with 4.6% error (1σ). [1] Gallego, F.J., et al., Efficiency assessment of using satellite data for crop area estimation in Ukraine. Int. J. Appl. Earth Observ. Geoinf. (2014), http://dx.doi.org/10.1016/j.jag.2013.12.013
ADHD classification using bag of words approach on network features
NASA Astrophysics Data System (ADS)
Solmaz, Berkan; Dey, Soumyabrata; Rao, A. Ravishankar; Shah, Mubarak
2012-02-01
Attention Deficit Hyperactivity Disorder (ADHD) is receiving lots of attention nowadays mainly because it is one of the common brain disorders among children and not much information is known about the cause of this disorder. In this study, we propose to use a novel approach for automatic classification of ADHD conditioned subjects and control subjects using functional Magnetic Resonance Imaging (fMRI) data of resting state brains. For this purpose, we compute the correlation between every possible voxel pairs within a subject and over the time frame of the experimental protocol. A network of voxels is constructed by representing a high correlation value between any two voxels as an edge. A Bag-of-Words (BoW) approach is used to represent each subject as a histogram of network features; such as the number of degrees per voxel. The classification is done using a Support Vector Machine (SVM). We also investigate the use of raw intensity values in the time series for each voxel. Here, every subject is represented as a combined histogram of network and raw intensity features. Experimental results verified that the classification accuracy improves when the combined histogram is used. We tested our approach on a highly challenging dataset released by NITRC for ADHD-200 competition and obtained promising results. The dataset not only has a large size but also includes subjects from different demography and edge groups. To the best of our knowledge, this is the first paper to propose BoW approach in any functional brain disorder classification and we believe that this approach will be useful in analysis of many brain related conditions.
NASA Astrophysics Data System (ADS)
Fluet-Chouinard, E.; Lehner, B.; Aires, F.; Prigent, C.; McIntyre, P. B.
2017-12-01
Global surface water maps have improved in spatial and temporal resolutions through various remote sensing methods: open water extents with compiled Landsat archives and inundation with topographically downscaled multi-sensor retrievals. These time-series capture variations through time of open water and inundation without discriminating between hydrographic features (e.g. lakes, reservoirs, river channels and wetland types) as other databases have done as static representation. Available data sources present the opportunity to generate a comprehensive map and typology of aquatic environments (deepwater and wetlands) that improves on earlier digitized inventories and maps. The challenge of classifying surface waters globally is to distinguishing wetland types with meaningful characteristics or proxies (hydrology, water chemistry, soils, vegetation) while accommodating limitations of remote sensing data. We present a new wetland classification scheme designed for global application and produce a map of aquatic ecosystem types globally using state-of-the-art remote sensing products. Our classification scheme combines open water extent and expands it with downscaled multi-sensor inundation data to capture the maximal vegetated wetland extent. The hierarchical structure of the classification is modified from the Cowardin Systems (1979) developed for the USA. The first level classification is based on a combination of landscape positions and water source (e.g. lacustrine, riverine, palustrine, coastal and artificial) while the second level represents the hydrologic regime (e.g. perennial, seasonal, intermittent and waterlogged). Class-specific descriptors can further detail the wetland types with soils and vegetation cover. Our globally consistent nomenclature and top-down mapping allows for direct comparison across biogeographic regions, to upscale biogeochemical fluxes as well as other landscape level functions.
Van Berkel, Gary J.; Kertesz, Vilmos
2016-11-15
An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Berkel, Gary J.; Kertesz, Vilmos
An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less
NASA Astrophysics Data System (ADS)
Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang
2017-05-01
Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.
Bayesian methods for outliers detection in GNSS time series
NASA Astrophysics Data System (ADS)
Qianqian, Zhang; Qingming, Gui
2013-07-01
This article is concerned with the problem of detecting outliers in GNSS time series based on Bayesian statistical theory. Firstly, a new model is proposed to simultaneously detect different types of outliers based on the conception of introducing different types of classification variables corresponding to the different types of outliers; the problem of outlier detection is converted into the computation of the corresponding posterior probabilities, and the algorithm for computing the posterior probabilities based on standard Gibbs sampler is designed. Secondly, we analyze the reasons of masking and swamping about detecting patches of additive outliers intensively; an unmasking Bayesian method for detecting additive outlier patches is proposed based on an adaptive Gibbs sampler. Thirdly, the correctness of the theories and methods proposed above is illustrated by simulated data and then by analyzing real GNSS observations, such as cycle slips detection in carrier phase data. Examples illustrate that the Bayesian methods for outliers detection in GNSS time series proposed by this paper are not only capable of detecting isolated outliers but also capable of detecting additive outlier patches. Furthermore, it can be successfully used to process cycle slips in phase data, which solves the problem of small cycle slips.
Multiscale Symbolic Phase Transfer Entropy in Financial Time Series Classification
NASA Astrophysics Data System (ADS)
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
We address the challenge of classifying financial time series via a newly proposed multiscale symbolic phase transfer entropy (MSPTE). Using MSPTE method, we succeed to quantify the strength and direction of information flow between financial systems and classify financial time series, which are the stock indices from Europe, America and China during the period from 2006 to 2016 and the stocks of banking, aviation industry and pharmacy during the period from 2007 to 2016, simultaneously. The MSPTE analysis shows that the value of symbolic phase transfer entropy (SPTE) among stocks decreases with the increasing scale factor. It is demonstrated that MSPTE method can well divide stocks into groups by areas and industries. In addition, it can be concluded that the MSPTE analysis quantify the similarity among the stock markets. The symbolic phase transfer entropy (SPTE) between the two stocks from the same area is far less than the SPTE between stocks from different areas. The results also indicate that four stocks from America and Europe have relatively high degree of similarity and the stocks of banking and pharmaceutical industry have higher similarity for CA. It is worth mentioning that the pharmaceutical industry has weaker particular market mechanism than banking and aviation industry.
Generalized Feature Extraction for Wrist Pulse Analysis: From 1-D Time Series to 2-D Matrix.
Dimin Wang; Zhang, David; Guangming Lu
2017-07-01
Traditional Chinese pulse diagnosis, known as an empirical science, depends on the subjective experience. Inconsistent diagnostic results may be obtained among different practitioners. A scientific way of studying the pulse should be to analyze the objectified wrist pulse waveforms. In recent years, many pulse acquisition platforms have been developed with the advances in sensor and computer technology. And the pulse diagnosis using pattern recognition theories is also increasingly attracting attentions. Though many literatures on pulse feature extraction have been published, they just handle the pulse signals as simple 1-D time series and ignore the information within the class. This paper presents a generalized method of pulse feature extraction, extending the feature dimension from 1-D time series to 2-D matrix. The conventional wrist pulse features correspond to a particular case of the generalized models. The proposed method is validated through pattern classification on actual pulse records. Both quantitative and qualitative results relative to the 1-D pulse features are given through diabetes diagnosis. The experimental results show that the generalized 2-D matrix feature is effective in extracting both the periodic and nonperiodic information. And it is practical for wrist pulse analysis.
Tsallis statistics and neurodegenerative disorders
NASA Astrophysics Data System (ADS)
Iliopoulos, Aggelos C.; Tsolaki, Magdalini; Aifantis, Elias C.
2016-08-01
In this paper, we perform statistical analysis of time series deriving from four neurodegenerative disorders, namely epilepsy, amyotrophic lateral sclerosis (ALS), Parkinson's disease (PD), Huntington's disease (HD). The time series are concerned with electroencephalograms (EEGs) of healthy and epileptic states, as well as gait dynamics (in particular stride intervals) of the ALS, PD and HDs. We study data concerning one subject for each neurodegenerative disorder and one healthy control. The analysis is based on Tsallis non-extensive statistical mechanics and in particular on the estimation of Tsallis q-triplet, namely {qstat, qsen, qrel}. The deviation of Tsallis q-triplet from unity indicates non-Gaussian statistics and long-range dependencies for all time series considered. In addition, the results reveal the efficiency of Tsallis statistics in capturing differences in brain dynamics between healthy and epileptic states, as well as differences between ALS, PD, HDs from healthy control subjects. The results indicate that estimations of Tsallis q-indices could be used as possible biomarkers, along with others, for improving classification and prediction of epileptic seizures, as well as for studying the gait complex dynamics of various diseases providing new insights into severity, medications and fall risk, improving therapeutic interventions.
Aktaruzzaman, M; Migliorini, M; Tenhunen, M; Himanen, S L; Bianchi, A M; Sassi, R
2015-05-01
The work considers automatic sleep stage classification, based on heart rate variability (HRV) analysis, with a focus on the distinction of wakefulness (WAKE) from sleep and rapid eye movement (REM) from non-REM (NREM) sleep. A set of 20 automatically annotated one-night polysomnographic recordings was considered, and artificial neural networks were selected for classification. For each inter-heartbeat (RR) series, beside features previously presented in literature, we introduced a set of four parameters related to signal regularity. RR series of three different lengths were considered (corresponding to 2, 6, and 10 successive epochs, 30 s each, in the same sleep stage). Two sets of only four features captured 99 % of the data variance in each classification problem, and both of them contained one of the new regularity features proposed. The accuracy of classification for REM versus NREM (68.4 %, 2 epochs; 83.8 %, 10 epochs) was higher than when distinguishing WAKE versus SLEEP (67.6 %, 2 epochs; 71.3 %, 10 epochs). Also, the reliability parameter (Cohens's Kappa) was higher (0.68 and 0.45, respectively). Sleep staging classification based on HRV was still less precise than other staging methods, employing a larger variety of signals collected during polysomnographic studies. However, cheap and unobtrusive HRV-only sleep classification proved sufficiently precise for a wide range of applications.
VizieR Online Data Catalog: RR Lyrae in SDSS Stripe 82 (Suveges+, 2012)
NASA Astrophysics Data System (ADS)
Suveges, M.; Sesar, B.; Varadi, M.; Mowlavi, N.; Becker, A. C.; Ivezic, Z.; Beck, M.; Nienartowicz, K.; Rimoldini, L.; Dubath, P.; Bartholdi, P.; Eyer, L.
2013-05-01
We propose a robust principal component analysis framework for the exploitation of multiband photometric measurements in large surveys. Period search results are improved using the time-series of the first principal component due to its optimized signal-to-noise ratio. The presence of correlated excess variations in the multivariate time-series enables the detection of weaker variability. Furthermore, the direction of the largest variance differs for certain types of variable stars. This can be used as an efficient attribute for classification. The application of the method to a subsample of Sloan Digital Sky Survey Stripe 82 data yielded 132 high-amplitude delta Scuti variables. We also found 129 new RR Lyrae variables, complementary to the catalogue of Sesar et al., extending the halo area mapped by Stripe 82 RR Lyrae stars towards the Galactic bulge. The sample also comprises 25 multiperiodic or Blazhko RR Lyrae stars. (8 data files).
Temporal abstraction for the analysis of intensive care information
NASA Astrophysics Data System (ADS)
Hadad, Alejandro J.; Evin, Diego A.; Drozdowicz, Bartolomé; Chiotti, Omar
2007-11-01
This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations.
Carrara, Marta; Carozzi, Luca; Moss, Travis J; de Pasquale, Marco; Cerutti, Sergio; Lake, Douglas E; Moorman, J Randall; Ferrario, Manuela
2015-01-01
Identification of atrial fibrillation (AF) is a clinical imperative. Heartbeat interval time series are increasingly available from personal monitors, allowing new opportunity for AF diagnosis. Previously, we devised numerical algorithms for identification of normal sinus rhythm (NSR), AF, and SR with frequent ectopy using dynamical measures of heart rate. Here, we wished to validate them in the canonical MIT-BIH ECG databases. We tested algorithms on the NSR, AF and arrhythmia databases. When the databases were combined, the positive predictive value of the new algorithms exceeded 95% for NSR and AF, and was 40% for SR with ectopy. Further, dynamical measures did not distinguish atrial from ventricular ectopy. Inspection of individual 24hour records showed good correlation of observed and predicted rhythms. Heart rate dynamical measures are effective ingredients in numerical algorithms to classify cardiac rhythm from the heartbeat intervals time series alone. Copyright © 2015 Elsevier Inc. All rights reserved.
Parallel photonic information processing at gigabyte per second data rates using transient states
NASA Astrophysics Data System (ADS)
Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo
2013-01-01
The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.
Dai, Wei; Fu, Caroline; Khant, Htet A; Ludtke, Steven J; Schmid, Michael F; Chiu, Wah
2014-11-01
Advances in electron cryotomography have provided new opportunities to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase-contrast optics produces images with markedly increased contrast compared with images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods for obtaining 3D structures of cyanophage assembly intermediates in the host by subtomogram alignment, classification and averaging. Acquiring three or four tomographic tilt series takes ∼12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. The time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume.
Bi-scale analysis of multitemporal land cover fractions for wetland vegetation mapping
NASA Astrophysics Data System (ADS)
Michishita, Ryo; Jiang, Zhiben; Gong, Peng; Xu, Bing
2012-08-01
Land cover fractions (LCFs) derived through spectral mixture analysis are useful in understanding sub-pixel information. However, few studies have been conducted on the analysis of time-series LCFs. Although multi-scale comparisons of spectral index, hard classification, and land surface temperature images have received attention, rarely have these approaches been applied to LCFs. This study compared the LCFs derived through Multiple Endmember Spectral Mixture Analysis (MESMA) using the time-series Landsat Thematic Mapper (TM) and Terra Moderate Resolution Imaging Spectroradiometer (MODIS) data acquired in the Poyang Lake area, China between 2004 and 2005. Specifically, we aimed to: (1) propose an approach for optimal endmember (EM) selection in time-series MESMA; (2) understand the trends in time-series LCFs derived from the TM and MODIS data; and (3) examine the trends in the correlation between the bi-scale LCFs derived from the time-series TM and MODIS data. Our results indicated: (1) the EM spectra chosen according to the proposed hierarchical three-step approach (overall, seasonal, and individual) accurately modeled the both the TM and MODIS images; (2) green vegetation (GV) and NPV/soil/impervious surface (N/S/I) classes followed sine curve trends in the overall area, while the two water classes displayed the water level change pattern in the areas primarily covered with wetland vegetation; and (3) GV, N/S/I, and bright water classes indicated a moderately high agreement between the TM and MODIS LCFs in the whole area (adjusted R2 ⩾ 0.6). However, low levels of correlations were found in the areas primarily dominated by wetland vegetation for all land cover classes.
Wang, Li-wen; Wei, Ya-xing; Niu, Zheng
2008-06-01
1 km MODIS NDVI time series data combining with decision tree classification, supervised classification and unsupervised classification was used to classify land cover type of Qinghai Province into 14 classes. In our classification system, sparse grassland and sparse shrub were emphasized, and their spatial distribution locations were labeled. From digital elevation model (DEM) of Qinghai Province, five elevation belts were achieved, and we utilized geographic information system (GIS) software to analyze vegetation cover variation on different elevation belts. Our research result shows that vegetation cover in Qinghai Province has been improved in recent five years. Vegetation cover area increases from 370047 km2 in 2001 to 374576 km2 in 2006, and vegetation cover rate increases by 0.63%. Among five grade elevation belts, vegetation cover ratio of high mountain belt is the highest (67.92%). The area of middle density grassland in high mountain belt is the largest, of which area is 94 003 km2. Increased area of dense grassland in high mountain belt is the greatest (1280 km2). During five years, the biggest variation is the conversion from sparse grassland to middle density grassland in high mountain belt, of which area is 15931 km2.
Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics
NASA Astrophysics Data System (ADS)
Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo
2014-05-01
Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.
Revealing real-time emotional responses: a personalized assessment based on heartbeat dynamics.
Valenza, Gaetano; Citi, Luca; Lanatá, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo
2014-05-21
Emotion recognition through computational modeling and analysis of physiological signals has been widely investigated in the last decade. Most of the proposed emotion recognition systems require relatively long-time series of multivariate records and do not provide accurate real-time characterizations using short-time series. To overcome these limitations, we propose a novel personalized probabilistic framework able to characterize the emotional state of a subject through the analysis of heartbeat dynamics exclusively. The study includes thirty subjects presented with a set of standardized images gathered from the international affective picture system, alternating levels of arousal and valence. Due to the intrinsic nonlinearity and nonstationarity of the RR interval series, a specific point-process model was devised for instantaneous identification considering autoregressive nonlinearities up to the third-order according to the Wiener-Volterra representation, thus tracking very fast stimulus-response changes. Features from the instantaneous spectrum and bispectrum, as well as the dominant Lyapunov exponent, were extracted and considered as input features to a support vector machine for classification. Results, estimating emotions each 10 seconds, achieve an overall accuracy in recognizing four emotional states based on the circumplex model of affect of 79.29%, with 79.15% on the valence axis, and 83.55% on the arousal axis.
van Bömmel, Alena; Song, Song; Majer, Piotr; Mohr, Peter N C; Heekeren, Hauke R; Härdle, Wolfgang K
2014-07-01
Decision making usually involves uncertainty and risk. Understanding which parts of the human brain are activated during decisions under risk and which neural processes underly (risky) investment decisions are important goals in neuroeconomics. Here, we analyze functional magnetic resonance imaging (fMRI) data on 17 subjects who were exposed to an investment decision task from Mohr, Biele, Krugel, Li, and Heekeren (in NeuroImage 49, 2556-2563, 2010b). We obtain a time series of three-dimensional images of the blood-oxygen-level dependent (BOLD) fMRI signals. We apply a panel version of the dynamic semiparametric factor model (DSFM) presented in Park, Mammen, Wolfgang, and Borak (in Journal of the American Statistical Association 104(485), 284-298, 2009) and identify task-related activations in space and dynamics in time. With the panel DSFM (PDSFM) we can capture the dynamic behavior of the specific brain regions common for all subjects and represent the high-dimensional time-series data in easily interpretable low-dimensional dynamic factors without large loss of variability. Further, we classify the risk attitudes of all subjects based on the estimated low-dimensional time series. Our classification analysis successfully confirms the estimated risk attitudes derived directly from subjects' decision behavior.
Working memory supports inference learning just like classification learning.
Craig, Stewart; Lewandowsky, Stephan
2013-08-01
Recent research has found a positive relationship between people's working memory capacity (WMC) and their speed of category learning. To date, only classification-learning tasks have been considered, in which people learn to assign category labels to objects. It is unknown whether learning to make inferences about category features might also be related to WMC. We report data from a study in which 119 participants undertook classification learning and inference learning, and completed a series of WMC tasks. Working memory capacity was positively related to people's classification and inference learning performance.
NASA Astrophysics Data System (ADS)
Champion, N.
2012-08-01
Contrary to aerial images, satellite images are often affected by the presence of clouds. Identifying and removing these clouds is one of the primary steps to perform when processing satellite images, as they may alter subsequent procedures such as atmospheric corrections, DSM production or land cover classification. The main goal of this paper is to present the cloud detection approach, developed at the French Mapping agency. Our approach is based on the availability of multi-temporal satellite images (i.e. time series that generally contain between 5 and 10 images) and is based on a region-growing procedure. Seeds (corresponding to clouds) are firstly extracted through a pixel-to-pixel comparison between the images contained in time series (the presence of a cloud is here assumed to be related to a high variation of reflectance between two images). Clouds are then delineated finely using a dedicated region-growing algorithm. The method, originally designed for panchromatic SPOT5-HRS images, is tested in this paper using time series with 9 multi-temporal satellite images. Our preliminary experiments show the good performances of our method. In a near future, the method will be applied to Pléiades images, acquired during the in-flight commissioning phase of the satellite (launched at the end of 2011). In that context, this is a particular goal of this paper to show to which extent and in which way our method can be adapted to this kind of imagery.
NASA Astrophysics Data System (ADS)
Chen, Bangqian; Xiao, Xiangming; Li, Xiangping; Pan, Lianghao; Doughty, Russell; Ma, Jun; Dong, Jinwei; Qin, Yuanwei; Zhao, Bin; Wu, Zhixiang; Sun, Rui; Lan, Guoyu; Xie, Guishui; Clinton, Nicholas; Giri, Chandra
2017-09-01
Due to rapid losses of mangrove forests caused by anthropogenic disturbances and climate change, accurate and contemporary maps of mangrove forests are needed to understand how mangrove ecosystems are changing and establish plans for sustainable management. In this study, a new classification algorithm was developed using the biophysical characteristics of mangrove forests in China. More specifically, these forests were mapped by identifying: (1) greenness, canopy coverage, and tidal inundation from time series Landsat data, and (2) elevation, slope, and intersection-with-sea criterion. The annual mean Normalized Difference Vegetation Index (NDVI) was found to be a key variable in determining the classification thresholds of greenness, canopy coverage, and tidal inundation of mangrove forests, which are greatly affected by tide dynamics. In addition, the integration of Sentinel-1A VH band and modified Normalized Difference Water Index (mNDWI) shows great potential in identifying yearlong tidal and fresh water bodies, which is related to mangrove forests. This algorithm was developed using 6 typical Regions of Interest (ROIs) as algorithm training and was run on the Google Earth Engine (GEE) cloud computing platform to process 1941 Landsat images (25 Path/Row) and 586 Sentinel-1A images circa 2015. The resultant mangrove forest map of China at 30 m spatial resolution has an overall/users/producer's accuracy greater than 95% when validated with ground reference data. In 2015, China's mangrove forests had a total area of 20,303 ha, about 92% of which was in the Guangxi Zhuang Autonomous Region, Guangdong, and Hainan Provinces. This study has demonstrated the potential of using the GEE platform, time series Landsat and Sentine-1A SAR images to identify and map mangrove forests along the coastal zones. The resultant mangrove forest maps are likely to be useful for the sustainable management and ecological assessments of mangrove forests in China.
Correlation-based pattern recognition for implantable defibrillators.
Wilkins, J.
1996-01-01
An estimated 300,000 Americans die each year from cardiac arrhythmias. Historically, drug therapy or surgery were the only treatment options available for patients suffering from arrhythmias. Recently, implantable arrhythmia management devices have been developed. These devices allow abnormal cardiac rhythms to be sensed and corrected in vivo. Proper arrhythmia classification is critical to selecting the appropriate therapeutic intervention. The classification problem is made more challenging by the power/computation constraints imposed by the short battery life of implantable devices. Current devices utilize heart rate-based classification algorithms. Although easy to implement, rate-based approaches have unacceptably high error rates in distinguishing supraventricular tachycardia (SVT) from ventricular tachycardia (VT). Conventional morphology assessment techniques used in ECG analysis often require too much computation to be practical for implantable devices. In this paper, a computationally-efficient, arrhythmia classification architecture using correlation-based morphology assessment is presented. The architecture classifies individuals heart beats by assessing similarity between an incoming cardiac signal vector and a series of prestored class templates. A series of these beat classifications are used to make an overall rhythm assessment. The system makes use of several new results in the field of pattern recognition. The resulting system achieved excellent accuracy in discriminating SVT and VT. PMID:8947674
[GRADE system: classification of quality of evidence and strength of recommendation].
Aguayo-Albasini, José Luis; Flores-Pastor, Benito; Soria-Aledo, Víctor
2014-02-01
The acquisition and classification of scientific evidence, and subsequent formulation of recommendations constitute the basis for the development of clinical practice guidelines. There are several systems for the classification of evidence and strength of recommendations; the most commonly used nowadays is the Grading of Recommendations, Assessment, Development and Evaluation system (GRADE). The GRADE system initially classifies the evidence into high or low, coming from experimental or observational studies; subsequently and following a series of considerations, the evidence is classified into high, moderate, low or very low. The strength of recommendations is based not only on the quality of the evidence, but also on a series of factors such as the risk/benefit balance, values and preferences of the patients and professionals, and the use of resources or costs. Copyright © 2013 AEC. Published by Elsevier Espana. All rights reserved.
[Physiologic and hygienic characteristics of college teachers work].
Ryzhov, A Ia; Komin, S V; Kopkareva, O O
2005-01-01
First series of studies covered analysis of lecture with registering number of words and movements complementary to them. The series 2 determined occupational activities of college teacher, according to contemporary hygienic classification, as highly intensive work requiring physiologic and managerial correction.
Concerning a new classification of tricyanides
NASA Technical Reports Server (NTRS)
Krafft, F.; Vonhansen, A.
1979-01-01
A new classification series of tricyanides is presented. Several tricyanides are synthesized by a simple method from aluminum chloride, benzonitrile, and a respective alkyl or phenyl chloride, purified by recrystallization and distillation, and then analyzed. Structural formulae are suggested, and molecular weights, melting points, and boiling points are determined for each.
Soil Genesis and Development, Lesson 5 - Soil Geography and Classification
USDA-ARS?s Scientific Manuscript database
The system of soil classification developed by the United States Department of Agriculture (USDA) is called Soil Taxonomy. Soil Taxonomy consists of a hierarchy of six levels which, from highest to lowest, are: Order, Suborder, Great Group, Subgroup, family, and series. This lesson will focus on bro...
A vegetation classification system for use in California: its conceptual basis
Timothy E. Paysen; Jeanine A. Derby; C. Eugene Conrad
1982-01-01
A taxonomic Vegetation Classification System proposed for use in California is designed to simplify interdisciplinary communication about vegetation. The system structure is an aggregative plant community hierarchy at four levels of precision--the Association, Series, Subformation, and Formation. A flexible Phase category links specific resource management concerns to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masci, Frank J.; Grillmair, Carl J.; Cutri, Roc M.
2014-07-01
We describe a methodology to classify periodic variable stars identified using photometric time-series measurements constructed from the Wide-field Infrared Survey Explorer (WISE) full-mission single-exposure Source Databases. This will assist in the future construction of a WISE Variable Source Database that assigns variables to specific science classes as constrained by the WISE observing cadence with statistically meaningful classification probabilities. We have analyzed the WISE light curves of 8273 variable stars identified in previous optical variability surveys (MACHO, GCVS, and ASAS) and show that Fourier decomposition techniques can be extended into the mid-IR to assist with their classification. Combined with other periodicmore » light-curve features, this sample is then used to train a machine-learned classifier based on the random forest (RF) method. Consistent with previous classification studies of variable stars in general, the RF machine-learned classifier is superior to other methods in terms of accuracy, robustness against outliers, and relative immunity to features that carry little or redundant class information. For the three most common classes identified by WISE: Algols, RR Lyrae, and W Ursae Majoris type variables, we obtain classification efficiencies of 80.7%, 82.7%, and 84.5% respectively using cross-validation analyses, with 95% confidence intervals of approximately ±2%. These accuracies are achieved at purity (or reliability) levels of 88.5%, 96.2%, and 87.8% respectively, similar to that achieved in previous automated classification studies of periodic variable stars.« less
Symbolic dynamic filtering and language measure for behavior identification of mobile robots.
Mallapragada, Goutham; Ray, Asok; Jin, Xin
2012-06-01
This paper presents a procedure for behavior identification of mobile robots, which requires limited or no domain knowledge of the underlying process. While the features of robot behavior are extracted by symbolic dynamic filtering of the observed time series, the behavior patterns are classified based on language measure theory. The behavior identification procedure has been experimentally validated on a networked robotic test bed by comparison with commonly used tools, namely, principal component analysis for feature extraction and Bayesian risk analysis for pattern classification.
Wideband Detection and Classification of Practice Limpet Mines against Various Backgrounds
2008-07-01
variations de la hauteur. Les sonars imageurs haute fréquence permettent de dresser une carte de la réflectivité haute fréquence de la surface et de...25 Figure 32 The cross -correlations (described in the text) of the echo time series with a reference plate echo. The first target is...Fig.20d) for the [17 57] kHz compensated pulse.......................... 28 viii DRDC Atlantic TM 2008-079 Figure 34 The cross
1995-12-01
questions about the stock market data and calculations. I truly had no idea of the breadth of information available. I’m truly impressed and proud of...your technical expertise and insight. This research began with your suggestion and would be nowhere without your market insight. Despite some set backs...predict stock market behavior prior to this work. Chapter III will describe in detail the methodology that was developed in this research to improve the
NASA Astrophysics Data System (ADS)
Fujiki, Shogoro; Okada, Kei-ichi; Nishio, Shogo; Kitayama, Kanehiro
2016-09-01
We developed a new method to estimate stand ages of secondary vegetation in the Bornean montane zone, where local people conduct traditional shifting cultivation and protected areas are surrounded by patches of recovering secondary vegetation of various ages. Identifying stand ages at the landscape level is critical to improve conservation policies. We combined a high-resolution satellite image (WorldView-2) with time-series Landsat images. We extracted stand ages (the time elapsed since the most recent slash and burn) from a change-detection analysis with Landsat time-series images and superimposed the derived stand ages on the segments classified by object-based image analysis using WorldView-2. We regarded stand ages as a response variable, and object-based metrics as independent variables, to develop regression models that explain stand ages. Subsequently, we classified the vegetation of the target area into six age units and one rubber plantation unit (1-3 yr, 3-5 yr, 5-7 yr, 7-30 yr, 30-50 yr, >50 yr and 'rubber plantation') using regression models and linear discriminant analyses. Validation demonstrated an accuracy of 84.3%. Our approach is particularly effective in classifying highly dynamic pioneer vegetation younger than 7 years into 2-yr intervals, suggesting that rapid changes in vegetation canopies can be detected with high accuracy. The combination of a spectral time-series analysis and object-based metrics based on high-resolution imagery enabled the classification of dynamic vegetation under intensive shifting cultivation and yielded an informative land cover map based on stand ages.
Component analysis of somatosensory evoked potentials for identifying spinal cord injury location.
Wang, Yazhou; Li, Guangsheng; Luk, Keith D K; Hu, Yong
2017-05-24
This study aims to determine whether the time-frequency components (TFCs) of somatosensory evoked potentials (SEPs) can be used to identify the specific location of a compressive spinal cord injury using a classification technique. Waveforms of SEPs after compressive injuries at various locations (C4, C5 and C6) in rat spinal cords were decomposed into a series of TFCs using a high-resolution time-frequency analysis method. A classification method based on support vector machine (SVM) was applied to the distributions of these TFCs among different pathological locations. The difference among injury locations manifests itself in different categories of SEP TFCs. High-energy TFCs of normal-state SEPs have significantly higher power and frequency than those of injury-state SEPs. The location of C5 is characterized by a unique distribution pattern of middle-energy TFCs. The difference between C4 and C6 is evidenced by the distribution pattern of low-energy TFCs. The proposed classification method based on SEP TFCs offers a discrimination accuracy of 80.2%. In this study, meaningful information contained in various SEP components was investigated and used to propose a new application of SEPs for identification of the location of pathological changes in the cervical spinal cord.
Zhang, Jianhua; Yin, Zhong; Wang, Rubin
2017-01-01
This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed.
Zhao, Yu Xi; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Hydrological process evaluation is temporal dependent. Hydrological time series including dependence components do not meet the data consistency assumption for hydrological computation. Both of those factors cause great difficulty for water researches. Given the existence of hydrological dependence variability, we proposed a correlationcoefficient-based method for significance evaluation of hydrological dependence based on auto-regression model. By calculating the correlation coefficient between the original series and its dependence component and selecting reasonable thresholds of correlation coefficient, this method divided significance degree of dependence into no variability, weak variability, mid variability, strong variability, and drastic variability. By deducing the relationship between correlation coefficient and auto-correlation coefficient in each order of series, we found that the correlation coefficient was mainly determined by the magnitude of auto-correlation coefficient from the 1 order to p order, which clarified the theoretical basis of this method. With the first-order and second-order auto-regression models as examples, the reasonability of the deduced formula was verified through Monte-Carlo experiments to classify the relationship between correlation coefficient and auto-correlation coefficient. This method was used to analyze three observed hydrological time series. The results indicated the coexistence of stochastic and dependence characteristics in hydrological process.
Van Berkel, Gary J; Kertesz, Vilmos
2017-02-15
An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.
Heart rate variability (HRV): an indicator of stress
NASA Astrophysics Data System (ADS)
Kaur, Balvinder; Durek, Joseph J.; O'Kane, Barbara L.; Tran, Nhien; Moses, Sophia; Luthra, Megha; Ikonomidou, Vasiliki N.
2014-05-01
Heart rate variability (HRV) can be an important indicator of several conditions that affect the autonomic nervous system, including traumatic brain injury, post-traumatic stress disorder and peripheral neuropathy [3], [4], [10] & [11]. Recent work has shown that some of the HRV features can potentially be used for distinguishing a subject's normal mental state from a stressed one [4], [13] & [14]. In all of these past works, although processing is done in both frequency and time domains, few classification algorithms have been explored for classifying normal from stressed RRintervals. In this paper we used 30 s intervals from the Electrocardiogram (ECG) time series collected during normal and stressed conditions, produced by means of a modified version of the Trier social stress test, to compute HRV-driven features and subsequently applied a set of classification algorithms to distinguish stressed from normal conditions. To classify RR-intervals, we explored classification algorithms that are commonly used for medical applications, namely 1) logistic regression (LR) [16] and 2) linear discriminant analysis (LDA) [6]. Classification performance for various levels of stress over the entire test was quantified using precision, accuracy, sensitivity and specificity measures. Results from both classifiers were then compared to find an optimal classifier and HRV features for stress detection. This work, performed under an IRB-approved protocol, not only provides a method for developing models and classifiers based on human data, but also provides a foundation for a stress indicator tool based on HRV. Further, these classification tools will not only benefit many civilian applications for detecting stress, but also security and military applications for screening such as: border patrol, stress detection for deception [3],[17], and wounded-warrior triage [12].
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
SNSEDextend: SuperNova Spectral Energy Distributions extrapolation toolkit
NASA Astrophysics Data System (ADS)
Pierel, Justin D. R.; Rodney, Steven A.; Avelino, Arturo; Bianco, Federica; Foley, Ryan J.; Friedman, Andrew; Hicken, Malcolm; Hounsell, Rebekah; Jha, Saurabh W.; Kessler, Richard; Kirshner, Robert; Mandel, Kaisey; Narayan, Gautham; Filippenko, Alexei V.; Scolnic, Daniel; Strolger, Louis-Gregory
2018-05-01
SNSEDextend extrapolates core-collapse and Type Ia Spectral Energy Distributions (SEDs) into the UV and IR for use in simulations and photometric classifications. The user provides a library of existing SED templates (such as those in the authors' SN SED Repository) along with new photometric constraints in the UV and/or NIR wavelength ranges. The software then extends the existing template SEDs so their colors match the input data at all phases. SNSEDextend can also extend the SALT2 spectral time-series model for Type Ia SN for a "first-order" extrapolation of the SALT2 model components, suitable for use in survey simulations and photometric classification tools; as the code does not do a rigorous re-training of the SALT2 model, the results should not be relied on for precision applications such as light curve fitting for cosmology.
NASA Astrophysics Data System (ADS)
Bontemps, S.; Defourny, P.; Van Bogaert, E.; Weber, J. L.; Arino, O.
2010-12-01
Regular and global land cover mapping contributes to evaluating the impact of human activities on the environment. Jointly supported by the European Space Agency and the European Environmental Agency, the GlobCorine project builds on the GlobCover findings and aims at making the full use of the MERIS time series for frequent land cover monitoring. The GlobCover automated classification approach has been tuned to the pan-European continent and adjusted towards a classification compatible with the Corine typology. The GlobCorine 2005 land cover map has been achieved, validated and made available to a broad- level stakeholder community from the ESA website. A first version of the GlobCorine 2009 map has also been produced, demonstrating the possibility for an operational production of frequent and updated global land cover maps.
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor.
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-29
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect-using electric current shape analysis-for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between "does-not-need-to-be-replaced" and "needs-to-be-replaced" shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification.
Replacement Condition Detection of Railway Point Machines Using an Electric Current Sensor
Sa, Jaewon; Choi, Younchang; Chung, Yongwha; Kim, Hee-Young; Park, Daihee; Yoon, Sukhan
2017-01-01
Detecting replacement conditions of railway point machines is important to simultaneously satisfy the budget-limit and train-safety requirements. In this study, we consider classification of the subtle differences in the aging effect—using electric current shape analysis—for the purpose of replacement condition detection of railway point machines. After analyzing the shapes of after-replacement data and then labeling the shapes of each before-replacement data, we can derive the criteria that can handle the subtle differences between “does-not-need-to-be-replaced” and “needs-to-be-replaced” shapes. On the basis of the experimental results with in-field replacement data, we confirmed that the proposed method could detect the replacement conditions with acceptable accuracy, as well as provide visual interpretability of the criteria used for the time-series classification. PMID:28146057
NASA Astrophysics Data System (ADS)
Perrimond, B.; Bigot, S.; Quénol, H.; Spielgelberger, T.; Baudry, J.
2012-04-01
Climate and vegetation are linked all over the world. In this study, we work on a seasonal weather classification based on air temperature and precipitation to deduce a link with different phenological stage (greening up, senescence, ...) over a 12 year period (1998-2009) for two different domains in France (Alps and Brittany). In temperate land, the main climatic variable with a potential effect on vegetation is the mean temperature followed by the rainfall deficit. A better understanding in season and their climatic characteristic is need to establish link between climate and phenology; so a weather classification is proposed based on empirical orthogonal functions and ascending hierarchical classification on atmospheric variables. This classification allows us to exhibit the inter-annual and intra-seasonal climatic spatiotemporal variability for both experimental site. Relationships between climate and phenology consist in a comparison between advance and delay in phenological stage and weather type issue from the classification. Experiment field are two french Long Term Ecological Research (LTER). The first one (LTER 'Alps' ) have mountain characteristics about 1000 to 4780 m ASL, ~65% of forest occupation ; the second one (LTER Armorique) is an Atlantic coastal landscape, 0-360 m ASL, ~70% of agricultural field. Climatic data are SAFRAN-France reanalysis which are developed to run SVAT model and come from the French meteorological service 'Météo-France'. All atmospheric variable needed to run a hydrological model are available (air temperature, rainfall/snowfall, wind speed, relative humidity, incoming/outcoming radiation) at a 8-8 km2 space resolution and with a daily time resolution. The phenological data are extracted from SPOT-VGT product 1-1 km2 space resolution and 10 days time resolution) by time series analysis process. Such of study is particularly important to understand relationships between environmental and ecological variables and it will allow to better predict ecological reaction under climate change constraint.
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
A system of vegetation classification applied to Hawaii
Michael G. Buck; Timothy E. Paysen
1984-01-01
A classification system for use in describing vegetation has been developed for Hawaii. Physiognomic and taxonomic criteria are used for a hierarchical stratification of vegetation in which the system categories are Formation, Subformation, Series, Association, and Phase. The System applies to local resource management activities and serves as a framework for resource...
Forest habitat types of central Idaho
Robert Steele; Robert D. Pfister; Russell A. Ryker; Jay A. Kittams
1981-01-01
A land-classification system based upon potential natural vegetation is presented for the forests of central Idaho. It is based on reconnaissance sampling of about 800 stands. A hierarchical taxonomic classification of forest sites was developed using the habitat type concept. A total of eight climax series, 64 habitat types, and 55 additional phases of habitat types...
Grassland and shrubland habitat types of western Montana
W. F. Mueggler; W. L. Stewart
1978-01-01
A classification system based upon potential natural vegetation is presented for the grasslands and shrublands of the mountainous western third of Montana. The classification was developed by analyzing data from 580 stands. Twenty-nine habitat types in 13 climax series are defined and a diagnostic key provided for field identification. Environment, vegetative...
Gender classification under extended operating conditions
NASA Astrophysics Data System (ADS)
Rude, Howard N.; Rizki, Mateen
2014-06-01
Gender classification is a critical component of a robust image security system. Many techniques exist to perform gender classification using facial features. In contrast, this paper explores gender classification using body features extracted from clothed subjects. Several of the most effective types of features for gender classification identified in literature were implemented and applied to the newly developed Seasonal Weather And Gender (SWAG) dataset. SWAG contains video clips of approximately 2000 samples of human subjects captured over a period of several months. The subjects are wearing casual business attire and outer garments appropriate for the specific weather conditions observed in the Midwest. The results from a series of experiments are presented that compare the classification accuracy of systems that incorporate various types and combinations of features applied to multiple looks at subjects at different image resolutions to determine a baseline performance for gender classification.
3D hand motion trajectory prediction from EEG mu and beta bandpower.
Korik, A; Sosnik, R; Siddique, N; Coyle, D
2016-01-01
A motion trajectory prediction (MTP) - based brain-computer interface (BCI) aims to reconstruct the three-dimensional (3D) trajectory of upper limb movement using electroencephalography (EEG). The most common MTP BCI employs a time series of bandpass-filtered EEG potentials (referred to here as the potential time-series, PTS, model) for reconstructing the trajectory of a 3D limb movement using multiple linear regression. These studies report the best accuracy when a 0.5-2Hz bandpass filter is applied to the EEG. In the present study, we show that spatiotemporal power distribution of theta (4-8Hz), mu (8-12Hz), and beta (12-28Hz) bands are more robust for movement trajectory decoding when the standard PTS approach is replaced with time-varying bandpower values of a specified EEG band, ie, with a bandpower time-series (BTS) model. A comprehensive analysis comprising of three subjects performing pointing movements with the dominant right arm toward six targets is presented. Our results show that the BTS model produces significantly higher MTP accuracy (R~0.45) compared to the standard PTS model (R~0.2). In the case of the BTS model, the highest accuracy was achieved across the three subjects typically in the mu (8-12Hz) and low-beta (12-18Hz) bands. Additionally, we highlight a limitation of the commonly used PTS model and illustrate how this model may be suboptimal for decoding motion trajectory relevant information. Although our results, showing that the mu and beta bands are prominent for MTP, are not in line with other MTP studies, they are consistent with the extensive literature on classical multiclass sensorimotor rhythm-based BCI studies (classification of limbs as opposed to motion trajectory prediction), which report the best accuracy of imagined limb movement classification using power values of mu and beta frequency bands. The methods proposed here provide a positive step toward noninvasive decoding of imagined 3D hand movements for movement-free BCIs. © 2016 Elsevier B.V. All rights reserved.
Clifford support vector machines for classification, regression, and recurrence.
Bayro-Corrochano, Eduardo Jose; Arana-Daniel, Nancy
2010-11-01
This paper introduces the Clifford support vector machines (CSVM) as a generalization of the real and complex-valued support vector machines using the Clifford geometric algebra. In this framework, we handle the design of kernels involving the Clifford or geometric product. In this approach, one redefines the optimization variables as multivectors. This allows us to have a multivector as output. Therefore, we can represent multiple classes according to the dimension of the geometric algebra in which we work. We show that one can apply CSVM for classification and regression and also to build a recurrent CSVM. The CSVM is an attractive approach for the multiple input multiple output processing of high-dimensional geometric entities. We carried out comparisons between CSVM and the current approaches to solve multiclass classification and regression. We also study the performance of the recurrent CSVM with experiments involving time series. The authors believe that this paper can be of great use for researchers and practitioners interested in multiclass hypercomplex computing, particularly for applications in complex and quaternion signal and image processing, satellite control, neurocomputation, pattern recognition, computer vision, augmented virtual reality, robotics, and humanoids.
Identifying Autism from Resting-State fMRI Using Long Short-Term Memory Networks.
Dvornek, Nicha C; Ventola, Pamela; Pelphrey, Kevin A; Duncan, James S
2017-09-01
Functional magnetic resonance imaging (fMRI) has helped characterize the pathophysiology of autism spectrum disorders (ASD) and carries promise for producing objective biomarkers for ASD. Recent work has focused on deriving ASD biomarkers from resting-state functional connectivity measures. However, current efforts that have identified ASD with high accuracy were limited to homogeneous, small datasets, while classification results for heterogeneous, multi-site data have shown much lower accuracy. In this paper, we propose the use of recurrent neural networks with long short-term memory (LSTMs) for classification of individuals with ASD and typical controls directly from the resting-state fMRI time-series. We used the entire large, multi-site Autism Brain Imaging Data Exchange (ABIDE) I dataset for training and testing the LSTM models. Under a cross-validation framework, we achieved classification accuracy of 68.5%, which is 9% higher than previously reported methods that used fMRI data from the whole ABIDE cohort. Finally, we presented interpretation of the trained LSTM weights, which highlight potential functional networks and regions that are known to be implicated in ASD.
Through thick and thin: quantitative classification of photometric observing conditions on Paranal
NASA Astrophysics Data System (ADS)
Kerber, Florian; Querel, Richard R.; Neureiter, Bianca; Hanuschik, Reinhard
2016-07-01
A Low Humidity and Temperature Profiling (LHATPRO) microwave radiometer is used to monitor sky conditions over ESO's Paranal observatory. It provides measurements of precipitable water vapour (PWV) at 183 GHz, which are being used in Service Mode for scheduling observations that can take advantage of favourable conditions for infrared (IR) observations. The instrument also contains an IR camera measuring sky brightness temperature at 10.5 μm. It is capable of detecting cold and thin, even sub-visual, cirrus clouds. We present a diagnostic diagram that, based on a sophisticated time series analysis of these IR sky brightness data, allows for the automatic and quantitative classification of photometric observing conditions over Paranal. The method is highly sensitive to the presence of even very thin clouds but robust against other causes of sky brightness variations. The diagram has been validated across the complete range of conditions that occur over Paranal and we find that the automated process provides correct classification at the 95% level. We plan to develop our method into an operational tool for routine use in support of ESO Science Operations.
Summer Crop Classification by Multi-Temporal COSMO-SkyMed® Data
NASA Astrophysics Data System (ADS)
Guarini, Rocchina; Bruzzone, Lorenzo; Santoni, Massimo; Vuolo, Francesco; Luigi, Dini
2016-08-01
In this study, we propose a multi-temporal and multi- polarization approach to discriminate different crop types in the Marchefel region, Austria. The sensitivity of X-band COSMO-SkyMed® (CSK®) data with respect to five crop classes, namely carrot, corn, potato, soybean and sugarbeet is investigated. In particular, the capabilities of dual-polarization (StripMap PingPong) HH/HV, and single-polarization (StripMap Himage), HH and VH, in distinguishing among the five crop types are evaluated. A total of twenty-one Himage and ten PingPong images were acquired in a seven-months period, from April to October 2014. Therefore, the backscattering coefficient was extracted for each dataset and the classification was performed using a pixel-based support vector machine (SVM) approach. The accuracy of the obtained crop classifications was assessed by comparing them with ground truth. The dual-polarization results are contrasted between the HH and HV polarization, and with single-polarization ones (HH and VH polarizations). The best accuracy is obtained by using time-series of StripMap Himage data, at VH polarization, covering the whole season period.
Identifying Autism from Resting-State fMRI Using Long Short-Term Memory Networks
Dvornek, Nicha C.; Ventola, Pamela; Pelphrey, Kevin A.; Duncan, James S.
2017-01-01
Functional magnetic resonance imaging (fMRI) has helped characterize the pathophysiology of autism spectrum disorders (ASD) and carries promise for producing objective biomarkers for ASD. Recent work has focused on deriving ASD biomarkers from resting-state functional connectivity measures. However, current efforts that have identified ASD with high accuracy were limited to homogeneous, small datasets, while classification results for heterogeneous, multi-site data have shown much lower accuracy. In this paper, we propose the use of recurrent neural networks with long short-term memory (LSTMs) for classification of individuals with ASD and typical controls directly from the resting-state fMRI time-series. We used the entire large, multi-site Autism Brain Imaging Data Exchange (ABIDE) I dataset for training and testing the LSTM models. Under a cross-validation framework, we achieved classification accuracy of 68.5%, which is 9% higher than previously reported methods that used fMRI data from the whole ABIDE cohort. Finally, we presented interpretation of the trained LSTM weights, which highlight potential functional networks and regions that are known to be implicated in ASD. PMID:29104967
Wan, Shixiang; Duan, Yucong; Zou, Quan
2017-09-01
Predicting the subcellular localization of proteins is an important and challenging problem. Traditional experimental approaches are often expensive and time-consuming. Consequently, a growing number of research efforts employ a series of machine learning approaches to predict the subcellular location of proteins. There are two main challenges among the state-of-the-art prediction methods. First, most of the existing techniques are designed to deal with multi-class rather than multi-label classification, which ignores connections between multiple labels. In reality, multiple locations of particular proteins imply that there are vital and unique biological significances that deserve special focus and cannot be ignored. Second, techniques for handling imbalanced data in multi-label classification problems are necessary, but never employed. For solving these two issues, we have developed an ensemble multi-label classifier called HPSLPred, which can be applied for multi-label classification with an imbalanced protein source. For convenience, a user-friendly webserver has been established at http://server.malab.cn/HPSLPred. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jahanian, Hesamoddin; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gholam-Ali
2005-09-01
To present novel feature spaces, based on multiscale decompositions obtained by scalar wavelet and multiwavelet transforms, to remedy problems associated with high dimension of functional magnetic resonance imaging (fMRI) time series (when they are used directly in clustering algorithms) and their poor signal-to-noise ratio (SNR) that limits accurate classification of fMRI time series according to their activation contents. Using randomization, the proposed method finds wavelet/multiwavelet coefficients that represent the activation content of fMRI time series and combines them to define new feature spaces. Using simulated and experimental fMRI data sets, the proposed feature spaces are compared to the cross-correlation (CC) feature space and their performances are evaluated. In these studies, the false positive detection rate is controlled using randomization. To compare different methods, several points of the receiver operating characteristics (ROC) curves, using simulated data, are estimated and compared. The proposed features suppress the effects of confounding signals and improve activation detection sensitivity. Experimental results show improved sensitivity and robustness of the proposed method compared to the conventional CC analysis. More accurate and sensitive activation detection can be achieved using the proposed feature spaces compared to CC feature space. Multiwavelet features show superior detection sensitivity compared to the scalar wavelet features. (c) 2005 Wiley-Liss, Inc.
NASA Technical Reports Server (NTRS)
Gao, Feng; DeColstoun, Eric Brown; Ma, Ronghua; Weng, Qihao; Masek, Jeffrey G.; Chen, Jin; Pan, Yaozhong; Song, Conghe
2012-01-01
Cities have been expanding rapidly worldwide, especially over the past few decades. Mapping the dynamic expansion of impervious surface in both space and time is essential for an improved understanding of the urbanization process, land-cover and land-use change, and their impacts on the environment. Landsat and other medium-resolution satellites provide the necessary spatial details and temporal frequency for mapping impervious surface expansion over the past four decades. Since the US Geological Survey opened the historical record of the Landsat image archive for free access in 2008, the decades-old bottleneck of data limitation has gone. Remote-sensing scientists are now rich with data, and the challenge is how to make best use of this precious resource. In this article, we develop an efficient algorithm to map the continuous expansion of impervious surface using a time series of four decades of medium-resolution satellite images. The algorithm is based on a supervised classification of the time-series image stack using a decision tree. Each imerpervious class represents urbanization starting in a different image. The algorithm also allows us to remove inconsistent training samples because impervious expansion is not reversible during the study period. The objective is to extract a time series of complete and consistent impervious surface maps from a corresponding times series of images collected from multiple sensors, and with a minimal amount of image preprocessing effort. The approach was tested in the lower Yangtze River Delta region, one of the fastest urban growth areas in China. Results from nearly four decades of medium-resolution satellite data from the Landsat Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic Mapper plus (ETM+) and China-Brazil Earth Resources Satellite (CBERS) show a consistent urbanization process that is consistent with economic development plans and policies. The time-series impervious spatial extent maps derived from this study agree well with an existing urban extent polygon data set that was previously developed independently. The overall mapping accuracy was estimated at about 92.5% with 3% commission error and 12% omission error for the impervious type from all images regardless of image quality and initial spatial resolution.
Lim, Hyun-Woo; Park, Ji-Hoon; Park, Hyun-Hee
2017-01-01
Objective This paper describes changes in the characteristics of patients seeking orthodontic treatment over the past decade and the treatment they received, to identify any seasonal variations or trends. Methods This single-center retrospective cohort study included all patients who presented to Seoul National University Dental Hospital for orthodontic diagnosis and treatment between January 1, 2005 and December 31, 2015. The study analyzed a set of heterogeneous variables grouped into the following categories: demographic (age, gender, and address), clinical (Angle Classification, anomaly, mode of orthodontic treatment, removable appliances for Phase 1 treatment, fixed appliances for Phase 2 treatment, orthognathic surgery, extraction, mini-plate, mini-implant, and patient transfer) and time-related variables (date of first visit and orthodontic treatment time). Time series analysis was applied to each variable. Results The sample included 14,510 patients with a median age of 19.5 years. The number of patients and their ages demonstrated a clear seasonal variation, which peaked in the summer and winter. Increasing trends were observed for the proportion of male patients, use of non-extraction treatment modality, use of ceramic brackets, patients from provinces outside the Seoul region at large, patients transferred from private practitioners, and patients who underwent orthognathic surgery performed by university surgeons. Decreasing trends included the use of metal brackets and orthodontic treatment time. Conclusions Time series analysis revealed a seasonal variation in some characteristics, and several variables showed changing trends over the past decade. PMID:28861391
NASA Astrophysics Data System (ADS)
Swain, Sushree Diptimayee; Ray, Pravat Kumar; Mohanty, K. B.
2016-06-01
This research paper discover the design of a shunt Passive Power Filter (PPF) in Hybrid Series Active Power Filter (HSAPF) that employs a novel analytic methodology which is superior than FFT analysis. This novel approach consists of the estimation, detection and classification of the signals. The proposed method is applied to estimate, detect and classify the power quality (PQ) disturbance such as harmonics. This proposed work deals with three methods: the harmonic detection through wavelet transform method, the harmonic estimation by Kalman Filter algorithm and harmonic classification by decision tree method. From different type of mother wavelets in wavelet transform method, the db8 is selected as suitable mother wavelet because of its potency on transient response and crouched oscillation at frequency domain. In harmonic compensation process, the detected harmonic is compensated through Hybrid Series Active Power Filter (HSAPF) based on Instantaneous Reactive Power Theory (IRPT). The efficacy of the proposed method is verified in MATLAB/SIMULINK domain and as well as with an experimental set up. The obtained results confirm the superiority of the proposed methodology than FFT analysis. This newly proposed PPF is used to make the conventional HSAPF more robust and stable.
HARD PAN I Test Series Test and Instrumentation Plans. Volume I. Test Plan
1975-12-01
t.jw .y..,,^.,^,. Ä!»,,«-* :,,; .trwev* ’ UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGt ’Wh&n Data Entered) J?)REPORT DOCUMENTATION PAGE...to facility-l—> DO ,: FORM A’J 73 1473 EDITION OF 1 NOV 65 15 OBSOLETE UNCLASSIFIED fNW SECURITY CLASSIFICATION OF THIS PAGE (Wfien Data Entered...y^o ... — ppiiw’.^y.-.j-w... v»t \\ UNCLASSIFIED iCURITY CLASSIFICATION CF THIS PAGEfWlon Data Entered) design, modification, and hardness
Training Classifiers with Shadow Features for Sensor-Based Human Activity Recognition.
Fong, Simon; Song, Wei; Cho, Kyungeun; Wong, Raymond; Wong, Kelvin K L
2017-02-27
In this paper, a novel training/testing process for building/using a classification model based on human activity recognition (HAR) is proposed. Traditionally, HAR has been accomplished by a classifier that learns the activities of a person by training with skeletal data obtained from a motion sensor, such as Microsoft Kinect. These skeletal data are the spatial coordinates (x, y, z) of different parts of the human body. The numeric information forms time series, temporal records of movement sequences that can be used for training a classifier. In addition to the spatial features that describe current positions in the skeletal data, new features called 'shadow features' are used to improve the supervised learning efficacy of the classifier. Shadow features are inferred from the dynamics of body movements, and thereby modelling the underlying momentum of the performed activities. They provide extra dimensions of information for characterising activities in the classification process, and thereby significantly improve the classification accuracy. Two cases of HAR are tested using a classification model trained with shadow features: one is by using wearable sensor and the other is by a Kinect-based remote sensor. Our experiments can demonstrate the advantages of the new method, which will have an impact on human activity detection research.
Training Classifiers with Shadow Features for Sensor-Based Human Activity Recognition
Fong, Simon; Song, Wei; Cho, Kyungeun; Wong, Raymond; Wong, Kelvin K. L.
2017-01-01
In this paper, a novel training/testing process for building/using a classification model based on human activity recognition (HAR) is proposed. Traditionally, HAR has been accomplished by a classifier that learns the activities of a person by training with skeletal data obtained from a motion sensor, such as Microsoft Kinect. These skeletal data are the spatial coordinates (x, y, z) of different parts of the human body. The numeric information forms time series, temporal records of movement sequences that can be used for training a classifier. In addition to the spatial features that describe current positions in the skeletal data, new features called ‘shadow features’ are used to improve the supervised learning efficacy of the classifier. Shadow features are inferred from the dynamics of body movements, and thereby modelling the underlying momentum of the performed activities. They provide extra dimensions of information for characterising activities in the classification process, and thereby significantly improve the classification accuracy. Two cases of HAR are tested using a classification model trained with shadow features: one is by using wearable sensor and the other is by a Kinect-based remote sensor. Our experiments can demonstrate the advantages of the new method, which will have an impact on human activity detection research. PMID:28264470
Case-based statistical learning applied to SPECT image classification
NASA Astrophysics Data System (ADS)
Górriz, Juan M.; Ramírez, Javier; Illán, I. A.; Martínez-Murcia, Francisco J.; Segovia, Fermín.; Salas-Gonzalez, Diego; Ortiz, A.
2017-03-01
Statistical learning and decision theory play a key role in many areas of science and engineering. Some examples include time series regression and prediction, optical character recognition, signal detection in communications or biomedical applications for diagnosis and prognosis. This paper deals with the topic of learning from biomedical image data in the classification problem. In a typical scenario we have a training set that is employed to fit a prediction model or learner and a testing set on which the learner is applied to in order to predict the outcome for new unseen patterns. Both processes are usually completely separated to avoid over-fitting and due to the fact that, in practice, the unseen new objects (testing set) have unknown outcomes. However, the outcome yields one of a discrete set of values, i.e. the binary diagnosis problem. Thus, assumptions on these outcome values could be established to obtain the most likely prediction model at the training stage, that could improve the overall classification accuracy on the testing set, or keep its performance at least at the level of the selected statistical classifier. In this sense, a novel case-based learning (c-learning) procedure is proposed which combines hypothesis testing from a discrete set of expected outcomes and a cross-validated classification stage.
Floating Forests: Validation of a Citizen Science Effort to Answer Global Ecological Questions
NASA Astrophysics Data System (ADS)
Rosenthal, I.; Byrnes, J.; Cavanaugh, K. C.; Haupt, A. J.; Trouille, L.; Bell, T. W.; Rassweiler, A.; Pérez-Matus, A.; Assis, J.
2017-12-01
Researchers undertaking long term, large-scale ecological analyses face significant challenges for data collection and processing. Crowdsourcing via citizen science can provide an efficient method for analyzing large data sets. However, many scientists have raised questions about the quality of data collected by citizen scientists. Here we use Floating-Forests (http://floatingforests.org), a citizen science platform for creating a global time series of giant kelp abundance, to show that ensemble classifications of satellite data can ensure data quality. Citizen scientists view satellite images of coastlines and classify kelp forests by tracing all visible patches of kelp. Each image is classified by fifteen citizen scientists before being retired. To validate citizen science results, all fifteen classifications are converted to a raster and overlaid on a calibration dataset generated from previous studies. Results show that ensemble classifications from citizen scientists are consistently accurate when compared to calibration data. Given that all source images were acquired by Landsat satellites, we expect this consistency to hold across all regions. At present, we have over 6000 web-based citizen scientists' classifications of almost 2.5 million images of kelp forests in California and Tasmania. These results are not only useful for remote sensing of kelp forests, but also for a wide array of applications that combine citizen science with remote sensing.
Classification of boreal forest by satellite and inventory data using neural network approach
NASA Astrophysics Data System (ADS)
Romanov, A. A.
2012-12-01
The main objective of this research was to develop methodology for boreal (Siberian Taiga) land cover classification in a high accuracy level. The study area covers the territories of Central Siberian several parts along the Yenisei River (60-62 degrees North Latitude): the right bank includes mixed forest and dark taiga, the left - pine forests; so were taken as a high heterogeneity and statistically equal surfaces concerning spectral characteristics. Two main types of data were used: time series of middle spatial resolution satellite images (Landsat 5, 7 and SPOT4) and inventory datasets from the nature fieldworks (used for training samples sets preparation). Method of collecting field datasets included a short botany description (type/species of vegetation, density, compactness of the crowns, individual height and max/min diameters representative of each type, surface altitude of the plot), at the same time the geometric characteristic of each training sample unit corresponded to the spatial resolution of satellite images and geo-referenced (prepared datasets both of the preliminary processing and verification). The network of test plots was planned as irregular and determined by the landscape oriented approach. The main focus of the thematic data processing has been allocated for the use of neural networks (fuzzy logic inc.); therefore, the results of field studies have been converting input parameter of type / species of vegetation cover of each unit and the degree of variability. Proposed approach involves the processing of time series separately for each image mainly for the verification: shooting parameters taken into consideration (time, albedo) and thus expected to assess the quality of mapping. So the input variables for the networks were sensor bands, surface altitude, solar angels and land surface temperature (for a few experiments); also given attention to the formation of the formula class on the basis of statistical pre-processing of results of field research (prevalence type). Besides some statistical methods of supervised classification has been used (minimal distance, maximum likelihood, Mahalanobis). During the study received various types of neural classifiers suitable for the mapping, and even for the high heterogenic areas neural network approach has shown better results in precision despite the validity of the assumption of Gaussian distribution (Table). Experimentally chosen optimum network structure consisting of three layers of ten neuron in each, but it should be clarified that such configuration requires larges computational resources in comparison the statistical methods presented above; necessary to increase the number of iteration in network learning process for RMS errors minimization. It should also be emphasized that the key issues of accuracy estimation of the classification results is lack of completeness of the training sets, this is especially true with summer image processing of mixed forest. However seems that proposed methodology can be used also for measure local dynamic of boreal land surface by the type of vegetation.Comparison of classification accuracyt;
Development and initial validation of the Classification of Early-Onset Scoliosis (C-EOS).
Williams, Brendan A; Matsumoto, Hiroko; McCalla, Daren J; Akbarnia, Behrooz A; Blakemore, Laurel C; Betz, Randal R; Flynn, John M; Johnston, Charles E; McCarthy, Richard E; Roye, David P; Skaggs, David L; Smith, John T; Snyder, Brian D; Sponseller, Paul D; Sturm, Peter F; Thompson, George H; Yazici, Muharrem; Vitale, Michael G
2014-08-20
Early-onset scoliosis is a heterogeneous condition, with highly variable manifestations and natural history. No standardized classification system exists to describe and group patients, to guide optimal care, or to prognosticate outcomes within this population. A classification system for early-onset scoliosis is thus a necessary prerequisite to the timely evolution of care of these patients. Fifteen experienced surgeons participated in a nominal group technique designed to achieve a consensus-based classification system for early-onset scoliosis. A comprehensive list of factors important in managing early-onset scoliosis was generated using a standardized literature review, semi-structured interviews, and open forum discussion. Three group meetings and two rounds of surveying guided the selection of classification components, subgroupings, and cut-points. Initial validation of the system was conducted using an interobserver reliability assessment based on the classification of a series of thirty cases. Nominal group technique was used to identify three core variables (major curve angle, etiology, and kyphosis) with high group content validity scores. Age and curve progression ranked slightly lower. Participants evaluated the cases of thirty patients with early-onset scoliosis for reliability testing. The mean kappa value for etiology (0.64) was substantial, while the mean kappa values for major curve angle (0.95) and kyphosis (0.93) indicated almost perfect agreement. The final classification consisted of a continuous age prefix, etiology (congenital or structural, neuromuscular, syndromic, and idiopathic), major curve angle (1, 2, 3, or 4), and kyphosis (-, N, or +) variables, and an optional progression modifier (P0, P1, or P2). Utilizing formal consensus-building methods in a large group of surgeons experienced in treating early-onset scoliosis, a novel classification system for early-onset scoliosis was developed with all core components demonstrating substantial to excellent interobserver reliability. This classification system will serve as a foundation to guide ongoing research efforts and standardize communication in the clinical setting. Copyright © 2014 by The Journal of Bone and Joint Surgery, Incorporated.
NASA Astrophysics Data System (ADS)
Wang, J.; Sulla-menashe, D. J.; Woodcock, C. E.; Sonnentag, O.; Friedl, M. A.
2017-12-01
Rapid climate change in arctic and boreal ecosystems is driving changes to land cover composition, including woody expansion in the arctic tundra, successional shifts following boreal fires, and thaw-induced wetland expansion and forest collapse along the southern limit of permafrost. The impacts of these land cover transformations on the physical climate and the carbon cycle are increasingly well-documented from field and model studies, but there have been few attempts to empirically estimate rates of land cover change at decadal time scale and continental spatial scale. Previous studies have used too coarse spatial resolution or have been too limited in temporal range to enable broad multi-decadal assessment of land cover change. As part of NASA's Arctic Boreal Vulnerability Experiment (ABoVE), we are using dense time series of Landsat remote sensing data to map disturbances and classify land cover types across the ABoVE extended domain (spanning western Canada and Alaska) over the last three decades (1982-2014) at 30 m resolution. We utilize regionally-complete and repeated acquisition high-resolution (<2 m) DigitalGlobe imagery to generate training data from across the region that follows a nested, hierarchical classification scheme encompassing plant functional type and cover density, understory type, wetland status, and land use. Additionally, we crosswalk plot-level field data into our scheme for additional high quality training sites. We use the Continuous Change Detection and Classification algorithm to estimate land cover change dates and temporal-spectral features in the Landsat data. These features are used to train random forest classification models and map land cover and analyze land cover change processes, focusing primarily on tundra "shrubification", post-fire succession, and boreal wetland expansion. We will analyze the high resolution data based on stratified random sampling of our change maps to validate and assess the accuracy of our model predictions. In this paper, we present initial results from this effort, including sub-regional analyses focused on several key areas, such as the Taiga Plains and the Southern Arctic ecozones, to calibrate our random forest models and assess results.
Forest habitat types of eastern Idaho-western Wyoming
Robert Steele; Stephen V. Cooper; David M. Ondov; David W. Roberts; Robert D. Pfister
1983-01-01
A land-classification system based upon potential natural vegetation is presented for the forests of central Idaho. It is based on reconnaissance sampling of about 980 stands. A hierarchical taxonomic classification of forest sites was developed using the habitat type concept. A total of six climax series, 58 habitat types, and 24 additional phases of habitat types are...
Coniferous forest habitat types of central and southern Utah
Andrew P. Youngblood; Ronald L. Mauk
1985-01-01
A land-classification system based upon potential natural vegetation is presented for the coniferous forests of central and southern Utah. It is based on reconnaissance sampling of about 720 stands. A hierarchical taxonomic classification of forest sites was developed using the habitat type concept. Seven climax series, 37 habitat types, and six additional phases of...
Forest habitat types of Montana
Robert D. Pfister; Bernard L. Kovalchik; Stephen F. Arno; Richard C. Presby
1977-01-01
A land-classification system based upon potential natural vegetation is presented for the forests of Montana. It is based on an intensive 4-year study and reconnaissance sampling of about 1,500 stands. A hierarchical classification of forest sites was developed using the habitat type concept. A total of 9 climax series, 64 habitat types, and 37 additional phases of...
Interannual Change Detection of Mediterranean Seagrasses Using RapidEye Image Time Series
Traganos, Dimosthenis; Reinartz, Peter
2018-01-01
Recent research studies have highlighted the decrease in the coverage of Mediterranean seagrasses due to mainly anthropogenic activities. The lack of data on the distribution of these significant aquatic plants complicates the quantification of their decreasing tendency. While Mediterranean seagrasses are declining, satellite remote sensing technology is growing at an unprecedented pace, resulting in a wealth of spaceborne image time series. Here, we exploit recent advances in high spatial resolution sensors and machine learning to study Mediterranean seagrasses. We process a multispectral RapidEye time series between 2011 and 2016 to detect interannual seagrass dynamics in 888 submerged hectares of the Thermaikos Gulf, NW Aegean Sea, Greece (eastern Mediterranean Sea). We assess the extent change of two Mediterranean seagrass species, the dominant Posidonia oceanica and Cymodocea nodosa, following atmospheric and analytical water column correction, as well as machine learning classification, using Random Forests, of the RapidEye time series. Prior corrections are necessary to untangle the initially weak signal of the submerged seagrass habitats from satellite imagery. The central results of this study show that P. oceanica seagrass area has declined by 4.1%, with a trend of −11.2 ha/yr, while C. nodosa seagrass area has increased by 17.7% with a trend of +18 ha/yr throughout the 5-year study period. Trends of change in spatial distribution of seagrasses in the Thermaikos Gulf site are in line with reported trends in the Mediterranean. Our presented methodology could be a time- and cost-effective method toward the quantitative ecological assessment of seagrass dynamics elsewhere in the future. From small meadows to whole coastlines, knowledge of aquatic plant dynamics could resolve decline or growth trends and accurately highlight key units for future restoration, management, and conservation. PMID:29467777
Interannual Change Detection of Mediterranean Seagrasses Using RapidEye Image Time Series.
Traganos, Dimosthenis; Reinartz, Peter
2018-01-01
Recent research studies have highlighted the decrease in the coverage of Mediterranean seagrasses due to mainly anthropogenic activities. The lack of data on the distribution of these significant aquatic plants complicates the quantification of their decreasing tendency. While Mediterranean seagrasses are declining, satellite remote sensing technology is growing at an unprecedented pace, resulting in a wealth of spaceborne image time series. Here, we exploit recent advances in high spatial resolution sensors and machine learning to study Mediterranean seagrasses. We process a multispectral RapidEye time series between 2011 and 2016 to detect interannual seagrass dynamics in 888 submerged hectares of the Thermaikos Gulf, NW Aegean Sea, Greece (eastern Mediterranean Sea). We assess the extent change of two Mediterranean seagrass species, the dominant Posidonia oceanica and Cymodocea nodosa , following atmospheric and analytical water column correction, as well as machine learning classification, using Random Forests, of the RapidEye time series. Prior corrections are necessary to untangle the initially weak signal of the submerged seagrass habitats from satellite imagery. The central results of this study show that P. oceanica seagrass area has declined by 4.1%, with a trend of -11.2 ha/yr, while C. nodosa seagrass area has increased by 17.7% with a trend of +18 ha/yr throughout the 5-year study period. Trends of change in spatial distribution of seagrasses in the Thermaikos Gulf site are in line with reported trends in the Mediterranean. Our presented methodology could be a time- and cost-effective method toward the quantitative ecological assessment of seagrass dynamics elsewhere in the future. From small meadows to whole coastlines, knowledge of aquatic plant dynamics could resolve decline or growth trends and accurately highlight key units for future restoration, management, and conservation.
NASA Astrophysics Data System (ADS)
Zheng, Jinde; Pan, Haiyang; Cheng, Junsheng
2017-02-01
To timely detect the incipient failure of rolling bearing and find out the accurate fault location, a novel rolling bearing fault diagnosis method is proposed based on the composite multiscale fuzzy entropy (CMFE) and ensemble support vector machines (ESVMs). Fuzzy entropy (FuzzyEn), as an improvement of sample entropy (SampEn), is a new nonlinear method for measuring the complexity of time series. Since FuzzyEn (or SampEn) in single scale can not reflect the complexity effectively, multiscale fuzzy entropy (MFE) is developed by defining the FuzzyEns of coarse-grained time series, which represents the system dynamics in different scales. However, the MFE values will be affected by the data length, especially when the data are not long enough. By combining information of multiple coarse-grained time series in the same scale, the CMFE algorithm is proposed in this paper to enhance MFE, as well as FuzzyEn. Compared with MFE, with the increasing of scale factor, CMFE obtains much more stable and consistent values for a short-term time series. In this paper CMFE is employed to measure the complexity of vibration signals of rolling bearings and is applied to extract the nonlinear features hidden in the vibration signals. Also the physically meanings of CMFE being suitable for rolling bearing fault diagnosis are explored. Based on these, to fulfill an automatic fault diagnosis, the ensemble SVMs based multi-classifier is constructed for the intelligent classification of fault features. Finally, the proposed fault diagnosis method of rolling bearing is applied to experimental data analysis and the results indicate that the proposed method could effectively distinguish different fault categories and severities of rolling bearings.
Seri Landscape Classification and Spatial Reference
ERIC Educational Resources Information Center
O'Meara, Carolyn
2010-01-01
This thesis contributes to the growing field of ethnophysiography, a new subfield of cognitive anthropology that aims to determine the universals and variation in the categorization of landscape objects across cultures. More specifically, this work looks at the case of the Seri people of Sonora, Mexico to investigate the way they categorize…
Li, Jing; Zipper, Carl E; Donovan, Patricia F; Wynne, Randolph H; Oliphant, Adam J
2015-09-01
Surface mining disturbances have attracted attention globally due to extensive influence on topography, land use, ecosystems, and human populations in mineral-rich regions. We analyzed a time series of Landsat satellite imagery to produce a 28-year disturbance history for surface coal mining in a segment of eastern USA's central Appalachian coalfield, southwestern Virginia. The method was developed and applied as a three-step sequence: vegetation index selection, persistent vegetation identification, and mined-land delineation by year of disturbance. The overall classification accuracy and kappa coefficient were 0.9350 and 0.9252, respectively. Most surface coal mines were identified correctly by location and by time of initial disturbance. More than 8 % of southwestern Virginia's >4000-km(2) coalfield area was disturbed by surface coal mining over the 28-year period. Approximately 19.5 % of the Appalachian coalfield surface within the most intensively mined county (Wise County) has been disturbed by mining. Mining disturbances expanded steadily and progressively over the study period. Information generated can be applied to gain further insight concerning mining influences on ecosystems and other essential environmental features.
Combining neural networks and genetic algorithms for hydrological flow forecasting
NASA Astrophysics Data System (ADS)
Neruda, Roman; Srejber, Jan; Neruda, Martin; Pascenko, Petr
2010-05-01
We present a neural network approach to rainfall-runoff modeling for small size river basins based on several time series of hourly measured data. Different neural networks are considered for short time runoff predictions (from one to six hours lead time) based on runoff and rainfall data observed in previous time steps. Correlation analysis shows that runoff data, short time rainfall history, and aggregated API values are the most significant data for the prediction. Neural models of multilayer perceptron and radial basis function networks with different numbers of units are used and compared with more traditional linear time series predictors. Out of possible 48 hours of relevant history of all the input variables, the most important ones are selected by means of input filters created by a genetic algorithm. The genetic algorithm works with population of binary encoded vectors defining input selection patterns. Standard genetic operators of two-point crossover, random bit-flipping mutation, and tournament selection were used. The evaluation of objective function of each individual consists of several rounds of building and testing a particular neural network model. The whole procedure is rather computational exacting (taking hours to days on a desktop PC), thus a high-performance mainframe computer has been used for our experiments. Results based on two years worth data from the Ploucnice river in Northern Bohemia suggest that main problems connected with this approach to modeling are ovetraining that can lead to poor generalization, and relatively small number of extreme events which makes it difficult for a model to predict the amplitude of the event. Thus, experiments with both absolute and relative runoff predictions were carried out. In general it can be concluded that the neural models show about 5 per cent improvement in terms of efficiency coefficient over liner models. Multilayer perceptrons with one hidden layer trained by back propagation algorithm and predicting relative runoff show the best behavior so far. Utilizing the genetically evolved input filter improves the performance of yet another 5 per cent. In the future we would like to continue with experiments in on-line prediction using real-time data from Smeda River with 6 hours lead time forecast. Following the operational reality we will focus on classification of the runoffs into flood alert levels, and reformulation of the time series prediction task as a classification problem. The main goal of all this work is to improve flood warning system operated by the Czech Hydrometeorological Institute.
Performance evaluation of the croissant production line with reparable machines
NASA Astrophysics Data System (ADS)
Tsarouhas, Panagiotis H.
2015-03-01
In this study, the analytical probability models for an automated serial production system, bufferless that consists of n-machines in series with common transfer mechanism and control system was developed. Both time to failure and time to repair a failure are assumed to follow exponential distribution. Applying those models, the effect of system parameters on system performance in actual croissant production line was studied. The production line consists of six workstations with different numbers of reparable machines in series. Mathematical models of the croissant production line have been developed using Markov process. The strength of this study is in the classification of the whole system in states, representing failures of different machines. Failure and repair data from the actual production environment have been used to estimate reliability and maintainability for each machine, workstation, and the entire line is based on analytical models. The analysis provides a useful insight into the system's behaviour, helps to find design inherent faults and suggests optimal modifications to upgrade the system and improve its performance.
NASA Astrophysics Data System (ADS)
Storch, Cornelia; Wagner, Thomas; Ramminger, Gernot; Pape, Marlon; Ott, Hannes; Hausler, Thomas; Gomez, Sharon
2016-08-01
The paper presents a description of the methods development for an automated processing chain for the classification of Forest Cover and Change based on high resolution multi-temporal time series Landsat and SPOT5Take5 data with focus on the dry forest ecosystems of Africa. The method has been developed within the European Space Agency (ESA) funded Global monitoring for Environment and Security Service Element for Forest Monitoring (GSE FM) project on dry forest areas; the demonstration site selected was in Malawi. The methods are based on the principles of a robust, but still flexible monitoring system, to cope with most complex Earth Observation (EO) data scenarios, varying in terms of data quality, source, accuracy, information content, completeness etc. The method allows automated tracking of change dates, data gap filling and takes into account phenology, seasonality of tree species with respect to leaf fall and heavy cloud cover during the rainy season.
Using "big data" to optimally model hydrology and water quality across expansive regions
Roehl, E.A.; Cook, J.B.; Conrads, P.A.
2009-01-01
This paper describes a new divide and conquer approach that leverages big environmental data, utilizing all available categorical and time-series data without subjectivity, to empirically model hydrologic and water-quality behaviors across expansive regions. The approach decomposes large, intractable problems into smaller ones that are optimally solved; decomposes complex signals into behavioral components that are easier to model with "sub- models"; and employs a sequence of numerically optimizing algorithms that include time-series clustering, nonlinear, multivariate sensitivity analysis and predictive modeling using multi-layer perceptron artificial neural networks, and classification for selecting the best sub-models to make predictions at new sites. This approach has many advantages over traditional modeling approaches, including being faster and less expensive, more comprehensive in its use of available data, and more accurate in representing a system's physical processes. This paper describes the application of the approach to model groundwater levels in Florida, stream temperatures across Western Oregon and Wisconsin, and water depths in the Florida Everglades. ?? 2009 ASCE.
Dai, Wei; Fu, Caroline; Khant, Htet A.; Ludtke, Steven J.; Schmid, Michael F.; Chiu, Wah
2015-01-01
Advances in electron cryo-tomography have provided a new opportunity to visualize the internal 3D structures of a bacterium. An electron microscope equipped with Zernike phase contrast optics produces images with dramatically increased contrast compared to images obtained by conventional electron microscopy. Here we describe a protocol to apply Zernike phase plate technology for acquiring electron tomographic tilt series of cyanophage-infected cyanobacterial cells embedded in ice, without staining or chemical fixation. We detail the procedures for aligning and assessing phase plates for data collection, and methods to obtain 3D structures of cyanophage assembly intermediates in the host, by subtomogram alignment, classification and averaging. Acquiring three to four tomographic tilt series takes approximately 12 h on a JEM2200FS electron microscope. We expect this time requirement to decrease substantially as the technique matures. Time required for annotation and subtomogram averaging varies widely depending on the project goals and data volume. PMID:25321408
NASA Astrophysics Data System (ADS)
Borchert, Otto Jerome
This paper describes a software tool to assist groups of people in the classification and identification of real world objects called the Classification, Identification, and Retrieval-based Collaborative Learning Environment (CIRCLE). A thorough literature review identified current pedagogical theories that were synthesized into a series of five tasks: gathering, elaboration, classification, identification, and reinforcement through game play. This approach is detailed as part of an included peer reviewed paper. Motivation is increased through the use of formative and summative gamification; getting points completing important portions of the tasks and playing retrieval learning based games, respectively, which is also included as a peer-reviewed conference proceedings paper. Collaboration is integrated into the experience through specific tasks and communication mediums. Implementation focused on a REST-based client-server architecture. The client is a series of web-based interfaces to complete each of the tasks, support formal classroom interaction through faculty accounts and student tracking, and a module for peers to help each other. The server, developed using an in-house JavaMOO platform, stores relevant project data and serves data through a series of messages implemented as a JavaScript Object Notation Application Programming Interface (JSON API). Through a series of two beta tests and two experiments, it was discovered the second, elaboration, task requires considerable support. While students were able to properly suggest experiments and make observations, the subtask involving cleaning the data for use in CIRCLE required extra support. When supplied with more structured data, students were enthusiastic about the classification and identification tasks, showing marked improvement in usability scores and in open ended survey responses. CIRCLE tracks a variety of educationally relevant variables, facilitating support for instructors and researchers. Future work will revolve around material development, software refinement, and theory building. Curricula, lesson plans, instructional materials need to be created to seamlessly integrate CIRCLE in a variety of courses. Further refinement of the software will focus on improving the elaboration interface and developing further game templates to add to the motivation and retrieval learning aspects of the software. Data gathered from CIRCLE experiments can be used to develop and strengthen theories on teaching and learning.
Empirical study of classification process for two-stage turbo air classifier in series
NASA Astrophysics Data System (ADS)
Yu, Yuan; Liu, Jiaxiang; Li, Gang
2013-05-01
The suitable process parameters for a two-stage turbo air classifier are important for obtaining the ultrafine powder that has a narrow particle-size distribution, however little has been published internationally on the classification process for the two-stage turbo air classifier in series. The influence of the process parameters of a two-stage turbo air classifier in series on classification performance is empirically studied by using aluminum oxide powders as the experimental material. The experimental results show the following: 1) When the rotor cage rotary speed of the first-stage classifier is increased from 2 300 r/min to 2 500 r/min with a constant rotor cage rotary speed of the second-stage classifier, classification precision is increased from 0.64 to 0.67. However, in this case, the final ultrafine powder yield is decreased from 79% to 74%, which means the classification precision and the final ultrafine powder yield can be regulated through adjusting the rotor cage rotary speed of the first-stage classifier. 2) When the rotor cage rotary speed of the second-stage classifier is increased from 2 500 r/min to 3 100 r/min with a constant rotor cage rotary speed of the first-stage classifier, the cut size is decreased from 13.16 μm to 8.76 μm, which means the cut size of the ultrafine powder can be regulated through adjusting the rotor cage rotary speed of the second-stage classifier. 3) When the feeding speed is increased from 35 kg/h to 50 kg/h, the "fish-hook" effect is strengthened, which makes the ultrafine powder yield decrease. 4) To weaken the "fish-hook" effect, the equalization of the two-stage wind speeds or the combination of a high first-stage wind speed with a low second-stage wind speed should be selected. This empirical study provides a criterion of process parameter configurations for a two-stage or multi-stage classifier in series, which offers a theoretical basis for practical production.
The APA classification of mental disorders: future perspectives.
Regier, Darrel A; Narrow, William E; First, Michael B; Marshall, Tina
2002-01-01
After 8-10 years of experience with the fourth edition of the Diagnostic and Statistical Manual (DSM-IV) and the tenth edition of the International Classification of Diseases (ICD-10), it is an ideal time to begin looking at the clinical and research consequences of these diagnostic systems. The American Psychiatric Association, in conjunction with the National Institutes of Health, has initiated a research development process intended to accelerate an evaluation of existing criteria while developing and testing hypotheses that would improve the validity of our diagnostic concepts. Over the past year, a multidisciplinary, international panel has developed a series of six white papers which define research opportunities in the following broad areas: Nomenclature, Disability and Impairment, Personality Disorders, Relational Disorders, Developmental Psychopathology, Neuroscience, and Cross-Cultural aspects of Psychopathology. Recommendations for future national and international research in each of these areas will be discussed. Copyright 2002 S. Karger AG, Basel
Maxwell, Susan K; Sylvester, Kenneth M
2012-06-01
A time series of 230 intra- and inter-annual Landsat Thematic Mapper images was used to identify land that was ever cropped during the years 1984 through 2010 for a five county region in southwestern Kansas. Annual maximum Normalized Difference Vegetation Index (NDVI) image composites (NDVI(ann-max)) were used to evaluate the inter-annual dynamics of cropped and non-cropped land. Three feature images were derived from the 27-year NDVI(ann-max) image time series and used in the classification: 1) maximum NDVI value that occurred over the entire 27 year time span (NDVI(max)), 2) standard deviation of the annual maximum NDVI values for all years (NDVI(sd)), and 3) standard deviation of the annual maximum NDVI values for years 1984-1986 (NDVI(sd84-86)) to improve Conservation Reserve Program land discrimination.Results of the classification were compared to three reference data sets: County-level USDA Census records (1982-2007) and two digital land cover maps (Kansas 2005 and USGS Trends Program maps (1986-2000)). Area of ever-cropped land for the five counties was on average 11.8 % higher than the area estimated from Census records. Overall agreement between the ever-cropped land map and the 2005 Kansas map was 91.9% and 97.2% for the Trends maps. Converting the intra-annual Landsat data set to a single annual maximum NDVI image composite considerably reduced the data set size, eliminated clouds and cloud-shadow affects, yet maintained information important for discriminating cropped land. Our results suggest that Landsat annual maximum NDVI image composites will be useful for characterizing land use and land cover change for many applications.
Case Series: Chikungunya and Dengue at a Forward Operating Location
2015-05-01
Journal Article 3. DATES COVERED (From – To) November 2014 – January 2015 4. TITLE AND SUBTITLE Case Series: Chikungunya and Dengue at a Forward...series and discusses the significance of this disease in the Americas and diagnostic challenges when other arboviruses such as dengue are present. 15...SUBJECT TERMS Chikungunya, dengue , mosquitoes 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 3
New Comprehensive System to Construct Speleothem Fabrics Time Series
NASA Astrophysics Data System (ADS)
Frisia, S.; Borsato, A.
2014-12-01
Speleothem fabrics record processes that influence the way geochemical proxy data are encoded in speleothems, yet, there has been little advance in the use of fabrics as a complement to palaeo-proxy datasets since the fabric classification proposed by us in 2010. The systematic use of fabrics documentation in speleothem science has been limited by the absence of a comprehensive, numerical system that would allow constructing fabric time series comparable with the widely used geochemical time series. Documentation of speleothem fabrics is fundamental for a robust interpretation of speleothem time series where stable isotopes and trace elements are used as proxy, because fabrics highlight depositional as well as post-depositional processes whose understanding complements reconstructions based on geochemistry. Here we propose a logic system allowing transformation of microscope observations into numbers tied to acronyms that specify each fabric type and subtype. The rationale for ascribing progressive numbers to fabrics is based on the most up-to-date growth models. In this conceptual framework, the progression reflects hydrological conditions, bio-mediation and diagenesis. The lowest numbers are given to calcite fabrics formed at relatively constant drip rates: the columnar types (compact and open). Higher numbers are ascribed to columnar fabrics characterized by presence of impurities that cause elongation or lattice distortion (Elongated, Fascicular Optic and Radiaxial calcites). The sequence progresses with the dendritic fabrics, followed by micrite (M), which has been observed in association with microbial films. Microsparite (Ms) and mosaic calcite (Mc) have the highest numbers, being considered as diagenetic. Acronyms and subfixes are intended to become universally acknowledged. Thus, fabrics can be plotted vs. age to yield time series, where numbers are replaced by the acronyms. This will result in a visual representation of climate- or environmental-related parameters underpinning speleothem crystal growth. The Fabric log thus becomes a useful tool providing robustness to the geochemical data or test the overall utility of the speleothem record.
A Hierarchical Convolutional Neural Network for vesicle fusion event classification.
Li, Haohan; Mao, Yunxiang; Yin, Zhaozheng; Xu, Yingke
2017-09-01
Quantitative analysis of vesicle exocytosis and classification of different modes of vesicle fusion from the fluorescence microscopy are of primary importance for biomedical researches. In this paper, we propose a novel Hierarchical Convolutional Neural Network (HCNN) method to automatically identify vesicle fusion events in time-lapse Total Internal Reflection Fluorescence Microscopy (TIRFM) image sequences. Firstly, a detection and tracking method is developed to extract image patch sequences containing potential fusion events. Then, a Gaussian Mixture Model (GMM) is applied on each image patch of the patch sequence with outliers rejected for robust Gaussian fitting. By utilizing the high-level time-series intensity change features introduced by GMM and the visual appearance features embedded in some key moments of the fusion process, the proposed HCNN architecture is able to classify each candidate patch sequence into three classes: full fusion event, partial fusion event and non-fusion event. Finally, we validate the performance of our method on 9 challenging datasets that have been annotated by cell biologists, and our method achieves better performances when comparing with three previous methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rose, R.; Aizenman, H.; Mei, E.; Choudhury, N.
2013-12-01
High School students interested in the STEM fields benefit most when actively participating, so I created a series of learning modules on how to analyze complex systems using machine-learning that give automated feedback to students. The automated feedbacks give timely responses that will encourage the students to continue testing and enhancing their programs. I have designed my modules to take the tactical learning approach in conveying the concepts behind correlation, linear regression, and vector distance based classification and clustering. On successful completion of these modules, students will learn how to calculate linear regression, Pearson's correlation, and apply classification and clustering techniques to a dataset. Working on these modules will allow the students to take back to the classroom what they've learned and then apply it to the Earth Science curriculum. During my research this summer, we applied these lessons to analyzing river deltas; we looked at trends in the different variables over time, looked for similarities in NDVI, precipitation, inundation, runoff and discharge, and attempted to predict floods based on the precipitation, waves mean, area of discharge, NDVI, and inundation.
SSAW: A new sequence similarity analysis method based on the stationary discrete wavelet transform.
Lin, Jie; Wei, Jing; Adjeroh, Donald; Jiang, Bing-Hua; Jiang, Yue
2018-05-02
Alignment-free sequence similarity analysis methods often lead to significant savings in computational time over alignment-based counterparts. A new alignment-free sequence similarity analysis method, called SSAW is proposed. SSAW stands for Sequence Similarity Analysis using the Stationary Discrete Wavelet Transform (SDWT). It extracts k-mers from a sequence, then maps each k-mer to a complex number field. Then, the series of complex numbers formed are transformed into feature vectors using the stationary discrete wavelet transform. After these steps, the original sequence is turned into a feature vector with numeric values, which can then be used for clustering and/or classification. Using two different types of applications, namely, clustering and classification, we compared SSAW against the the-state-of-the-art alignment free sequence analysis methods. SSAW demonstrates competitive or superior performance in terms of standard indicators, such as accuracy, F-score, precision, and recall. The running time was significantly better in most cases. These make SSAW a suitable method for sequence analysis, especially, given the rapidly increasing volumes of sequence data required by most modern applications.
NASA Astrophysics Data System (ADS)
Zingone, Adriana; Harrison, Paul J.; Kraberg, Alexandra; Lehtinen, Sirpa; McQuatters-Gollop, Abigail; O'Brien, Todd; Sun, Jun; Jakobsen, Hans H.
2015-09-01
Phytoplankton diversity and its variation over an extended time scale can provide answers to a wide range of questions relevant to societal needs. These include human health, the safe and sustained use of marine resources and the ecological status of the marine environment, including long-term changes under the impact of multiple stressors. The analysis of phytoplankton data collected at the same place over time, as well as the comparison among different sampling sites, provide key information for assessing environmental change, and evaluating new actions that must be made to reduce human induced pressures on the environment. To achieve these aims, phytoplankton data may be used several decades later by users that have not participated in their production, including automatic data retrieval and analysis. The methods used in phytoplankton species analysis vary widely among research and monitoring groups, while quality control procedures have not been implemented in most cases. Here we highlight some of the main differences in the sampling and analytical procedures applied to phytoplankton analysis and identify critical steps that are required to improve the quality and inter-comparability of data obtained at different sites and/or times. Harmonization of methods may not be a realistic goal, considering the wide range of purposes of phytoplankton time-series data collection. However, we propose that more consistent and detailed metadata and complementary information be recorded and made available along with phytoplankton time-series datasets, including description of the procedures and elements allowing for a quality control of the data. To keep up with the progress in taxonomic research, there is a need for continued training of taxonomists, and for supporting and complementing existing web resources, in order to allow a constant upgrade of knowledge in phytoplankton classification and identification. Efforts towards the improvement of metadata recording, data annotation and quality control procedures will ensure the internal consistency of phytoplankton time series and facilitate their comparability and accessibility, thus strongly increasing the value of the precious information they provide. Ultimately, the sharing of quality controlled data will allow one to recoup the high cost of obtaining the data through the multiple use of the time-series data in various projects over many decades.
ERIC Educational Resources Information Center
Gilchrist, Alan, Ed.
This set of papers offers insights into some of the major developments in the field of classification and knowledge organization, and highlights many of the fundamental changes in views and theories which have taken place during the last 40 years. This document begins with a series of reminiscences from former delegates of the first International…
Automated Classification of Power Signals
2008-06-01
determine when a transient occurs. The identification of this signal can then be determined by an expert classifier and a series of these...the manual identification and classification of system events. Once events were located, the characteristics were examined to determine if system... identification code, which varies depending on the system classifier that is specified. Figure 3-7 provides an example of a Linux directory containing
ERIC Educational Resources Information Center
Wang, Wen-Chung; Huang, Sheng-Yun
2011-01-01
The one-parameter logistic model with ability-based guessing (1PL-AG) has been recently developed to account for effect of ability on guessing behavior in multiple-choice items. In this study, the authors developed algorithms for computerized classification testing under the 1PL-AG and conducted a series of simulations to evaluate their…
Parker, J W; Lane, J R; Karaikovic, E E; Gaines, R W
2000-05-01
A retrospective review of all the surgically managed spinal fractures at the University of Missouri Medical Center during the 41/2-year period from January 1989 to July 1993 was performed. Of the 51 surgically managed patients, 46 were instrumented by short-segment technique (attachment of one level above the fracture to one level below the fracture). The other 5 patients in this consecutive series had multiple trauma. These patients were included in the review because this was a consecutive series. However, they were grouped separately because they were instrumented by long-segment technique because of their multiple organ system injuries. The choice of the anterior or posterior approach for short-segment instrumentation was based on the Load-Sharing Classification published in a 1994 issue of Spine. The purpose of this review was to demonstrate that grading comminution by use of the Load-Sharing Classification for approach selection and the choice of patients with isolated fractures who are cooperative with spinal bracing for 4 months provide the keys to successful short-segment treatment of isolated spinal fractures. The current literature implies that the use of pedicle screws for short-segment instrumentation of spinal fracture is dangerous and inappropriate because of the high screw fracture rate. Charts, operative notes, preoperative and postoperative radiographs, computed tomography scans, and follow-up records of all patients were reviewed carefully from the time of surgery until final follow-up assessment. The Load-Sharing Classification had been used prospectively for all patients before their surgery to determine the approach for short-segment instrumentation. Denis' Pain Scale and Work Scales were obtained during follow-up evaluation for all patients. All patients were observed over 40 months except for 1 patient who died of unrelated causes after 35 months. The mean follow-up period was 66 months (51/2 years). No patient was lost to follow-up evaluation. Prospective application of the Load-Sharing Classification to the patients' injury and restriction of the short-segment approach to cooperative patients with isolated spinal fractures (excluding multisystem trauma patients) allowed 45 of 46 patients instrumented by the short-segment technique to proceed to successful healing in virtual anatomic alignment. The Load-Sharing Classification is a straightforward way to describe the amount of bony comminution in a spinal fracture. When applied to patients with isolated spine fractures who are cooperative with 3 to 4 months of spinal bracing, it can help the surgeon select short-segment pedicle-screw-based fixation using the posterior approach for less comminuted injuries and the anterior approach for those more comminuted. The choice of which fracture-dislocations should be strut grafted anteriorly and which need only posterior short-segment pedicle-screw-based instrumentation also can be made using the Load-Sharing Classification.
On-line Robot Adaptation to Environmental Change
2005-08-01
by the Department of the Interior under contract no. NBCH1040007, the US Army under contract no. DABT639910013, the US Air Force Research Laboratory...Probable Series Predictor algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . 97 5.2 Accuracy of PSC in various test classification tasks...105 6.1 Probable Series Predictor algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . 123 6.2 Accuracy of PSC in
Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M
2014-01-01
Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject “at rest”). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing “signal” (brain activity) can be distinguished form the “noise” components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX (“FMRIB’s ICA-based X-noiseifier”), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different Classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original data, to provide automated cleanup. On conventional resting-state fMRI (rfMRI) single-run datasets, FIX achieved about 95% overall accuracy. On high-quality rfMRI data from the Human Connectome Project, FIX achieves over 99% classification accuracy, and as a result is being used in the default rfMRI processing pipeline for generating HCP connectomes. FIX is publicly available as a plugin for FSL. PMID:24389422
The sdA problem - II. Photometric and spectroscopic follow-up
NASA Astrophysics Data System (ADS)
Pelisoli, Ingrid; Kepler, S. O.; Koester, D.; Castanheira, B. G.; Romero, A. D.; Fraga, L.
2018-07-01
The spectral classification `subdwarf A' (sdA) is given to stars showing H-rich spectra and sub-main-sequence surface gravities, but effective temperature lower than the zero-age horizontal branch. Their evolutionary origin is an enigma. In this work, we discuss the results of follow-up observations of selected sdAs. We obtained time-resolved spectroscopy for 24 objects and time-series photometry for another 19 objects. For two targets, we report both spectroscopy and photometry observations. We confirm seven objects to be new extremely low-mass white dwarfs (ELMs), one of which is a known eclipsing star. We also find the eighth member of the pulsating ELM class.
Odontogenic tumours: A review of 266 cases.
Lawal, Ahmed O; Adisa, Akinyele O; Olusanya, Adeola A
2013-02-01
The aim of this study was to examine the relative frequency of odontogenic tumours at a tertiary hospital in Ibadan, as well as to study the various histologic types based on WHO 2005 classification and to compare results from this study with those of previous studies. The records of the Oral Pathology Department of University College Hospital were reviewed. Lesions diagnosed as odontogenic tumours were categorized into four groups based on WHO 2005 classification and were analyzed for age, sex and site using SPSS for Window (version 18.0; SPSS Inc. Chicago, IL) and frequency tables were generated. Two hundred and sixty six (41.7%) cases of odontogenic tumours were seen. The mean age of occurrence was 32.6 (±15.815) years (range3-82 years) and peak age was in the third decade of life. Eleven (4.1%) malignant odontogenic tumours were seen. Ameloblastoma with 65.4% of cases was the most common odontogenic tumour followed by fibromyxoma (14.7%), no case of odontoma was seen in this series. The findings were mostly similar to those of African and Asian series and showed variations from reports from the Americas. The reason for the disparity in African and American series needs further investigations. Key words:Odontogenic tumour, classification, Nigeria.
Epilepsy in twins: insights from unique historical data of William Lennox.
Vadlamudi, L; Andermann, E; Lombroso, C T; Schachter, S C; Milne, R L; Hopper, J L; Andermann, F; Berkovic, S F
2004-04-13
To classify the Lennox twin pairs according to modern epilepsy classifications, use the classic twin model to identify which epilepsy syndromes have an inherited component, search for evidence of syndrome-specific genes, and compare concordances from Lennox's series with a contemporary Australian series. Following review of Lennox's original files describing twins with seizures from 1934 through 1958, the International League Against Epilepsy classifications of seizures and epileptic syndromes were applied to 169 pairs. Monozygous (MZ) and dizygous (DZ) pairs were subdivided into epilepsy syndromes and casewise concordances estimated. The authors excluded 26 pairs, with 71 MZ and 72 DZ pairs remaining. Seizure analysis demonstrated strong parallels between contemporary seizure classification and Lennox's terminology. Epilepsy syndrome diagnoses were made in 75%. The MZ and DZ casewise concordance estimates gave strong evidence for a major genetic influence in idiopathic generalized epilepsies (0.80 versus 0.00; n = 23). High MZ casewise concordances also supported a genetic etiology in symptomatic generalized epilepsies and febrile seizures. The pairs who were concordant for seizures usually had the same syndromic diagnoses in both twins (86% in MZ, 60% in DZ), suggesting syndrome-specific genes. Apart from partial epilepsies, the MZ casewise concordances were similar to those derived from Australian twin data. The authors were able to apply contemporary classifications to Lennox's twins. The data confirm genetic bases for common generalized epilepsies as well as febrile seizures and provide further support for syndrome-specific genes. Finally, comparable results to our Australian series were obtained, verifying the value of twin studies.
Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng
2018-04-20
Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.
The risk characteristics of solar and geomagnetic activity
NASA Astrophysics Data System (ADS)
Podolska, Katerina
2016-04-01
The main aim of this contribution is a deeper analysis of the influence of solar activity which is expected to have an impact on human health, and therefore on mortality, in particular civilization and degenerative diseases. We have constructed the characteristics that represent the risk of solar and geomagnetic activity on human health on the basis of our previous analysis of association between the daily numbers of death on diseases of the nervous system and diseases of the circulatory system and solar and geomagnetic activity in the Czech Republic during the years 1994 - 2013. We used long period daily time series of numbers of deaths by cause, long period time series of solar activity indices (namely R and F10.7), geomagnetic indicies (Kp planetary index, Dst) and ionospheric parameters (foF2 and TEC). The ionospheric parameters were related to the geographic location of the Czech Republic and adjusted for middle geographic latitudes. The risk characteristics were composed by cluster analysis in time series according to the phases of the solar cycle resp. the seasonal insolation at mid-latitudes or the daily period according to the impact of solar and geomagnetic activity on mortality by cause of death from medical cause groups of death VI. Diseases of the nervous system and IX. Diseases of the circulatory system mortality by 10th Revision of International Classification of Diseases WHO (ICD-10).
Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.
2017-01-01
Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.
Awan, Imtiaz; Aziz, Wajid; Habib, Nazneen; Alowibdi, Jalal S.; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali
2018-01-01
Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features. PMID:29771977
Awan, Imtiaz; Aziz, Wajid; Shah, Imran Hussain; Habib, Nazneen; Alowibdi, Jalal S; Saeed, Sharjil; Nadeem, Malik Sajjad Ahmed; Shah, Syed Ahsin Ali
2018-01-01
Considerable interest has been devoted for developing a deeper understanding of the dynamics of healthy biological systems and how these dynamics are affected due to aging and disease. Entropy based complexity measures have widely been used for quantifying the dynamics of physical and biological systems. These techniques have provided valuable information leading to a fuller understanding of the dynamics of these systems and underlying stimuli that are responsible for anomalous behavior. The single scale based traditional entropy measures yielded contradictory results about the dynamics of real world time series data of healthy and pathological subjects. Recently the multiscale entropy (MSE) algorithm was introduced for precise description of the complexity of biological signals, which was used in numerous fields since its inception. The original MSE quantified the complexity of coarse-grained time series using sample entropy. The original MSE may be unreliable for short signals because the length of the coarse-grained time series decreases with increasing scaling factor τ, however, MSE works well for long signals. To overcome the drawback of original MSE, various variants of this method have been proposed for evaluating complexity efficiently. In this study, we have proposed multiscale normalized corrected Shannon entropy (MNCSE), in which instead of using sample entropy, symbolic entropy measure NCSE has been used as an entropy estimate. The results of the study are compared with traditional MSE. The effectiveness of the proposed approach is demonstrated using noise signals as well as interbeat interval signals from healthy and pathological subjects. The preliminary results of the study indicate that MNCSE values are more stable and reliable than original MSE values. The results show that MNCSE based features lead to higher classification accuracies in comparison with the MSE based features.
TIMESERIESSTREAMING.VI: LabVIEW program for reliable data streaming of large analog time series
NASA Astrophysics Data System (ADS)
Czerwinski, Fabian; Oddershede, Lene B.
2011-02-01
With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification. Program summaryProgram title: TimeSeriesStreaming.VI Catalogue identifier: AEHT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 250 No. of bytes in distributed program, including test data, etc.: 63 259 Distribution format: tar.gz Programming language: LabVIEW ( http://www.ni.com/labview/) Computer: Any machine running LabVIEW 8.6 or higher Operating system: Windows XP and Windows 7 RAM: 60-360 Mbyte Classification: 3 Nature of problem: For numerous scientific and engineering applications, it is highly desirable to have an efficient, reliable, and flexible program to perform data streaming of time series sampled with high frequencies and possibly for long time intervals. This type of data acquisition often produces very large amounts of data not easily streamed onto a computer hard disk using standard methods. Solution method: This LabVIEW program is developed to directly stream any kind of time series onto a hard disk. Due to optimized timing and usage of computational resources, such as multicores and protocols for memory usage, this program provides extremely reliable data acquisition. In particular, the program is optimized to deal with large amounts of data, e.g., taken with high sampling frequencies and over long time intervals. The program can be easily customized for time series analyses. Restrictions: Only tested in Windows-operating LabVIEW environments, must use TDMS format, acquisition cards must be LabVIEW compatible, driver DAQmx installed. Running time: As desirable: microseconds to hours
Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach
NASA Astrophysics Data System (ADS)
Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.
2015-04-01
Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.
Formisano, Elia; De Martino, Federico; Valente, Giancarlo
2008-09-01
Machine learning and pattern recognition techniques are being increasingly employed in functional magnetic resonance imaging (fMRI) data analysis. By taking into account the full spatial pattern of brain activity measured simultaneously at many locations, these methods allow detecting subtle, non-strictly localized effects that may remain invisible to the conventional analysis with univariate statistical methods. In typical fMRI applications, pattern recognition algorithms "learn" a functional relationship between brain response patterns and a perceptual, cognitive or behavioral state of a subject expressed in terms of a label, which may assume discrete (classification) or continuous (regression) values. This learned functional relationship is then used to predict the unseen labels from a new data set ("brain reading"). In this article, we describe the mathematical foundations of machine learning applications in fMRI. We focus on two methods, support vector machines and relevance vector machines, which are respectively suited for the classification and regression of fMRI patterns. Furthermore, by means of several examples and applications, we illustrate and discuss the methodological challenges of using machine learning algorithms in the context of fMRI data analysis.
LMD Based Features for the Automatic Seizure Detection of EEG Signals Using SVM.
Zhang, Tao; Chen, Wanzhong
2017-08-01
Achieving the goal of detecting seizure activity automatically using electroencephalogram (EEG) signals is of great importance and significance for the treatment of epileptic seizures. To realize this aim, a newly-developed time-frequency analytical algorithm, namely local mean decomposition (LMD), is employed in the presented study. LMD is able to decompose an arbitrary signal into a series of product functions (PFs). Primarily, the raw EEG signal is decomposed into several PFs, and then the temporal statistical and non-linear features of the first five PFs are calculated. The features of each PF are fed into five classifiers, including back propagation neural network (BPNN), K-nearest neighbor (KNN), linear discriminant analysis (LDA), un-optimized support vector machine (SVM) and SVM optimized by genetic algorithm (GA-SVM), for five classification cases, respectively. Confluent features of all PFs and raw EEG are further passed into the high-performance GA-SVM for the same classification tasks. Experimental results on the international public Bonn epilepsy EEG dataset show that the average classification accuracy of the presented approach are equal to or higher than 98.10% in all the five cases, and this indicates the effectiveness of the proposed approach for automated seizure detection.
Forecasting Daily Volume and Acuity of Patients in the Emergency Department.
Calegari, Rafael; Fogliatto, Flavio S; Lucini, Filipe R; Neyeloff, Jeruza; Kuchenbecker, Ricardo S; Schaan, Beatriz D
2016-01-01
This study aimed at analyzing the performance of four forecasting models in predicting the demand for medical care in terms of daily visits in an emergency department (ED) that handles high complexity cases, testing the influence of climatic and calendrical factors on demand behavior. We tested different mathematical models to forecast ED daily visits at Hospital de Clínicas de Porto Alegre (HCPA), which is a tertiary care teaching hospital located in Southern Brazil. Model accuracy was evaluated using mean absolute percentage error (MAPE), considering forecasting horizons of 1, 7, 14, 21, and 30 days. The demand time series was stratified according to patient classification using the Manchester Triage System's (MTS) criteria. Models tested were the simple seasonal exponential smoothing (SS), seasonal multiplicative Holt-Winters (SMHW), seasonal autoregressive integrated moving average (SARIMA), and multivariate autoregressive integrated moving average (MSARIMA). Performance of models varied according to patient classification, such that SS was the best choice when all types of patients were jointly considered, and SARIMA was the most accurate for modeling demands of very urgent (VU) and urgent (U) patients. The MSARIMA models taking into account climatic factors did not improve the performance of the SARIMA models, independent of patient classification.
Forecasting Daily Volume and Acuity of Patients in the Emergency Department
Fogliatto, Flavio S.; Neyeloff, Jeruza; Kuchenbecker, Ricardo S.; Schaan, Beatriz D.
2016-01-01
This study aimed at analyzing the performance of four forecasting models in predicting the demand for medical care in terms of daily visits in an emergency department (ED) that handles high complexity cases, testing the influence of climatic and calendrical factors on demand behavior. We tested different mathematical models to forecast ED daily visits at Hospital de Clínicas de Porto Alegre (HCPA), which is a tertiary care teaching hospital located in Southern Brazil. Model accuracy was evaluated using mean absolute percentage error (MAPE), considering forecasting horizons of 1, 7, 14, 21, and 30 days. The demand time series was stratified according to patient classification using the Manchester Triage System's (MTS) criteria. Models tested were the simple seasonal exponential smoothing (SS), seasonal multiplicative Holt-Winters (SMHW), seasonal autoregressive integrated moving average (SARIMA), and multivariate autoregressive integrated moving average (MSARIMA). Performance of models varied according to patient classification, such that SS was the best choice when all types of patients were jointly considered, and SARIMA was the most accurate for modeling demands of very urgent (VU) and urgent (U) patients. The MSARIMA models taking into account climatic factors did not improve the performance of the SARIMA models, independent of patient classification. PMID:27725842
Enhancing the performance of regional land cover mapping
NASA Astrophysics Data System (ADS)
Wu, Weicheng; Zucca, Claudio; Karam, Fadi; Liu, Guangping
2016-10-01
Different pixel-based, object-based and subpixel-based methods such as time-series analysis, decision-tree, and different supervised approaches have been proposed to conduct land use/cover classification. However, despite their proven advantages in small dataset tests, their performance is variable and less satisfactory while dealing with large datasets, particularly, for regional-scale mapping with high resolution data due to the complexity and diversity in landscapes and land cover patterns, and the unacceptably long processing time. The objective of this paper is to demonstrate the comparatively highest performance of an operational approach based on integration of multisource information ensuring high mapping accuracy in large areas with acceptable processing time. The information used includes phenologically contrasted multiseasonal and multispectral bands, vegetation index, land surface temperature, and topographic features. The performance of different conventional and machine learning classifiers namely Malahanobis Distance (MD), Maximum Likelihood (ML), Artificial Neural Networks (ANNs), Support Vector Machines (SVMs) and Random Forests (RFs) was compared using the same datasets in the same IDL (Interactive Data Language) environment. An Eastern Mediterranean area with complex landscape and steep climate gradients was selected to test and develop the operational approach. The results showed that SVMs and RFs classifiers produced most accurate mapping at local-scale (up to 96.85% in Overall Accuracy), but were very time-consuming in whole-scene classification (more than five days per scene) whereas ML fulfilled the task rapidly (about 10 min per scene) with satisfying accuracy (94.2-96.4%). Thus, the approach composed of integration of seasonally contrasted multisource data and sampling at subclass level followed by a ML classification is a suitable candidate to become an operational and effective regional land cover mapping method.
A Deep Learning Scheme for Motor Imagery Classification based on Restricted Boltzmann Machines.
Lu, Na; Li, Tengfei; Ren, Xiaodong; Miao, Hongyu
2017-06-01
Motor imagery classification is an important topic in brain-computer interface (BCI) research that enables the recognition of a subject's intension to, e.g., implement prosthesis control. The brain dynamics of motor imagery are usually measured by electroencephalography (EEG) as nonstationary time series of low signal-to-noise ratio. Although a variety of methods have been previously developed to learn EEG signal features, the deep learning idea has rarely been explored to generate new representation of EEG features and achieve further performance improvement for motor imagery classification. In this study, a novel deep learning scheme based on restricted Boltzmann machine (RBM) is proposed. Specifically, frequency domain representations of EEG signals obtained via fast Fourier transform (FFT) and wavelet package decomposition (WPD) are obtained to train three RBMs. These RBMs are then stacked up with an extra output layer to form a four-layer neural network, which is named the frequential deep belief network (FDBN). The output layer employs the softmax regression to accomplish the classification task. Also, the conjugate gradient method and backpropagation are used to fine tune the FDBN. Extensive and systematic experiments have been performed on public benchmark datasets, and the results show that the performance improvement of FDBN over other selected state-of-the-art methods is statistically significant. Also, several findings that may be of significant interest to the BCI community are presented in this article.
ERIC Educational Resources Information Center
Roberts, Charles T., Comp.; Lichtenberger, Allan R., Comp.
This handbook has been prepared as a vehicle or mechanism for program cost accounting and as a guide to standard school accounting terminology for use in all types of local and intermediate education agencies. In addition to classification descriptions, program accounting definitions, and proration of cost procedures, some units of measure and…
Project DIPOLE WEST - Multiburst Environment (Non-Simultaneous Detonations)
1976-09-01
PAGE (WIMn Dat• Bntered) Unclassified SECURITY CLASSIFICATION OP’ THIS PAGE(ft• Data .Bnt......, 20. Abstract Purpose of the series was to obtain...HULL hydrodynamic air blast code show good correlation. UNCLASSIFIED SECUFUTY CLASSIFICATION OF THIS PA.GE(When Date Bntered) • • 1...supervision. Contributions were also made by Dr. John Dewey, University of Victoria; Mr. A. P. R. Lambert, Canadian General Electric; Mr. Charles Needham
NASA Astrophysics Data System (ADS)
Gålfalk, Magnus; Karlson, Martin; Crill, Patrick; Bastviken, David
2017-04-01
The calibration and validation of remote sensing land cover products is highly dependent on accurate ground truth data, which are costly and practically challenging to collect. This study evaluates a novel and efficient alternative to field surveys and UAV imaging commonly applied for this task. The method consists of i) a light weight, water proof, remote controlled RGB-camera mounted on an extendable monopod used for acquiring wide-field images of the ground from a height of 4.5 meters, and ii) a script for semi-automatic image classification. In the post-processing, the wide-field images are corrected for optical distortion and geometrically rectified so that the spatial resolution is the same over the surface area used for classification. The script distinguishes land surface components by color, brightness and spatial variability. The method was evaluated in wetland areas located around Abisko, northern Sweden. Proportional estimates of the six main surface components in the wetlands (wet and dry Sphagnum, shrub, grass, water, rock) were derived for 200 images, equivalent to 10 × 10 m field plots. These photo plots were then used as calibration data for a regional scale satellite based classification which separates the six wetland surface components using a Sentinel-1 time series. The method presented in this study is accurate, rapid, robust and cost efficient in comparison to field surveys (time consuming) and drone mapping (which require low wind speeds and no rain, suffer from battery limited flight times, have potential GPS/compass errors far north, and in some areas are prohibited by law).
Mejia Tobar, Alejandra; Hyoudou, Rikiya; Kita, Kahori; Nakamura, Tatsuhiro; Kambara, Hiroyuki; Ogata, Yousuke; Hanakawa, Takashi; Koike, Yasuharu; Yoshimura, Natsue
2017-01-01
The classification of ankle movements from non-invasive brain recordings can be applied to a brain-computer interface (BCI) to control exoskeletons, prosthesis, and functional electrical stimulators for the benefit of patients with walking impairments. In this research, ankle flexion and extension tasks at two force levels in both legs, were classified from cortical current sources estimated by a hierarchical variational Bayesian method, using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) recordings. The hierarchical prior for the current source estimation from EEG was obtained from activated brain areas and their intensities from an fMRI group (second-level) analysis. The fMRI group analysis was performed on regions of interest defined over the primary motor cortex, the supplementary motor area, and the somatosensory area, which are well-known to contribute to movement control. A sparse logistic regression method was applied for a nine-class classification (eight active tasks and a resting control task) obtaining a mean accuracy of 65.64% for time series of current sources, estimated from the EEG and the fMRI signals using a variational Bayesian method, and a mean accuracy of 22.19% for the classification of the pre-processed of EEG sensor signals, with a chance level of 11.11%. The higher classification accuracy of current sources, when compared to EEG classification accuracy, was attributed to the high number of sources and the different signal patterns obtained in the same vertex for different motor tasks. Since the inverse filter estimation for current sources can be done offline with the present method, the present method is applicable to real-time BCIs. Finally, due to the highly enhanced spatial distribution of current sources over the brain cortex, this method has the potential to identify activation patterns to design BCIs for the control of an affected limb in patients with stroke, or BCIs from motor imagery in patients with spinal cord injury.
NASA Technical Reports Server (NTRS)
Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno
2014-01-01
Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations must be temporally dense to distinguish between type and frequency in heterogeneous landscapes.
ERIC Educational Resources Information Center
MCGRAW, EUGENE T.
PART OF A KANSAS STATE UNIVERSITY SERIES ON COMMUNITY PLANNING AND DEVELOPMENT, THIS MONOGRAPH DESCRIBES AND DEFINES THE NATURE OF URBAN CENTERS AS PHYSICAL ENTITIES. BASIC LAND USE CATEGORIES AND SUBDIVISIONS, FUNCTIONAL CLASSIFICATIONS OF COMMUNITIES IN THE UNITED STATES (MANUFACTURING, RETAIL, WHOLESALE, DIVERSIFIED, TRANSPORTATION, MINING,…
A 3-tier classification of cerebral arteriovenous malformations. Clinical article.
Spetzler, Robert F; Ponce, Francisco A
2011-03-01
The authors propose a 3-tier classification for cerebral arteriovenous malformations (AVMs). The classification is based on the original 5-tier Spetzler-Martin grading system, and reflects the treatment paradigm for these lesions. The implications of this modification in the literature are explored. Class A combines Grades I and II AVMs, Class B are Grade III AVMs, and Class C combines Grades IV and V AVMs. Recommended management is surgery for Class A AVMs, multimodality treatment for Class B, and observation for Class C, with exceptions to the latter including recurrent hemorrhages and progressive neurological deficits. To evaluate whether combining grades is warranted from the perspective of surgical outcomes, the 3-tier system was applied to 1476 patients from 7 surgical series in which results were stratified according to Spetzler-Martin grades. Pairwise comparisons of individual Spetzler-Martin grades in the series analyzed showed the fewest significant differences (p < 0.05) in outcomes between Grades I and II AVMs and between Grades IV and V AVMs. In the pooled data analysis, significant differences in outcomes were found between all grades except IV and V (p = 0.38), and the lowest relative risks were found between Grades I and II (1.066) and between Grades IV and V (1.095). Using the pooled data, the predictive accuracies for surgical outcomes of the 5-tier and 3-tier systems were equivalent (receiver operating characteristic curve area 0.711 and 0.713, respectively). Combining Grades I and II AVMs and combining Grades IV and V AVMs is justified in part because the differences in surgical results between these respective pairs are small. The proposed 3-tier classification of AVMs offers simplification of the Spetzler-Martin system, provides a guide to treatment, and is predictive of outcome. The revised classification not only simplifies treatment recommendations; by placing patients into 3 as opposed to 5 groups, statistical power is markedly increased for series comparisons.
Grossman, Rachel; Ram, Zvi
2014-12-01
Sarcoma rarely metastasizes to the brain, and there are no specific treatment guidelines for these tumors. The recursive partitioning analysis (RPA) classification is a well-established prognostic scale used in many malignancies. In this study we assessed the clinical characteristics of metastatic sarcoma to the brain and the validity of the RPA classification system in a subset of 21 patients who underwent surgical resection of metastatic sarcoma to the brain We retrospectively analyzed the medical, radiological, surgical, pathological, and follow-up clinical records of 21 patients who were operated for metastatic sarcoma to the brain between 1996 and 2012. Gliosarcomas, sarcomas of the head and neck with local extension into the brain, and metastatic sarcomas to the spine were excluded from this reported series. The patients' mean age was 49.6 ± 14.2 years (range, 25-75 years) at the time of diagnosis. Sixteen patients had a known history of systemic sarcoma, mostly in the extremities, and had previously received systemic chemotherapy and radiation therapy for their primary tumor. The mean maximal tumor diameter in the brain was 4.9 ± 1.7 cm (range 1.7-7.2 cm). The group's median preoperative Karnofsky Performance Scale was 80, with 14 patients presenting with Karnofsky Performance Scale of 70 or greater. The median overall survival was 7 months (range 0.2-204 months). The median survival time stratified by the Radiation Therapy Oncology Group RPA classes were 31, 7, and 2 months for RPA class I, II, and III, respectively (P = 0.0001). This analysis is the first to support the prognostic utility of the Radiation Therapy Oncology Group RPA classification for sarcoma brain metastases and may be used as a treatment guideline tool in this rare disease. Copyright © 2014 Elsevier Inc. All rights reserved.
Objective classification of atmospheric circulation over southern Scandinavia
NASA Astrophysics Data System (ADS)
Linderson, Maj-Lena
2001-02-01
A method for calculating circulation indices and weather types following the Lamb classification is applied to southern Scandinavia. The main objective is to test the ability of the method to describe the atmospheric circulation over the area, and to evaluate the extent to which the pressure patterns determine local precipitation and temperature in Scania, southernmost Sweden. The weather type classification method works well and produces distinct groups. However, the variability within the group is large with regard to the location of the low pressure centres, which may have implications for the precipitation over the area. The anticyclonic weather type dominates, together with the cyclonic and westerly types. This deviates partly from the general picture for Sweden and may be explained by the southerly location of the study area. The cyclonic type is most frequent in spring, although cloudiness and amount of rain are lowest during this season. This could be explained by the occurrence of weaker cyclones or low air humidity during this time of year. Local temperature and precipitation were modelled by stepwise regression for each season, designating weather types as independent variables. Only the winter season-modelled temperature and precipitation show a high and robust correspondence to the observed temperature and precipitation, even though <60% of the precipitation variance is explained. In the other seasons, the connection between atmospheric circulation and the local temperature and precipitation is low. Other meteorological parameters may need to be taken into account. The time and space resolution of the mean sea level pressure (MSLP) grid may affect the results, as many important features might not be covered by the classification. Local physiography may also influence the local climate in a way that cannot be described by the atmospheric circulation pattern alone, stressing the importance of using more than one observation series.
NASA Astrophysics Data System (ADS)
Gallager, S. M.
2016-02-01
Marine ecosystems are changing at a variety of time scales as a function of a diverse suite of forcing functions both natural and anthropogenic. Establishing a continuous plankton time series consisting of scales from rapid (seconds) to long-term (decades), provides a sentinel for ecosystem change. The key is to measure plankton biodiversity at sufficiently fast time scales that allow disentanglement of physical (transport) and biological (growth) properties of an ecosystem. CPICS is a plankton and particle imaging microscope system that is designed to produce crisp darkfield images of light scattering material in aquatic environments. The open flow design is non-invasive and non-restrictive providing images of very fragile plankton in their natural orientation. Several magnifications are possible from 0.5 to 5x forming a field of view of 10cm to 1mm, respectively. CPICS has been installed on several cabled observing systems called OceanCubes off the coast of Okinawa and Tokyo, Japan providing a continuous stream of plankton images to a machine vision image classifier located at WHOI. Image features include custom algorithms for texture, color pattern, morphology and shape, which are extracted from in-focus target. The features are then used to train a classifier (e.g., Random Forest) resulting in classifications that are tested using cross-validation, confusion matrices and ROC curves. High (>90%) classification accuracies are possible depending on the number of training categories and target complexity. A web-based utility allows access to raw images, training sets, classifiers and classification results. Combined with chemical and physical data from the observatories, an ecologically meaningful plankton index of biodiversity and its variance is developed using a combination of species and taxon groups, which provides an approach for understanding ecosystem change without the need to identify all targets to species. http://oceancubes.whoi.edu/instruments/CPICS
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-04-01
In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.
NASA Technical Reports Server (NTRS)
R.Neigh, Christopher S.; Bolton, Douglas K.; Williams, Jennifer J.; Diabate, Mouhamad
2014-01-01
Forests are the largest aboveground sink for atmospheric carbon (C), and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometers (AVHRRs). To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS) to map forest change. This method included: (1) multiple disturbance index thresholds; and (2) a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer's and user's accuracies of 0.21 +/- 0.06 to 0.38 +/- 0.05 for insect disturbance, 0.23 +/- 0.07 to 1 +/- 0 for burned area and 0.74 +/- 0.03 to 0.76 +/- 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer's and user's accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific Northwest forests. Using a robust accuracy assessment, we demonstrate the difficulty of transferring change attribution methods to other ecosystems, which has implications for the development of automated detection/attribution approaches. Widespread disturbance was found within AVHRR-negative anomalies, but identifying causal factors in LTSS with adequate mapping accuracy for fire and insects proved to be elusive. Our results provide a background framework for future studies to improve methods for the accuracy assessment of automated LTSS classifications.
ERIC Educational Resources Information Center
Duzen, Carl; And Others
1992-01-01
Presents a series of activities that utilizes a leveling device to classify constant and accelerated motion. Applies this classification system to uniform circular motion and motion produced by gravitational force. (MDH)
Classification and authentication of unknown water samples using machine learning algorithms.
Kundu, Palash K; Panchariya, P C; Kundu, Madhusree
2011-07-01
This paper proposes the development of water sample classification and authentication, in real life which is based on machine learning algorithms. The proposed techniques used experimental measurements from a pulse voltametry method which is based on an electronic tongue (E-tongue) instrumentation system with silver and platinum electrodes. E-tongue include arrays of solid state ion sensors, transducers even of different types, data collectors and data analysis tools, all oriented to the classification of liquid samples and authentication of unknown liquid samples. The time series signal and the corresponding raw data represent the measurement from a multi-sensor system. The E-tongue system, implemented in a laboratory environment for 6 numbers of different ISI (Bureau of Indian standard) certified water samples (Aquafina, Bisleri, Kingfisher, Oasis, Dolphin, and McDowell) was the data source for developing two types of machine learning algorithms like classification and regression. A water data set consisting of 6 numbers of sample classes containing 4402 numbers of features were considered. A PCA (principal component analysis) based classification and authentication tool was developed in this study as the machine learning component of the E-tongue system. A proposed partial least squares (PLS) based classifier, which was dedicated as well; to authenticate a specific category of water sample evolved out as an integral part of the E-tongue instrumentation system. The developed PCA and PLS based E-tongue system emancipated an overall encouraging authentication percentage accuracy with their excellent performances for the aforesaid categories of water samples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure
Berisha, Visar; Wisler, Alan; Hero, Alfred O.; Spanias, Andreas
2015-01-01
Information divergence functions play a critical role in statistics and information theory. In this paper we show that a non-parametric f-divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm the theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks. PMID:26807014
Oesophageal diverticula: principles of management and appraisal of classification.
Borrie, J; Wilson, R L
1980-01-01
In this paper we review a consecutive series of 50 oesophageal diverticula, appraise clinical features and methods of management, and suggest an improvement on the World Health Organization classification. The link between oesophageal diverticula and motor disorders as assessed by oesophageal manometry is stressed. It is necessary to correct the functional disorder as well as the diverticulum if it is causing symptoms. A revised classification could be as follows: congenital--single or multiple; acquired--single (cricopharyngeal, mid-oesophageal, epiphrenic other) or multiple (for example, when cricopharyngeal and mid-oesophageal present together, or when there is intramural diverticulosis. Images PMID:6781091
A neural network approach to burst detection.
Mounce, S R; Day, A J; Wood, A S; Khan, A; Widdop, P D; Machell, J
2002-01-01
This paper describes how hydraulic and water quality data from a distribution network may be used to provide a more efficient leakage management capability for the water industry. The research presented concerns the application of artificial neural networks to the issue of detection and location of leakage in treated water distribution systems. An architecture for an Artificial Neural Network (ANN) based system is outlined. The neural network uses time series data produced by sensors to directly construct an empirical model for predication and classification of leaks. Results are presented using data from an experimental site in Yorkshire Water's Keighley distribution system.
Commission 45: Spectral Classification
NASA Astrophysics Data System (ADS)
Giridhar, Sunetra; Gray, Richard O.; Corbally, Christopher J.; Bailer-Jones, Coryn A. L.; Eyer, Laurent; Irwin, Michael J.; Kirkpatrick, J. Davy; Majewski, Steven; Minniti, Dante; Nordström, Birgitta
This report gives an update of developments (since the last General Assembly at Prague) in the areas that are of relevance to the commission. In addition to numerous papers, a new monograph entitled Stellar Spectral Classification with Richard Gray and Chris Corbally as leading authors will be published by Princeton University Press as part of their Princeton Series in Astrophysics in April 2009. This book is an up-to-date and encyclopedic review of stellar spectral classification across the H-R diagram, including the traditional MK system in the blue-violet, recent extensions into the ultraviolet and infrared, the newly defined L-type and T-type spectral classes, as well as spectral classification of carbon stars, S-type stars, white dwarfs, novae, supernovae and Wolf-Rayet stars.
A nonlinear heartbeat dynamics model approach for personalized emotion recognition.
Valenza, Gaetano; Citi, Luca; Lanatà, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo
2013-01-01
Emotion recognition based on autonomic nervous system signs is one of the ambitious goals of affective computing. It is well-accepted that standard signal processing techniques require relative long-time series of multivariate records to ensure reliability and robustness of recognition and classification algorithms. In this work, we present a novel methodology able to assess cardiovascular dynamics during short-time (i.e. < 10 seconds) affective stimuli, thus overcoming some of the limitations of current emotion recognition approaches. We developed a personalized, fully parametric probabilistic framework based on point-process theory where heartbeat events are modelled using a 2(nd)-order nonlinear autoregressive integrative structure in order to achieve effective performances in short-time affective assessment. Experimental results show a comprehensive emotional characterization of 4 subjects undergoing a passive affective elicitation using a sequence of standardized images gathered from the international affective picture system. Each picture was identified by the IAPS arousal and valence scores as well as by a self-reported emotional label associating a subjective positive or negative emotion. Results show a clear classification of two defined levels of arousal, valence and self-emotional state using features coming from the instantaneous spectrum and bispectrum of the considered RR intervals, reaching up to 90% recognition accuracy.
NASA Astrophysics Data System (ADS)
Fleig, Anne K.; Tallaksen, Lena M.; Hisdal, Hege; Stahl, Kerstin; Hannah, David M.
Classifications of weather and circulation patterns are often applied in research seeking to relate atmospheric state to surface environmental phenomena. However, numerous procedures have been applied to define the patterns, thus limiting comparability between studies. The COST733 Action “ Harmonisation and Applications of Weather Type Classifications for European regions” tests 73 different weather type classifications (WTC) and their associate weather types (WTs) and compares the WTCs’ utility for various applications. The objective of this study is to evaluate the potential of these WTCs for analysis of regional hydrological drought development in north-western Europe. Hydrological drought is defined in terms of a Regional Drought Area Index (RDAI), which is based on deficits derived from daily river flow series. RDAI series (1964-2001) were calculated for four homogeneous regions in Great Britain and two in Denmark. For each region, WTs associated with hydrological drought development were identified based on antecedent and concurrent WT-frequencies for major drought events. The utility of the different WTCs for the study of hydrological drought development was evaluated, and the influence of WTC attributes, i.e. input variables, number of defined WTs and general classification concept, on WTC performance was assessed. The objective Grosswetterlagen (OGWL), the objective Second-Generation Lamb Weather Type Classification (LWT2) with 18 WTs and two implementations of the objective Wetterlagenklassifikation (WLK; with 40 and 28 WTs) outperformed all other WTCs. In general, WTCs with more WTs (⩾27) were found to perform better than WTCs with less (⩽18) WTs. The influence of input variables was not consistent across the different classification procedures, and the performance of a WTC was determined primarily by the classification procedure itself. Overall, classification procedures following the relatively simple general classification concept of predefining WTs based on thresholds, performed better than those based on more sophisticated classification concepts such as deriving WTs by cluster analysis or artificial neural networks. In particular, PCA based WTCs with 9 WTs and automated WTCs with a high number of predefined WTs (subjectively and threshold based) performed well. It is suggested that the explicit consideration of the air flow characteristics of meridionality, zonality and cyclonicity in the definition of WTs is a useful feature for a WTC when analysing regional hydrological drought development.
Effects of uncertainty and variability on population declines and IUCN Red List classifications.
Rueda-Cediel, Pamela; Anderson, Kurt E; Regan, Tracey J; Regan, Helen M
2018-01-22
The International Union for Conservation of Nature (IUCN) Red List Categories and Criteria is a quantitative framework for classifying species according to extinction risk. Population models may be used to estimate extinction risk or population declines. Uncertainty and variability arise in threat classifications through measurement and process error in empirical data and uncertainty in the models used to estimate extinction risk and population declines. Furthermore, species traits are known to affect extinction risk. We investigated the effects of measurement and process error, model type, population growth rate, and age at first reproduction on the reliability of risk classifications based on projected population declines on IUCN Red List classifications. We used an age-structured population model to simulate true population trajectories with different growth rates, reproductive ages and levels of variation, and subjected them to measurement error. We evaluated the ability of scalar and matrix models parameterized with these simulated time series to accurately capture the IUCN Red List classification generated with true population declines. Under all levels of measurement error tested and low process error, classifications were reasonably accurate; scalar and matrix models yielded roughly the same rate of misclassifications, but the distribution of errors differed; matrix models led to greater overestimation of extinction risk than underestimations; process error tended to contribute to misclassifications to a greater extent than measurement error; and more misclassifications occurred for fast, rather than slow, life histories. These results indicate that classifications of highly threatened taxa (i.e., taxa with low growth rates) under criterion A are more likely to be reliable than for less threatened taxa when assessed with population models. Greater scrutiny needs to be placed on data used to parameterize population models for species with high growth rates, particularly when available evidence indicates a potential transition to higher risk categories. © 2018 Society for Conservation Biology.
A Brief History of Soil Mapping and Classification in the USA
NASA Astrophysics Data System (ADS)
Brevik, Eric C.; Hartemink, Alfred E.
2014-05-01
Soil maps show the distribution of soils across an area but also depict soil science theory and ideas on soil formation and classification at the time the maps were created. The national soil mapping program in the USA was established in 1899. The first nation-wide soil map was published by M. Whitney in 1909 and showed soil provinces that were largely based on geology. In 1912, G.N. Coffey published the first country-wide map based on soil properties. The map showed 5 broad soil units that used parent material, color and drainage as diagnostic criteria. The 1913 national map was produced by C.F. Marbut, H.H. Bennett, J.E. Lapham, and M.H. Lapham and showed broad physiographic units that were further subdivided into soil series, soil classes and soil types. In 1935, Marbut drafted a series of maps based on soil properties, but these maps were replaced as official U.S. soil maps in 1938 with the work of M. Baldwin, C.E. Kellogg, and J. Thorp. A series of soil maps similar to modern USA maps appeared in the 1960s with the 7th Approximation followed by revisions with the 1975 and 1999 editions of Soil Taxonomy. This review has shown that soil maps in the United States produced since the early 1900s moved initially from a geologic-based concept to a pedologic concept of soils. Later changes were from property-based systems to process-based, and then back to property-based. The information in this presentation is based on Brevik and Hartemink (2013). Brevik, E.C., and A.E. Hartemink. 2013. Soil Maps of the United States of America. Soil Science Society of America Journal 77:1117-1132. doi:10.2136/sssaj2012.0390.
A Global Classification of Contemporary Fire Regimes
NASA Astrophysics Data System (ADS)
Norman, S. P.; Kumar, J.; Hargrove, W. W.; Hoffman, F. M.
2014-12-01
Fire regimes provide a sensitive indicator of changes in climate and human use as the concept includes fire extent, season, frequency, and intensity. Fires that occur outside the distribution of one or more aspects of a fire regime may affect ecosystem resilience. However, global scale data related to these varied aspects of fire regimes are highly inconsistent due to incomplete or inconsistent reporting. In this study, we derive a globally applicable approach to characterizing similar fire regimes using long geophysical time series, namely MODIS hotspots since 2000. K-means non-hierarchical clustering was used to generate empirically based groups that minimized within-cluster variability. Satellite-based fire detections are known to have shortcomings, including under-detection from obscuring smoke, clouds or dense canopy cover and rapid spread rates, as often occurs with flashy fuels or during extreme weather. Such regions are free from preconceptions, and the empirical, data-mining approach used on this relatively uniform data source allows the region structures to emerge from the data themselves. Comparing such an empirical classification to expectations from climate, phenology, land use or development-based models can help us interpret the similarities and differences among places and how they provide different indicators of changes of concern. Classifications can help identify where large infrequent mega-fires are likely to occur ahead of time such as in the boreal forest and portions of the Interior US West, and where fire reports are incomplete such as in less industrial countries.
Classification of Educational Systems in OECD Member Countries: Canada, Greece, Yugoslavia.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France).
The present volume is one of a series intended to provide a comparative view of the educational systems of member countries of the Organisation for Economic Cooperation and Development (OECD). The purpose of the series is to assist OECD member countries in the development of their educational statistics and to provide a basis for the collection of…
ERIC Educational Resources Information Center
Camp, Carole Ann, Ed.
This booklet, one of six in the Living Things Science series, presents activities about diversity and classification of living things which address basic "Benchmarks" suggested by the American Association for the Advancement of Science for the Living Environment for grades 3-5. Contents include background information, vocabulary (in…
Park, Sunjoo; Yi, Hongtao; Feiock, Richard C
2015-12-01
Measuring and tracking the numbers of jobs in solid waste management and recycling industries over time provide basic data to inform decision makers about the important role played by this sector in a state or region's 'green economy'. This study estimates the number of people employed in the solid waste and recycling industry from 1989 through 2011 in the state of Florida (USA), applying a classification scheme based on the Standard Industrial Code (SIC) and utilizing the National Establishment Time Series (NETS) database. The results indicate that solid waste and recycling jobs in the private sector steadily increased from 1989 to 2011, whereas government employment for solid waste management fluctuated over the same period. © The Author(s) 2015.
Carmona-Bayonas, A; Jiménez-Fonseca, P; Virizuela Echaburu, J; Sánchez Cánovas, M; Ayala de la Peña, F
2017-09-01
Since its publication more than 15 years ago, the MASCC score has been internationally validated any number of times and recommended by most clinical practice guidelines for the management of febrile neutropenia (FN) around the world. We have used an empirical data-supported simulated scenario to demonstrate that, despite everything, the MASCC score is impractical as a basis for decision-making. A detailed analysis of reasons supporting the clinical irrelevance of this model is performed. First, seven of its eight variables are "innocent bystanders" that contribute little to selecting low-risk candidates for ambulatory management. Secondly, the training series was hardly representative of outpatients with solid tumors and low-risk FN. Finally, the simultaneous inclusion of key variables both in the model and in the outcome explains its successful validation in various series of patients. Alternative methods of prognostic classification, such as the Clinical Index of Stable Febrile Neutropenia, have been specifically validated for patients with solid tumors and should replace the MASCC model in situations of clinical uncertainty.
NASA Astrophysics Data System (ADS)
Wang, Yawen; Wild, Martin
2017-02-01
During 1990-1993, a nation-wide replacement of the instruments measuring surface solar radiation (SSR) and a restructuring of SSR stations took place in China. Meanwhile, a sudden upward jump was noted in published composite time series of observed SSR records in this period. This study clarifies that about 1/3 of the magnitude of the SSR jump in China was accidentally caused by the abandonment/establishment of 51 stations (˜39% of total) during the period of 1990-1993. The remaining 2/3 of the SSR jump was only caused by 22 stations detected by the methods of the accumulated deviation curve and the Mann-Whitney U test. Out of these 22 stations, about 1/4 of the SSR jump were caused by 6 stations due to natural factors, as similar variations were recorded by sunshine duration. The other 3/4 were caused by the remaining 16 stations as a result of artificial factors such as instrument replacement, changes in the classification or location of stations, or potential operational errors.
NASA Astrophysics Data System (ADS)
Khosravi, Farhad; Trainor, Patrick J.; Lambert, Christopher; Kloecker, Goetz; Wickstrom, Eric; Rai, Shesh N.; Panchapakesan, Balaji
2016-11-01
We demonstrate the rapid and label-free capture of breast cancer cells spiked in blood using nanotube-antibody micro-arrays. 76-element single wall carbon nanotube arrays were manufactured using photo-lithography, metal deposition, and etching techniques. Anti-epithelial cell adhesion molecule (anti-EpCAM), Anti-human epithelial growth factor receptor 2 (anti-Her2) and non-specific IgG antibodies were functionalized to the surface of the nanotube devices using 1-pyrene-butanoic acid succinimidyl ester. Following device functionalization, blood spiked with SKBR3, MCF7 and MCF10A cells (100/1000 cells per 5 μl per device, 170 elements totaling 0.85 ml of whole blood) were adsorbed on to the nanotube device arrays. Electrical signatures were recorded from each device to screen the samples for differences in interaction (specific or non-specific) between samples and devices. A zone classification scheme enabled the classification of all 170 elements in a single map. A kernel-based statistical classifier for the ‘liquid biopsy’ was developed to create a predictive model based on dynamic time warping series to classify device electrical signals that corresponded to plain blood (control) or SKBR3 spiked blood (case) on anti-Her2 functionalized devices with ˜90% sensitivity, and 90% specificity in capture of 1000 SKBR3 breast cancer cells in blood using anti-Her2 functionalized devices. Screened devices that gave positive electrical signatures were confirmed using optical/confocal microscopy to hold spiked cancer cells. Confocal microscopic analysis of devices that were classified to hold spiked blood based on their electrical signatures confirmed the presence of cancer cells through staining for DAPI (nuclei), cytokeratin (cancer cells) and CD45 (hematologic cells) with single cell sensitivity. We report 55%-100% cancer cell capture yield depending on the active device area for blood adsorption with mean of 62% (˜12 500 captured off 20 000 spiked cells in 0.1 ml blood) in this first nanotube-CTC chip study.
A Maple package for improved global mapping forecast
NASA Astrophysics Data System (ADS)
Carli, H.; Duarte, L. G. S.; da Mota, L. A. C. P.
2014-03-01
We present a Maple implementation of the well known global approach to time series analysis and some further developments designed to improve the computational efficiency of the forecasting capabilities of the approach. This global approach can be summarized as being a reconstruction of the phase space, based on a time ordered series of data obtained from the system. After that, using the reconstructed vectors, a portion of this space is used to produce a mapping, a polynomial fitting, through a minimization procedure, that represents the system and can be employed to forecast further entries for the series. In the present implementation, we introduce a set of commands, tools, in order to perform all these tasks. For example, the command VecTS deals mainly with the reconstruction of the vector in the phase space. The command GfiTS deals with producing the minimization and the fitting. ForecasTS uses all these and produces the prediction of the next entries. For the non-standard algorithms, we here present two commands: IforecasTS and NiforecasTS that, respectively deal with the one-step and the N-step forecasting. Finally, we introduce two further tools to aid the forecasting. The commands GfiTS and AnalysTS, basically, perform an analysis of the behavior of each portion of a series regarding the settings used on the commands just mentioned above. Catalogue identifier: AERW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERW_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3001 No. of bytes in distributed program, including test data, etc.: 95018 Distribution format: tar.gz Programming language: Maple 14. Computer: Any capable of running Maple Operating system: Any capable of running Maple. Tested on Windows ME, Windows XP, Windows 7. RAM: 128 MB Classification: 4.3, 4.9, 5 Nature of problem: Time series analysis and improving forecast capability. Solution method: The method of solution is partially based on a result published in [1]. Restrictions: If the time series that is being analyzed presents a great amount of noise or if the dynamical system behind the time series is of high dimensionality (Dim≫3), then the method may not work well. Unusual features: Our implementation can, in the cases where the dynamics behind the time series is given by a system of low dimensionality, greatly improve the forecast. Running time: This depends strongly on the command that is being used. References: [1] Barbosa, L.M.C.R., Duarte, L.G.S., Linhares, C.A. and da Mota, L.A.C.P., Improving the global fitting method on nonlinear time series analysis, Phys. Rev. E 74, 026702 (2006).
Windsor, Richard
2010-01-01
Objectives. We evaluated the impact of a safety training regulation, implemented by the US Department of Labor's Mine Safety and Health Administration (MSHA) in 1999, on injury rates at stone, sand, and gravel mining operations. Methods. We applied a time-series design and analyses with quarterly counts of nonfatal injuries and employment hours from 7998 surface aggregate mines from 1995 through 2006. Covariates included standard industrial classification codes, ownership, and injury severity. Results. Overall crude rates of injuries declined over the 12-year period. Reductions in incident rates for medical treatment only, restricted duty, and lost-time injuries were consistent with temporal trends and provided no evidence of an intervention effect attributable to the MSHA regulation. Rates of permanently disabling injuries (PDIs) declined markedly. Regression analyses documented a statistically significant reduction in the risk rate in the postintervention time period (risk rate = 0.591; 95% confidence interval = 0.529, 0.661). Conclusions. Although a causal relationship between the regulatory intervention and the decline in the rate of PDIs is plausible, inconsistency in the results with the other injury-severity categories preclude attributing the observed outcome to the MSHA regulation. Further analyses of these data are needed. PMID:20466960
Monforton, Celeste; Windsor, Richard
2010-07-01
We evaluated the impact of a safety training regulation, implemented by the US Department of Labor's Mine Safety and Health Administration (MSHA) in 1999, on injury rates at stone, sand, and gravel mining operations. We applied a time-series design and analyses with quarterly counts of nonfatal injuries and employment hours from 7998 surface aggregate mines from 1995 through 2006. Covariates included standard industrial classification codes, ownership, and injury severity. Overall crude rates of injuries declined over the 12-year period. Reductions in incident rates for medical treatment only, restricted duty, and lost-time injuries were consistent with temporal trends and provided no evidence of an intervention effect attributable to the MSHA regulation. Rates of permanently disabling injuries (PDIs) declined markedly. Regression analyses documented a statistically significant reduction in the risk rate in the postintervention time period (risk rate = 0.591; 95% confidence interval = 0.529, 0.661). Although a causal relationship between the regulatory intervention and the decline in the rate of PDIs is plausible, inconsistency in the results with the other injury-severity categories preclude attributing the observed outcome to the MSHA regulation. Further analyses of these data are needed.
Liarokapis, Minas V; Artemiadis, Panagiotis K; Kyriakopoulos, Kostas J; Manolakos, Elias S
2013-09-01
A learning scheme based on random forests is used to discriminate between different reach to grasp movements in 3-D space, based on the myoelectric activity of human muscles of the upper-arm and the forearm. Task specificity for motion decoding is introduced in two different levels: Subspace to move toward and object to be grasped. The discrimination between the different reach to grasp strategies is accomplished with machine learning techniques for classification. The classification decision is then used in order to trigger an EMG-based task-specific motion decoding model. Task specific models manage to outperform "general" models providing better estimation accuracy. Thus, the proposed scheme takes advantage of a framework incorporating both a classifier and a regressor that cooperate advantageously in order to split the task space. The proposed learning scheme can be easily used to a series of EMG-based interfaces that must operate in real time, providing data-driven capabilities for multiclass problems, that occur in everyday life complex environments.
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Seismic waveform classification using deep learning
NASA Astrophysics Data System (ADS)
Kong, Q.; Allen, R. M.
2017-12-01
MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It has an Artificial Neural Network (ANN) algorithm running on the phone to distinguish earthquake motion from human activities recorded by the accelerometer on board. Once the ANN detects earthquake-like motion, it sends a 5-min chunk of acceleration data back to the server for further analysis. The time-series data collected contains both earthquake data and human activity data that the ANN confused. In this presentation, we will show the Convolutional Neural Network (CNN) we built under the umbrella of supervised learning to find out the earthquake waveform. The waveforms of the recorded motion could treat easily as images, and by taking the advantage of the power of CNN processing the images, we achieved very high successful rate to select the earthquake waveforms out. Since there are many non-earthquake waveforms than the earthquake waveforms, we also built an anomaly detection algorithm using the CNN. Both these two methods can be easily extended to other waveform classification problems.
Accelerometer-based on-body sensor localization for health and medical monitoring applications
Vahdatpour, Alireza; Amini, Navid; Xu, Wenyao; Sarrafzadeh, Majid
2011-01-01
In this paper, we present a technique to recognize the position of sensors on the human body. Automatic on-body device localization ensures correctness and accuracy of measurements in health and medical monitoring systems. In addition, it provides opportunities to improve the performance and usability of ubiquitous devices. Our technique uses accelerometers to capture motion data to estimate the location of the device on the user’s body, using mixed supervised and unsupervised time series analysis methods. We have evaluated our technique with extensive experiments on 25 subjects. On average, our technique achieves 89% accuracy in estimating the location of devices on the body. In order to study the feasibility of classification of left limbs from right limbs (e.g., left arm vs. right arm), we performed analysis, based of which no meaningful classification was observed. Personalized ultraviolet monitoring and wireless transmission power control comprise two immediate applications of our on-body device localization approach. Such applications, along with their corresponding feasibility studies, are discussed. PMID:22347840
Multi-source remotely sensed data fusion for improving land cover classification
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Bo; Xu, Bing
2017-02-01
Although many advances have been made in past decades, land cover classification of fine-resolution remotely sensed (RS) data integrating multiple temporal, angular, and spectral features remains limited, and the contribution of different RS features to land cover classification accuracy remains uncertain. We proposed to improve land cover classification accuracy by integrating multi-source RS features through data fusion. We further investigated the effect of different RS features on classification performance. The results of fusing Landsat-8 Operational Land Imager (OLI) data with Moderate Resolution Imaging Spectroradiometer (MODIS), China Environment 1A series (HJ-1A), and Advanced Spaceborne Thermal Emission and Reflection (ASTER) digital elevation model (DEM) data, showed that the fused data integrating temporal, spectral, angular, and topographic features achieved better land cover classification accuracy than the original RS data. Compared with the topographic feature, the temporal and angular features extracted from the fused data played more important roles in classification performance, especially those temporal features containing abundant vegetation growth information, which markedly increased the overall classification accuracy. In addition, the multispectral and hyperspectral fusion successfully discriminated detailed forest types. Our study provides a straightforward strategy for hierarchical land cover classification by making full use of available RS data. All of these methods and findings could be useful for land cover classification at both regional and global scales.
Nenutil, Rudolf
2015-01-01
In 2012, the new classification of the fourth series WHO blue books of breast tumors was released. The current version represents a fluent evolution, compared to the third edition. Some limited changes regarding terminology, definitions and the inclusion of some diagnostic units were adopted. The information about the molecular biology and genetic background of breast carcinoma has been enriched substantially.
ERIC Educational Resources Information Center
Smith, Susan G.
Recent legal decisions, coupled with Federal and State statutes, require the New York City Board of Education to follow definite guidelines in providing equal educational opportunities for its handicapped children. Among the problems raised are those of classification and labeling, criticized by some as a procedure that focuses more on pathology…
1992-03-23
17.- 9. Seri.7rity Classifications Self- p,•rformin(,;l ’,h.’ report ,, ’l .iatory F nto,’ U.S. Securily Classification in Bloc.k 9. % * •’, nrj/Mnnto...the Navy/Marine Corps structure and therefor could be rapidly improved. Though the medical issues addressed here are the result of information obtained...profession has become more complex. Tactical, logistical, and administrative doctrine has changed as rapidly as clinical medicine advances. If physicians and
Waldner, François; Hansen, Matthew C; Potapov, Peter V; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring.
NASA Astrophysics Data System (ADS)
Montereale-Gavazzi, Giacomo; Roche, Marc; Lurton, Xavier; Degrendele, Koen; Terseleer, Nathan; Van Lancker, Vera
2018-06-01
To characterize seafloor substrate type, seabed mapping and particularly multibeam echosounding are increasingly used. Yet, the utilisation of repetitive MBES-borne backscatter surveys to monitor the environmental status of the seafloor remains limited. Often methodological frameworks are missing, and should comprise of a suite of change detection procedures, similarly to those developed in the terrestrial sciences. In this study, pre-, ensemble and post-classification approaches were tested on an eight km2 study site within a Habitat Directive Area in the Belgian part of the North Sea. In this area, gravel beds with epifaunal assemblages were observed. Flourishing of the fauna is constrained by overtopping with sand or increased turbidity levels, which could result from anthropogenic activities. Monitoring of the gravel to sand ratio was hence put forward as an indicator of good environmental status. Seven acoustic surveys were undertaken from 2004 to 2015. The methods allowed quantifying temporal trends and patterns of change of the main substrate classes identified in the study area; namely fine to medium homogenous sand, medium sand with bioclastic detritus and medium to coarse sand with gravel. Results indicated that by considering the entire study area and the entire time series, the gravel to sand ratio fluctuated, but was overall stable. Nonetheless, when only the biodiversity hotspots were considered, net losses and a gradual trend, indicative of potential smothering, was captured by ensemble and post-classification approaches respectively. Additionally, a two-dimensional morphological analysis, based on the bathymetric data, suggested a loss of profile complexity from 2004 to 2015. Causal relationships with natural and anthropogenic stressors are yet to be established. The methodologies presented and discussed are repeatable and can be applied to broad-scale geographical extents given that broad-scale time series datasets become available.
NASA Astrophysics Data System (ADS)
Justice, C. J.
2016-12-01
Eighty percent of Tanzania's population is involved in the agriculture sector. Despite this national dependence, agricultural reporting is minimal and monitoring efforts are in their infancy. The cropland mask developed through this study provides an underpinning for agricultural monitoring by informing analysis of crop conditions, dispersion, and intensity at a national scale. Tanzania is dominated by smallholder agricultural systems with an average field size of less than one hectare. At this field scale, previous classifications of agricultural land in Tanzania using MODIS coarse resolution data are insufficient to inform a working monitoring system. The nation-wide cropland mask in this study was developed using composited Landsat tiles from a 2010-2013 time-series. Decision tree classifier methods were used in the study with representative training areas collected for agriculture and no agriculture using appropriate indices to separate these classes. Validation was undertaken using a random sample and high resolution satellite images to compare agriculture and no agriculture samples from the study area. The cropland mask had high producer and user accuracy in the no agriculture class at 95.0% and 97.35% respectively. There was high producer accuracy in the agriculture class at 80.2% and moderate user accuracy at 67.9%. The principal metrics used for the classification support the theme that agriculture in Tanzania and Sub-Saharan Africa are less vegetated than surrounding areas and most similar to bare ground - emphasizing the need for improved access to inputs and irrigation to enhance productivity and smallholder livelihoods. The techniques used in this study were successful for developing a cropland mask and have the potential to be adapted for other countries, allowing targeted monitoring efforts to improve food security, market price, and inform agricultural policy.
NASA Astrophysics Data System (ADS)
Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.
2013-05-01
Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.
Subbaraju, Vigneshwaran; Suresh, Mahanand Belathur; Sundaram, Suresh; Narasimhan, Sundararajan
2017-01-01
This paper presents a new approach for detecting major differences in brain activities between Autism Spectrum Disorder (ASD) patients and neurotypical subjects using the resting state fMRI. Further the method also extracts discriminative features for an accurate diagnosis of ASD. The proposed approach determines a spatial filter that projects the covariance matrices of the Blood Oxygen Level Dependent (BOLD) time-series signals from both the ASD patients and neurotypical subjects in orthogonal directions such that they are highly separable. The inverse of this filter also provides a spatial pattern map within the brain that highlights those regions responsible for the distinguishable activities between the ASD patients and neurotypical subjects. For a better classification, highly discriminative log-variance features providing the maximum separation between the two classes are extracted from the projected BOLD time-series data. A detailed study has been carried out using the publicly available data from the Autism Brain Imaging Data Exchange (ABIDE) consortium for the different gender and age-groups. The study results indicate that for all the above categories, the regional differences in resting state activities are more commonly found in the right hemisphere compared to the left hemisphere of the brain. Among males, a clear shift in activities to the prefrontal cortex is observed for ASD patients while other parts of the brain show diminished activities compared to neurotypical subjects. Among females, such a clear shift is not evident; however, several regions, especially in the posterior and medial portions of the brain show diminished activities due to ASD. Finally, the classification performance obtained using the log-variance features is found to be better when compared to earlier studies in the literature. Copyright © 2016 Elsevier B.V. All rights reserved.
Hansen, Matthew C.; Potapov, Peter V.; Löw, Fabian; Newby, Terence; Ferreira, Stefanus; Defourny, Pierre
2017-01-01
The lack of sufficient ground truth data has always constrained supervised learning, thereby hindering the generation of up-to-date satellite-derived thematic maps. This is all the more true for those applications requiring frequent updates over large areas such as cropland mapping. Therefore, we present a method enabling the automated production of spatially consistent cropland maps at the national scale, based on spectral-temporal features and outdated land cover information. Following an unsupervised approach, this method extracts reliable calibration pixels based on their labels in the outdated map and their spectral signatures. To ensure spatial consistency and coherence in the map, we first propose to generate seamless input images by normalizing the time series and deriving spectral-temporal features that target salient cropland characteristics. Second, we reduce the spatial variability of the class signatures by stratifying the country and by classifying each stratum independently. Finally, we remove speckle with a weighted majority filter accounting for per-pixel classification confidence. Capitalizing on a wall-to-wall validation data set, the method was tested in South Africa using a 16-year old land cover map and multi-sensor Landsat time series. The overall accuracy of the resulting cropland map reached 92%. A spatially explicit validation revealed large variations across the country and suggests that intensive grain-growing areas were better characterized than smallholder farming systems. Informative features in the classification process vary from one stratum to another but features targeting the minimum of vegetation as well as short-wave infrared features were consistently important throughout the country. Overall, the approach showed potential for routinely delivering consistent cropland maps over large areas as required for operational crop monitoring. PMID:28817618
NASA Astrophysics Data System (ADS)
Zhang, Xiaoping; Pan, Delu; Chen, Jianyu; Zhan, Yuanzeng; Mao, Zhihua
2013-01-01
Islands are an important part of the marine ecosystem. Increasing impervious surfaces in the Zhoushan Islands due to new development and increased population have an ecological impact on the runoff and water quality. Based on time-series classification and the complement of vegetation fraction in urban regions, Landsat thematic mapper and other high-resolution satellite images were applied to monitor the dynamics of impervious surface area (ISA) in the Zhoushan Islands from 1986 to 2011. Landsat-derived ISA results were validated by the high-resolution Worldview-2 and aerial photographs. The validation shows that mean relative errors of these ISA maps are <15 %. The results reveal that the ISA in the Zhoushan Islands increased from 19.2 km2 in 1986 to 86.5 km2 in 2011, and the period from 2006 to 2011 had the fastest expansion rate of 5.59 km2 per year. The major land conversions to high densities of ISA were from the tidal zone and arable lands. The expansions of ISA were unevenly distributed and most of them were located along the periphery of these islands. Time-series maps revealed that ISA expansions happened continuously over the last 25 years. Our analysis indicated that the policy and the topography were the dominant factors controlling the spatial patterns of ISA and its expansions in the Zhoushan Islands. With continuous urbanization processes, the rapid ISA expansions may not be stopped in the near feature.
NASA Astrophysics Data System (ADS)
Amit, S. N. K.; Saito, S.; Sasaki, S.; Kiyoki, Y.; Aoki, Y.
2015-04-01
Google earth with high-resolution imagery basically takes months to process new images before online updates. It is a time consuming and slow process especially for post-disaster application. The objective of this research is to develop a fast and effective method of updating maps by detecting local differences occurred over different time series; where only region with differences will be updated. In our system, aerial images from Massachusetts's road and building open datasets, Saitama district datasets are used as input images. Semantic segmentation is then applied to input images. Semantic segmentation is a pixel-wise classification of images by implementing deep neural network technique. Deep neural network technique is implemented due to being not only efficient in learning highly discriminative image features such as road, buildings etc., but also partially robust to incomplete and poorly registered target maps. Then, aerial images which contain semantic information are stored as database in 5D world map is set as ground truth images. This system is developed to visualise multimedia data in 5 dimensions; 3 dimensions as spatial dimensions, 1 dimension as temporal dimension, and 1 dimension as degenerated dimensions of semantic and colour combination dimension. Next, ground truth images chosen from database in 5D world map and a new aerial image with same spatial information but different time series are compared via difference extraction method. The map will only update where local changes had occurred. Hence, map updating will be cheaper, faster and more effective especially post-disaster application, by leaving unchanged region and only update changed region.
NASA Astrophysics Data System (ADS)
Jiang, Wenqian; Zeng, Bo; Yang, Zhou; Li, Gang
2018-01-01
In the non-invasive load monitoring mode, the load decomposition can reflect the running state of each load, which will help the user reduce unnecessary energy costs. With the demand side management measures of time of using price, a resident load influence analysis method for time of using price (TOU) based on non-intrusive load monitoring data are proposed in the paper. Relying on the current signal of the resident load classification, the user equipment type, and different time series of self-elasticity and cross-elasticity of the situation could be obtained. Through the actual household load data test with the impact of TOU, part of the equipment will be transferred to the working hours, and users in the peak price of electricity has been reduced, and in the electricity at the time of the increase Electrical equipment, with a certain regularity.
Stromatolites of the Belt Series in Glacier National Park and Vicinity, Montana
Rezak, Richard
1957-01-01
Eight zones of Precambrian stromatolites that are useful for local correlation are recognized in the Belt series of the Glacier National Park region, Montana. The zones vary in composition, thickness, and areal extent. Some are widespread and extend into neighboring regions, and others occur only in small areas. Their names are taken from the dominant species that occurs in each zone. The zones are, from youngest to oldest - Conophyton zone 2 Missoula group Collenia symmetrica zone 2 Collenia undosa zone 2 Collenia multiflabella zone Piegan group Conophyton zone 1 Collenia symmetrica zone 1 Collenia undosa zone 1 Ravalli group Collenia frequens zone Only the Conophyton zones have been mapped in the park area. The present study uses a classification based upon the three criteria of (1) mode of growth, (2) gross form of the colony, and (3) nature and orientation of the laminae. The scheme of classification also seems applicable to Paleozoic and later stromatolites. Possibly a consistent pattern of form-genera and form-species may be developed. Four form-genera and seven form-species are recognized in the Belt series of the park region. These are Cryptozoon occidentale Dawson, Collenia undosa Walcott, C. frequens Walcott, C. symmetrica Fenton and Fenton, Newlandia sp., and Conophyton inclinatum n. sp. It is realized that these structures should not be classified according to biological nomenclature. However, biological names are here applied to the structures until a suitable system of classification can be devised. Comparisons of the stromatolites of the Belt series with modern stromatolites on Andros Island, Bahama Islands, and Pleistocene stromatolites from Lake Lahonton, Nev., reveal similarities in structure that appear to be significant as to physical mode of origin.
O'Neill, William; Penn, Richard; Werner, Michael; Thomas, Justin
2015-06-01
Estimation of stochastic process models from data is a common application of time series analysis methods. Such system identification processes are often cast as hypothesis testing exercises whose intent is to estimate model parameters and test them for statistical significance. Ordinary least squares (OLS) regression and the Levenberg-Marquardt algorithm (LMA) have proven invaluable computational tools for models being described by non-homogeneous, linear, stationary, ordinary differential equations. In this paper we extend stochastic model identification to linear, stationary, partial differential equations in two independent variables (2D) and show that OLS and LMA apply equally well to these systems. The method employs an original nonparametric statistic as a test for the significance of estimated parameters. We show gray scale and color images are special cases of 2D systems satisfying a particular autoregressive partial difference equation which estimates an analogous partial differential equation. Several applications to medical image modeling and classification illustrate the method by correctly classifying demented and normal OLS models of axial magnetic resonance brain scans according to subject Mini Mental State Exam (MMSE) scores. Comparison with 13 image classifiers from the literature indicates our classifier is at least 14 times faster than any of them and has a classification accuracy better than all but one. Our modeling method applies to any linear, stationary, partial differential equation and the method is readily extended to 3D whole-organ systems. Further, in addition to being a robust image classifier, estimated image models offer insights into which parameters carry the most diagnostic image information and thereby suggest finer divisions could be made within a class. Image models can be estimated in milliseconds which translate to whole-organ models in seconds; such runtimes could make real-time medicine and surgery modeling possible.
exprso: an R-package for the rapid implementation of machine learning algorithms.
Quinn, Thomas; Tylee, Daniel; Glatt, Stephen
2016-01-01
Machine learning plays a major role in many scientific investigations. However, non-expert programmers may struggle to implement the elaborate pipelines necessary to build highly accurate and generalizable models. We introduce exprso , a new R package that is an intuitive machine learning suite designed specifically for non-expert programmers. Built initially for the classification of high-dimensional data, exprso uses an object-oriented framework to encapsulate a number of common analytical methods into a series of interchangeable modules. This includes modules for feature selection, classification, high-throughput parameter grid-searching, elaborate cross-validation schemes (e.g., Monte Carlo and nested cross-validation), ensemble classification, and prediction. In addition, exprso also supports multi-class classification (through the 1-vs-all generalization of binary classifiers) and the prediction of continuous outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos
The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variablemore » objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.« less
Clustering-based Feature Learning on Variable Stars
NASA Astrophysics Data System (ADS)
Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos
2016-04-01
The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variable objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.
ERIC Educational Resources Information Center
Curtin, Bernadette M.; Hecklinger, Fred J.
As part of a series on career and life planning for adults, this booklet provides a guide to the job market and strategies for choosing a career. Part I suggests the reader list prospective careers and preferred job conditions. Part II helps the reader to categorize careers on the basis of requisite skills, occupational classifications,…
M. Joan Foote
1983-01-01
One hundred thirty forest stands ranging in age from I month postfire to 200 years were sampled and described by successional series (white spruce and black spruce) and by developmental stage (newly burned, moss-herb, tall shrub-sapling, dense tree, hardwood, and spruce). Patterns of change in the two successional series are described. In addition, 12 mature forest...
ERIC Educational Resources Information Center
Zaharevitz, Walter
This booklet, one in a series on aviation careers, outlines the variety of careers in aviation available in federal, state, and local governmental agencies. The first part of the booklet provides general information about civil aviation careers with the federal government, including pay scales, job classifications, and working conditions.…
Bissacco, Alessandro; Chiuso, Alessandro; Soatto, Stefano
2007-11-01
We address the problem of performing decision tasks, and in particular classification and recognition, in the space of dynamical models in order to compare time series of data. Motivated by the application of recognition of human motion in image sequences, we consider a class of models that include linear dynamics, both stable and marginally stable (periodic), both minimum and non-minimum phase, driven by non-Gaussian processes. This requires extending existing learning and system identification algorithms to handle periodic modes and nonminimum phase behavior, while taking into account higher-order statistics of the data. Once a model is identified, we define a kernel-based cord distance between models that includes their dynamics, their initial conditions as well as input distribution. This is made possible by a novel kernel defined between two arbitrary (non-Gaussian) distributions, which is computed by efficiently solving an optimal transport problem. We validate our choice of models, inference algorithm, and distance on the tasks of human motion synthesis (sample paths of the learned models), and recognition (nearest-neighbor classification in the computed distance). However, our work can be applied more broadly where one needs to compare historical data while taking into account periodic trends, non-minimum phase behavior, and non-Gaussian input distributions.
Applicability of Hydrologic Landscapes for Model Calibration ...
The Pacific Northwest Hydrologic Landscapes (PNW HL) at the assessment unit scale has provided a solid conceptual classification framework to relate and transfer hydrologically meaningful information between watersheds without access to streamflow time series. A collection of techniques were applied to the HL assessment unit composition in watersheds across the Pacific Northwest to aggregate the hydrologic behavior of the Hydrologic Landscapes from the assessment unit scale to the watershed scale. This non-trivial solution both emphasizes HL classifications within the watershed that provide that majority of moisture surplus/deficit and considers the relative position (upstream vs. downstream) of these HL classifications. A clustering algorithm was applied to the HL-based characterization of assessment units within 185 watersheds to help organize watersheds into nine classes hypothesized to have similar hydrologic behavior. The HL-based classes were used to organize and describe hydrologic behavior information about watershed classes and both predictions and validations were independently performed with regard to the general magnitude of six hydroclimatic signature values. A second cluster analysis was then performed using the independently calculated signature values as similarity metrics, and it was found that the six signature clusters showed substantial overlap in watershed class membership to those in the HL-based classes. One hypothesis set forward from thi
NASA Astrophysics Data System (ADS)
Niculescu, S.; Ienco, D.; Hanganu, J.
2018-04-01
Land cover is a fundamental variable for regional planning, as well as for the study and understanding of the environment. This work propose a multi-temporal approach relying on a fusion of radar multi-sensor data and information collected by the latest sensor (Sentinel-1) with a view to obtaining better results than traditional image processing techniques. The Danube Delta is the site for this work. The spatial approach relies on new spatial analysis technologies and methodologies: Deep Learning of multi-temporal Sentinel-1. We propose a deep learning network for image classification which exploits the multi-temporal characteristic of Sentinel-1 data. The model we employ is a Gated Recurrent Unit (GRU) Network, a recurrent neural network that explicitly takes into account the time dimension via a gated mechanism to perform the final prediction. The main quality of the GRU network is its ability to consider only the important part of the information coming from the temporal data discarding the irrelevant information via a forgetting mechanism. We propose to use such network structure to classify a series of images Sentinel-1 (20 Sentinel-1 images acquired between 9.10.2014 and 01.04.2016). The results are compared with results of the classification of Random Forest.
Delay Differential Equation Models of Normal and Diseased Electrocardiograms
NASA Astrophysics Data System (ADS)
Lainscsek, Claudia; Sejnowski, Terrence J.
Time series analysis with nonlinear delay differential equations (DDEs) is a powerful tool since it reveals spectral as well as nonlinear properties of the underlying dynamical system. Here global DDE models are used to analyze electrocardiography recordings (ECGs) in order to capture distinguishing features for different heart conditions such as normal heart beat, congestive heart failure, and atrial fibrillation. To capture distinguishing features of the different data types the number of terms and delays in the model as well as the order of nonlinearity of the DDE model have to be selected. The DDE structure selection is done in a supervised way by selecting the DDE that best separates different data types. We analyzed 24 h of data from 15 young healthy subjects in normal sinus rhythm (NSR) of 15 congestive heart failure (CHF) patients as well as of 15 subjects suffering from atrial fibrillation (AF) selected from the Physionet database. For the analysis presented here we used 5 min non-overlapping data windows on the raw data without any artifact removal. For classification performance we used the Cohen Kappa coefficient computed directly from the confusion matrix. The overall classification performance of the three groups was around 72-99 % on the 5 min windows for the different approaches. For 2 h data windows the classification for all three groups was above 95%.
Weakly Supervised Dictionary Learning
NASA Astrophysics Data System (ADS)
You, Zeyu; Raich, Raviv; Fern, Xiaoli Z.; Kim, Jinsub
2018-05-01
We present a probabilistic modeling and inference framework for discriminative analysis dictionary learning under a weak supervision setting. Dictionary learning approaches have been widely used for tasks such as low-level signal denoising and restoration as well as high-level classification tasks, which can be applied to audio and image analysis. Synthesis dictionary learning aims at jointly learning a dictionary and corresponding sparse coefficients to provide accurate data representation. This approach is useful for denoising and signal restoration, but may lead to sub-optimal classification performance. By contrast, analysis dictionary learning provides a transform that maps data to a sparse discriminative representation suitable for classification. We consider the problem of analysis dictionary learning for time-series data under a weak supervision setting in which signals are assigned with a global label instead of an instantaneous label signal. We propose a discriminative probabilistic model that incorporates both label information and sparsity constraints on the underlying latent instantaneous label signal using cardinality control. We present the expectation maximization (EM) procedure for maximum likelihood estimation (MLE) of the proposed model. To facilitate a computationally efficient E-step, we propose both a chain and a novel tree graph reformulation of the graphical model. The performance of the proposed model is demonstrated on both synthetic and real-world data.
Chambon, Stanislas; Galtier, Mathieu N; Arnal, Pierrick J; Wainrib, Gilles; Gramfort, Alexandre
2018-04-01
Sleep stage classification constitutes an important preliminary exam in the diagnosis of sleep disorders. It is traditionally performed by a sleep expert who assigns to each 30 s of the signal of a sleep stage, based on the visual inspection of signals such as electroencephalograms (EEGs), electrooculograms (EOGs), electrocardiograms, and electromyograms (EMGs). We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting handcrafted features, that exploits all multivariate and multimodal polysomnography (PSG) signals (EEG, EMG, and EOG), and that can exploit the temporal context of each 30-s window of data. For each modality, the first layer learns linear spatial filters that exploit the array of sensors to increase the signal-to-noise ratio, and the last layer feeds the learnt representation to a softmax classifier. Our model is compared to alternative automatic approaches based on convolutional networks or decisions trees. Results obtained on 61 publicly available PSG records with up to 20 EEG channels demonstrate that our network architecture yields the state-of-the-art performance. Our study reveals a number of insights on the spatiotemporal distribution of the signal of interest: a good tradeoff for optimal classification performance measured with balanced accuracy is to use 6 EEG with 2 EOG (left and right) and 3 EMG chin channels. Also exploiting 1 min of data before and after each data segment offers the strongest improvement when a limited number of channels are available. As sleep experts, our system exploits the multivariate and multimodal nature of PSG signals in order to deliver the state-of-the-art classification performance with a small computational cost.
NASA Astrophysics Data System (ADS)
Wood, N. J.; Jones, J.; Spielman, S.
2013-12-01
Near-field tsunami hazards are credible threats to many coastal communities throughout the world. Along the U.S. Pacific Northwest coast, low-lying areas could be inundated by a series of catastrophic tsunami waves that begin to arrive in a matter of minutes following a Cascadia subduction zone (CSZ) earthquake. This presentation summarizes analytical efforts to classify communities with similar characteristics of community vulnerability to tsunami hazards. This work builds on past State-focused inventories of community exposure to CSZ-related tsunami hazards in northern California, Oregon, and Washington. Attributes used in the classification, or cluster analysis, include demography of residents, spatial extent of the developed footprint based on mid-resolution land cover data, distribution of the local workforce, and the number and type of public venues, dependent-care facilities, and community-support businesses. Population distributions also are characterized by a function of travel time to safety, based on anisotropic, path-distance, geospatial modeling. We used an unsupervised-model-based clustering algorithm and a v-fold, cross-validation procedure (v=50) to identify the appropriate number of community types. We selected class solutions that provided the appropriate balance between parsimony and model fit. The goal of the vulnerability classification is to provide emergency managers with a general sense of the types of communities in tsunami hazard zones based on similar characteristics instead of only providing an exhaustive list of attributes for individual communities. This classification scheme can be then used to target and prioritize risk-reduction efforts that address common issues across multiple communities. The presentation will include a discussion of the utility of proposed place classifications to support regional preparedness and outreach efforts.
Using self-organizing maps to develop ambient air quality classifications: a time series example
2014-01-01
Background Development of exposure metrics that capture features of the multipollutant environment are needed to investigate health effects of pollutant mixtures. This is a complex problem that requires development of new methodologies. Objective Present a self-organizing map (SOM) framework for creating ambient air quality classifications that group days with similar multipollutant profiles. Methods Eight years of day-level data from Atlanta, GA, for ten ambient air pollutants collected at a central monitor location were classified using SOM into a set of day types based on their day-level multipollutant profiles. We present strategies for using SOM to develop a multipollutant metric of air quality and compare results with more traditional techniques. Results Our analysis found that 16 types of days reasonably describe the day-level multipollutant combinations that appear most frequently in our data. Multipollutant day types ranged from conditions when all pollutants measured low to days exhibiting relatively high concentrations for either primary or secondary pollutants or both. The temporal nature of class assignments indicated substantial heterogeneity in day type frequency distributions (~1%-14%), relatively short-term durations (<2 day persistence), and long-term and seasonal trends. Meteorological summaries revealed strong day type weather dependencies and pollutant concentration summaries provided interesting scenarios for further investigation. Comparison with traditional methods found SOM produced similar classifications with added insight regarding between-class relationships. Conclusion We find SOM to be an attractive framework for developing ambient air quality classification because the approach eases interpretation of results by allowing users to visualize classifications on an organized map. The presented approach provides an appealing tool for developing multipollutant metrics of air quality that can be used to support multipollutant health studies. PMID:24990361
NASA Astrophysics Data System (ADS)
Rapp, J. R.; Deines, J. M.; Kendall, A. D.; Hyndman, D. W.
2017-12-01
The High Plains Aquifer (HPA) is the most extensively irrigated aquifer in the continental United States and is the largest major aquifer in North America with an area of 500,000 km2. Increased demand for agricultural products has led to expanded irrigation extent, but brought with it declining groundwater levels that have made irrigation unsustainable in some locations. Understanding these irrigation dynamics and mapping irrigated areas through time are essential for future sustainable agricultural practices and hydrological modeling. Map products using remote sensing have only recently been able to track annual dynamics at relatively high spatial resolution (30 m) for a large portion of the northern HPA. However follow-on efforts to expand these maps to the entire HPA have met with difficulty due to the challenge of distinguishing irrigation in crop types that are commonly deficit- or partially-irrigated. Expanding these maps to the full HPA requires addressing unique features of partially irrigated fields and irrigated cotton, a major water user in the southern HPA. Working in Google Earth Engine, we used all available Landsat imagery to generate annual time series of vegetation indices. We combined this information with climate covariables, planting dates, and crop specific training data to algorithmically separate fully irrigated, partially irrigated, and non-irrigated field locations. The classification scheme was then applied to produce annual maps of irrigation across the entire HPA. The extensive use of ancillary data and the "greenness" time series for the algorithmic classification generally increased accuracy relative to previous efforts. High-accuracy, representative map products of irrigation extent capable of detecting crop type and irrigation intensity within aquifers will be an essential tool to monitor the sustainability of global aquifers and to provide a scientific bases for political and economic decisions affecting those aquifers.
Electrocardiogram ST-Segment Morphology Delineation Method Using Orthogonal Transformations
2016-01-01
Differentiation between ischaemic and non-ischaemic transient ST segment events of long term ambulatory electrocardiograms is a persisting weakness in present ischaemia detection systems. Traditional ST segment level measuring is not a sufficiently precise technique due to the single point of measurement and severe noise which is often present. We developed a robust noise resistant orthogonal-transformation based delineation method, which allows tracing the shape of transient ST segment morphology changes from the entire ST segment in terms of diagnostic and morphologic feature-vector time series, and also allows further analysis. For these purposes, we developed a new Legendre Polynomials based Transformation (LPT) of ST segment. Its basis functions have similar shapes to typical transient changes of ST segment morphology categories during myocardial ischaemia (level, slope and scooping), thus providing direct insight into the types of time domain morphology changes through the LPT feature-vector space. We also generated new Karhunen and Lo ève Transformation (KLT) ST segment basis functions using a robust covariance matrix constructed from the ST segment pattern vectors derived from the Long Term ST Database (LTST DB). As for the delineation of significant transient ischaemic and non-ischaemic ST segment episodes, we present a study on the representation of transient ST segment morphology categories, and an evaluation study on the classification power of the KLT- and LPT-based feature vectors to classify between ischaemic and non-ischaemic ST segment episodes of the LTST DB. Classification accuracy using the KLT and LPT feature vectors was 90% and 82%, respectively, when using the k-Nearest Neighbors (k = 3) classifier and 10-fold cross-validation. New sets of feature-vector time series for both transformations were derived for the records of the LTST DB which is freely available on the PhysioNet website and were contributed to the LTST DB. The KLT and LPT present new possibilities for human-expert diagnostics, and for automated ischaemia detection. PMID:26863140
Classifying Urban Space Types of Seoul using Time-series Heat Island map
NASA Astrophysics Data System (ADS)
Jung, S.; KIM, H.; JE, M.
2017-12-01
In August 2016, the hottest heat occurred in Korea since the weather observation started in Korea. Due to climate changes, this heat phenomenon is expected to be severe more in the future. Thus, this study analyzed the heatwave occurred in 2016 with regard to Seoul from various angles to identify the characteristics of urban regions where the heat island phenomenon occurred. To do this, first, temperature data for two days on August 6 and 12 in 2016 when the hottest heatwave occurred were collected from 287 places of automatic weather stations (AWS) installed in Seoul and adjacent suburbs. The temperature distribution of Seoul was mapped using interpolation in every hour using the collected temperature data. Second, regions in Seoul were classified using statistical methods based on spatial characteristics such as land coverage, density, use type, and traffic volume in Seoul. Third, a daily pattern of change in temperature in the classified regions was depicted with a graph, and regions were re-classified based on the daily pattern of change in temperature. Finally, the characteristics of the classified regions were re-reviewed and then, heat island occurrence, continuation, and reduction measure by region type were discussed. The analysis results showed that a pattern of heatwave occurrence was exhibited differently by the classified region type. The results also showed that not only physical characteristics such as land coverage but also socioeconomic index such as population density and floating population that induced a traffic volume influenced the pattern of heatwave occurrence despite of the same land usage regions. This study not only classified urban climate regions by existing mean temperature and specific time-point temperature but also proposed a methodology that analyzed heat island phenomenon inside cities by using time-series temperature data in a day. Furthermore, this study enabled regional classification based on heat island characteristics to contribute to establishment of measure for each regional classification.
Nutrient trends through time in Sweden's Baltic Drainage Area
NASA Astrophysics Data System (ADS)
Fischer, I.; Destouni, G.; Prieto, C.
2015-12-01
Changes in climate and land-use have and will continue to modify regional hydrology, in turn impacting environmental health, agricultural productivity and water resource quality and availability. The Baltic region is an area of interest as the coast spans nine countries- serving over 100 million people. The Baltic Sea contains one of the largest human caused hypoxic dead zones due to eutrophication driven by anthropogenic excess loading of nutrients. Policies to reduce these loads include also international directives and agreements, such as the EU Water Framework Directive, adopted in 2000 to protect and improve water quality throughout the European Union, and the Baltic Sea Action Plan under the Helsinki Commission aimed specifically at reducing the nutrient loading to and mitigating the eutrophication of the Baltic Sea. In light of these policies and amidst the number of studies on the Baltic Sea we ask, using the accessible nutrient and discharge data what does nutrient loading look like today? Are the most excessive loads going down? Observed nutrient and flow time series across Sweden allow for answering these questions, by spatial and temporal trend analysis of loads from various parts of Sweden to the Baltic Sea. Analyzing these observed time series in conjunction with the ecological health status classifications of the EU Water Framework Directive, allows in particular for answering the question if the loads into the water bodies with the poorest water quality, and from those to the Baltic Sea, are improving, being maintained or deteriorating. Such insight is required to contribute to relevant and efficient water and nutrient load management. Furthermore, empirically calculating nutrient loads, rather than only modeling, reveals that the water body health classification may not reflect what water bodies actually contribute the heaviest loads to the Baltic Sea. This work also underscores the importance of comprehensive analysis of all available data from long term monitoring programs over large spatial scales, including large water quality gradients, in order to assess and address water management problems of today and the future.
Asteroid proper elements and secular resonances
NASA Technical Reports Server (NTRS)
Knezevic, Zoran; Milani, Andrea
1992-01-01
In a series of papers (e.g., Knezevic, 1991; Milani and Knezevic, 1990; 1991) we reported on the progress we were making in computing asteroid proper elements, both as regards their accuracy and long-term stability. Additionally, we reported on the efficiency and 'intelligence' of our software. At the same time, we studied the associated problems of resonance effects, and we introduced the new class of 'nonlinear' secular resonances; we determined the locations of these secular resonances in proper-element phase space and analyzed their impact on the asteroid family classification. Here we would like to summarize the current status of our work and possible further developments.
Machine learning methods for classifying human physical activity from on-body accelerometers.
Mannini, Andrea; Sabatini, Angelo Maria
2010-01-01
The use of on-body wearable sensors is widespread in several academic and industrial domains. Of great interest are their applications in ambulatory monitoring and pervasive computing systems; here, some quantitative analysis of human motion and its automatic classification are the main computational tasks to be pursued. In this paper, we discuss how human physical activity can be classified using on-body accelerometers, with a major emphasis devoted to the computational algorithms employed for this purpose. In particular, we motivate our current interest for classifiers based on Hidden Markov Models (HMMs). An example is illustrated and discussed by analysing a dataset of accelerometer time series.
NASA Astrophysics Data System (ADS)
Wardlow, Brian Douglas
The objectives of this research were to: (1) investigate time-series MODIS (Moderate Resolution Imaging Spectroradiometer) 250-meter EVI (Enhanced Vegetation Index) and NDVI (Normalized Difference Vegetation Index) data for regional-scale crop-related land use/land cover characterization in the U.S. Central Great Plains and (2) develop and test a MODIS-based crop mapping protocol. A pixel-level analysis of the time-series MODIS 250-m VIs for 2,000+ field sites across Kansas found that unique spectral-temporal signatures were detected for the region's major crop types, consistent with the crops' phenology. Intra-class variations were detected in VI data associated with different land use practices, climatic conditions, and planting dates for the crops. The VIs depicted similar seasonal variations and were highly correlated. A pilot study in southwest Kansas found that accurate and detailed cropping patterns could be mapped using the MODIS 250-m VI data. Overall and class-specific accuracies were generally greater than 90% for mapping crop/non-crop, general crops (alfalfa, summer crops, winter wheat, and fallow), summer crops (corn, sorghum, and soybeans), and irrigated/non-irrigated crops using either VI dataset. The classified crop areas also had a high level of agreement (<5% difference) with the USDA reported crop areas. Both VIs produced comparable crop maps with only a 1-2% difference between their classification accuracies and a high level of pixel-level agreement (>90%) between their classified crop patterns. This hierarchical crop mapping protocol was tested for Kansas and produced similar classification results over a larger and more diverse area. Overall and class-specific accuracies were typically between 85% and 95% for the crop maps. At the state level, the maps had a high level of areal agreement (<5% difference) with the USDA crop area figures and their classified patterns were consistent with the state's cropping practices. In general, the protocol's performance was relatively consistent across the state's range of environmental conditions, landscape patterns, and cropping practices. The largest areal differences occurred in eastern Kansas due to the omission of many small cropland areas that were not resolvable at MODIS' 250-m resolution. Notable regional deviations in classified areas also occurred for selected classes due to localized precipitation patterns and specific cropping practices.
Silicon/Carbon Anodes with One-Dimensional Pore Structure for Lithium-Ion Batteries
2012-02-28
REPORT Silicon/Carbon Anodes with One-Dimensional Pore Structure for Lithium - Ion Batteries 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: A series of...Dimensional Pore Structure for Lithium - Ion Batteries Report Title ABSTRACT A series of composite electrode materials have been synthesized and...1 Silicon/Carbon Anodes with One-Dimensional Pore Structure for Lithium - Ion Batteries Grant # W911NF1110231 Annual Progress report June
Automated time series forecasting for biosurveillance.
Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit
2007-09-30
For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.
Classification of full-thickness rotator cuff lesions: a review
Lädermann, Alexandre; Burkhart, Stephen S.; Hoffmeyer, Pierre; Neyton, Lionel; Collin, Philippe; Yates, Evan; Denard, Patrick J.
2016-01-01
Rotator cuff lesions (RCL) have considerable variability in location, tear pattern, functional impairment, and repairability. Historical classifications for differentiating these lesions have been based upon factors such as the size and shape of the tear, and the degree of atrophy and fatty infiltration. Additional recent descriptions include bipolar rotator cuff insufficiency, ‘Fosbury flop tears’, and musculotendinous lesions. Recommended treatment is based on the location of the lesion, patient factors and associated pathology, and often includes personal experience and data from case series. Development of a more comprehensive classification which integrates historical and newer descriptions of RCLs may help to guide treatment further. Cite this article: Lädermann A, Burkhart SS, Hoffmeyer P, et al. Classification of full thickness rotator cuff lesions: a review. EFORT Open Rev 2016;1:420-430. DOI: 10.1302/2058-5241.1.160005. PMID:28461921
Mapping Brazilian savanna vegetation gradients with Landsat time series
NASA Astrophysics Data System (ADS)
Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick
2016-10-01
Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important phenological parameters for monitoring habitats or ecosystem responses to climate change. The open Landsat and Sentinel-2 archives provide the satellite data needed for improved analyses of savanna ecosystems globally.
Structural health monitoring feature design by genetic programming
NASA Astrophysics Data System (ADS)
Harvey, Dustin Y.; Todd, Michael D.
2014-09-01
Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.
McDermott, P A; Hale, R L
1982-07-01
Tested diagnostic classifications of child psychopathology produced by a computerized technique known as multidimensional actuarial classification (MAC) against the criterion of expert psychological opinion. The MAC program applies series of statistical decision rules to assess the importance of and relationships among several dimensions of classification, i.e., intellectual functioning, academic achievement, adaptive behavior, and social and behavioral adjustment, to perform differential diagnosis of children's mental retardation, specific learning disabilities, behavioral and emotional disturbance, possible communication or perceptual-motor impairment, and academic under- and overachievement in reading and mathematics. Classifications rendered by MAC are compared to those offered by two expert child psychologists for cases of 73 children referred for psychological services. Experts' agreement with MAC was significant for all classification areas, as was MAC's agreement with the experts held as a conjoint reference standard. Whereas the experts' agreement with MAC averaged 86.0% above chance, their agreement with one another averaged 76.5% above chance. Implications of the findings are explored and potential advantages of the systems-actuarial approach are discussed.
Automatic Classification of Medical Text: The Influence of Publication Form1
Cole, William G.; Michael, Patricia A.; Stewart, James G.; Blois, Marsden S.
1988-01-01
Previous research has shown that within the domain of medical journal abstracts the statistical distribution of words is neither random nor uniform, but is highly characteristic. Many words are used mainly or solely by one medical specialty or when writing about one particular level of description. Due to this regularity of usage, automatic classification within journal abstracts has proved quite successful. The present research asks two further questions. It investigates whether this statistical regularity and automatic classification success can also be achieved in medical textbook chapters. It then goes on to see whether the statistical distribution found in textbooks is sufficiently similar to that found in abstracts to permit accurate classification of abstracts based solely on previous knowledge of textbooks. 14 textbook chapters and 45 MEDLINE abstracts were submitted to an automatic classification program that had been trained only on chapters drawn from a standard textbook series. Statistical analysis of the properties of abstracts vs. chapters revealed important differences in word use. Automatic classification performance was good for chapters, but poor for abstracts.
Evidence for Chaotic Edge Turbulence in the Alcator C-Mod Tokamak
NASA Astrophysics Data System (ADS)
Zhu, Ziyan; White, Anne; Carter, Troy; Terry, Jim; Baek, Seung Gyou
2017-10-01
Turbulence greatly reduces the confinement time of magnetic-confined plasmas; understanding the nature of this turbulence and the associated transport is therefore of great importance. This research seeks to establish whether turbulent fluctuations in Alcator C-Mod are chaotic or stochastic. Stochastic fluctuations may lead to a random walk diffusive transport, whereas a diffusive description is unlikely to be valid for chaotic fluctuations since it lives in restricted areas of phase space (e.g., on attractors). Analysis of the time series obtained with the O-mode reflectometer and the gas puff imaging (GPI) systems reveals that the turbulent density fluctuations in C-Mod are chaotic. Supporting evidence for this conclusion includes the observation of an exponential power spectra (which is associated with Lorentzian-shaped pulses in the time series), the population of the corresponding Bandt-Pompe (BP) probability distribution, and the location of the signal on the Complexity-Entropy plane (C-H plane). These analysis techniques will be briefly introduced along with a discussion of the analysis results. The classification of edge turbulence as chaotic opens the door for further work to understand the underlying process and the impact on turbulent transport. Supported by USDoE awards DE-FC02-99ER54512 and DE-FC02-07ER54918:011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-11-24
This proposed action provides the Department of Energy (DOE) authorization to the US Army to conduct a testing program using Depleted Uranium (DU) in Area 25 at the Nevada Test Site (NTS). The US Army Ballistic Research Laboratory (BRL) would be the managing agency for the program. The proposed action site would utilize existing facilities, and human activity would be confined to areas identified as having no tortoise activity. Two classifications of tests would be conducted under the testing program: (1) open-air tests, and (2) X-Tunnel tests. A series of investigative tests would be conducted to obtain information on DUmore » use under the conditions of each classification. The open-air tests would include DU ammunition hazard classification and combat systems activity tests. Upon completion of each test or series of tests, the area would be decontaminated to meet requirements of DOE Order 5400.5, Radiation Protection of the Public and Environment. All contaminated materials would be decontaminated or disposed of as radioactive waste in an approved low-level Radioactive Waste Management Site (RWMS) by personnel trained specifically for this purpose.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-11-24
This proposed action provides the Department of Energy (DOE) authorization to the US Army to conduct a testing program using Depleted Uranium (DU) in Area 25 at the Nevada Test Site (NTS). The US Army Ballistic Research Laboratory (BRL) would be the managing agency for the program. The proposed action site would utilize existing facilities, and human activity would be confined to areas identified as having no tortoise activity. Two classifications of tests would be conducted under the testing program: (1) open-air tests, and (2) X-Tunnel tests. A series of investigative tests would be conducted to obtain information on DUmore » use under the conditions of each classification. The open-air tests would include DU ammunition hazard classification and combat systems activity tests. Upon completion of each test or series of tests, the area would be decontaminated to meet requirements of DOE Order 5400.5, Radiation Protection of the Public and Environment. All contaminated materials would be decontaminated or disposed of as radioactive waste in an approved low-level Radioactive Waste Management Site (RWMS) by personnel trained specifically for this purpose.« less
Visualization of system dynamics using phasegrams
Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh
2013-01-01
A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715
Constructions and classifications of projective Poisson varieties.
Pym, Brent
2018-01-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
Constructions and classifications of projective Poisson varieties
NASA Astrophysics Data System (ADS)
Pym, Brent
2018-03-01
This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.
Sea ice type dynamics in the Arctic based on Sentinel-1 Data
NASA Astrophysics Data System (ADS)
Babiker, Mohamed; Korosov, Anton; Park, Jeong-Won
2017-04-01
Sea ice observation from satellites has been carried out for more than four decades and is one of the most important applications of EO data in operational monitoring as well as in climate change studies. Several sensors and retrieval methods have been developed and successfully utilized to measure sea ice area, concentration, drift, type, thickness, etc [e.g. Breivik et al., 2009]. Today operational sea ice monitoring and analysis is fully dependent on use of satellite data. However, new and improved satellite systems, such as multi-polarisation Synthetic Apperture Radar (SAR), require further studies to develop more advanced and automated sea ice monitoring methods. In addition, the unprecedented volume of data available from recently launched Sentinel missions provides both challenges and opportunities for studying sea ice dynamics. In this study we investigate sea ice type dynamics in the Fram strait based on Sentinel-1 A, B SAR data. Series of images for the winter season are classified into 4 ice types (young ice, first year ice, multiyear ice and leads) using the new algorithm developed by us for sea ice classification, which is based on segmentation, GLCM calculation, Haralick texture feature extraction, unsupervised and supervised classifications and Support Vector Machine (SVM) [Zakhvatkina et al., 2016; Korosov et al., 2016]. This algorithm is further improved by applying thermal and scalloping noise removal [Park et al. 2016]. Sea ice drift is retrieved from the same series of Sentinel-1 images using the newly developed algorithm based on combination of feature tracking and pattern matching [Mukenhuber et al., 2016]. Time series of these two products (sea ice type and sea ice drift) are combined in order to study sea ice deformation processes at small scales. Zones of sea ice convergence and divergence identified from sea ice drift are compared with ridges and leads identified from texture features. That allows more specific interpretation of SAR imagery and more accurate automatic classification. In addition, the map of four ice types calculated using the texture features from one SAR image is propagated forward using the sea ice drift vectors. The propagated ice type is compared with ice type derived from the next image. The comparison identifies changes in ice type which occurred during drift and allows to reduce uncertainties in sea ice type calculation.
Support vector machines for TEC seismo-ionospheric anomalies detection
NASA Astrophysics Data System (ADS)
Akhoondzadeh, M.
2013-02-01
Using time series prediction methods, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the predefined threshold value. Support Vector Machines (SVMs) are widely used due to their many advantages for classification and regression tasks. This study is concerned with investigating the Total Electron Content (TEC) time series by using a SVM to detect seismo-ionospheric anomalous variations induced by the three powerful earthquakes of Tohoku (11 March 2011), Haiti (12 January 2010) and Samoa (29 September 2009). The duration of TEC time series dataset is 49, 46 and 71 days, for Tohoku, Haiti and Samoa earthquakes, respectively, with each at time resolution of 2 h. In the case of Tohoku earthquake, the results show that the difference between the predicted value obtained from the SVM method and the observed value reaches the maximum value (i.e., 129.31 TECU) at earthquake time in a period of high geomagnetic activities. The SVM method detected a considerable number of anomalous occurrences 1 and 2 days prior to the Haiti earthquake and also 1 and 5 days before the Samoa earthquake in a period of low geomagnetic activities. In order to show that the method is acting sensibly with regard to the results extracted during nonevent and event TEC data, i.e., to perform some null-hypothesis tests in which the methods would also be calibrated, the same period of data from the previous year of the Samoa earthquake date has been taken into the account. Further to this, in this study, the detected TEC anomalies using the SVM method were compared to the previous results (Akhoondzadeh and Saradjian, 2011; Akhoondzadeh, 2012) obtained from the mean, median, wavelet and Kalman filter methods. The SVM detected anomalies are similar to those detected using the previous methods. It can be concluded that SVM can be a suitable learning method to detect the novelty changes of a nonlinear time series such as variations of earthquake precursors.
NASA Astrophysics Data System (ADS)
Pouclet, A.
1980-09-01
The lavas of the Central Africa rift (Western rift) are distributed in three groups with increasing alkalinity. The petrographical and chemical data give a classification of seven series: one series of alkaline-basalts in the first weakly alkaline group, two basanitic, sodic or potassic, series in the second fairly alkaline group, and four nephelinitic, melilitic, perpotassic or carbonatitic series in the third strongly alkaline group. The definitions of all these lavas are reviewed. We propose a simplified terminology with, in particular, a K-lavas’ nomenclature parallel to the Na-lavas’ one and a division using the DI of Thornton and Tuttle (1960).
NASA Astrophysics Data System (ADS)
Ford, R. E.
2006-12-01
In 2006 the Loma Linda University ESSE21 Mesoamerican Project (Earth System Science Education for the 21st Century) along with partners such as the University of Redlands and California State University, Pomona, produced an online learning module that is designed to help students learn critical remote sensing skills-- specifically: ecosystem characterization, i.e. doing a supervised or unsupervised classification of satellite imagery in a tropical coastal environment. And, it would teach how to measure land use / land cover change (LULC) over time and then encourage students to use that data to assess the Human Dimensions of Global Change (HDGC). Specific objectives include: 1. Learn where to find remote sensing data and practice downloading, pre-processing, and "cleaning" the data for image analysis. 2. Use Leica-Geosystems ERDAS Imagine or IDRISI Kilimanjaro to analyze and display the data. 3. Do an unsupervised classification of a LANDSAT image of a protected area in Honduras, i.e. Cuero y Salado, Pico Bonito, or Isla del Tigre. 4. Virtually participate in a ground-validation exercise that would allow one to re-classify the image into a supervised classification using the FAO Global Land Cover Network (GLCN) classification system. 5. Learn more about each protected area's landscape, history, livelihood patterns and "sustainability" issues via virtual online tours that provide ground and space photos of different sites. This will help students in identifying potential "training sites" for doing a supervised classification. 6. Study other global, US, Canadian, and European land use/land cover classification systems and compare their advantages and disadvantages over the FAO/GLCN system. 7. Learn to appreciate the advantages and disadvantages of existing LULC classification schemes and adapt them to local-level user needs. 8. Carry out a change detection exercise that shows how land use and/or land cover has changed over time for the protected area of your choice. The presenter will demonstrate the module, assess the collaborative process which created it, and describe how it has been used so far by users in the US as well as in Honduras and elsewhere via a series joint workshops held in Mesoamerica. Suggestions for improvement will be requested. See the module and related content resources at: http://resweb.llu.edu/rford/ESSE21/LUCCModule/
NASA Astrophysics Data System (ADS)
Behling, Robert; Milewski, Robert; Chabrillat, Sabine
2018-06-01
This paper proposes the remote sensing time series approach WLMO (Water-Land MOnitor) to monitor spatiotemporal shoreline changes. The approach uses a hierarchical classification system based on temporal MNDWI-trajectories with the goal to accommodate typical uncertainties in remote sensing shoreline extraction techniques such as existence of clouds and geometric mismatches between images. Applied to a dense Landsat time series between 1984 and 2014 for the two Namibian coastal lagoons at Walvis Bay and Sandwich Harbour the WLMO was able to identify detailed accretion and erosion progressions at the sand spits forming these lagoons. For both lagoons a northward expansion of the sand spits of up to 1000 m was identified, which corresponds well with the prevailing northwards directed ocean current and wind processes that are responsible for the material transport along the shore. At Walvis Bay we could also show that in the 30 years of analysis the sand spit's width has decreased by more than a half from 750 m in 1984-360 m in 2014. This ongoing cross-shore erosion process is a severe risk for future sand spit breaching, which would expose parts of the lagoon and the city to the open ocean. One of the major advantages of WLMO is the opportunity to analyze detailed spatiotemporal shoreline changes. Thus, it could be shown that the observed long-term accretion and erosion processes underwent great variations over time and cannot a priori be assumed as linear processes. Such detailed spatiotemporal process patterns are a prerequisite to improve the understanding of the processes forming the Namibian shorelines. Moreover, the approach has also the potential to be used in other coastal areas, because the focus on MNDWI-trajectories allows the transfer to many multispectral satellite sensors (e.g. Sentinel-2, ASTER) available worldwide.
Lalys, Florent; Riffaud, Laurent; Bouget, David; Jannin, Pierre
2012-01-01
The need for a better integration of the new generation of Computer-Assisted-Surgical (CAS) systems has been recently emphasized. One necessity to achieve this objective is to retrieve data from the Operating Room (OR) with different sensors, then to derive models from these data. Recently, the use of videos from cameras in the OR has demonstrated its efficiency. In this paper, we propose a framework to assist in the development of systems for the automatic recognition of high level surgical tasks using microscope videos analysis. We validated its use on cataract procedures. The idea is to combine state-of-the-art computer vision techniques with time series analysis. The first step of the framework consisted in the definition of several visual cues for extracting semantic information, therefore characterizing each frame of the video. Five different pieces of image-based classifiers were therefore implemented. A step of pupil segmentation was also applied for dedicated visual cue detection. Time series classification algorithms were then applied to model time-varying data. Dynamic Time Warping (DTW) and Hidden Markov Models (HMM) were tested. This association combined the advantages of all methods for better understanding of the problem. The framework was finally validated through various studies. Six binary visual cues were chosen along with 12 phases to detect, obtaining accuracies of 94%. PMID:22203700
ERIC Educational Resources Information Center
Roberts, Douglas A.; And Others
This manual is one of a series designed to assist junior high school teachers in developing general level or non-academic science programs which focus on the relationship between science and society. Although designed primarily for grades 7 and 8, the content is also suitable for students in grade 6. The major portion of the manual consists of six…
Fell, D B; Sprague, A E; Grimshaw, J M; Yasseen, A S; Coyle, D; Dunn, S I; Perkins, S L; Peterson, W E; Johnson, M; Bunting, P S; Walker, M C
2014-03-01
To determine the impact of a health system-wide fetal fibronectin (fFN) testing programme on the rates of hospital admission for preterm labour (PTL). Multiple baseline time-series design. Canadian province of Ontario. A retrospective population-based cohort of antepartum and delivered obstetrical admissions in all Ontario hospitals between 1 April 2002 and 31 March 2010. International Classification of Diseases codes in a health system-wide hospital administrative database were used to identify the study population and define the outcome measure. An aggregate time series of monthly rates of hospital admissions for PTL was analysed using segmented regression models after aligning the fFN test implementation date for each institution. Rate of obstetrical hospital admission for PTL. Estimated rates of hospital admission for PTL following fFN implementation were lower than predicted had pre-implementation trends prevailed. The reduction in the rate was modest, but statistically significant, when estimated at 12 months following fFN implementation (-0.96 hospital admissions for PTL per 100 preterm births; 95% confidence interval [CI], -1.02 to -0.90, P = 0.04). The statistically significant reduction was sustained at 24 and 36 months following implementation. Using a robust quasi-experimental study design to overcome confounding as a result of underlying secular trends or concurrent interventions, we found evidence of a small but statistically significant reduction in the health system-level rate of hospital admissions for PTL following implementation of fFN testing in a large Canadian province. © 2013 Royal College of Obstetricians and Gynaecologists.
NASA Astrophysics Data System (ADS)
Wijesingha, J. S. J.; Deshapriya, N. L.; Samarakoon, L.
2015-04-01
Billions of people in the world depend on rice as a staple food and as an income-generating crop. Asia is the leader in rice cultivation and it is necessary to maintain an up-to-date rice-related database to ensure food security as well as economic development. This study investigates general applicability of high temporal resolution Moderate Resolution Imaging Spectroradiometer (MODIS) 250m gridded vegetation product for monitoring rice crop growth, mapping rice crop acreage and analyzing crop yield, at the province-level. The MODIS 250m Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) time series data, field data and crop calendar information were utilized in this research in Sa Kaeo Province, Thailand. The following methodology was used: (1) data pre-processing and rice plant growth analysis using Vegetation Indices (VI) (2) extraction of rice acreage and start-of-season dates from VI time series data (3) accuracy assessment, and (4) yield analysis with MODIS VI. The results show a direct relationship between rice plant height and MODIS VI. The crop calendar information and the smoothed NDVI time series with Whittaker Smoother gave high rice acreage estimation (with 86% area accuracy and 75% classification accuracy). Point level yield analysis showed that the MODIS EVI is highly correlated with rice yield and yield prediction using maximum EVI in the rice cycle predicted yield with an average prediction error 4.2%. This study shows the immense potential of MODIS gridded vegetation product for keeping an up-to-date Geographic Information System of rice cultivation.
Understanding multi-scale structural evolution in granular systems through gMEMS
NASA Astrophysics Data System (ADS)
Walker, David M.; Tordesillas, Antoinette
2013-06-01
We show how the rheological response of a material to applied loads can be systematically coded, analyzed and succinctly summarized, according to an individual grain's property (e.g. kinematics). Individual grains are considered as their own smart sensor akin to microelectromechanical systems (e.g. gyroscopes, accelerometers), each capable of recognizing their evolving role within self-organizing building block structures (e.g. contact cycles and force chains). A symbolic time series is used to represent their participation in such self-assembled building blocks and a complex network summarizing their interrelationship with other grains is constructed. In particular, relationships between grain time series are determined according to the information theory Hamming distance or the metric Euclidean distance. We then use topological distance to find network communities enabling groups of grains at remote physical metric distances in the material to share a classification. In essence grains with similar structural and functional roles at different scales are identified together. This taxonomy distills the dissipative structural rearrangements of grains down to its essential features and thus provides pointers for objective physics-based internal variable formalisms used in the construction of robust predictive continuum models.
NASA Astrophysics Data System (ADS)
Nasertdinova, A. D.; Bochkarev, V. V.
2017-11-01
Deep neural networks with a large number of parameters are a powerful tool for solving problems of pattern recognition, prediction and classification. Nevertheless, overfitting remains a serious problem in the use of such networks. A method of solving the problem of overfitting is proposed in this article. This method is based on reducing the number of independent parameters of a neural network model using the principal component analysis, and can be implemented using existing libraries of neural computing. The algorithm was tested on the problem of recognition of handwritten symbols from the MNIST database, as well as on the task of predicting time series (rows of the average monthly number of sunspots and series of the Lorentz system were used). It is shown that the application of the principal component analysis enables reducing the number of parameters of the neural network model when the results are good. The average error rate for the recognition of handwritten figures from the MNIST database was 1.12% (which is comparable to the results obtained using the "Deep training" methods), while the number of parameters of the neural network can be reduced to 130 times.
Fractal dimension and nonlinear dynamical processes
NASA Astrophysics Data System (ADS)
McCarty, Robert C.; Lindley, John P.
1993-11-01
Mandelbrot, Falconer and others have demonstrated the existence of dimensionally invariant geometrical properties of non-linear dynamical processes known as fractals. Barnsley defines fractal geometry as an extension of classical geometry. Such an extension, however, is not mathematically trivial Of specific interest to those engaged in signal processing is the potential use of fractal geometry to facilitate the analysis of non-linear signal processes often referred to as non-linear time series. Fractal geometry has been used in the modeling of non- linear time series represented by radar signals in the presence of ground clutter or interference generated by spatially distributed reflections around the target or a radar system. It was recognized by Mandelbrot that the fractal geometries represented by man-made objects had different dimensions than the geometries of the familiar objects that abound in nature such as leaves, clouds, ferns, trees, etc. The invariant dimensional property of non-linear processes suggests that in the case of acoustic signals (active or passive) generated within a dispersive medium such as the ocean environment, there exists much rich structure that will aid in the detection and classification of various objects, man-made or natural, within the medium.
Wind data mining by Kohonen Neural Networks.
Fayos, José; Fayos, Carolina
2007-02-14
Time series of Circulation Weather Type (CWT), including daily averaged wind direction and vorticity, are self-classified by similarity using Kohonen Neural Networks (KNN). It is shown that KNN is able to map by similarity all 7300 five-day CWT sequences during the period of 1975-94, in London, United Kingdom. It gives, as a first result, the most probable wind sequences preceding each one of the 27 CWT Lamb classes in that period. Inversely, as a second result, the observed diffuse correlation between both five-day CWT sequences and the CWT of the 6(th) day, in the long 20-year period, can be generalized to predict the last from the previous CWT sequence in a different test period, like 1995, as both time series are similar. Although the average prediction error is comparable to that obtained by forecasting standard methods, the KNN approach gives complementary results, as they depend only on an objective classification of observed CWT data, without any model assumption. The 27 CWT of the Lamb Catalogue were coded with binary three-dimensional vectors, pointing to faces, edges and vertex of a "wind-cube," so that similar CWT vectors were close.
Large Scale Crop Classification in Ukraine using Multi-temporal Landsat-8 Images with Missing Data
NASA Astrophysics Data System (ADS)
Kussul, N.; Skakun, S.; Shelestov, A.; Lavreniuk, M. S.
2014-12-01
At present, there are no globally available Earth observation (EO) derived products on crop maps. This issue is being addressed within the Sentinel-2 for Agriculture initiative where a number of test sites (including from JECAM) participate to provide coherent protocols and best practices for various global agriculture systems, and subsequently crop maps from Sentinel-2. One of the problems in dealing with optical images for large territories (more than 10,000 sq. km) is the presence of clouds and shadows that result in having missing values in data sets. In this abstract, a new approach to classification of multi-temporal optical satellite imagery with missing data due to clouds and shadows is proposed. First, self-organizing Kohonen maps (SOMs) are used to restore missing pixel values in a time series of satellite imagery. SOMs are trained for each spectral band separately using non-missing values. Missing values are restored through a special procedure that substitutes input sample's missing components with neuron's weight coefficients. After missing data restoration, a supervised classification is performed for multi-temporal satellite images. For this, an ensemble of neural networks, in particular multilayer perceptrons (MLPs), is proposed. Ensembling of neural networks is done by the technique of average committee, i.e. to calculate the average class probability over classifiers and select the class with the highest average posterior probability for the given input sample. The proposed approach is applied for large scale crop classification using multi temporal Landsat-8 images for the JECAM test site in Ukraine [1-2]. It is shown that ensemble of MLPs provides better performance than a single neural network in terms of overall classification accuracy and kappa coefficient. The obtained classification map is also validated through estimated crop and forest areas and comparison to official statistics. 1. A.Yu. Shelestov et al., "Geospatial information system for agricultural monitoring," Cybernetics Syst. Anal., vol. 49, no. 1, pp. 124-132, 2013. 2. J. Gallego et al., "Efficiency Assessment of Different Approaches to Crop Classification Based on Satellite and Ground Observations," J. Autom. Inform. Scie., vol. 44, no. 5, pp. 67-80, 2012.
A novel and efficient technique for identification and classification of GPCRs.
Gupta, Ravi; Mittal, Ankush; Singh, Kuldip
2008-07-01
G-protein coupled receptors (GPCRs) play a vital role in different biological processes, such as regulation of growth, death, and metabolism of cells. GPCRs are the focus of significant amount of current pharmaceutical research since they interact with more than 50% of prescription drugs. The dipeptide-based support vector machine (SVM) approach is the most accurate technique to identify and classify the GPCRs. However, this approach has two major disadvantages. First, the dimension of dipeptide-based feature vector is equal to 400. The large dimension makes the classification task computationally and memory wise inefficient. Second, it does not consider the biological properties of protein sequence for identification and classification of GPCRs. In this paper, we present a novel-feature-based SVM classification technique. The novel features are derived by applying wavelet-based time series analysis approach on protein sequences. The proposed feature space summarizes the variance information of seven important biological properties of amino acids in a protein sequence. In addition, the dimension of the feature vector for proposed technique is equal to 35. Experiments were performed on GPCRs protein sequences available at GPCRs Database. Our approach achieves an accuracy of 99.9%, 98.06%, 97.78%, and 94.08% for GPCR superfamily, families, subfamilies, and subsubfamilies (amine group), respectively, when evaluated using fivefold cross-validation. Further, an accuracy of 99.8%, 97.26%, and 97.84% was obtained when evaluated on unseen or recall datasets of GPCR superfamily, families, and subfamilies, respectively. Comparison with dipeptide-based SVM technique shows the effectiveness of our approach.
7. Cable Creek Bridge after completion. Zion National Park negative ...
7. Cable Creek Bridge after completion. Zion National Park negative number 1485, classification series 002, 12. - Floor of the Valley Road, Cable Creek Bridge, Spanning Cable Creek on Floor of Valley, Springdale, Washington County, UT
Pellegrini, Marco O. O.
2017-01-01
Abstract Throughout the years, three infrageneric classifications were proposed for Tradescantia along with several informal groups and species complexes. The current infrageneric classification accepts 12 sections – with T. sect. Tradescantia being further divided into four series – and assimilates many concepts adopted by previous authors. Recent molecular-based phylogenetic studies indicate that the currently accepted sections might not represent monophyletic groups within Tradescantia. Based on newly gathered morphological data on the group, complemented with available micromorphological, cytological and phytochemical data, I present the first morphology-based evolutionary hypothesis for Tradescantia. Furthermore, I reduce subtribe Thyrsantheminae to a synonym of subtribe Tradescantiinae, and propose a new infrageneric classification for Tradescantia, based on the total evidence of the present morphological phylogeny, in accordance to the previously published molecular data. PMID:29118649
Machine learning in soil classification.
Bhattacharya, B; Solomatine, D P
2006-03-01
In a number of engineering problems, e.g. in geotechnics, petroleum engineering, etc. intervals of measured series data (signals) are to be attributed a class maintaining the constraint of contiguity and standard classification methods could be inadequate. Classification in this case needs involvement of an expert who observes the magnitude and trends of the signals in addition to any a priori information that might be available. In this paper, an approach for automating this classification procedure is presented. Firstly, a segmentation algorithm is developed and applied to segment the measured signals. Secondly, the salient features of these segments are extracted using boundary energy method. Based on the measured data and extracted features to assign classes to the segments classifiers are built; they employ Decision Trees, ANN and Support Vector Machines. The methodology was tested in classifying sub-surface soil using measured data from Cone Penetration Testing and satisfactory results were obtained.
Sheffield, Kathryn; Morse-McNabb, Elizabeth; Clark, Rob; Robson, Susan; Lewis, Hayden
2015-01-01
There is a demand for regularly updated, broad-scale, accurate land cover information in Victoria from multiple stakeholders. This paper documents the methods used to generate an annual dominant land cover (DLC) map for Victoria, Australia from 2009 to 2013. Vegetation phenology parameters derived from an annual time series of the Moderate Resolution Imaging Spectroradiometer Vegetation Indices 16-day 250 m (MOD13Q1) product were used to generate annual DLC maps, using a three-tiered hierarchical classification scheme. Classification accuracy at the broadest (primary) class level was over 91% for all years, while it ranged from 72 to 81% at the secondary class level. The most detailed class level (tertiary) had accuracy levels ranging from 61 to 68%. The approach used was able to accommodate variable climatic conditions, which had substantial impacts on vegetation growth patterns and agricultural production across the state between both regions and years. The production of an annual dataset with complete spatial coverage for Victoria provides a reliable base data set with an accuracy that is fit-for-purpose for many applications. PMID:26602009
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Sentiments Analysis of Reviews Based on ARCNN Model
NASA Astrophysics Data System (ADS)
Xu, Xiaoyu; Xu, Ming; Xu, Jian; Zheng, Ning; Yang, Tao
2017-10-01
The sentiments analysis of product reviews is designed to help customers understand the status of the product. The traditional method of sentiments analysis relies on the input of a fixed feature vector which is performance bottleneck of the basic codec architecture. In this paper, we propose an attention mechanism with BRNN-CNN model, referring to as ARCNN model. In order to have a good analysis of the semantic relations between words and solves the problem of dimension disaster, we use the GloVe algorithm to train the vector representations for words. Then, ARCNN model is proposed to deal with the problem of deep features training. Specifically, BRNN model is proposed to investigate non-fixed-length vectors and keep time series information perfectly and CNN can study more connection of deep semantic links. Moreover, the attention mechanism can automatically learn from the data and optimize the allocation of weights. Finally, a softmax classifier is designed to complete the sentiment classification of reviews. Experiments show that the proposed method can improve the accuracy of sentiment classification compared with benchmark methods.
NASA Astrophysics Data System (ADS)
Giraldo, Diana L.; García-Arteaga, Juan D.; Romero, Eduardo
2016-03-01
Initial diagnosis of Alzheimer's disease (AD) is based on the patient's clinical history and a battery of neuropsy-chological tests. This work presents an automatic strategy that uses Structural Magnetic Resonance Imaging (MRI) to learn brain models for different stages of the disease using information from clinical assessments. Then, a comparison of the discriminant power of the models in different anatomical areas is made by using the brain region of the models as a reference frame for the classification problem, by using the projection into the AD model a Receiver Operating Characteristic (ROC) curve is constructed. Validation was performed using a leave- one-out scheme with 86 subjects (20 AD and 60 NC) from the Open Access Series of Imaging Studies (OASIS) database. The region with the best classification performance was the left amygdala where it is possible to achieve a sensibility and specificity of 85% at the same time. The regions with the best performance, in terms of the AUC, are in strong agreement with those described as important for the diagnosis of AD in clinical practice.
Multidimensional assessment of acute confusion after traumatic brain injury.
Sherer, Mark; Nakase-Thompson, Risa; Yablon, Stuart A; Gontkovsky, Samuel T
2005-05-01
To describe the phenomenology of posttraumatic confusional state (PTCS) and to provide preliminary validation of a new procedure, the Confusion Assessment Protocol (CAP), for assessing PTCS. Criterion standard investigation. Inpatient traumatic brain injury (TBI) rehabilitation program. Two consecutive series of patients (n=62, n=93) with TBI admitted for inpatient rehabilitation. Not applicable. Clinical diagnosis of delirium based on Diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM-IV) criteria, classification of posttraumatic amnesia (PTA) based on the Galveston Orientation and Amnesia Test (GOAT), and Disability Rating Scale score at time of rehabilitation hospital discharge. Results Agreement between the diagnosis of PTCS with the CAP and DSM-IV classification of delirium was 87%, and agreement between PTCS and PTA using GOAT criteria was 90%. Patients classified as in PTCS sustained more severe injuries and required longer rehabilitation stays. Confusion status was associated with poorer functional status at rehabilitation discharge. The CAP is a brief, structured, repeatable measure of multiple neurobehavioral aspects of PTCS. Confusion status as determined by CAP assessment contributed to prediction of outcome at rehabilitation discharge after adjustment for other potential predictors.
Highway Functional Classification Concepts, Criteria and Procedures.
DOT National Transportation Integrated Search
2013-01-01
This is the tenth in a series of combined documents prepared by the U.S. Department of Transportation : (DOT) to satisfy requirements for reports to Congress on the condition, performance, and future capital : investment needs of the Nations highw...
46 CFR 298.11 - Vessel requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... with accepted commercial experience and practice. (g) Metric Usage. Our preferred system of measurement and weights for Vessels and Shipyard Projects is the metric system. ...), classification societies to be ISO 9000 series registered or Quality Systems Certificate Scheme qualified IACS...
46 CFR 298.11 - Vessel requirements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... with accepted commercial experience and practice. (g) Metric Usage. Our preferred system of measurement and weights for Vessels and Shipyard Projects is the metric system. ...), classification societies to be ISO 9000 series registered or Quality Systems Certificate Scheme qualified IACS...
46 CFR 298.11 - Vessel requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... with accepted commercial experience and practice. (g) Metric Usage. Our preferred system of measurement and weights for Vessels and Shipyard Projects is the metric system. ...), classification societies to be ISO 9000 series registered or Quality Systems Certificate Scheme qualified IACS...
46 CFR 298.11 - Vessel requirements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... with accepted commercial experience and practice. (g) Metric Usage. Our preferred system of measurement and weights for Vessels and Shipyard Projects is the metric system. ...), classification societies to be ISO 9000 series registered or Quality Systems Certificate Scheme qualified IACS...
46 CFR 298.11 - Vessel requirements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... with accepted commercial experience and practice. (g) Metric Usage. Our preferred system of measurement and weights for Vessels and Shipyard Projects is the metric system. ...), classification societies to be ISO 9000 series registered or Quality Systems Certificate Scheme qualified IACS...
Real-Time Classification of Exercise Exertion Levels Using Discriminant Analysis of HRV Data.
Jeong, In Cheol; Finkelstein, Joseph
2015-01-01
Heart rate variability (HRV) was shown to reflect activation of sympathetic nervous system however it is not clear which set of HRV parameters is optimal for real-time classification of exercise exertion levels. There is no studies that compared potential of two types of HRV parameters (time-domain and frequency-domain) in predicting exercise exertion level using discriminant analysis. The main goal of this study was to compare potential of HRV time-domain parameters versus HRV frequency-domain parameters in classifying exercise exertion level. Rest, exercise, and recovery categories were used in classification models. Overall 79.5% classification agreement by the time-domain parameters as compared to overall 52.8% classification agreement by frequency-domain parameters demonstrated that the time-domain parameters had higher potential in classifying exercise exertion levels.
Cropland Area Extraction in China with Multi-Temporal MODIS Data
NASA Astrophysics Data System (ADS)
Bagan, H.; Baruah, P. J.; Wang, Q.; Yasuoka, Y.
2007-12-01
: extracting the area of cropland in China is very important for agricultural management, land degradation and ecosystem assessment. In this study we investigate the potential and the methodology for the cropland area extraction using multi-temporal MODIS EVI data and some ancillary data. A 16-day composite EVI time-series data for 2003 (6 March 2003 - 2 December 2003) with a spatial resolution of 500 m, and the ancillary data included Land-use GIS data, Landsat TM/ETM, ASTER data, and county-level cultivated land statistical data of year 2000. The Self-Organizing Map (SOM) neural network classification algorithm was applied to the EVI data set. To focus on agricultural and desertification, we designed 9 land-cover types: 1) water, 2) woodland, 3) grassland, 4) dry cropland, 5) sandy, 6) paddy, 7) wetland, 8) urban/bare, and 9) snow/ice. The overall classification accuracy was 85% with a kappa coefficient of 0.84. The EVI data sets were sensitive and performed well in distinguishing the majority of land cover types. We also used county-level cultivated land statistical data from the year 2000 to evaluate the accuracy of the agricultural area from classification results, and found that the correlation coefficient was high in most counties. The result of this study shows that the methodology used in this study is, in general, feasible for cropland extraction in China. Keywords: MODIS, EVI, SOM, Cropland, land cover.
NASA Astrophysics Data System (ADS)
Scholkmann, Felix; Cifra, Michal; Alexandre Moraes, Thiago; de Mello Gallep, Cristiano
2011-12-01
The aim of the present study was to test whether the multifractal properties of ultra-weak photon emission (UPE) from germinating wheat seedlings (Triticum aestivum) change when the seedlings are treated with different concentrations of the toxin potassium dichromate (PD). To this end, UPE was measured (50 seedlings in one Petri dish, duration: approx. 16.6- 28 h) from samples of three groups: (i) control (group C, N = 9), (ii) treated with 25 ppm of PD (group G25, N = 32), and (iii) treated with 150 ppm of PD (group G150, N = 23). For the multifractal analysis, the following steps where performed: (i) each UPE time series was trimmed to a final length of 1000 min; (ii) each UPE time series was filtered, linear detrended and normalized; (iii) the multifractal spectrum (f(α)) was calculated for every UPE time series using the backward multifractal detrended moving average (MFDMA) method; (iv) each multifractal spectrum was characterized by calculating the mode (αmode) of the spectrum and the degree of multifractality (Δα) (v) for every UPE time series its mean, skewness and kurtosis were also calculated; finally (vi) all obtained parameters where analyzed to determine their ability to differentiate between the three groups. This was based on Fisher's discriminant ratio (FDR), which was calculated for each parameter combination. Additionally, a non-parametric test was used to test whether the parameter values are significantly different or not. The analysis showed that when comparing all the three groups, FDR had the highest values for the multifractal parameters (αmode, Δα). Furthermore, the differences in these parameters between the groups were statistically significant (p < 0.05). The classical parameters (mean, skewness and kurtosis) had lower FDR values than the multifractal parameters in all cases and showed no significant difference between the groups (except for the skewness between group C and G150). In conclusion, multifractal analysis enables changes in UPE time series to be detected even when they are hidden for normal linear signal analysis methods. The analysis of changes in the multifractal properties might be a basis to design a classification system enabling the intoxication of cell cultures to be quantified based on UPE measurements.
NASA Astrophysics Data System (ADS)
Huesca, Margarita; Merino-de-Miguel, Silvia; Eklundh, Lars; Litago, Javier; Cicuéndez, Victor; Rodríguez-Rastrero, Manuel; Ustin, Susan L.; Palacios-Orueta, Alicia
2015-12-01
Remote sensing (RS) time series are an excellent operative source for information about the land surface across several scales and different levels of landscape heterogeneity. Ustin and Gamon (2010) proposed the new concept of "optical types" (OT), meaning "optically distinguishable functional types", as a way to better understand remote sensing signals related to the actual functional behavior of species that share common physiognomic forms but differ in functionality. Whereas the OT approach seems to be promising and consistent with ecological theory as a way to monitor vegetation derived from RS, it received little implementation. This work presents a method for implementing the OT concept for efficient monitoring of ecosystems based on RS time series. We propose relying on an ecosystem's repetitive pattern in the temporal domain (self-similarity) to assess its dynamics. Based on this approach, our main hypothesis is that distinct dynamics are intrinsic to a specific OT. Self-similarity level in the temporal domain within a broadleaf forest class was quantitatively assessed using the auto-correlation function (ACF), from statistical time series analysis. A vector comparison classification method, spectral angle mapper, and principal component analysis were used to identify general patterns related to forest dynamics. Phenological metrics derived from MODIS NDVI time series using the TIMESAT software, together with information from the National Forest Map were used to explain the different dynamics found. Results showed significant and highly stable self-similarity patterns in OTs that corresponded to forests under non-moisture-limited environments with an adaptation strategy based on a strong phenological synchrony with climate seasonality. These forests are characterized by dense closed canopy deciduous forests associated with high productivity and low biodiversity in terms of dominant species. Forests in transitional areas were associated with patterns of less temporal stability probably due to mixtures of different adaptation strategies (i.e., deciduous, marcescent and evergreen species) and higher functional diversity related to climate variability at long and short terms. A less distinct seasonality and even a double season appear in the OT of the broadleaf Mediterranean forest characterized by an open canopy dominated by evergreen-sclerophyllous formations. Within this forest, understory and overstory dynamics maximize functional diversity resulting in contrasting traits adapted to summer drought, winter frosts, and high precipitation variability.
NASA Astrophysics Data System (ADS)
Verma, A. K.; Garg, P. K.; Prasad, K. S. H.; Dadhwal, V. K.
2016-12-01
Agriculture is a backbone of Indian economy, providing livelihood to about 70% of the population. The primary objective of this research is to investigate the general applicability of time-series MODIS 250m Normalized difference vegetation index (NDVI) and Enhanced vegetation index (EVI) data for various Land use/Land cover (LULC) classification. The other objective is the retrieval of crop biophysical parameter using MODIS 250m resolution data. The Uttar Pradesh state of India is selected for this research work. A field study of 38 farms was conducted during entire crop season of the year 2015 to evaluate the applicability of MODIS 8-day, 250m resolution composite images for assessment of crop condition. The spectroradiometer is used for ground reflectance and the AccuPAR LP-80 Ceptometer is used to measure the agricultural crops Leaf Area Index (LAI). The AccuPAR measures Photosynthetically Active Radiation (PAR) and can invert these readings to give LAI for plant canopy. Ground-based canopy reflectance and LAI were used to calibrate a radiative transfer model to create look-up table (LUT) that was used to simulate LAI. The seasonal trend of MODIS-derived LAI was used to find crop parameter by adjusting the LAI simulated from climate-based crop yield model. Cloud free MODIS images of 250m resolution (16 day composite period) were downloaded using LP-DAAC website over a period of 12 months (Jan to Dec 2015). MODIS both the VI products were found to have sufficient spectral, spatial and temporal resolution to detect unique signatures for each class (water, fallow land, urban, dense vegetation, orchard, sugarcane and other crops). Ground truth data were collected using JUNO GPS. Multi-temporal VI signatures for vegetation classes were consistent with its general phenological characteristic and were spectrally separable at some point during the growing season. The MODIS NDVI and EVI multi-temporal images tracked similar seasonal responses for all croplands and were highly correlated across the growing season. The confusion matrix method is used for accuracy assessment and reference data which has been taken during the field visit. Total 520 pixels have been selected for various classes to determine the accuracy. The classification accuracy and kappa coefficient is found to be 79.76% and 0.78 respectively.
WFIRST: Microlensing Analysis Data Challenge
NASA Astrophysics Data System (ADS)
Street, Rachel; WFIRST Microlensing Science Investigation Team
2018-01-01
WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.
Development of a new British Geologcial Survey(BGS) Map Series: Seabed Geomorphology
NASA Astrophysics Data System (ADS)
Dove, Dayton
2015-04-01
BGS scientists are developing a new offshore map series, Seabed Geomorphology (1:50k), to join the existing 1:250k 'Sea Bed Sediments', 'Quaternary Geology', and 'Solid Geology' map series. The increasing availability of extensive high-resolution swath bathymetry data (e.g. MCA's Civil Hydrography Programme) provides an unprecedented opportunity to characterize the processes which formed, and actively govern the physical seabed environment. Mapping seabed geomorphology is an effective means to describe individual, or groups of features whose form and other physical attributes (e.g. symmetry) may be used to distinguish feature origin. Swath bathymetry also provides added and renewed value to other data types (e.g. grab samples, legacy seismic data). In such cases the geomorphic evidence may be expanded to make inferences on the evolution of seabed features as well as their association with the underlying geology and other environmental variables/events over multiple timescales. Classifying seabed geomorphology is not particularly innovative or groundbreaking. Terrestrial geomorphology is of course a well established field of science, and within the marine environment for example, mapping submarine glacial landforms has probably become the most reliable method to reconstruct the extent and dynamics of past ice-sheets. What is novel here, and we believe useful/necessary for a survey organization, is to standardise the geomorphological classification scheme such that it is applicable to multiple and diverse environments. The classification scheme should be sufficiently detailed and interpretive to be informative, but not so detailed that we over-interpret or become mired in disputed feature designations or definitions. We plan to present the maps at 1:50k scale with the intention that these maps will be 'enabling' resources for research, educational, commercial, and policy purposes, much like the existing 1:250k map series. We welcome feedback on the structure and content of the proposed classification scheme, as well as the anticipated value to respective user communities.
Gallbladder perforation: case series and systematic review.
Date, Ravindra S; Thrumurthy, Sri G; Whiteside, Sigrid; Umer, Mohammed A; Pursnani, Kishore G; Ward, Jeremy B; Mughal, M Muntzer
2012-01-01
Gallbladder perforation is a serious complication of acute cholecystitis. Its management has evolved considerably since its classification by Niemeier in 1934. This review summarises the evidence surrounding the natural progression of this condition and potential problems with Niemeier's classification, and proposes a management algorithm for the more complex type II perforation. Data from a retrospective case series and a systematic review were combined. The case series included all patients with gallbladder perforations from 2004 to 2008 at a British teaching hospital. The systematic review searched for gallbladder perforation using the MEDLINE, Embase, Web of Science and Cochrane Library (2011 Issue 4) databases, as well as recent conference abstracts. The outcome data were analysed using SPSS version 15. No adjustments were made for multiple testing. 198 patients (including 19 patients from the present series) with a mean age of 62.1+/-9.7 years and male gender proportion of 55.4% (range 33.3-76.7%) were included. The most common gallbladder perforations were type II (median 46.2%, range 7.4-83.3%), followed by type I (median 40.6%, range 16.7-70.0%) and type III (median 10.1%, range 0-48.1%). Perforation was associated with cholelithiasis in 86.6% (range 78.9-90.6%) of patients, and the overall median mortality rate was 10.8% (range 0-12.5%). Male gender was weakly associated with mortality (p = 0.089) but age (p = 0.877) and cholelithiasis (p = 0.425) were not. Mortality did not vary significantly with perforation type. Gallbladder perforation should be reported according to the original Neimeier's classification to avoid heterogeneity in data (e.g. varying rates of perforation types). The algorithm proposed in this study aims to guide the management of complex type II gallbladder perforations to minimise subsequent morbidity and mortality. Copyright © 2012 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Mars, John L.; Garrity, Christopher P.; Houseknecht, David W.; Amoroso, Lee; Meares, Donald C.
2007-01-01
Introduction The northeastern part of the National Petroleum Reserve in Alaska (NPRA) has become an area of active petroleum exploration during the past five years. Recent leasing and exploration drilling in the NPRA requires the U.S. Bureau of Land Management (BLM) to manage and monitor a variety of surface activities that include seismic surveying, exploration drilling, oil-field development drilling, construction of oil-production facilities, and construction of pipelines and access roads. BLM evaluates a variety of permit applications, environmental impact studies, and other documents that require rapid compilation and analysis of data pertaining to surface and subsurface geology, hydrology, and biology. In addition, BLM must monitor these activities and assess their impacts on the natural environment. Timely and accurate completion of these land-management tasks requires elevation, hydrologic, geologic, petroleum-activity, and cadastral data, all integrated in digital formats at a higher resolution than is currently available in nondigital (paper) formats. To support these land-management tasks, a series of maps was generated from remotely sensed data in an area of high petroleum-industry activity (fig. 1). The maps cover an area from approximately latitude 70?00' N. to 70?30' N. and from longitude 151?00' W. to 153?10' W. The area includes the Alpine oil field in the east, the Husky Inigok exploration well (site of a landing strip) in the west, many of the exploration wells drilled in NPRA since 2000, and the route of a proposed pipeline to carry oil from discovery wells in NPRA to the Alpine oil field. This map area is referred to as the 'Fish Creek area' after a creek that flows through the region. The map series includes (1) a color shaded-relief map based on 5-m-resolution data (sheet 1), (2) a surface-classification map based on 30-m-resolution data (sheet 2), and (3) a 5-m-resolution shaded relief-surface classification map that combines the shaded-relief and surface-classification data (sheet 3). Remote sensing datasets that were used to compile the maps include Landsat 7 Enhanced Thematic Mapper+ (ETM+), and interferometric synthetic aperture radar (IFSAR) data. In addition, a 1:250,000-scale geologic map of the Harrison Bay quadrangle, Alaska (Carter and Galloway, 1985, 2005) was used in conjunction with ETM+ and IFSAR data.
Dasgupta, Nilanjan; Carin, Lawrence
2005-04-01
Time-reversal imaging (TRI) is analogous to matched-field processing, although TRI is typically very wideband and is appropriate for subsequent target classification (in addition to localization). Time-reversal techniques, as applied to acoustic target classification, are highly sensitive to channel mismatch. Hence, it is crucial to estimate the channel parameters before time-reversal imaging is performed. The channel-parameter statistics are estimated here by applying a geoacoustic inversion technique based on Gibbs sampling. The maximum a posteriori (MAP) estimate of the channel parameters are then used to perform time-reversal imaging. Time-reversal implementation requires a fast forward model, implemented here by a normal-mode framework. In addition to imaging, extraction of features from the time-reversed images is explored, with these applied to subsequent target classification. The classification of time-reversed signatures is performed by the relevance vector machine (RVM). The efficacy of the technique is analyzed on simulated in-channel data generated by a free-field finite element method (FEM) code, in conjunction with a channel propagation model, wherein the final classification performance is demonstrated to be relatively insensitive to the associated channel parameters. The underlying theory of Gibbs sampling and TRI are presented along with the feature extraction and target classification via the RVM.
Precipitation Indices Low Countries
NASA Astrophysics Data System (ADS)
van Engelen, A. F. V.; Ynsen, F.; Buisman, J.; van der Schrier, G.
2009-09-01
Since 1995, KNMI published a series of books(1), presenting an annual reconstruction of weather and climate in the Low Countries, covering the period AD 763-present, or roughly, the last millennium. The reconstructions are based on the interpretation of documentary sources predominantly and comparison with other proxies and instrumental observations. The series also comprises a number of classifications. Amongst them annual classifications for winter and summer temperature and for winter and summer dryness-wetness. The classification of temperature have been reworked into peer reviewed (2) series (AD 1000-present) of seasonal temperatures and temperature indices, the so called LCT (Low Countries Temperature) series, now incorporated in the Millennium databases. Recently we started a study to convert the dryness-wetness classifications into a series of precipitation; the so called LCP (Low Countries Precipitation) series. A brief outline is given here of the applied methodology and preliminary results. The WMO definition for meteorological drought has been followed being that a period is called wet respectively dry when the amount of precipitation is considerable more respectively less than usual (normal). To gain a more quantitative insight for four locations, geographically spread over the Low Countries area (De Bilt, Vlissingen, Maastricht and Uccle), we analysed the statistics of daily precipitation series, covering the period 1900-present. This brought us to the following definition, valid for the Low Countries: A period is considered as (very) dry respectively (very) wet if over a continuous period of at least 60 days (~two months) cq 90 days (~three months) on at least two out of the four locations 50% less resp. 50% more than the normal amount for the location (based on the 1961-1990 normal period) has been measured. This results into the following classification into five drought classes hat could be applied to non instrumental observations: Very wet period (+2): Wide scale river flooding, marshy acres and meadows.-Farmers cope with poor harvests of hay, grains, fruit etc. resulting in famines.-Late grape harvests, poor yield quantity and quality of wine. Wet period (+1): High water levels cq discharges of major rivers, tributaries and brooks, local river floodings, marshy acres and meadows in the low lying areas.-Wearisome and hampered agriculture. Normal (0) Dry period (-1): Low water levels cq discharges of major rivers, tributaries and brooks. Some brooks may dry up.-Summer half year: local short of yield of grass, hay and other forage.-Summer half year: moor-, peat- and forest fires. Very dry period (-2): Very low water levels cq discharges of major rivers and tributaries. Brooks and wells dry up. Serious shortage of drinking water; especially in summer.-Major agricultural damage, shortage of water, mortality stock of cattle. Shortage of grain. Flour can not be produced due to water mills running out of water, shortage of bread, bread riots, famines.-Large scale forest and peat areas, resulting in serious air pollution. Town fires. By verifying the historical evidence on these criterions, a series of 5 step indices ranging from very dry to very wet for summer and winter half year of the Low Countries was obtained. Subsequently these indices series were compared with the instrumentally observed seasonal precipitation sums for De Bilt (1735-2008), which is considered to be representative for the Central Netherlands. For winter (Oct-March) and summer half year (Apr.-Sept.) the accumulated precipitation amounts are calculated; these amounts are approximately normally distributed. Based on this distribution, the cumulative frequency distribution is calculated. By tabulating the number of summers in the pre-instrumental period 1201-1750 for each of the drought classes, a distribution is calculated which is then related to the modern accumulated precipitation distribution. Assuming that the accumulated precipitation amount has not been below (above) the mean precipitation minus (plus) three standard deviations for the corresponding season, an accumulated precipitation amount which relates to each of the five drought classes in the classification can be estimated. (1) Buisman, J. , Van Engelen, A.F.V. (editor), Duizend jaar weer wind en water in de Lage Landen, Van Wijnen, Franeker (Netherlands), Vol. I763-1300, 1995, Vol. II, 1300-1450, 1996, Vol. III, 1450-1575, 1998, Vol. IV, 1575-1675, 2000, Vol. V, 1675-1750, 2006. (2) Shabalova, M.V., Van Engelen, A.F.V., Evaluation of a reconstruction of winter and summer temperatures in the Low Countries, AD 764-1998, Climatic Change 58: 219-242, 2003
[Classifications in forensic medicine and their logical basis].
Kovalev, A V; Shmarov, L A; Ten'kov, A A
2014-01-01
The objective of the present study was to characterize the main requirements for the correct construction of classifications used in forensic medicine, with special reference to the errors that occur in the relevant text-books, guidelines, and manuals and the ways to avoid them. This publication continues the series of thematic articles of the authors devoted to the logical errors in the expert conclusions. The preparation of further publications is underway to report the results of the in-depth analysis of the logical errors encountered in expert conclusions, text-books, guidelines, and manuals.
NASA Technical Reports Server (NTRS)
Saatchi, Sassan; DeGrandi, Franco; Simard, Marc; Podest, Erika
1999-01-01
In this paper, a multiscale approach is introduced to classify the Japanese Research Satellite-1 (JERS-1) mosaic image over the Central African rainforest. A series of texture maps are generated from the 100 m mosaic image at various scales. Using a quadtree model and relating classes at each scale by a Markovian relationship, the multiscale images are classified from course to finer scale. The results are verified at various scales and the evolution of classification is monitored by calculating the error at each stage.
NASA Technical Reports Server (NTRS)
Coggeshall, M. E.; Hoffer, R. M.
1973-01-01
Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...