NASA Astrophysics Data System (ADS)
Creusen, I. M.; Hazelhoff, L.; De With, P. H. N.
2013-10-01
In large-scale automatic traffic sign surveying systems, the primary computational effort is concentrated at the traffic sign detection stage. This paper focuses on reducing the computational load of particularly the sliding window object detection algorithm which is employed for traffic sign detection. Sliding-window object detectors often use a linear SVM to classify the features in a window. In this case, the classification can be seen as a convolution of the feature maps with the SVM kernel. It is well known that convolution can be efficiently implemented in the frequency domain, for kernels larger than a certain size. We show that by careful reordering of sliding-window operations, most of the frequency-domain transformations can be eliminated, leading to a substantial increase in efficiency. Additionally, we suggest to use the overlap-add method to keep the memory use within reasonable bounds. This allows us to keep all the transformed kernels in memory, thereby eliminating even more domain transformations, and allows all scales in a multiscale pyramid to be processed using the same set of transformed kernels. For a typical sliding-window implementation, we have found that the detector execution performance improves with a factor of 5.3. As a bonus, many of the detector improvements from literature, e.g. chi-squared kernel approximations, sub-class splitting algorithms etc., can be more easily applied at a lower performance penalty because of an improved scalability.
An efficient pseudomedian filter for tiling microrrays.
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-06-07
Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at http://tiling.gersteinlab.org/pseudomedian/.
An efficient pseudomedian filter for tiling microrrays
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-01-01
Background Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. Results We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Conclusion Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at . PMID:17555595
NASA Astrophysics Data System (ADS)
Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih
2018-03-01
Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.
A parameter estimation algorithm for spatial sine testing - Theory and evaluation
NASA Technical Reports Server (NTRS)
Rost, R. W.; Deblauwe, F.
1992-01-01
This paper presents the theory and an evaluation of a spatial sine testing parameter estimation algorithm that uses directly the measured forced mode of vibration and the measured force vector. The parameter estimation algorithm uses an ARMA model and a recursive QR algorithm is applied for data reduction. In this first evaluation, the algorithm has been applied to a frequency response matrix (which is a particular set of forced mode of vibration) using a sliding frequency window. The objective of the sliding frequency window is to execute the analysis simultaneously with the data acquisition. Since the pole values and the modal density are obtained from this analysis during the acquisition, the analysis information can be used to help determine the forcing vectors during the experimental data acquisition.
Adaptive early detection ML/PDA estimator for LO targets with EO sensors
NASA Astrophysics Data System (ADS)
Chummun, Muhammad R.; Kirubarajan, Thiagalingam; Bar-Shalom, Yaakov
2000-07-01
The batch Maximum Likelihood Estimator, combined with Probabilistic Data (ML-PDA), has been shown to be effective in acquiring low observable (LO) - low SNR - non-maneuvering targets in the presence of heavy clutter. The use of signal strength or amplitude information (AI) in the ML-PDA estimator with AI in a sliding-window fashion, to detect high- speed targets in heavy clutter using electro-optical (EO) sensors. The initial time and the length of the sliding-window are adjusted adaptively according to the information content of the received measurements. A track validation scheme via hypothesis testing is developed to confirm the estimated track, that is, the presence of a target, in each window. The sliding-window ML-PDA approach, together with track validation, enables early detection by rejecting noninformative scans, target reacquisition in case of temporary target disappearance and the handling of targets with speeds evolving over time. The proposed algorithm is shown to detect the target, which is hidden in as many as 600 false alarms per scan, 10 frames earlier than the Multiple Hypothesis Tracking (MHT) algorithm.
[A fast iterative algorithm for adaptive histogram equalization].
Cao, X; Liu, X; Deng, Z; Jiang, D; Zheng, C
1997-01-01
In this paper, we propose an iterative algorthm called FAHE., which is based on the relativity between the current local histogram and the one before the sliding window moving. Comparing with the basic AHE, the computing time of FAHE is decreased from 5 hours to 4 minutes on a 486dx/33 compatible computer, when using a 65 x 65 sliding window for a 512 x 512 with 8 bits gray-level range.
Finding Frequent Closed Itemsets in Sliding Window in Linear Time
NASA Astrophysics Data System (ADS)
Chen, Junbo; Zhou, Bo; Chen, Lu; Wang, Xinyu; Ding, Yiqun
One of the most well-studied problems in data mining is computing the collection of frequent itemsets in large transactional databases. Since the introduction of the famous Apriori algorithm [14], many others have been proposed to find the frequent itemsets. Among such algorithms, the approach of mining closed itemsets has raised much interest in data mining community. The algorithms taking this approach include TITANIC [8], CLOSET+[6], DCI-Closed [4], FCI-Stream [3], GC-Tree [15], TGC-Tree [16] etc. Among these algorithms, FCI-Stream, GC-Tree and TGC-Tree are online algorithms work under sliding window environments. By the performance evaluation in [16], GC-Tree [15] is the fastest one. In this paper, an improved algorithm based on GC-Tree is proposed, the computational complexity of which is proved to be a linear combination of the average transaction size and the average closed itemset size. The algorithm is based on the essential theorem presented in Sect. 4.2. Empirically, the new algorithm is several orders of magnitude faster than the state of art algorithm, GC-Tree.
Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brendel, Bernhard, E-mail: bernhard.brendel@philips.com; Teuffenbach, Maximilian von; Noël, Peter B.
2016-01-15
Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penaltymore » comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts.« less
NASA Astrophysics Data System (ADS)
Wang, Xiaohua; Rong, Mingzhe; Qiu, Juan; Liu, Dingxin; Su, Biao; Wu, Yi
A new type of algorithm for predicting the mechanical faults of a vacuum circuit breaker (VCB) based on an artificial neural network (ANN) is proposed in this paper. There are two types of mechanical faults in a VCB: operation mechanism faults and tripping circuit faults. An angle displacement sensor is used to measure the main axle angle displacement which reflects the displacement of the moving contact, to obtain the state of the operation mechanism in the VCB, while a Hall current sensor is used to measure the trip coil current, which reflects the operation state of the tripping circuit. Then an ANN prediction algorithm based on a sliding time window is proposed in this paper and successfully used to predict mechanical faults in a VCB. The research results in this paper provide a theoretical basis for the realization of online monitoring and fault diagnosis of a VCB.
PPP Sliding Window Algorithm and Its Application in Deformation Monitoring.
Song, Weiwei; Zhang, Rui; Yao, Yibin; Liu, Yanyan; Hu, Yuming
2016-05-31
Compared with the double-difference relative positioning method, the precise point positioning (PPP) algorithm can avoid the selection of a static reference station and directly measure the three-dimensional position changes at the observation site and exhibit superiority in a variety of deformation monitoring applications. However, because of the influence of various observing errors, the accuracy of PPP is generally at the cm-dm level, which cannot meet the requirements needed for high precision deformation monitoring. For most of the monitoring applications, the observation stations maintain stationary, which can be provided as a priori constraint information. In this paper, a new PPP algorithm based on a sliding window was proposed to improve the positioning accuracy. Firstly, data from IGS tracking station was processed using both traditional and new PPP algorithm; the results showed that the new algorithm can effectively improve positioning accuracy, especially for the elevation direction. Then, an earthquake simulation platform was used to simulate an earthquake event; the results illustrated that the new algorithm can effectively detect the vibrations change of a reference station during an earthquake. At last, the observed Wenchuan earthquake experimental results showed that the new algorithm was feasible to monitor the real earthquakes and provide early-warning alerts.
Fast object detection algorithm based on HOG and CNN
NASA Astrophysics Data System (ADS)
Lu, Tongwei; Wang, Dandan; Zhang, Yanduo
2018-04-01
In the field of computer vision, object classification and object detection are widely used in many fields. The traditional object detection have two main problems:one is that sliding window of the regional selection strategy is high time complexity and have window redundancy. And the other one is that Robustness of the feature is not well. In order to solve those problems, Regional Proposal Network (RPN) is used to select candidate regions instead of selective search algorithm. Compared with traditional algorithms and selective search algorithms, RPN has higher efficiency and accuracy. We combine HOG feature and convolution neural network (CNN) to extract features. And we use SVM to classify. For TorontoNet, our algorithm's mAP is 1.6 percentage points higher. For OxfordNet, our algorithm's mAP is 1.3 percentage higher.
Analysis of Rhythms in Experimental Signals
NASA Astrophysics Data System (ADS)
Desherevskii, A. V.; Zhuravlev, V. I.; Nikolsky, A. N.; Sidorin, A. Ya.
2017-12-01
We compare algorithms designed to extract quasiperiodic components of a signal and estimate the amplitude, phase, stability, and other characteristics of a rhythm in a sliding window in the presence of data gaps. Each algorithm relies on its own rhythm model; therefore, it is necessary to use different algorithms depending on the research objectives. The described set of algorithms and methods is implemented in the WinABD software package, which includes a time-series database management system, a powerful research complex, and an interactive data-visualization environment.
Prediction of CpG-island function: CpG clustering vs. sliding-window methods
2010-01-01
Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903
Compositional searching of CpG islands in the human genome
NASA Astrophysics Data System (ADS)
Luque-Escamilla, Pedro Luis; Martínez-Aroza, José; Oliver, José L.; Gómez-Lopera, Juan Francisco; Román-Roldán, Ramón
2005-06-01
We report on an entropic edge detector based on the local calculation of the Jensen-Shannon divergence with application to the search for CpG islands. CpG islands are pieces of the genome related to gene expression and cell differentiation, and thus to cancer formation. Searching for these CpG islands is a major task in genetics and bioinformatics. Some algorithms have been proposed in the literature, based on moving statistics in a sliding window, but its size may greatly influence the results. The local use of Jensen-Shannon divergence is a completely different strategy: the nucleotide composition inside the islands is different from that in their environment, so a statistical distance—the Jensen-Shannon divergence—between the composition of two adjacent windows may be used as a measure of their dissimilarity. Sliding this double window over the entire sequence allows us to segment it compositionally. The fusion of those segments into greater ones that satisfy certain identification criteria must be achieved in order to obtain the definitive results. We find that the local use of Jensen-Shannon divergence is very suitable in processing DNA sequences for searching for compositionally different structures such as CpG islands, as compared to other algorithms in literature.
78 FR 40057 - Airworthiness Directives; Airbus Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... A321 series airplanes. This proposed AD was prompted by reports of certain sliding windows that were... numbers of sliding windows and sliding window seals, and modification if necessary. This proposed AD also... could lead to the functional loss of the sliding window as an exit, possibly preventing the flightcrew...
Absolute phase estimation: adaptive local denoising and global unwrapping.
Bioucas-Dias, Jose; Katkovnik, Vladimir; Astola, Jaakko; Egiazarian, Karen
2008-10-10
The paper attacks absolute phase estimation with a two-step approach: the first step applies an adaptive local denoising scheme to the modulo-2 pi noisy phase; the second step applies a robust phase unwrapping algorithm to the denoised modulo-2 pi phase obtained in the first step. The adaptive local modulo-2 pi phase denoising is a new algorithm based on local polynomial approximations. The zero-order and the first-order approximations of the phase are calculated in sliding windows of varying size. The zero-order approximation is used for pointwise adaptive window size selection, whereas the first-order approximation is used to filter the phase in the obtained windows. For phase unwrapping, we apply the recently introduced robust (in the sense of discontinuity preserving) PUMA unwrapping algorithm [IEEE Trans. Image Process.16, 698 (2007)] to the denoised wrapped phase. Simulations give evidence that the proposed algorithm yields state-of-the-art performance, enabling strong noise attenuation while preserving image details. (c) 2008 Optical Society of America
Chow, James C.L.; Grigorov, Grigor N.; Yazdani, Nuri
2006-01-01
A custom‐made computer program, SWIMRT, to construct “multileaf collimator (MLC) machine” file for intensity‐modulated radiotherapy (IMRT) fluence maps was developed using MATLAB® and the sliding window algorithm. The user can either import a fluence map with a graphical file format created by an external treatment‐planning system such as Pinnacle3 or create his or her own fluence map using the matrix editor in the program. Through comprehensive calibrations of the dose and the dimension of the imported fluence field, the user can use associated image‐processing tools such as field resizing and edge trimming to modify the imported map. When the processed fluence map is suitable, a “MLC machine” file is generated for our Varian 21 EX linear accelerator with a 120‐leaf Millennium MLC. This machine file is transferred to the MLC console of the LINAC to control the continuous motions of the leaves during beam irradiation. An IMRT field is then irradiated with the 2D intensity profiles, and the irradiated profiles are compared to the imported or modified fluence map. This program was verified and tested using film dosimetry to address the following uncertainties: (1) the mechanical limitation due to the leaf width and maximum traveling speed, and (2) the dosimetric limitation due to the leaf leakage/transmission and penumbra effect. Because the fluence map can be edited, resized, and processed according to the requirement of a study, SWIMRT is essential in studying and investigating the IMRT technique using the sliding window algorithm. Using this program, future work on the algorithm may include redistributing the time space between segmental fields to enhance the fluence resolution, and readjusting the timing of each leaf during delivery to avoid small fields. Possible clinical utilities and examples for SWIMRT are given in this paper. PACS numbers: 87.53.Kn, 87.53.St, 87.53.Uv PMID:17533330
Fuzzy CMAC With incremental Bayesian Ying-Yang learning and dynamic rule construction.
Nguyen, M N
2010-04-01
Inspired by the philosophy of ancient Chinese Taoism, Xu's Bayesian ying-yang (BYY) learning technique performs clustering by harmonizing the training data (yang) with the solution (ying). In our previous work, the BYY learning technique was applied to a fuzzy cerebellar model articulation controller (FCMAC) to find the optimal fuzzy sets; however, this is not suitable for time series data analysis. To address this problem, we propose an incremental BYY learning technique in this paper, with the idea of sliding window and rule structure dynamic algorithms. Three contributions are made as a result of this research. First, an online expectation-maximization algorithm incorporated with the sliding window is proposed for the fuzzification phase. Second, the memory requirement is greatly reduced since the entire data set no longer needs to be obtained during the prediction process. Third, the rule structure dynamic algorithm with dynamically initializing, recruiting, and pruning rules relieves the "curse of dimensionality" problem that is inherent in the FCMAC. Because of these features, the experimental results of the benchmark data sets of currency exchange rates and Mackey-Glass show that the proposed model is more suitable for real-time streaming data analysis.
Process Flow Features as a Host-Based Event Knowledge Representation
2012-06-14
an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from...for Cluster 98. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.9. Davies- Boldin Dunn Index Sliding Window 5 on Windows 7...82 4.10. Davies- Boldin Dunn Index Sliding Window 10 on Windows 7 . 83 4.11. Davies- Boldin Dunn Index Sliding Window 20 on Windows 7 . 83 ix List of
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
NASA Astrophysics Data System (ADS)
Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.
2017-11-01
Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.
Baczkowski, Blazej M; Johnstone, Tom; Walter, Henrik; Erk, Susanne; Veer, Ilya M
2017-06-01
We evaluated whether sliding-window analysis can reveal functionally relevant brain network dynamics during a well-established fear conditioning paradigm. To this end, we tested if fMRI fluctuations in amygdala functional connectivity (FC) can be related to task-induced changes in physiological arousal and vigilance, as reflected in the skin conductance level (SCL). Thirty-two healthy individuals participated in the study. For the sliding-window analysis we used windows that were shifted by one volume at a time. Amygdala FC was calculated for each of these windows. Simultaneously acquired SCL time series were averaged over time frames that corresponded to the sliding-window FC analysis, which were subsequently regressed against the whole-brain seed-based amygdala sliding-window FC using the GLM. Surrogate time series were generated to test whether connectivity dynamics could have occurred by chance. In addition, results were contrasted against static amygdala FC and sliding-window FC of the primary visual cortex, which was chosen as a control seed, while a physio-physiological interaction (PPI) was performed as cross-validation. During periods of increased SCL, the left amygdala became more strongly coupled with the bilateral insula and anterior cingulate cortex, core areas of the salience network. The sliding-window analysis yielded a connectivity pattern that was unlikely to have occurred by chance, was spatially distinct from static amygdala FC and from sliding-window FC of the primary visual cortex, but was highly comparable to that of the PPI analysis. We conclude that sliding-window analysis can reveal functionally relevant fluctuations in connectivity in the context of an externally cued task. Copyright © 2017 Elsevier Inc. All rights reserved.
24 CFR 3280.113 - Glass and glazed openings.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Glass and glazed openings. (a) Windows and sliding glass doors. All windows and sliding glass doors shall meet the requirements of § 3280.403 the “Standard for Windows and Sliding Glass Doors Used in...
MATSurv: multisensor air traffic surveillance system
NASA Astrophysics Data System (ADS)
Yeddanapudi, Murali; Bar-Shalom, Yaakov; Pattipati, Krishna R.; Gassner, Richard R.
1995-09-01
This paper deals with the design and implementation of MATSurv 1--an experimental Multisensor Air Traffic Surveillance system. The proposed system consists of a Kalman filter based state estimator used in conjunction with a 2D sliding window assignment algorithm. Real data from two FAA radars is used to evaluate the performance of this algorithm. The results indicate that the proposed algorithm provides a superior classification of the measurements into tracks (i.e., the most likely aircraft trajectories) when compared to the aircraft trajectories obtained using the measurement IDs (squawk or IFF code).
Griffiths, Jason I.; Fronhofer, Emanuel A.; Garnier, Aurélie; Seymour, Mathew; Altermatt, Florian; Petchey, Owen L.
2017-01-01
The development of video-based monitoring methods allows for rapid, dynamic and accurate monitoring of individuals or communities, compared to slower traditional methods, with far reaching ecological and evolutionary applications. Large amounts of data are generated using video-based methods, which can be effectively processed using machine learning (ML) algorithms into meaningful ecological information. ML uses user defined classes (e.g. species), derived from a subset (i.e. training data) of video-observed quantitative features (e.g. phenotypic variation), to infer classes in subsequent observations. However, phenotypic variation often changes due to environmental conditions, which may lead to poor classification, if environmentally induced variation in phenotypes is not accounted for. Here we describe a framework for classifying species under changing environmental conditions based on the random forest classification. A sliding window approach was developed that restricts temporal and environmentally conditions to improve the classification. We tested our approach by applying the classification framework to experimental data. The experiment used a set of six ciliate species to monitor changes in community structure and behavior over hundreds of generations, in dozens of species combinations and across a temperature gradient. Differences in biotic and abiotic conditions caused simplistic classification approaches to be unsuccessful. In contrast, the sliding window approach allowed classification to be highly successful, as phenotypic differences driven by environmental change, could be captured by the classifier. Importantly, classification using the random forest algorithm showed comparable success when validated against traditional, slower, manual identification. Our framework allows for reliable classification in dynamic environments, and may help to improve strategies for long-term monitoring of species in changing environments. Our classification pipeline can be applied in fields assessing species community dynamics, such as eco-toxicology, ecology and evolutionary ecology. PMID:28472193
Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn
2016-01-01
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197
24 CFR 3280.113 - Glass and glazed openings.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Glass and glazed openings. 3280.113... Glass and glazed openings. (a) Windows and sliding glass doors. All windows and sliding glass doors shall meet the requirements of § 3280.403 the “Standard for Windows and Sliding Glass Doors Used in...
Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang
2016-08-01
Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation. Copyright © 2016 Elsevier B.V. All rights reserved.
SlideSort: all pairs similarity search for short reads
Shimizu, Kana; Tsuda, Koji
2011-01-01
Motivation: Recent progress in DNA sequencing technologies calls for fast and accurate algorithms that can evaluate sequence similarity for a huge amount of short reads. Searching similar pairs from a string pool is a fundamental process of de novo genome assembly, genome-wide alignment and other important analyses. Results: In this study, we designed and implemented an exact algorithm SlideSort that finds all similar pairs from a string pool in terms of edit distance. Using an efficient pattern growth algorithm, SlideSort discovers chains of common k-mers to narrow down the search. Compared to existing methods based on single k-mers, our method is more effective in reducing the number of edit distance calculations. In comparison to backtracking methods such as BWA, our method is much faster in finding remote matches, scaling easily to tens of millions of sequences. Our software has an additional function of single link clustering, which is useful in summarizing short reads for further processing. Availability: Executable binary files and C++ libraries are available at http://www.cbrc.jp/~shimizu/slidesort/ for Linux and Windows. Contact: slidesort@m.aist.go.jp; shimizu-kana@aist.go.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21148542
NASA Astrophysics Data System (ADS)
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
A Variational Approach to Simultaneous Image Segmentation and Bias Correction.
Zhang, Kaihua; Liu, Qingshan; Song, Huihui; Li, Xuelong
2015-08-01
This paper presents a novel variational approach for simultaneous estimation of bias field and segmentation of images with intensity inhomogeneity. We model intensity of inhomogeneous objects to be Gaussian distributed with different means and variances, and then introduce a sliding window to map the original image intensity onto another domain, where the intensity distribution of each object is still Gaussian but can be better separated. The means of the Gaussian distributions in the transformed domain can be adaptively estimated by multiplying the bias field with a piecewise constant signal within the sliding window. A maximum likelihood energy functional is then defined on each local region, which combines the bias field, the membership function of the object region, and the constant approximating the true signal from its corresponding object. The energy functional is then extended to the whole image domain by the Bayesian learning approach. An efficient iterative algorithm is proposed for energy minimization, via which the image segmentation and bias field correction are simultaneously achieved. Furthermore, the smoothness of the obtained optimal bias field is ensured by the normalized convolutions without extra cost. Experiments on real images demonstrated the superiority of the proposed algorithm to other state-of-the-art representative methods.
Guerra, Jorge; Uddin, Jasim; Nilsen, Dawn; Mclnerney, James; Fadoo, Ammarah; Omofuma, Isirame B.; Hughes, Shatif; Agrawal, Sunil; Allen, Peter; Schambra, Heidi M.
2017-01-01
There currently exist no practical tools to identify functional movements in the upper extremities (UEs). This absence has limited the precise therapeutic dosing of patients recovering from stroke. In this proof-of-principle study, we aimed to develop an accurate approach for classifying UE functional movement primitives, which comprise functional movements. Data were generated from inertial measurement units (IMUs) placed on upper body segments of older healthy individuals and chronic stroke patients. Subjects performed activities commonly trained during rehabilitation after stroke. Data processing involved the use of a sliding window to obtain statistical descriptors, and resulting features were processed by a Hidden Markov Model (HMM). The likelihoods of the states, resulting from the HMM, were segmented by a second sliding window and their averages were calculated. The final predictions were mapped to human functional movement primitives using a Logistic Regression algorithm. Algorithm performance was assessed with a leave-one-out analysis, which determined its sensitivity, specificity, and positive and negative predictive values for all classified primitives. In healthy control and stroke participants, our approach identified functional movement primitives embedded in training activities with, on average, 80% precision. This approach may support functional movement dosing in stroke rehabilitation. PMID:28813877
Sliding Window Generalized Kernel Affine Projection Algorithm Using Projection Mappings
NASA Astrophysics Data System (ADS)
Slavakis, Konstantinos; Theodoridis, Sergios
2008-12-01
Very recently, a solution to the kernel-based online classification problem has been given by the adaptive projected subgradient method (APSM). The developed algorithm can be considered as a generalization of a kernel affine projection algorithm (APA) and the kernel normalized least mean squares (NLMS). Furthermore, sparsification of the resulting kernel series expansion was achieved by imposing a closed ball (convex set) constraint on the norm of the classifiers. This paper presents another sparsification method for the APSM approach to the online classification task by generating a sequence of linear subspaces in a reproducing kernel Hilbert space (RKHS). To cope with the inherent memory limitations of online systems and to embed tracking capabilities to the design, an upper bound on the dimension of the linear subspaces is imposed. The underlying principle of the design is the notion of projection mappings. Classification is performed by metric projection mappings, sparsification is achieved by orthogonal projections, while the online system's memory requirements and tracking are attained by oblique projections. The resulting sparsification scheme shows strong similarities with the classical sliding window adaptive schemes. The proposed design is validated by the adaptive equalization problem of a nonlinear communication channel, and is compared with classical and recent stochastic gradient descent techniques, as well as with the APSM's solution where sparsification is performed by a closed ball constraint on the norm of the classifiers.
Window Operator Types | Efficient Windows Collaborative
Types Casement Casement Casement windows are hinged at the sides. Hinged windows such as casements operating types to consider. Traditional operable window types include the projected or hinged types such as casement, awning, and hopper, and the sliding types such as double- and single-hung and horizontal sliding
Soft-output decoding algorithms in iterative decoding of turbo codes
NASA Technical Reports Server (NTRS)
Benedetto, S.; Montorsi, G.; Divsalar, D.; Pollara, F.
1996-01-01
In this article, we present two versions of a simplified maximum a posteriori decoding algorithm. The algorithms work in a sliding window form, like the Viterbi algorithm, and can thus be used to decode continuously transmitted sequences obtained by parallel concatenated codes, without requiring code trellis termination. A heuristic explanation is also given of how to embed the maximum a posteriori algorithms into the iterative decoding of parallel concatenated codes (turbo codes). The performances of the two algorithms are compared on the basis of a powerful rate 1/3 parallel concatenated code. Basic circuits to implement the simplified a posteriori decoding algorithm using lookup tables, and two further approximations (linear and threshold), with a very small penalty, to eliminate the need for lookup tables are proposed.
Pattern Discovery and Change Detection of Online Music Query Streams
NASA Astrophysics Data System (ADS)
Li, Hua-Fu
In this paper, an efficient stream mining algorithm, called FTP-stream (Frequent Temporal Pattern mining of streams), is proposed to find the frequent temporal patterns over melody sequence streams. In the framework of our proposed algorithm, an effective bit-sequence representation is used to reduce the time and memory needed to slide the windows. The FTP-stream algorithm can calculate the support threshold in only a single pass based on the concept of bit-sequence representation. It takes the advantage of "left" and "and" operations of the representation. Experiments show that the proposed algorithm only scans the music query stream once, and runs significant faster and consumes less memory than existing algorithms, such as SWFI-stream and Moment.
SU-E-T-478: Sliding Window Multi-Criteria IMRT Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, D; Papp, D; Unkelbach, J
2014-06-01
Purpose: To demonstrate a method for what-you-see-is-what-you-get multi-criteria Pareto surface navigation for step and shoot IMRT treatment planning. Methods: We show mathematically how multiple sliding window treatment plans can be averaged to yield a single plan whose dose distribution is the dosimetric average of the averaged plans. This is incorporated into the Pareto surface navigation based approach to treatment planning in such a way that as the user navigates the surface, the plans he/she is viewing are ready to be delivered (i.e. there is no extra ‘segment the plans’ step that often leads to unacceptable plan degradation in step andmore » shoot Pareto surface navigation). We also describe how the technique can be applied to VMAT. Briefly, sliding window VMAT plans are created such that MLC leaves paint out fluence maps every 15 degrees or so. These fluence map leaf trajectories are averaged in the same way the static beam IMRT ones are. Results: We show mathematically that fluence maps are exactly averaged using our leaf sweep averaging algorithm. Leaf transmission and output factor corrections effects, which are ignored in this work, can lead to small errors in terms of the dose distributions not being exactly averaged even though the fluence maps are. However, our demonstrations show that the dose distributions are almost exactly averaged as well. We demonstrate the technique both for IMRT and VMAT. Conclusions: By turning to sliding window delivery, we show that the problem of losing plan fidelity during the conversion of an idealized fluence map plan into a deliverable plan is remedied. This will allow for multicriteria optimization that avoids the pitfall that the planning has to be redone after the conversion into MLC segments due to plan quality decline. David Craft partially funded by RaySearch Laboratories.« less
Finding minimum spanning trees more efficiently for tile-based phase unwrapping
NASA Astrophysics Data System (ADS)
Sawaf, Firas; Tatam, Ralph P.
2006-06-01
The tile-based phase unwrapping method employs an algorithm for finding the minimum spanning tree (MST) in each tile. We first examine the properties of a tile's representation from a graph theory viewpoint, observing that it is possible to make use of a more efficient class of MST algorithms. We then describe a novel linear time algorithm which reduces the size of the MST problem by half at the least, and solves it completely at best. We also show how this algorithm can be applied to a tile using a sliding window technique. Finally, we show how the reduction algorithm can be combined with any other standard MST algorithm to achieve a more efficient hybrid, using Prim's algorithm for empirical comparison and noting that the reduction algorithm takes only 0.1% of the time taken by the overall hybrid.
Robust and unobtrusive algorithm based on position independence for step detection
NASA Astrophysics Data System (ADS)
Qiu, KeCheng; Li, MengYang; Luo, YiHan
2018-04-01
Running is becoming one of the most popular exercises among the people, monitoring steps can help users better understand their running process and improve exercise efficiency. In this paper, we design and implement a robust and unobtrusive algorithm based on position independence for step detection under real environment. It applies Butterworth filter to suppress high frequency interference and then employs the projection based on mathematics to transform system to solve the problem of unknown position of smartphone. Finally, using sliding window to suppress the false peak. The algorithm was tested for eight participants on the Android 7.0 platform. In our experiments, the results show that the proposed algorithm can achieve desired effect in spite of device pose.
Lesion Detection in CT Images Using Deep Learning Semantic Segmentation Technique
NASA Astrophysics Data System (ADS)
Kalinovsky, A.; Liauchuk, V.; Tarasau, A.
2017-05-01
In this paper, the problem of automatic detection of tuberculosis lesion on 3D lung CT images is considered as a benchmark for testing out algorithms based on a modern concept of Deep Learning. For training and testing of the algorithms a domestic dataset of 338 3D CT scans of tuberculosis patients with manually labelled lesions was used. The algorithms which are based on using Deep Convolutional Networks were implemented and applied in three different ways including slice-wise lesion detection in 2D images using semantic segmentation, slice-wise lesion detection in 2D images using sliding window technique as well as straightforward detection of lesions via semantic segmentation in whole 3D CT scans. The algorithms demonstrate superior performance compared to algorithms based on conventional image analysis methods.
9. INTERIOR OF LIVING ROOM SHOWING ALUMINUM SLIDING GLASS WINDOW ...
9. INTERIOR OF LIVING ROOM SHOWING ALUMINUM SLIDING GLASS WINDOW FRONT DOOR, AND ORIGINAL 6-LIGHT OVER 1-LIGHT, DOUBLE-HUNG WINDOWS IN SINGLE AND DOUBLE ARRANGEMENTS. VIEW TO NORTHWEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Sliding window prior data assisted compressed sensing for MRI tracking of lung tumors.
Yip, Eugene; Yun, Jihyun; Wachowicz, Keith; Gabos, Zsolt; Rathee, Satyapal; Fallone, B G
2017-01-01
Hybrid magnetic resonance imaging and radiation therapy devices are capable of imaging in real-time to track intrafractional lung tumor motion during radiotherapy. Highly accelerated magnetic resonance (MR) imaging methods can potentially reduce system delay time and/or improves imaging spatial resolution, and provide flexibility in imaging parameters. Prior Data Assisted Compressed Sensing (PDACS) has previously been proposed as an acceleration method that combines the advantages of 2D compressed sensing and the KEYHOLE view-sharing technique. However, as PDACS relies on prior data acquired at the beginning of a dynamic imaging sequence, decline in image quality occurs for longer duration scans due to drifts in MR signal. Novel sliding window-based techniques for refreshing prior data are proposed as a solution to this problem. MR acceleration is performed by retrospective removal of data from the fully sampled sets. Six patients with lung tumors are scanned with a clinical 3 T MRI using a balanced steady-state free precession (bSSFP) sequence for 3 min at approximately 4 frames per second, for a total of 650 dynamics. A series of distinct pseudo-random patterns of partial k-space acquisition is generated such that, when combined with other dynamics within a sliding window of 100 dynamics, covers the entire k-space. The prior data in the sliding window are continuously refreshed to reduce the impact of MR signal drifts. We intended to demonstrate two different ways to utilize the sliding window data: a simple averaging method and a navigator-based method. These two sliding window methods are quantitatively compared against the original PDACS method using three metrics: artifact power, centroid displacement error, and Dice's coefficient. The study is repeated with pseudo 0.5 T images by adding complex, normally distributed noise with a standard deviation that reduces image SNR, relative to original 3 T images, by a factor of 6. Without sliding window implemented, PDACS-reconstructed dynamic datasets showed progressive increases in image artifact power as the 3 min scan progresses. With sliding windows implemented, this increase in artifact power is eliminated. Near the end of a 3 min scan at 3 T SNR and 5× acceleration, implementation of an averaging (navigator) sliding window method improves our metrics by the following ways: artifact power decreases from 0.065 without sliding window to 0.030 (0.031), centroid error decreases from 2.64 to 1.41 mm (1.28 mm), and Dice coefficient agreement increases from 0.860 to 0.912 (0.915). At pseudo 0.5 T SNR, the improvements in metrics are as follows: artifact power decreases from 0.110 without sliding window to 0.0897 (0.0985), centroid error decreases from 2.92 mm to 1.36 mm (1.32 mm), and Dice coefficient agreements increases from 0.851 to 0.894 (0.896). In this work we demonstrated the negative impact of slow changes in MR signal for longer duration PDACS dynamic scans, namely increases in image artifact power and reductions of tumor tracking accuracy. We have also demonstrated sliding window implementations (i.e., refreshing of prior data) of PDACS are effective solutions to this problem at both 3 T and simulated 0.5 T bSSFP images. © 2016 American Association of Physicists in Medicine.
Automatic segmentation of psoriasis lesions
NASA Astrophysics Data System (ADS)
Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang
2014-10-01
The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.
Wavelet-based clustering of resting state MRI data in the rat.
Medda, Alessio; Hoffmann, Lukas; Magnuson, Matthew; Thompson, Garth; Pan, Wen-Ju; Keilholz, Shella
2016-01-01
While functional connectivity has typically been calculated over the entire length of the scan (5-10min), interest has been growing in dynamic analysis methods that can detect changes in connectivity on the order of cognitive processes (seconds). Previous work with sliding window correlation has shown that changes in functional connectivity can be observed on these time scales in the awake human and in anesthetized animals. This exciting advance creates a need for improved approaches to characterize dynamic functional networks in the brain. Previous studies were performed using sliding window analysis on regions of interest defined based on anatomy or obtained from traditional steady-state analysis methods. The parcellation of the brain may therefore be suboptimal, and the characteristics of the time-varying connectivity between regions are dependent upon the length of the sliding window chosen. This manuscript describes an algorithm based on wavelet decomposition that allows data-driven clustering of voxels into functional regions based on temporal and spectral properties. Previous work has shown that different networks have characteristic frequency fingerprints, and the use of wavelets ensures that both the frequency and the timing of the BOLD fluctuations are considered during the clustering process. The method was applied to resting state data acquired from anesthetized rats, and the resulting clusters agreed well with known anatomical areas. Clusters were highly reproducible across subjects. Wavelet cross-correlation values between clusters from a single scan were significantly higher than the values from randomly matched clusters that shared no temporal information, indicating that wavelet-based analysis is sensitive to the relationship between areas. Copyright © 2015 Elsevier Inc. All rights reserved.
Active impulsive noise control using maximum correntropy with adaptive kernel size
NASA Astrophysics Data System (ADS)
Lu, Lu; Zhao, Haiquan
2017-03-01
The active noise control (ANC) based on the principle of superposition is an attractive method to attenuate the noise signals. However, the impulsive noise in the ANC systems will degrade the performance of the controller. In this paper, a filtered-x recursive maximum correntropy (FxRMC) algorithm is proposed based on the maximum correntropy criterion (MCC) to reduce the effect of outliers. The proposed FxRMC algorithm does not requires any priori information of the noise characteristics and outperforms the filtered-x least mean square (FxLMS) algorithm for impulsive noise. Meanwhile, in order to adjust the kernel size of FxRMC algorithm online, a recursive approach is proposed through taking into account the past estimates of error signals over a sliding window. Simulation and experimental results in the context of active impulsive noise control demonstrate that the proposed algorithms achieve much better performance than the existing algorithms in various noise environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, L; Huang, B; Rowedder, B
Purpose: The Smart leaf motion calculator (SLMC) in Eclipse treatment planning system is an advanced fluence delivery modeling algorithm as it takes into account fine MLC features including inter-leaf leakage, rounded leaf tips, non-uniform leaf thickness, and the spindle cavity etc. In this study, SLMC and traditional Varian LMC (VLMC) algorithms were investigated, for the first time, in dosimetric characteristics and delivery accuracy of sliding window (SW) IMRT. Methods: The SW IMRT plans of 51 cancer cases were included to evaluate dosimetric characteristics and dose delivery accuracy from leaf motion calculated by SLMC and VLMC, respectively. All plans were deliveredmore » using a Varian TrueBeam Linac. The DVH and MUs of the plans were analyzed. Three patient specific QA tools - independent dose calculation software IMSure, Delta4 phantom, and EPID portal dosimetry were also used to measure the delivered dose distribution. Results: Significant differences in the MUs were observed between the two LMCs (p≤0.001).Gamma analysis shows an excellent agreement between the planned dose distribution calculated by both LMC algorithms and delivered dose distribution measured by three QA tools in all plans at 3%/3 mm, leading to a mean pass rate exceeding 97%. The mean fraction of pixels with gamma < 1 of SLMC is slightly lower than that of VLMC in the IMSure and Delta4 results, but higher in portal dosimetry (the highest spatial resolution), especially in complex cases such as nasopharynx. Conclusion: The study suggests that the two LMCs generates the similar target coverage and sparing patterns of critical structures. However, SLMC is modestly more accurate than VLMC in modeling advanced MLC features, which may lead to a more accurate dose delivery in SW IMRT. Current clinical QA tools might not be specific enough to differentiate the dosimetric discrepancies at the millimeter level calculated by these two LMC algorithms. NIH/NIGMS grant U54 GM104944, Lincy Endowed Assistant Professorship.« less
Sliding, Insulating Window Panel Reduces Heat Loss.
ERIC Educational Resources Information Center
School Business Affairs, 1984
1984-01-01
A new sliding insulated panel reduces window heat loss up to 86 percent, and infiltration 60-90 percent, paying for itself in 3-9 years. This article discusses the panel's use and testing in the upper Midwest, reporting both technical characteristics and users' reactions. (MCG)
Combining point context and dynamic time warping for online gesture recognition
NASA Astrophysics Data System (ADS)
Mao, Xia; Li, Chen
2017-05-01
Previous gesture recognition methods usually focused on recognizing gestures after the entire gesture sequences were obtained. However, in many practical applications, a system has to identify gestures before they end to give instant feedback. We present an online gesture recognition approach that can realize early recognition of unfinished gestures with low latency. First, a curvature buffer-based point context (CBPC) descriptor is proposed to extract the shape feature of a gesture trajectory. The CBPC descriptor is a complete descriptor with a simple computation, and thus has its superiority in online scenarios. Then, we introduce an online windowed dynamic time warping algorithm to realize online matching between the ongoing gesture and the template gestures. In the algorithm, computational complexity is effectively decreased by adding a sliding window to the accumulative distance matrix. Lastly, the experiments are conducted on the Australian sign language data set and the Kinect hand gesture (KHG) data set. Results show that the proposed method outperforms other state-of-the-art methods especially when gesture information is incomplete.
An algorithm for automating the registration of USDA segment ground data to LANDSAT MSS data
NASA Technical Reports Server (NTRS)
Graham, M. H. (Principal Investigator)
1981-01-01
The algorithm is referred to as the Automatic Segment Matching Algorithm (ASMA). The ASMA uses control points or the annotation record of a P-format LANDSAT compter compatible tape as the initial registration to relate latitude and longitude to LANDSAT rows and columns. It searches a given area of LANDSAT data with a 2x2 sliding window and computes gradient values for bands 5 and 7 to match the segment boundaries. The gradient values are held in memory during the shifting (or matching) process. The reconstructed segment array, containing ones (1's) for boundaries and zeros elsewhere are computer compared to the LANDSAT array and the best match computed. Initial testing of the ASMA indicates that it has good potential for replacing the manual technique.
NASA Astrophysics Data System (ADS)
Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari
2018-01-01
The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.
Plan averaging for multicriteria navigation of sliding window IMRT and VMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, David, E-mail: dcraft@partners.org; Papp, Dávid; Unkelbach, Jan
2014-02-15
Purpose: To describe a method for combining sliding window plans [intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT)] for use in treatment plan averaging, which is needed for Pareto surface navigation based multicriteria treatment planning. Methods: The authors show that by taking an appropriately defined average of leaf trajectories of sliding window plans, the authors obtain a sliding window plan whose fluence map is the exact average of the fluence maps corresponding to the initial plans. In the case of static-beam IMRT, this also implies that the dose distribution of the averaged plan is the exact dosimetricmore » average of the initial plans. In VMAT delivery, the dose distribution of the averaged plan is a close approximation of the dosimetric average of the initial plans. Results: The authors demonstrate the method on three Pareto optimal VMAT plans created for a demanding paraspinal case, where the tumor surrounds the spinal cord. The results show that the leaf averaged plans yield dose distributions that approximate the dosimetric averages of the precomputed Pareto optimal plans well. Conclusions: The proposed method enables the navigation of deliverable Pareto optimal plans directly, i.e., interactive multicriteria exploration of deliverable sliding window IMRT and VMAT plans, eliminating the need for a sequencing step after navigation and hence the dose degradation that is caused by such a sequencing step.« less
Ahmad, Muneer; Jung, Low Tan; Bhuiyan, Al-Amin
2017-10-01
Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals. This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise. Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J; Lu, B; Yan, G
Purpose: To identify the weakness of dose calculation algorithm in a treatment planning system for volumetric modulated arc therapy (VMAT) and sliding window (SW) techniques using a two-dimensional diode array. Methods: The VMAT quality assurance(QA) was implemented with a diode array using multiple partial arcs that divided from a VMAT plan; each partial arc has the same segments and the original monitor units. Arc angles were less than ± 30°. Multiple arcs delivered through consecutive and repetitive gantry operating clockwise and counterclockwise. The source-toaxis distance setup with the effective depths of 10 and 20 cm were used for a diodemore » array. To figure out dose errors caused in delivery of VMAT fields, the numerous fields having the same segments with the VMAT field irradiated using different delivery techniques of static and step-and-shoot. The dose distributions of the SW technique were evaluated by creating split fields having fine moving steps of multi-leaf collimator leaves. Calculated doses using the adaptive convolution algorithm were analyzed with measured ones with distance-to-agreement and dose difference of 3 mm and 3%.. Results: While the beam delivery through static and step-and-shoot techniques showed the passing rate of 97 ± 2%, partial arc delivery of the VMAT fields brought out passing rate of 85%. However, when leaf motion was restricted less than 4.6 mm/°, passing rate was improved up to 95 ± 2%. Similar passing rate were obtained for both 10 and 20 cm effective depth setup. The calculated doses using the SW technique showed the dose difference over 7% at the final arrival point of moving leaves. Conclusion: Error components in dynamic delivery of modulated beams were distinguished by using the suggested QA method. This partial arc method can be used for routine VMAT QA. Improved SW calculation algorithm is required to provide accurate estimated doses.« less
NASA Astrophysics Data System (ADS)
Rai, A.; Minsker, B. S.
2016-12-01
In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.
Short segment search method for phylogenetic analysis using nested sliding windows
NASA Astrophysics Data System (ADS)
Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.
2017-10-01
To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.
A post-processing algorithm for time domain pitch trackers
NASA Astrophysics Data System (ADS)
Specker, P.
1983-01-01
This paper describes a powerful post-processing algorithm for time-domain pitch trackers. On two successive passes, the post-processing algorithm eliminates errors produced during a first pass by a time-domain pitch tracker. During the second pass, incorrect pitch values are detected as outliers by computing the distribution of values over a sliding 80 msec window. During the third pass (based on artificial intelligence techniques), remaining pitch pulses are used as anchor points to reconstruct the pitch train from the original waveform. The algorithm produced a decrease in the error rate from 21% obtained with the original time domain pitch tracker to 2% for isolated words and sentences produced in an office environment by 3 male and 3 female talkers. In a noisy computer room errors decreased from 52% to 2.9% for the same stimuli produced by 2 male talkers. The algorithm is efficient, accurate, and resistant to noise. The fundamental frequency micro-structure is tracked sufficiently well to be used in extracting phonetic features in a feature-based recognition system.
Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
Zhu, Qingxin; Niu, Xinzheng
2016-01-01
By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996
Zhang, Chunyuan; Zhu, Qingxin; Niu, Xinzheng
2016-01-01
By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.
13. INTERIOR OF FRONT BEDROOM SHOWING BUILTIN COMBINATION CABINET/SLIDING DOOR ...
13. INTERIOR OF FRONT BEDROOM SHOWING BUILT-IN COMBINATION CABINET/SLIDING DOOR CLOSET AND SLIDING GLASS WINDOW. VIEW TO SOUTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Stanford, Tyman E; Bagley, Christopher J; Solomon, Patty J
2016-01-01
Proteomic matrix-assisted laser desorption/ionisation (MALDI) linear time-of-flight (TOF) mass spectrometry (MS) may be used to produce protein profiles from biological samples with the aim of discovering biomarkers for disease. However, the raw protein profiles suffer from several sources of bias or systematic variation which need to be removed via pre-processing before meaningful downstream analysis of the data can be undertaken. Baseline subtraction, an early pre-processing step that removes the non-peptide signal from the spectra, is complicated by the following: (i) each spectrum has, on average, wider peaks for peptides with higher mass-to-charge ratios ( m / z ), and (ii) the time-consuming and error-prone trial-and-error process for optimising the baseline subtraction input arguments. With reference to the aforementioned complications, we present an automated pipeline that includes (i) a novel 'continuous' line segment algorithm that efficiently operates over data with a transformed m / z -axis to remove the relationship between peptide mass and peak width, and (ii) an input-free algorithm to estimate peak widths on the transformed m / z scale. The automated baseline subtraction method was deployed on six publicly available proteomic MS datasets using six different m/z-axis transformations. Optimality of the automated baseline subtraction pipeline was assessed quantitatively using the mean absolute scaled error (MASE) when compared to a gold-standard baseline subtracted signal. Several of the transformations investigated were able to reduce, if not entirely remove, the peak width and peak location relationship resulting in near-optimal baseline subtraction using the automated pipeline. The proposed novel 'continuous' line segment algorithm is shown to far outperform naive sliding window algorithms with regard to the computational time required. The improvement in computational time was at least four-fold on real MALDI TOF-MS data and at least an order of magnitude on many simulated datasets. The advantages of the proposed pipeline include informed and data specific input arguments for baseline subtraction methods, the avoidance of time-intensive and subjective piecewise baseline subtraction, and the ability to automate baseline subtraction completely. Moreover, individual steps can be adopted as stand-alone routines.
Artificial Intelligence Methods Applied to Parameter Detection of Atrial Fibrillation
NASA Astrophysics Data System (ADS)
Arotaritei, D.; Rotariu, C.
2015-09-01
In this paper we present a novel method to develop an atrial fibrillation (AF) based on statistical descriptors and hybrid neuro-fuzzy and crisp system. The inference of system produce rules of type if-then-else that care extracted to construct a binary decision system: normal of atrial fibrillation. We use TPR (Turning Point Ratio), SE (Shannon Entropy) and RMSSD (Root Mean Square of Successive Differences) along with a new descriptor, Teager- Kaiser energy, in order to improve the accuracy of detection. The descriptors are calculated over a sliding window that produce very large number of vectors (massive dataset) used by classifier. The length of window is a crisp descriptor meanwhile the rest of descriptors are interval-valued type. The parameters of hybrid system are adapted using Genetic Algorithm (GA) algorithm with fitness single objective target: highest values for sensibility and sensitivity. The rules are extracted and they are part of the decision system. The proposed method was tested using the Physionet MIT-BIH Atrial Fibrillation Database and the experimental results revealed a good accuracy of AF detection in terms of sensitivity and specificity (above 90%).
Adaptive DFT-based Interferometer Fringe Tracking
NASA Technical Reports Server (NTRS)
Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.
2004-01-01
An automatic interferometer fringe tracking system has been developed, implemented, and tested at the Infrared Optical Telescope Array (IOTA) observatory at Mt. Hopkins, Arizona. The system can minimize the optical path differences (OPDs) for all three baselines of the Michelson stellar interferometer at IOTA. Based on sliding window discrete Fourier transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on off-line data. Implemented in ANSI C on the 266 MHz PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately 2.0 milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. The adaptive DFT-based tracking algorithm should be applicable to other systems where there is a need to detect or track a signal with an approximately constant-frequency carrier pulse.
Adaptive DIT-Based Fringe Tracking and Prediction at IOTA
NASA Technical Reports Server (NTRS)
Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.
2004-01-01
An automatic fringe tracking system has been developed and implemented at the Infrared Optical Telescope Array (IOTA). In testing during May 2002, the system successfully minimized the optical path differences (OPDs) for all three baselines at IOTA. Based on sliding window discrete Fourier transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on off-line data. Implemented in ANSI C on the 266 MHZ PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately 2.0 milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. Preliminary analysis on an extension of this algorithm indicates a potential for predictive tracking, although at present, real-time implementation of this extension would require significantly more computational capacity.
Measuring Glial Metabolism in Repetitive Brain Trauma and Alzheimer’s Disease
2016-09-01
Six methods: Single value decomposition (SVD), wavelet, sliding window, sliding window with Gaussian weighting, spline and spectral improvements...comparison of a range of different denoising methods for dynamic MRS. Six denoising methods were considered: Single value decomposition (SVD), wavelet...project by improving the software required for the data analysis by developing six different denoising methods. He also assisted with the testing
Scott, Jonathan M.; Robinson, Stephen E.; Holroyd, Tom; Coppola, Richard; Sato, Susumu; Inati, Sara K.
2016-01-01
OBJECTIVE To describe and optimize an automated beamforming technique followed by identification of locations with excess kurtosis (g2) for efficient detection and localization of interictal spikes in medically refractory epilepsy patients. METHODS Synthetic Aperture Magnetometry with g2 averaged over a sliding time window (SAMepi) was performed in 7 focal epilepsy patients and 5 healthy volunteers. The effect of varied window lengths on detection of spiking activity was evaluated. RESULTS Sliding window lengths of 0.5–10 seconds performed similarly, with 0.5 and 1 second windows detecting spiking activity in one of the 3 virtual sensor locations with highest kurtosis. These locations were concordant with the region of eventual surgical resection in these 7 patients who remained seizure free at one year. Average g2 values increased with increasing sliding window length in all subjects. In healthy volunteers kurtosis values stabilized in datasets longer than two minutes. CONCLUSIONS SAMepi using g2 averaged over 1 second sliding time windows in datasets of at least 2 minutes duration reliably identified interictal spiking and the presumed seizure focus in these 7 patients. Screening the 5 locations with highest kurtosis values for spiking activity is an efficient and accurate technique for localizing interictal activity using MEG. SIGNIFICANCE SAMepi should be applied using the parameter values and procedure described for optimal detection and localization of interictal spikes. Use of this screening procedure could significantly improve the efficiency of MEG analysis if clinically validated. PMID:27760068
NASA Astrophysics Data System (ADS)
Morillot, Olivier; Likforman-Sulem, Laurence; Grosicki, Emmanuèle
2013-04-01
Many preprocessing techniques have been proposed for isolated word recognition. However, recently, recognition systems have dealt with text blocks and their compound text lines. In this paper, we propose a new preprocessing approach to efficiently correct baseline skew and fluctuations. Our approach is based on a sliding window within which the vertical position of the baseline is estimated. Segmentation of text lines into subparts is, thus, avoided. Experiments conducted on a large publicly available database (Rimes), with a BLSTM (bidirectional long short-term memory) recurrent neural network recognition system, show that our baseline correction approach highly improves performance.
Real-time motion-based H.263+ frame rate control
NASA Astrophysics Data System (ADS)
Song, Hwangjun; Kim, JongWon; Kuo, C.-C. Jay
1998-12-01
Most existing H.263+ rate control algorithms, e.g. the one adopted in the test model of the near-term (TMN8), focus on the macroblock layer rate control and low latency under the assumptions of with a constant frame rate and through a constant bit rate (CBR) channel. These algorithms do not accommodate the transmission bandwidth fluctuation efficiently, and the resulting video quality can be degraded. In this work, we propose a new H.263+ rate control scheme which supports the variable bit rate (VBR) channel through the adjustment of the encoding frame rate and quantization parameter. A fast algorithm for the encoding frame rate control based on the inherent motion information within a sliding window in the underlying video is developed to efficiently pursue a good tradeoff between spatial and temporal quality. The proposed rate control algorithm also takes the time-varying bandwidth characteristic of the Internet into account and is able to accommodate the change accordingly. Experimental results are provided to demonstrate the superior performance of the proposed scheme.
Non-intrusive parameter identification procedure user's guide
NASA Technical Reports Server (NTRS)
Hanson, G. D.; Jewell, W. F.
1983-01-01
Written in standard FORTRAN, NAS is capable of identifying linear as well as nonlinear relations between input and output parameters; the only restriction is that the input/output relation be linear with respect to the unknown coefficients of the estimation equations. The output of the identification algorithm can be specified to be in either the time domain (i.e., the estimation equation coefficients) or in the frequency domain (i.e., a frequency response of the estimation equation). The frame length ("window") over which the identification procedure is to take place can be specified to be any portion of the input time history, thereby allowing the freedom to start and stop the identification procedure within a time history. There also is an option which allows a sliding window, which gives a moving average over the time history. The NAS software also includes the ability to identify several assumed solutions simultaneously for the same or different input data.
21. INTERIOR OF SOUTHEAST REAR BEDROOM SHOWING ALUMINUMFRAME SLIDING GLASS ...
21. INTERIOR OF SOUTHEAST REAR BEDROOM SHOWING ALUMINUM-FRAME SLIDING GLASS WINDOWS. VIEW TO SOUTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
19. INTERIOR OF NORTHEAST REAR BEDROOM SHOWING ALUMINUMFRAME SLIDING GLASS ...
19. INTERIOR OF NORTHEAST REAR BEDROOM SHOWING ALUMINUM-FRAME SLIDING GLASS WINDOWS. VIEW TO NORTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
NASA Astrophysics Data System (ADS)
Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang
2016-05-01
In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.
15. INTERIOR OF BATHROOM SHOWING COMBINATION TUB/SHOWER, SINK, AND SLIDING ...
15. INTERIOR OF BATHROOM SHOWING COMBINATION TUB/SHOWER, SINK, AND SLIDING GLASS WINDOW. VIEW TO NORTH. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Robust sliding-window reconstruction for Accelerating the acquisition of MR fingerprinting.
Cao, Xiaozhi; Liao, Congyu; Wang, Zhixing; Chen, Ying; Ye, Huihui; He, Hongjian; Zhong, Jianhui
2017-10-01
To develop a method for accelerated and robust MR fingerprinting (MRF) with improved image reconstruction and parameter matching processes. A sliding-window (SW) strategy was applied to MRF, in which signal and dictionary matching was conducted between fingerprints consisting of mixed-contrast image series reconstructed from consecutive data frames segmented by a sliding window, and a precalculated mixed-contrast dictionary. The effectiveness and performance of this new method, dubbed SW-MRF, was evaluated in both phantom and in vivo. Error quantifications were conducted on results obtained with various settings of SW reconstruction parameters. Compared with the original MRF strategy, the results of both phantom and in vivo experiments demonstrate that the proposed SW-MRF strategy either provided similar accuracy with reduced acquisition time, or improved accuracy with equal acquisition time. Parametric maps of T 1 , T 2 , and proton density of comparable quality could be achieved with a two-fold or more reduction in acquisition time. The effect of sliding-window width on dictionary sensitivity was also estimated. The novel SW-MRF recovers high quality image frames from highly undersampled MRF data, which enables more robust dictionary matching with reduced numbers of data frames. This time efficiency may facilitate MRF applications in time-critical clinical settings. Magn Reson Med 78:1579-1588, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Through the Sliding Glass Door: #EmpowerTheReader
ERIC Educational Resources Information Center
Johnson, Nancy J.; Koss, Melanie D.; Martinez, Miriam
2018-01-01
This article seeks to complicate the understanding of Bishop's (1990) metaphor of mirrors, windows, and sliding glass doors, with particular emphasis on sliding glass doors and the emotional connections needed for readers to move through them. The authors begin by examining the importance of the reader and the characters he or she meets. Next, the…
Carolyn Stern Grant - Home Page
;New Windows on the Universe", a slide set of 175 slides of different astronomical objects in different wavelengths, which I did with Christine Jones and Bill Forman. I have also done a lot of work with
An efficient reversible privacy-preserving data mining technology over data streams.
Lin, Chen-Yi; Kao, Yuan-Hung; Lee, Wei-Bin; Chen, Rong-Chang
2016-01-01
With the popularity of smart handheld devices and the emergence of cloud computing, users and companies can save various data, which may contain private data, to the cloud. Topics relating to data security have therefore received much attention. This study focuses on data stream environments and uses the concept of a sliding window to design a reversible privacy-preserving technology to process continuous data in real time, known as a continuous reversible privacy-preserving (CRP) algorithm. Data with CRP algorithm protection can be accurately recovered through a data recovery process. In addition, by using an embedded watermark, the integrity of the data can be verified. The results from the experiments show that, compared to existing algorithms, CRP is better at preserving knowledge and is more effective in terms of reducing information loss and privacy disclosure risk. In addition, it takes far less time for CRP to process continuous data than existing algorithms. As a result, CRP is confirmed as suitable for data stream environments and fulfills the requirements of being lightweight and energy-efficient for smart handheld devices.
Determining Window Placement and Configuration for the Small Pressurized Rover (SPR)
NASA Technical Reports Server (NTRS)
Thompson, Shelby; Litaker, Harry; Howard, Robert
2009-01-01
This slide presentation reviews the process of the evaluation of window placement and configuration for the cockpit of the Lunar Electric Rover (LER). The purpose of the evaluation was to obtain human-in-the-loop data on window placement and configuration for the cockpit of the LER.
Adhikary, Nabanita; Mahanta, Chitralekha
2013-11-01
In this paper an integral backstepping sliding mode controller is proposed for controlling underactuated systems. A feedback control law is designed based on backstepping algorithm and a sliding surface is introduced in the final stage of the algorithm. The backstepping algorithm makes the controller immune to matched and mismatched uncertainties and the sliding mode control provides robustness. The proposed controller ensures asymptotic stability. The effectiveness of the proposed controller is compared against a coupled sliding mode controller for swing-up and stabilization of the Cart-Pendulum System. Simulation results show that the proposed integral backstepping sliding mode controller is able to reject both matched and mismatched uncertainties with a chattering free control law, while utilizing less control effort than the sliding mode controller. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Honggang; Lin, Huibin; Ding, Kang
2018-05-01
The performance of sparse features extraction by commonly used K-Singular Value Decomposition (K-SVD) method depends largely on the signal segment selected in rolling bearing diagnosis, furthermore, the calculating speed is relatively slow and the dictionary becomes so redundant when the fault signal is relatively long. A new sliding window denoising K-SVD (SWD-KSVD) method is proposed, which uses only one small segment of time domain signal containing impacts to perform sliding window dictionary learning and select an optimal pattern with oscillating information of the rolling bearing fault according to a maximum variance principle. An inner product operation between the optimal pattern and the whole fault signal is performed to enhance the characteristic of the impacts' occurrence moments. Lastly, the signal is reconstructed at peak points of the inner product to realize the extraction of the rolling bearing fault features. Both simulation and experiments verify that the method could extract the fault features effectively.
Development of an aerial counting system in oil palm plantations
NASA Astrophysics Data System (ADS)
Zulyma Miserque Castillo, Jhany; Laverde Diaz, Rubbermaid; Rueda Guzmán, Claudia Leonor
2016-07-01
This paper proposes the development of a counting aerial system capable of capturing, process and analyzing images of an oil palm plantation to register the number of cultivated palms. It begins with a study of the available UAV technologies to define the most appropriate model according to the project needs. As result, a DJI Phantom 2 Vision+ is used to capture pictures that are processed by a photogrammetry software to create orthomosaics from the areas of interest, which are handled by the developed software to calculate the number of palms contained in them. The implemented algorithm uses a sliding window technique in image pyramids to generate candidate windows, an LBP descriptor to model the texture of the picture, a logistic regression model to classify the windows and a non-maximum suppression algorithm to refine the decision. The system was tested in different images than the ones used for training and for establishing the set point. As result, the system showed a 95.34% detection rate with a 97.83% precision in mature palms and a 79.26% detection rate with a 97.53% precision in young palms giving an FI score of 0.97 for mature palms and 0.87 for the small ones. The results are satisfactory getting the census and high-quality images from which is possible to get more information from the area of interest. All this, achieved through a low-cost system capable of work even in cloudy conditions.
Joint channel estimation and multi-user detection for multipath fading channels in DS-CDMA systems
NASA Astrophysics Data System (ADS)
Wu, Sau-Hsuan; Kuo, C.-C. Jay
2002-11-01
The technique of joint blind channel estimation and multiple access interference (MAI) suppression for an asynchronous code-division multiple-access (CDMA) system is investigated in this research. To identify and track dispersive time-varying fading channels and to avoid the phase ambiguity that come with the second-order statistic approaches, a sliding-window scheme using the expectation maximization (EM) algorithm is proposed. The complexity of joint channel equalization and symbol detection for all users increases exponentially with system loading and the channel memory. The situation is exacerbated if strong inter-symbol interference (ISI) exists. To reduce the complexity and the number of samples required for channel estimation, a blind multiuser detector is developed. Together with multi-stage interference cancellation using soft outputs provided by this detector, our algorithm can track fading channels with no phase ambiguity even when channel gains attenuate close to zero.
Research on Synthetic Aperture Radar Processing for the Spaceborne Sliding Spotlight Mode.
Shen, Shijian; Nie, Xin; Zhang, Xinggan
2018-02-03
Gaofen-3 (GF-3) is China' first C-band multi-polarization synthetic aperture radar (SAR) satellite, which also provides the sliding spotlight mode for the first time. Sliding-spotlight mode is a novel mode to realize imaging with not only high resolution, but also wide swath. Several key technologies for sliding spotlight mode in spaceborne SAR with high resolution are investigated in this paper, mainly including the imaging parameters, the methods of velocity estimation and ambiguity elimination, and the imaging algorithms. Based on the chosen Convolution BackProjection (CBP) and PFA (Polar Format Algorithm) imaging algorithms, a fast implementation method of CBP and a modified PFA method suitable for sliding spotlight mode are proposed, and the processing flows are derived in detail. Finally, the algorithms are validated by simulations and measured data.
Ehteshami Bejnordi, Babak; Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A W M; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory Crf; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-12-12
Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin-stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists' diagnoses in a diagnostic setting. Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting.
24. INTERIOR OF BEDROOM NO. 2 SHOWING ALUMINUMFRAMED SLIDINGGLASS WINDOWS ...
24. INTERIOR OF BEDROOM NO. 2 SHOWING ALUMINUM-FRAMED SLIDING-GLASS WINDOWS ON NORTH AND EAST WALLS. VIEW TO NORTHEAST. - Bishop Creek Hydroelectric System, Plant 6, Cashbaugh-Kilpatrick House, Bishop Creek, Bishop, Inyo County, CA
Exploiting visual search theory to infer social interactions
NASA Astrophysics Data System (ADS)
Rota, Paolo; Dang-Nguyen, Duc-Tien; Conci, Nicola; Sebe, Nicu
2013-03-01
In this paper we propose a new method to infer human social interactions using typical techniques adopted in literature for visual search and information retrieval. The main piece of information we use to discriminate among different types of interactions is provided by proxemics cues acquired by a tracker, and used to distinguish between intentional and casual interactions. The proxemics information has been acquired through the analysis of two different metrics: on the one hand we observe the current distance between subjects, and on the other hand we measure the O-space synergy between subjects. The obtained values are taken at every time step over a temporal sliding window, and processed in the Discrete Fourier Transform (DFT) domain. The features are eventually merged into an unique array, and clustered using the K-means algorithm. The clusters are reorganized using a second larger temporal window into a Bag Of Words framework, so as to build the feature vector that will feed the SVM classifier.
Quantitative architectural analysis: a new approach to cortical mapping.
Schleicher, A; Palomero-Gallagher, N; Morosan, P; Eickhoff, S B; Kowalski, T; de Vos, K; Amunts, K; Zilles, K
2005-12-01
Recent progress in anatomical and functional MRI has revived the demand for a reliable, topographic map of the human cerebral cortex. Till date, interpretations of specific activations found in functional imaging studies and their topographical analysis in a spatial reference system are, often, still based on classical architectonic maps. The most commonly used reference atlas is that of Brodmann and his successors, despite its severe inherent drawbacks. One obvious weakness in traditional, architectural mapping is the subjective nature of localising borders between cortical areas, by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, more objective, quantitative mapping procedures have been established in the past years. The quantification of the neocortical, laminar pattern by defining intensity line profiles across the cortical layers, has a long tradition. During the last years, this method has been extended to enable a reliable, reproducible mapping of the cortex based on image analysis and multivariate statistics. Methodological approaches to such algorithm-based, cortical mapping were published for various architectural modalities. In our contribution, principles of algorithm-based mapping are described for cyto- and receptorarchitecture. In a cytoarchitectural parcellation of the human auditory cortex, using a sliding window procedure, the classical areal pattern of the human superior temporal gyrus was modified by a replacing of Brodmann's areas 41, 42, 22 and parts of area 21, with a novel, more detailed map. An extension and optimisation of the sliding window procedure to the specific requirements of receptorarchitectonic mapping, is also described using the macaque central sulcus and adjacent superior parietal lobule as a second, biologically independent example. Algorithm-based mapping procedures, however, are not limited to these two architectural modalities, but can be applied to all images in which a laminar cortical pattern can be detected and quantified, e.g. myeloarchitectonic and in vivo high resolution MR imaging. Defining cortical borders, based on changes in cortical lamination in high resolution, in vivo structural MR images will result in a rapid increase of our knowledge on the structural parcellation of the human cerebral cortex.
Analysis of Texture Using the Fractal Model
NASA Technical Reports Server (NTRS)
Navas, William; Espinosa, Ramon Vasquez
1997-01-01
Properties such as the fractal dimension (FD) can be used for feature extraction and classification of regions within an image. The FD measures the degree of roughness of a surface, so this number is used to characterize a particular region, in order to differentiate it from another. There are two basic approaches discussed in the literature to measure FD: the blanket method, and the box counting method. Both attempt to measure FD by estimating the change in surface area with respect to the change in resolution. We tested both methods but box counting resulted computationally faster and gave better results. Differential Box Counting (DBC) was used to segment a collage containing three textures. The FD is independent of directionality and brightness so five features were used derived from the original image to account for directionality and gray level biases. FD can not be measured on a point, so we use a window that slides across the image giving values of FD to the pixel on the center of the window. Windowing blurs the boundaries of adjacent classes, so an edge-preserving, feature-smoothing algorithm is used to improve classification within segments and to make the boundaries sharper. Segmentation using DBC was 90.8910 accurate.
NASA Technical Reports Server (NTRS)
Scholtz, P.; Smyth, P.
1992-01-01
This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.
Robust video copy detection approach based on local tangent space alignment
NASA Astrophysics Data System (ADS)
Nie, Xiushan; Qiao, Qianping
2012-04-01
We propose a robust content-based video copy detection approach based on local tangent space alignment (LTSA), which is an efficient dimensionality reduction algorithm. The idea is motivated by the fact that the content of video becomes richer and the dimension of content becomes higher. It does not give natural tools for video analysis and understanding because of the high dimensionality. The proposed approach reduces the dimensionality of video content using LTSA, and then generates video fingerprints in low dimensional space for video copy detection. Furthermore, a dynamic sliding window is applied to fingerprint matching. Experimental results show that the video copy detection approach has good robustness and discrimination.
Yi, Tianzhu; He, Zhihua; He, Feng; Dong, Zhen; Wu, Manqing
2017-01-01
This paper presents an efficient and precise imaging algorithm for the large bandwidth sliding spotlight synthetic aperture radar (SAR). The existing sub-aperture processing method based on the baseband azimuth scaling (BAS) algorithm cannot cope with the high order phase coupling along the range and azimuth dimensions. This coupling problem causes defocusing along the range and azimuth dimensions. This paper proposes a generalized chirp scaling (GCS)-BAS processing algorithm, which is based on the GCS algorithm. It successfully mitigates the deep focus along the range dimension of a sub-aperture of the large bandwidth sliding spotlight SAR, as well as high order phase coupling along the range and azimuth dimensions. Additionally, the azimuth focusing can be achieved by this azimuth scaling method. Simulation results demonstrate the ability of the GCS-BAS algorithm to process the large bandwidth sliding spotlight SAR data. It is proven that great improvements of the focus depth and imaging accuracy are obtained via the GCS-BAS algorithm. PMID:28555057
Veta, Mitko; Johannes van Diest, Paul; van Ginneken, Bram; Karssemeijer, Nico; Litjens, Geert; van der Laak, Jeroen A. W. M.; Hermsen, Meyke; Manson, Quirine F; Balkenhol, Maschenka; Geessink, Oscar; Stathonikos, Nikolaos; van Dijk, Marcory CRF; Bult, Peter; Beca, Francisco; Beck, Andrew H; Wang, Dayong; Khosla, Aditya; Gargeya, Rishab; Irshad, Humayun; Zhong, Aoxiao; Dou, Qi; Li, Quanzheng; Chen, Hao; Lin, Huang-Jing; Heng, Pheng-Ann; Haß, Christian; Bruni, Elia; Wong, Quincy; Halici, Ugur; Öner, Mustafa Ümit; Cetin-Atalay, Rengul; Berseth, Matt; Khvatkov, Vitali; Vylegzhanin, Alexei; Kraus, Oren; Shaban, Muhammad; Rajpoot, Nasir; Awan, Ruqayya; Sirinukunwattana, Korsuk; Qaiser, Talha; Tsang, Yee-Wah; Tellez, David; Annuscheit, Jonas; Hufnagl, Peter; Valkonen, Mira; Kartasalo, Kimmo; Latonen, Leena; Ruusuvuori, Pekka; Liimatainen, Kaisa; Albarqouni, Shadi; Mungal, Bharti; George, Ami; Demirci, Stefanie; Navab, Nassir; Watanabe, Seiryo; Seno, Shigeto; Takenaka, Yoichi; Matsuda, Hideo; Ahmady Phoulady, Hady; Kovalev, Vassili; Kalinovsky, Alexander; Liauchuk, Vitali; Bueno, Gloria; Fernandez-Carrobles, M. Milagro; Serrano, Ismael; Deniz, Oscar; Racoceanu, Daniel; Venâncio, Rui
2017-01-01
Importance Application of deep learning algorithms to whole-slide pathology images can potentially improve diagnostic accuracy and efficiency. Objective Assess the performance of automated deep learning algorithms at detecting metastases in hematoxylin and eosin–stained tissue sections of lymph nodes of women with breast cancer and compare it with pathologists’ diagnoses in a diagnostic setting. Design, Setting, and Participants Researcher challenge competition (CAMELYON16) to develop automated solutions for detecting lymph node metastases (November 2015-November 2016). A training data set of whole-slide images from 2 centers in the Netherlands with (n = 110) and without (n = 160) nodal metastases verified by immunohistochemical staining were provided to challenge participants to build algorithms. Algorithm performance was evaluated in an independent test set of 129 whole-slide images (49 with and 80 without metastases). The same test set of corresponding glass slides was also evaluated by a panel of 11 pathologists with time constraint (WTC) from the Netherlands to ascertain likelihood of nodal metastases for each slide in a flexible 2-hour session, simulating routine pathology workflow, and by 1 pathologist without time constraint (WOTC). Exposures Deep learning algorithms submitted as part of a challenge competition or pathologist interpretation. Main Outcomes and Measures The presence of specific metastatic foci and the absence vs presence of lymph node metastasis in a slide or image using receiver operating characteristic curve analysis. The 11 pathologists participating in the simulation exercise rated their diagnostic confidence as definitely normal, probably normal, equivocal, probably tumor, or definitely tumor. Results The area under the receiver operating characteristic curve (AUC) for the algorithms ranged from 0.556 to 0.994. The top-performing algorithm achieved a lesion-level, true-positive fraction comparable with that of the pathologist WOTC (72.4% [95% CI, 64.3%-80.4%]) at a mean of 0.0125 false-positives per normal whole-slide image. For the whole-slide image classification task, the best algorithm (AUC, 0.994 [95% CI, 0.983-0.999]) performed significantly better than the pathologists WTC in a diagnostic simulation (mean AUC, 0.810 [range, 0.738-0.884]; P < .001). The top 5 algorithms had a mean AUC that was comparable with the pathologist interpreting the slides in the absence of time constraints (mean AUC, 0.960 [range, 0.923-0.994] for the top 5 algorithms vs 0.966 [95% CI, 0.927-0.998] for the pathologist WOTC). Conclusions and Relevance In the setting of a challenge competition, some deep learning algorithms achieved better diagnostic performance than a panel of 11 pathologists participating in a simulation exercise designed to mimic routine pathology workflow; algorithm performance was comparable with an expert pathologist interpreting whole-slide images without time constraints. Whether this approach has clinical utility will require evaluation in a clinical setting. PMID:29234806
A Hybrid Approach for CpG Island Detection in the Human Genome.
Yang, Cheng-Hong; Lin, Yu-Da; Chiang, Yi-Cheng; Chuang, Li-Yeh
2016-01-01
CpG islands have been demonstrated to influence local chromatin structures and simplify the regulation of gene activity. However, the accurate and rapid determination of CpG islands for whole DNA sequences remains experimentally and computationally challenging. A novel procedure is proposed to detect CpG islands by combining clustering technology with the sliding-window method (PSO-based). Clustering technology is used to detect the locations of all possible CpG islands and process the data, thus effectively obviating the need for the extensive and unnecessary processing of DNA fragments, and thus improving the efficiency of sliding-window based particle swarm optimization (PSO) search. This proposed approach, named ClusterPSO, provides versatile and highly-sensitive detection of CpG islands in the human genome. In addition, the detection efficiency of ClusterPSO is compared with eight CpG island detection methods in the human genome. Comparison of the detection efficiency for the CpG islands in human genome, including sensitivity, specificity, accuracy, performance coefficient (PC), and correlation coefficient (CC), ClusterPSO revealed superior detection ability among all of the test methods. Moreover, the combination of clustering technology and PSO method can successfully overcome their respective drawbacks while maintaining their advantages. Thus, clustering technology could be hybridized with the optimization algorithm method to optimize CpG island detection. The prediction accuracy of ClusterPSO was quite high, indicating the combination of CpGcluster and PSO has several advantages over CpGcluster and PSO alone. In addition, ClusterPSO significantly reduced implementation time.
Evaluating the visibility of presentation slides
NASA Astrophysics Data System (ADS)
Sugawara, Genki; Umezu, Nobuyuki
2017-03-01
Presentations using slide software such as PowerPoint are widely performed in offices and schools. The improvement of presentation skills among ordinary people is required because these days such an opportunity of giving presentation is becoming so common. One of the key factors for making successful presentation is the visibility of the slides, as well as the contents themselves. We propose an algorithm to numerically evaluate the visibility of presentation slides. Our method receives a presentation as a set of images and eliminates the background from the slides to extract characters and figures. This algorithm then evaluates the visibility according to the number and size of characters, their colors, and figure layouts. The slide evaluation criteria are based on the series of experiments with 20 participants to parameterize typical values for visual elements in slides. The algorithm is implemented on an iMac and takes 0.5 sec. to evaluate a slide image. The evaluation score is given as a value between 0 and 100 and the users can improve their slide pages with lower scores. Our future work includes a series of experiments with various presentations and extending our method to publish as a web-based rating service for learning presentation skills.
Adaptive DFT-Based Interferometer Fringe Tracking
NASA Astrophysics Data System (ADS)
Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.
An automatic interferometer fringe tracking system has been developed, implemented, and tested at the Infrared Optical Telescope Array (IOTA) Observatory at Mount Hopkins, Arizona. The system can minimize the optical path differences (OPDs) for all three baselines of the Michelson stellar interferometer at IOTA. Based on sliding window discrete Fourier-transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on offline data. Implemented in ANSI C on the 266 MHz PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately 2.0 milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. The adaptive DFT-based tracking algorithm should be applicable to other systems where there is a need to detect or track a signal with an approximately constant-frequency carrier pulse. One example of such an application might be to the field of thin-film measurement by ellipsometry, using a broadband light source and a Fourier-transform spectrometer to detect the resulting fringe patterns.
Adaptive DFT-Based Interferometer Fringe Tracking
NASA Astrophysics Data System (ADS)
Wilson, Edward; Pedretti, Ettore; Bregman, Jesse; Mah, Robert W.; Traub, Wesley A.
2005-12-01
An automatic interferometer fringe tracking system has been developed, implemented, and tested at the Infrared Optical Telescope Array (IOTA) Observatory at Mount Hopkins, Arizona. The system can minimize the optical path differences (OPDs) for all three baselines of the Michelson stellar interferometer at IOTA. Based on sliding window discrete Fourier-transform (DFT) calculations that were optimized for computational efficiency and robustness to atmospheric disturbances, the algorithm has also been tested extensively on offline data. Implemented in ANSI C on the 266 MHz PowerPC processor running the VxWorks real-time operating system, the algorithm runs in approximately [InlineEquation not available: see fulltext.] milliseconds per scan (including all three interferograms), using the science camera and piezo scanners to measure and correct the OPDs. The adaptive DFT-based tracking algorithm should be applicable to other systems where there is a need to detect or track a signal with an approximately constant-frequency carrier pulse. One example of such an application might be to the field of thin-film measurement by ellipsometry, using a broadband light source and a Fourier-transform spectrometer to detect the resulting fringe patterns.
Image segmentation and 3D visualization for MRI mammography
NASA Astrophysics Data System (ADS)
Li, Lihua; Chu, Yong; Salem, Angela F.; Clark, Robert A.
2002-05-01
MRI mammography has a number of advantages, including the tomographic, and therefore three-dimensional (3-D) nature, of the images. It allows the application of MRI mammography to breasts with dense tissue, post operative scarring, and silicon implants. However, due to the vast quantity of images and subtlety of difference in MR sequence, there is a need for reliable computer diagnosis to reduce the radiologist's workload. The purpose of this work was to develop automatic breast/tissue segmentation and visualization algorithms to aid physicians in detecting and observing abnormalities in breast. Two segmentation algorithms were developed: one for breast segmentation, the other for glandular tissue segmentation. In breast segmentation, the MRI image is first segmented using an adaptive growing clustering method. Two tracing algorithms were then developed to refine the breast air and chest wall boundaries of breast. The glandular tissue segmentation was performed using an adaptive thresholding method, in which the threshold value was spatially adaptive using a sliding window. The 3D visualization of the segmented 2D slices of MRI mammography was implemented under IDL environment. The breast and glandular tissue rendering, slicing and animation were displayed.
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
NASA Astrophysics Data System (ADS)
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.
Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs)
Howsmon, Daniel P.; Cameron, Faye; Baysal, Nihat; Ly, Trang T.; Forlenza, Gregory P.; Maahs, David M.; Buckingham, Bruce A.; Hahn, Juergen; Bequette, B. Wayne
2017-01-01
Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis—a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios. PMID:28098839
Continuous Glucose Monitoring Enables the Detection of Losses in Infusion Set Actuation (LISAs).
Howsmon, Daniel P; Cameron, Faye; Baysal, Nihat; Ly, Trang T; Forlenza, Gregory P; Maahs, David M; Buckingham, Bruce A; Hahn, Juergen; Bequette, B Wayne
2017-01-15
Reliable continuous glucose monitoring (CGM) enables a variety of advanced technology for the treatment of type 1 diabetes. In addition to artificial pancreas algorithms that use CGM to automate continuous subcutaneous insulin infusion (CSII), CGM can also inform fault detection algorithms that alert patients to problems in CGM or CSII. Losses in infusion set actuation (LISAs) can adversely affect clinical outcomes, resulting in hyperglycemia due to impaired insulin delivery. Prolonged hyperglycemia may lead to diabetic ketoacidosis-a serious metabolic complication in type 1 diabetes. Therefore, an algorithm for the detection of LISAs based on CGM and CSII signals was developed to improve patient safety. The LISA detection algorithm is trained retrospectively on data from 62 infusion set insertions from 20 patients. The algorithm collects glucose and insulin data, and computes relevant fault metrics over two different sliding windows; an alarm sounds when these fault metrics are exceeded. With the chosen algorithm parameters, the LISA detection strategy achieved a sensitivity of 71.8% and issued 0.28 false positives per day on the training data. Validation on two independent data sets confirmed that similar performance is seen on data that was not used for training. The developed algorithm is able to effectively alert patients to possible infusion set failures in open-loop scenarios, with limited evidence of its extension to closed-loop scenarios.
Determination of the optimal tolerance for MLC positioning in sliding window and VMAT techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, V., E-mail: vhernandezmasgrau@gmail.com; Abella, R.; Calvo, J. F.
2015-04-15
Purpose: Several authors have recommended a 2 mm tolerance for multileaf collimator (MLC) positioning in sliding window treatments. In volumetric modulated arc therapy (VMAT) treatments, however, the optimal tolerance for MLC positioning remains unknown. In this paper, the authors present the results of a multicenter study to determine the optimal tolerance for both techniques. Methods: The procedure used is based on dynalog file analysis. The study was carried out using seven Varian linear accelerators from five different centers. Dynalogs were collected from over 100 000 clinical treatments and in-house software was used to compute the number of tolerance faults as amore » function of the user-defined tolerance. Thus, the optimal value for this tolerance, defined as the lowest achievable value, was investigated. Results: Dynalog files accurately predict the number of tolerance faults as a function of the tolerance value, especially for low fault incidences. All MLCs behaved similarly and the Millennium120 and the HD120 models yielded comparable results. In sliding window techniques, the number of beams with an incidence of hold-offs >1% rapidly decreases for a tolerance of 1.5 mm. In VMAT techniques, the number of tolerance faults sharply drops for tolerances around 2 mm. For a tolerance of 2.5 mm, less than 0.1% of the VMAT arcs presented tolerance faults. Conclusions: Dynalog analysis provides a feasible method for investigating the optimal tolerance for MLC positioning in dynamic fields. In sliding window treatments, the tolerance of 2 mm was found to be adequate, although it can be reduced to 1.5 mm. In VMAT treatments, the typically used 5 mm tolerance is excessively high. Instead, a tolerance of 2.5 mm is recommended.« less
Real-time person detection in low-resolution thermal infrared imagery with MSER and CNNs
NASA Astrophysics Data System (ADS)
Herrmann, Christian; Müller, Thomas; Willersinn, Dieter; Beyerer, Jürgen
2016-10-01
In many camera-based systems, person detection and localization is an important step for safety and security applications such as search and rescue, reconnaissance, surveillance, or driver assistance. Long-wave infrared (LWIR) imagery promises to simplify this task because it is less affected by background clutter or illumination changes. In contrast to a lot of related work, we make no assumptions about any movement of persons or the camera, i.e. persons may stand still and the camera may move or any combination thereof. Furthermore, persons may appear arbitrarily in near or far distances to the camera leading to low-resolution persons in far distances. To address this task, we propose a two-stage system, including a proposal generation method and a classifier to verify, if the detected proposals really are persons. In contradiction to use all possible proposals as with sliding window approaches, we apply Maximally Stable Extremal Regions (MSER) and classify the detected proposals afterwards with a Convolutional Neural Network (CNN). The MSER algorithm acts as a hot spot detector when applied to LWIR imagery. Because the body temperature of persons is usually higher than the background, they appear as hot spots in the image. However, the MSER algorithm is unable to distinguish between different kinds of hot spots. Thus, all further LWIR sources such as windows, animals or vehicles will be detected, too. Still by applying MSER, the number of proposals is reduced significantly in comparison to a sliding window approach which allows employing the high discriminative capabilities of deep neural networks classifiers that were recently shown in several applications such as face recognition or image content classification. We suggest using a CNN as classifier for the detected hot spots and train it to discriminate between person hot spots and all further hot spots. We specifically design a CNN that is suitable for the low-resolution person hot spots that are common with LWIR imagery applications and is capable of fast classification. Evaluation on several different LWIR person detection datasets shows an error rate reduction of up to 80 percent compared to previous approaches consisting of MSER, local image descriptors and a standard classifier such as an SVM or boosted decision trees. Further time measurements show that the proposed processing chain is capable of real-time person detection in LWIR camera streams.
Pace, Danielle F.; Aylward, Stephen R.; Niethammer, Marc
2014-01-01
We propose a deformable image registration algorithm that uses anisotropic smoothing for regularization to find correspondences between images of sliding organs. In particular, we apply the method for respiratory motion estimation in longitudinal thoracic and abdominal computed tomography scans. The algorithm uses locally adaptive diffusion tensors to determine the direction and magnitude with which to smooth the components of the displacement field that are normal and tangential to an expected sliding boundary. Validation was performed using synthetic, phantom, and 14 clinical datasets, including the publicly available DIR-Lab dataset. We show that motion discontinuities caused by sliding can be effectively recovered, unlike conventional regularizations that enforce globally smooth motion. In the clinical datasets, target registration error showed improved accuracy for lung landmarks compared to the diffusive regularization. We also present a generalization of our algorithm to other sliding geometries, including sliding tubes (e.g., needles sliding through tissue, or contrast agent flowing through a vessel). Potential clinical applications of this method include longitudinal change detection and radiotherapy for lung or abdominal tumours, especially those near the chest or abdominal wall. PMID:23899632
Pace, Danielle F; Aylward, Stephen R; Niethammer, Marc
2013-11-01
We propose a deformable image registration algorithm that uses anisotropic smoothing for regularization to find correspondences between images of sliding organs. In particular, we apply the method for respiratory motion estimation in longitudinal thoracic and abdominal computed tomography scans. The algorithm uses locally adaptive diffusion tensors to determine the direction and magnitude with which to smooth the components of the displacement field that are normal and tangential to an expected sliding boundary. Validation was performed using synthetic, phantom, and 14 clinical datasets, including the publicly available DIR-Lab dataset. We show that motion discontinuities caused by sliding can be effectively recovered, unlike conventional regularizations that enforce globally smooth motion. In the clinical datasets, target registration error showed improved accuracy for lung landmarks compared to the diffusive regularization. We also present a generalization of our algorithm to other sliding geometries, including sliding tubes (e.g., needles sliding through tissue, or contrast agent flowing through a vessel). Potential clinical applications of this method include longitudinal change detection and radiotherapy for lung or abdominal tumours, especially those near the chest or abdominal wall.
Sojoudi, Alireza; Goodyear, Bradley G
2016-12-01
Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Automated robust registration of grossly misregistered whole-slide images with varying stains
NASA Astrophysics Data System (ADS)
Litjens, G.; Safferling, K.; Grabe, N.
2016-03-01
Cancer diagnosis and pharmaceutical research increasingly depend on the accurate quantification of cancer biomarkers. Identification of biomarkers is usually performed through immunohistochemical staining of cancer sections on glass slides. However, combination of multiple biomarkers from a wide variety of immunohistochemically stained slides is a tedious process in traditional histopathology due to the switching of glass slides and re-identification of regions of interest by pathologists. Digital pathology now allows us to apply image registration algorithms to digitized whole-slides to align the differing immunohistochemical stains automatically. However, registration algorithms need to be robust to changes in color due to differing stains and severe changes in tissue content between slides. In this work we developed a robust registration methodology to allow for fast coarse alignment of multiple immunohistochemical stains to the base hematyoxylin and eosin stained image. We applied HSD color model conversion to obtain a less stain color dependent representation of the whole-slide images. Subsequently, optical density thresholding and connected component analysis were used to identify the relevant regions for registration. Template matching using normalized mutual information was applied to provide initial translation and rotation parameters, after which a cost function-driven affine registration was performed. The algorithm was validated using 40 slides from 10 prostate cancer patients, with landmark registration error as a metric. Median landmark registration error was around 180 microns, which indicates performance is adequate for practical application. None of the registrations failed, indicating the robustness of the algorithm.
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
SlideJ: An ImageJ plugin for automated processing of whole slide images.
Della Mea, Vincenzo; Baroni, Giulia L; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images-up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations.
SlideJ: An ImageJ plugin for automated processing of whole slide images
Baroni, Giulia L.; Pilutti, David; Di Loreto, Carla
2017-01-01
The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images—up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations. PMID:28683129
NASA Astrophysics Data System (ADS)
Bunai, Tasya; Rokhmatuloh; Wibowo, Adi
2018-05-01
In this paper, two methods to retrieve the Land Surface Temperature (LST) from thermal infrared data supplied by band 10 and 11 of the Thermal Infrared Sensor (TIRS) onboard the Landsat 8 is compared. The first is mono window algorithm developed by Qin et al. and the second is split window algorithm by Rozenstein et al. The purpose of this study is to perform the spatial distribution of land surface temperature, as well as to determine more accurate algorithm for retrieving land surface temperature by calculated root mean square error (RMSE). Finally, we present comparison the spatial distribution of land surface temperature by both of algorithm, and more accurate algorithm is split window algorithm refers to the root mean square error (RMSE) is 7.69° C.
Martin, Corinna; Jablonka, Sibylle
2018-01-01
Local and spontaneous calcium signals play important roles in neurons and neuronal networks. Spontaneous or cell-autonomous calcium signals may be difficult to assess because they appear in an unpredictable spatiotemporal pattern and in very small neuronal loci of axons or dendrites. We developed an open source bioinformatics tool for an unbiased assessment of calcium signals in x,y-t imaging series. The tool bases its algorithm on a continuous wavelet transform-guided peak detection to identify calcium signal candidates. The highly sensitive calcium event definition is based on identification of peaks in 1D data through analysis of a 2D wavelet transform surface. For spatial analysis, the tool uses a grid to separate the x,y-image field in independently analyzed grid windows. A document containing a graphical summary of the data is automatically created and displays the loci of activity for a wide range of signal intensities. Furthermore, the number of activity events is summed up to create an estimated total activity value, which can be used to compare different experimental situations, such as calcium activity before or after an experimental treatment. All traces and data of active loci become documented. The tool can also compute the signal variance in a sliding window to visualize activity-dependent signal fluctuations. We applied the calcium signal detector to monitor activity states of cultured mouse neurons. Our data show that both the total activity value and the variance area created by a sliding window can distinguish experimental manipulations of neuronal activity states. Notably, the tool is powerful enough to compute local calcium events and ‘signal-close-to-noise’ activity in small loci of distal neurites of neurons, which remain during pharmacological blockade of neuronal activity with inhibitors such as tetrodotoxin, to block action potential firing, or inhibitors of ionotropic glutamate receptors. The tool can also offer information about local homeostatic calcium activity events in neurites. PMID:29601577
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed, Raef S.; Ove, Roger; Duan, Jun
2006-10-01
The treatment of maxillary sinus carcinoma with forward planning can be technically difficult when the neck also requires radiotherapy. This difficulty arises because of the need to spare the contralateral face while treating the bilateral neck. There is considerable potential for error in clinical setup and treatment delivery. We evaluated intensity-modulated radiotherapy (IMRT) as an improvement on forward planning, and compared several inverse planning IMRT platforms. A composite dose-volume histogram (DVH) was generated from a complex forward planned case. We compared the results with those generated by sliding window fixed field dynamic multileaf collimator (MLC) IMRT, using sets of coplanarmore » beams. All setups included an anterior posterior (AP) beam, and 3-, 5-, 7-, and 9-field configurations were evaluated. The dose prescription and objective function priorities were invariant. We also evaluated 2 commercial tomotherapy IMRT delivery platforms. DVH results from all of the IMRT approaches compared favorably with the forward plan. Results for the various inverse planning approaches varied considerably across platforms, despite an attempt to prescribe the therapy similarly. The improvement seen with the addition of beams in the fixed beam sliding window case was modest. IMRT is an effective means of delivering radiotherapy reliably in the complex setting of maxillary sinus carcinoma with neck irradiation. Differences in objective function definition and optimization algorithms can lead to unexpected differences in the final dose distribution, and our evaluation suggests that these factors are more significant than the beam arrangement or number of beams.« less
SU-E-T-430: Modeling MLC Leaf End in 2D for Sliding Window IMRT and Arc Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, X; Zhu, T
2014-06-01
Purpose: To develop a 2D geometric model for MLC accounting for leaf end dose leakage for dynamic IMRT and Rapidarc therapy. Methods: Leaf-end dose leakage is one of the problems for MLC dose calculation and modeling. Dosimetric leaf gap used to model the MLC and to count for leakage in dose calculation, but may not be accurate for smaller leaf gaps. We propose another geometric modeling method to compensate for the MLC round-shape leaf ends dose leakage, and improve the accuracy of dose calculation and dose verification. A triangular function is used to geometrically model the MLC leaf end leakagemore » in the leaf motion direction, and a step function is used in the perpendicular direction. Dose measurements with different leaf gap, different window width, and different window height were conducted, and the results were used to fit the analytical model to get the model parameters. Results: Analytical models have been obtained for stop-and-shoot and dynamic modes for MLC motion. Parameters a=0.4, lw'=5.0 mm for 6X and a=0.54, lw'=4.1 mm for 15x were obtained from the fitting process. The proposed MLC leaf end model improves the dose profile at the two ends of the sliding window opening. This improvement is especially significant for smaller sliding window openings, which are commonly used for highly modulated IMRT plans and arc therapy plans. Conclusion: This work models the MLC round leaf end shape and movement pattern for IMRT dose calculation. The theory, as well as the results in this work provides a useful tool for photon beam IMRT dose calculation and verification.« less
5. EXTERIOR OF SOUTH END OF HOUSE SHOWING OPEN DOOR ...
5. EXTERIOR OF SOUTH END OF HOUSE SHOWING OPEN DOOR TO BASEMENT BELOW KITCHEN, ORIGINAL PAIRED WOODFRAMED SLIDING-GLASS WINDOWS ON KITCHEN WALL AND 1LIGHT OVER 1-LIGHT DOUBLE-HUNG WINDOW ON STORM PORCH ADDITION. VIEW TO WEST. - Rush Creek Hydroelectric System, Clubhouse Cottage, Rush Creek, June Lake, Mono County, CA
Online frequency estimation with applications to engine and generator sets
NASA Astrophysics Data System (ADS)
Manngård, Mikael; Böling, Jari M.
2017-07-01
Frequency and spectral analysis based on the discrete Fourier transform is a fundamental task in signal processing and machine diagnostics. This paper aims at presenting computationally efficient methods for real-time estimation of stationary and time-varying frequency components in signals. A brief survey of the sliding time window discrete Fourier transform and Goertzel filter is presented, and two filter banks consisting of: (i) sliding time window Goertzel filters (ii) infinite impulse response narrow bandpass filters are proposed for estimating instantaneous frequencies. The proposed methods show excellent results on both simulation studies and on a case study using angular speed data measurements of the crankshaft of a marine diesel engine-generator set.
NASA Astrophysics Data System (ADS)
Golubovic, Leonardo; Knudsen, Steven
2017-01-01
We consider general problem of modeling the dynamics of objects sliding on moving strings. We introduce a powerful computational algorithm that can be used to investigate the dynamics of objects sliding along non-relativistic strings. We use the algorithm to numerically explore fundamental physics of sliding climbers on a unique class of dynamical systems, Rotating Space Elevators (RSE). Objects sliding along RSE strings do not require internal engines or propulsion to be transported from the Earth's surface into outer space. By extensive numerical simulations, we find that sliding climbers may display interesting non-linear dynamics exhibiting both quasi-periodic and chaotic states of motion. While our main interest in this study is in the climber dynamics on RSEs, our results for the dynamics of sliding object are of more general interest. In particular, we designed tools capable of dealing with strongly nonlinear phenomena involving moving strings of any kind, such as the chaotic dynamics of sliding climbers observed in our simulations.
VIEW OF DINING ROOM WITH SLIDING DOORS IN CLOSED POSITION. ...
VIEW OF DINING ROOM WITH SLIDING DOORS IN CLOSED POSITION. WINDOWS ON THE LEFT HAND SIDE HAVE VIEWS INTO THE CARPORT. VIEW FACING NORTH - Camp H.M. Smith and Navy Public Works Center Manana Title VII (Capehart) Housing, Three-Bedroom Single-Family Type 9, Birch Circle, Elm Drive, Elm Circle, and Date Drive, Pearl City, Honolulu County, HI
Anomaly Detection in Test Equipment via Sliding Mode Observers
NASA Technical Reports Server (NTRS)
Solano, Wanda M.; Drakunov, Sergey V.
2012-01-01
Nonlinear observers were originally developed based on the ideas of variable structure control, and for the purpose of detecting disturbances in complex systems. In this anomaly detection application, these observers were designed for estimating the distributed state of fluid flow in a pipe described by a class of advection equations. The observer algorithm uses collected data in a piping system to estimate the distributed system state (pressure and velocity along a pipe containing liquid gas propellant flow) using only boundary measurements. These estimates are then used to further estimate and localize possible anomalies such as leaks or foreign objects, and instrumentation metering problems such as incorrect flow meter orifice plate size. The observer algorithm has the following parts: a mathematical model of the fluid flow, observer control algorithm, and an anomaly identification algorithm. The main functional operation of the algorithm is in creating the sliding mode in the observer system implemented as software. Once the sliding mode starts in the system, the equivalent value of the discontinuous function in sliding mode can be obtained by filtering out the high-frequency chattering component. In control theory, "observers" are dynamic algorithms for the online estimation of the current state of a dynamic system by measurements of an output of the system. Classical linear observers can provide optimal estimates of a system state in case of uncertainty modeled by white noise. For nonlinear cases, the theory of nonlinear observers has been developed and its success is mainly due to the sliding mode approach. Using the mathematical theory of variable structure systems with sliding modes, the observer algorithm is designed in such a way that it steers the output of the model to the output of the system obtained via a variety of sensors, in spite of possible mismatches between the assumed model and actual system. The unique properties of sliding mode control allow not only control of the model internal states to the states of the real-life system, but also identification of the disturbance or anomaly that may occur.
Text extraction from images in the wild using the Viola-Jones algorithm
NASA Astrophysics Data System (ADS)
Saabna, Raid M.; Zingboim, Eran
2018-04-01
Text Localization and extraction is an important issue in modern applications of computer vision. Applications such as reading and translating texts in the wild or from videos are among the many applications that can benefit results of this field. In this work, we adopt the well-known Viola-Jones algorithm to enable text extraction and localization from images in the wild. The Viola-Jones is an efficient, and a fast image-processing algorithm originally used for face detection. Based on some resemblance between text and face detection tasks in the wild, we have modified the viola-jones to detect regions of interest where text may be localized. In the proposed approach, some modification to the HAAR like features and a semi-automatic process of data set generating and manipulation were presented to train the algorithm. A process of sliding windows with different sizes have been used to scan the image for individual letters and letter clusters existence. A post processing step is used in order to combine the detected letters into words and to remove false positives. The novelty of the presented approach is using the strengths of a modified Viola-Jones algorithm to identify many different objects representing different letters and clusters of similar letters and later combine them into words of varying lengths. Impressive results were obtained on the ICDAR contest data sets.
Robust Timing Synchronization in Aeronautical Mobile Communication Systems
NASA Technical Reports Server (NTRS)
Xiong, Fu-Qin; Pinchak, Stanley
2004-01-01
This work details a study of robust synchronization schemes suitable for satellite to mobile aeronautical applications. A new scheme, the Modified Sliding Window Synchronizer (MSWS), is devised and compared with existing schemes, including the traditional Early-Late Gate Synchronizer (ELGS), the Gardner Zero-Crossing Detector (GZCD), and the Sliding Window Synchronizer (SWS). Performance of the synchronization schemes is evaluated by a set of metrics that indicate performance in digital communications systems. The metrics are convergence time, mean square phase error (or root mean-square phase error), lowest SNR for locking, initial frequency offset performance, midstream frequency offset performance, and system complexity. The performance of the synchronizers is evaluated by means of Matlab simulation models. A simulation platform is devised to model the satellite to mobile aeronautical channel, consisting of a Quadrature Phase Shift Keying modulator, an additive white Gaussian noise channel, and a demodulator front end. Simulation results show that the MSWS provides the most robust performance at the cost of system complexity. The GZCD provides a good tradeoff between robustness and system complexity for communication systems that require high symbol rates or low overall system costs. The ELGS has a high system complexity despite its average performance. Overall, the SWS, originally designed for multi-carrier systems, performs very poorly in single-carrier communications systems. Table 5.1 in Section 5 provides a ranking of each of the synchronization schemes in terms of the metrics set forth in Section 4.1. Details of comparison are given in Section 5. Based on the results presented in Table 5, it is safe to say that the most robust synchronization scheme examined in this work is the high-sample-rate Modified Sliding Window Synchronizer. A close second is its low-sample-rate cousin. The tradeoff between complexity and lowest mean-square phase error determines the rankings of the Gardner Zero-Crossing Detector and both versions of the Early-Late Gate Synchronizer. The least robust models are the high and low-sample-rate Sliding Window Synchronizers. Consequently, the recommended replacement synchronizer for NASA's Advanced Air Transportation Technologies mobile aeronautical communications system is the high-sample-rate Modified Sliding Window Synchronizer. By incorporating this synchronizer into their system, NASA can be assured that their system will be operational in extremely adverse conditions. The quick convergence time of the MSWS should allow the use of high-level protocols. However, if NASA feels that reduced system complexity is the most important aspect of their replacement synchronizer, the Gardner Zero-Crossing Detector would be the best choice.
MASTER BEDROOM SHOWING THE WINDOWS IN THE UPPER PORTION OF ...
MASTER BEDROOM SHOWING THE WINDOWS IN THE UPPER PORTION OF THE EXTERIOR WALL AND THE SLIDING CLOSET DOORS. VIEW FACING WEST - Camp H.M. Smith and Navy Public Works Center Manana Title VII (Capehart) Housing, U-Shaped Two-Bedroom Single-Family Type 6, Birch Circle, Elm Drive, Elm Circle, and Date Drive, Pearl City, Honolulu County, HI
10. INTERIOR OF LIVING ROOM SHOWING FRONT DOOR FLANKED BY ...
10. INTERIOR OF LIVING ROOM SHOWING FRONT DOOR FLANKED BY SLIDING GLASS WINDOWS AND ELECTRICAL WALL HEATER. ORIGINAL 1-LIGHT OVER 1-LIGHT, DOUBLE-HUNG WINDOW AT PHOTO RIGHT. CEILING VENT TO CHIMNEY AT RIGHT UPPER PHOTO CENTER. VIEW TO SOUTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Real-Time Detection of Dust Devils from Pressure Readings
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri
2009-01-01
A method for real-time detection of dust devils at a given location is based on identifying the abrupt, temporary decreases in atmospheric pressure that are characteristic of dust devils as they travel through that location. The method was conceived for use in a study of dust devils on the Martian surface, where bandwidth limitations encourage the transmission of only those blocks of data that are most likely to contain information about features of interest, such as dust devils. The method, which is a form of intelligent data compression, could readily be adapted to use for the same purpose in scientific investigation of dust devils on Earth. In this method, the readings of an atmospheric- pressure sensor are repeatedly digitized, recorded, and processed by an algorithm that looks for extreme deviations from a continually updated model of the current pressure environment. The question in formulating the algorithm is how to model current normal observations and what minimum magnitude deviation can be considered sufficiently anomalous as to indicate the presence of a dust devil. There is no single, simple answer to this question: any answer necessarily entails a compromise between false detections and misses. For the original Mars application, the answer was sought through analysis of sliding time windows of digitized pressure readings. Windows of 5-, 10-, and 15-minute durations were considered. The windows were advanced in increments of 30 seconds. Increments of other sizes can also be used, but computational cost increases as the increment decreases and analysis is performed more frequently. Pressure models were defined using a polynomial fit to the data within the windows. For example, the figure depicts pressure readings from a 10-minute window wherein the model was defined by a third-degree polynomial fit to the readings and dust devils were identified as negative deviations larger than both 3 standard deviations (from the mean) and 0.05 mbar in magnitude. An algorithm embodying the detection scheme of this example was found to yield a miss rate of just 8 percent and a false-detection rate of 57 percent when evaluated on historical pressure-sensor data collected by the Mars Pathfinder lander. Since dust devils occur infrequently over the course of a mission, prioritizing observations that contain successful detections could greatly conserve bandwidth allocated to a given mission. This technique can be used on future Mars landers and rovers, such as Mars Phoenix and the Mars Science Laboratory.
Resource sharing on CSMA/CD networks in the presence of noise. M.S. Thesis
NASA Technical Reports Server (NTRS)
Dinschel, Duane Edward
1987-01-01
Resource sharing on carrier sense multiple access with collision detection (CSMA/CD) networks can be accomplished by using window-control algorithms for bus contention. The window-control algorithms are designed to grant permission to transmit to the station with the minimum contention parameter. Proper operation of the window-control algorithm requires that all stations sense the same state of the newtork in each contention slot. Noise causes the state of the network to appear as a collision. False collisions can cause the window-control algorithm to terminate without isolating any stations. A two-phase window-control protocol and approximate recurrence equation with noise as a parameter to improve the performance of the window-control algorithms in the presence of noise are developed. The results are compared through simulation, with the approximate recurrence equation yielding the best overall performance. Noise is even a bigger problem when it is not detected by all stations. In such cases it is possible for the window boundaries of the contending stations to become out of phase. Consequently, it is possible to isolate a station other than the one with the minimum contention parameter. To guarantee proper isolation of the minimum, a broadcast phase must be added after the termination of the algorithm. The protocol required to correct the window-control algorithm when noise is not detected by all stations is discussed.
A sliding windows approach to analyse the evolution of bank shares in the European Union
NASA Astrophysics Data System (ADS)
Ferreira, Paulo; Dionísio, Andreia; Guedes, Everaldo Freitas; Zebende, Gilney Figueira
2018-01-01
Both sub-prime and Eurozone debt crisis problems caused severe financial crisis, which affected European markets in general, but particularly the banking sector. The continuous devaluation of bank shares in the financial sector caused a great decrease in market capitalization, and in citizen and investor confidence. Panic among investors led them to sell shares, while other agents took the opportunity to buy them. Therefore, the study of bank shares is important, particularly of their efficiency. In this paper, adopting a sliding windows detrended fluctuation approach, we analyse the efficiency concept dynamically with 63 European banks (both in and outside the Eurozone). The main results show that the crisis had an effect on changing the efficiency pattern.
NASA Astrophysics Data System (ADS)
Kazanskiy, Nikolay; Protsenko, Vladimir; Serafimovich, Pavel
2016-03-01
This research article contains an experiment with implementation of image filtering task in Apache Storm and IBM InfoSphere Streams stream data processing systems. The aim of presented research is to show that new technologies could be effectively used for sliding window filtering of image sequences. The analysis of execution was focused on two parameters: throughput and memory consumption. Profiling was performed on CentOS operating systems running on two virtual machines for each system. The experiment results showed that IBM InfoSphere Streams has about 1.5 to 13.5 times lower memory footprint than Apache Storm, but could be about 2.0 to 2.5 slower on a real hardware.
NASA Astrophysics Data System (ADS)
Lee, Nam-Jin; Kang, Chul-Goo
2016-10-01
In railway vehicles, excessive sliding or wheel locking can occur while braking because of a temporarily degraded adhesion between the wheel and the rail caused by the contaminated or wet surface of the rail. It can damage the wheel tread and affect the performance of the brake system and the safety of the railway vehicle. To safeguard the wheelset from these phenomena, almost all railway vehicles are equipped with wheel slide protection (WSP) systems. In this study, a new WSP algorithm is proposed. The features of the proposed algorithm are the use of the target sliding speed, the determination of a command for WSP valves using command maps, and compensation for the time delay in pneumatic brake systems using the Smith predictor. The proposed WSP algorithm was verified using experiments with a hardware-in-the-loop simulation system including the hardware of the pneumatic brake system.
LIVING ROOM. NOTE THE WINDOWS IN THE UPPER PORTION OF ...
LIVING ROOM. NOTE THE WINDOWS IN THE UPPER PORTION OF THE EXTERIOR WALL (LEFT) AND SLIDING DOORS TO THE DINING ROOM. VIEW FACING SOUTHWEST - Camp H.M. Smith and Navy Public Works Center Manana Title VII (Capehart) Housing, Four-Bedroom, Single-Family Type 10, Birch Circle, Elm Drive, Elm Circle, and Date Drive, Pearl City, Honolulu County, HI
INTERIOR VIEW OF SOUTHWEST WALL OF SECOND FLOOR SHOWING WINDOWS, ...
INTERIOR VIEW OF SOUTHWEST WALL OF SECOND FLOOR SHOWING WINDOWS, SLIDING DOORS AND METAL ROOF FRAMING. VIEW FACING SOUTHWEST - U.S. Naval Base, Pearl Harbor, Ford Island Polaris Missile Lab & U.S. Fleet Ballistic Missile Submarine Training Center, Between Lexington Boulvevard and the sea plane ramps on the southwest side of Ford Island, Pearl City, Honolulu County, HI
DOE Office of Scientific and Technical Information (OSTI.GOV)
García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, E.
An alternative approach to characterise real voltage dips is proposed and evaluated in this study. The proposed methodology is based on voltage-space vector solutions, identifying parameters for ellipses trajectories by using the least-squares algorithm applied on a sliding window along the disturbance. The most likely patterns are then estimated through a clustering process based on the k-means algorithm. The objective is to offer an efficient and easily implemented alternative to characterise faults and visualise the most likely instantaneous phase-voltage evolution during events through their corresponding voltage-space vector trajectories. This novel solution minimises the data to be stored but maintains extensivemore » information about the dips including starting and ending transients. The proposed methodology has been applied satisfactorily to real voltage dips obtained from intensive field-measurement campaigns carried out in a Spanish wind power plant up to a time period of several years. A comparison to traditional minimum root mean square-voltage and time-duration classifications is also included in this study.« less
Plural-wavelength flame detector that discriminates between direct and reflected radiation
NASA Technical Reports Server (NTRS)
Hall, Gregory H. (Inventor); Barnes, Heidi L. (Inventor); Medelius, Pedro J. (Inventor); Simpson, Howard J. (Inventor); Smith, Harvey S. (Inventor)
1997-01-01
A flame detector employs a plurality of wavelength selective radiation detectors and a digital signal processor programmed to analyze each of the detector signals, and determine whether radiation is received directly from a small flame source that warrants generation of an alarm. The processor's algorithm employs a normalized cross-correlation analysis of the detector signals to discriminate between radiation received directly from a flame and radiation received from a reflection of a flame to insure that reflections will not trigger an alarm. In addition, the algorithm employs a Fast Fourier Transform (FFT) frequency spectrum analysis of one of the detector signals to discriminate between flames of different sizes. In a specific application, the detector incorporates two infrared (IR) detectors and one ultraviolet (UV) detector for discriminating between a directly sensed small hydrogen flame, and reflections from a large hydrogen flame. The signals generated by each of the detectors are sampled and digitized for analysis by the digital signal processor, preferably 250 times a second. A sliding time window of approximately 30 seconds of detector data is created using FIFO memories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaoran, E-mail: sxr0806@gmail.com; School of Mathematics and Statistics, The University of Western Australia, Crawley WA 6009; Small, Michael, E-mail: michael.small@uwa.edu.au
In this work, we propose a novel method to transform a time series into a weighted and directed network. For a given time series, we first generate a set of segments via a sliding window, and then use a doubly symbolic scheme to characterize every windowed segment by combining absolute amplitude information with an ordinal pattern characterization. Based on this construction, a network can be directly constructed from the given time series: segments corresponding to different symbol-pairs are mapped to network nodes and the temporal succession between nodes is represented by directed links. With this conversion, dynamics underlying the timemore » series has been encoded into the network structure. We illustrate the potential of our networks with a well-studied dynamical model as a benchmark example. Results show that network measures for characterizing global properties can detect the dynamical transitions in the underlying system. Moreover, we employ a random walk algorithm to sample loops in our networks, and find that time series with different dynamics exhibits distinct cycle structure. That is, the relative prevalence of loops with different lengths can be used to identify the underlying dynamics.« less
A sentence sliding window approach to extract protein annotations from biomedical articles
Krallinger, Martin; Padron, Maria; Valencia, Alfonso
2005-01-01
Background Within the emerging field of text mining and statistical natural language processing (NLP) applied to biomedical articles, a broad variety of techniques have been developed during the past years. Nevertheless, there is still a great ned of comparative assessment of the performance of the proposed methods and the development of common evaluation criteria. This issue was addressed by the Critical Assessment of Text Mining Methods in Molecular Biology (BioCreative) contest. The aim of this contest was to assess the performance of text mining systems applied to biomedical texts including tools which recognize named entities such as genes and proteins, and tools which automatically extract protein annotations. Results The "sentence sliding window" approach proposed here was found to efficiently extract text fragments from full text articles containing annotations on proteins, providing the highest number of correctly predicted annotations. Moreover, the number of correct extractions of individual entities (i.e. proteins and GO terms) involved in the relationships used for the annotations was significantly higher than the correct extractions of the complete annotations (protein-function relations). Conclusion We explored the use of averaging sentence sliding windows for information extraction, especially in a context where conventional training data is unavailable. The combination of our approach with more refined statistical estimators and machine learning techniques might be a way to improve annotation extraction for future biomedical text mining applications. PMID:15960831
NASA Astrophysics Data System (ADS)
Zhang, Shangbin; Lu, Siliang; He, Qingbo; Kong, Fanrang
2016-09-01
For rotating machines, the defective faults of bearings generally are represented as periodic transient impulses in acquired signals. The extraction of transient features from signals has been a key issue for fault diagnosis. However, the background noise reduces identification performance of periodic faults in practice. This paper proposes a time-varying singular value decomposition (TSVD) method to enhance the identification of periodic faults. The proposed method is inspired by the sliding window method. By applying singular value decomposition (SVD) to the signal under a sliding window, we can obtain a time-varying singular value matrix (TSVM). Each column in the TSVM is occupied by the singular values of the corresponding sliding window, and each row represents the intrinsic structure of the raw signal, namely time-singular-value-sequence (TSVS). Theoretical and experimental analyses show that the frequency of TSVS is exactly twice that of the corresponding intrinsic structure. Moreover, the signal-to-noise ratio (SNR) of TSVS is improved significantly in comparison with the raw signal. The proposed method takes advantages of the TSVS in noise suppression and feature extraction to enhance fault frequency for diagnosis. The effectiveness of the TSVD is verified by means of simulation studies and applications to diagnosis of bearing faults. Results indicate that the proposed method is superior to traditional methods for bearing fault diagnosis.
AGOR 28: SIO Shipyard Representative Bi-Weekly Progress Report
2016-02-15
Handles for Bridge port and stbd side sliding windows reinstalled with better adhesive. Should be good now. Have to remember to lift up on...handle before attempting to slide. Woody has expressed concern with potential interference of ships main crane and CAST 6 winches. SIO plans to...swap forward CTD handling arm with the after overboarding arm. This may exacerbate potential interferences with stowed crane . A possible solution
Technology Evaluation and Integration for Heavy Tactical Vehicles
2010-08-17
for Movie - May have to Exit slide show mode UNCLASSIFIED Key Findings- Modular Hydraulic Powered Generator • Hydraulic powered alternator proved...for Movie - May have to Exit slide show mode UNCLASSIFIED PPMS Key Findings Findings: • Hybrid starting system proved functional • Works with wide...to compute inter- vehicle closing distance & stopping time. • Provide audible/visual alert to driver inside their reaction time window. • Use COTS
USDA-ARS?s Scientific Manuscript database
A computer algorithm was created to inspect scanned images from DNA microarray slides developed to rapidly detect and genotype E. Coli O157 virulent strains. The algorithm computes centroid locations for signal and background pixels in RGB space and defines a plane perpendicular to the line connect...
Horesh, Yair; Wexler, Ydo; Lebenthal, Ilana; Ziv-Ukelson, Michal; Unger, Ron
2009-03-04
Scanning large genomes with a sliding window in search of locally stable RNA structures is a well motivated problem in bioinformatics. Given a predefined window size L and an RNA sequence S of size N (L < N), the consecutive windows folding problem is to compute the minimal free energy (MFE) for the folding of each of the L-sized substrings of S. The consecutive windows folding problem can be naively solved in O(NL3) by applying any of the classical cubic-time RNA folding algorithms to each of the N-L windows of size L. Recently an O(NL2) solution for this problem has been described. Here, we describe and implement an O(NLpsi(L)) engine for the consecutive windows folding problem, where psi(L) is shown to converge to O(1) under the assumption of a standard probabilistic polymer folding model, yielding an O(L) speedup which is experimentally confirmed. Using this tool, we note an intriguing directionality (5'-3' vs. 3'-5') folding bias, i.e. that the minimal free energy (MFE) of folding is higher in the native direction of the DNA than in the reverse direction of various genomic regions in several organisms including regions of the genomes that do not encode proteins or ncRNA. This bias largely emerges from the genomic dinucleotide bias which affects the MFE, however we see some variations in the folding bias in the different genomic regions when normalized to the dinucleotide bias. We also present results from calculating the MFE landscape of a mouse chromosome 1, characterizing the MFE of the long ncRNA molecules that reside in this chromosome. The efficient consecutive windows folding engine described in this paper allows for genome wide scans for ncRNA molecules as well as large-scale statistics. This is implemented here as a software tool, called RNAslider, and applied to the scanning of long chromosomes, leading to the observation of features that are visible only on a large scale.
NASA Astrophysics Data System (ADS)
Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.
2018-03-01
In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.
NASA Astrophysics Data System (ADS)
Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim
2018-01-01
The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.
Vibration suppression in flexible structures via the sliding-mode control approach
NASA Technical Reports Server (NTRS)
Drakunov, S.; Oezguener, Uemit
1994-01-01
Sliding mode control became very popular recently because it makes the closed loop system highly insensitive to external disturbances and parameter variations. Sliding algorithms for flexible structures have been used previously, but these were based on finite-dimensional models. An extension of this approach for differential-difference systems is obtained. That makes if possible to apply sliding-mode control algorithms to the variety of nondispersive flexible structures which can be described as differential-difference systems. The main idea of using this technique for dispersive structures is to reduce the order of the controlled part of the system by applying an integral transformation. We can say that transformation 'absorbs' the dispersive properties of the flexible structure as the controlled part becomes dispersive.
Multimedia proceedings of the 10th Office Information Technology Conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, B.
1993-09-10
The CD contains the handouts for all the speakers, demo software from Apple, Adobe, Microsoft, and Zylabs, and video movies of the keynote speakers. Adobe Acrobat is used to provide full-fidelity retrieval of the speakers` slides and Apple`s Quicktime for Macintosh and Windows is used for video playback. ZyIndex is included for Windows users to provide a full-text search engine for selected documents. There are separately labelled installation and operating instructions for Macintosh and Windows users and some general materials common to both sets of users.
Finite time control for MIMO nonlinear system based on higher-order sliding mode.
Liu, Xiangjie; Han, Yaozhen
2014-11-01
Considering a class of MIMO uncertain nonlinear system, a novel finite time stable control algorithm is proposed based on higher-order sliding mode concept. The higher-order sliding mode control problem of MIMO nonlinear system is firstly transformed into finite time stability problem of multivariable system. Then continuous control law, which can guarantee finite time stabilization of nominal integral chain system, is employed. The second-order sliding mode is used to overcome the system uncertainties. High frequency chattering phenomenon of sliding mode is greatly weakened, and the arbitrarily fast convergence is reached. The finite time stability is proved based on the quadratic form Lyapunov function. Examples concerning the triple integral chain system with uncertainty and the hovercraft trajectory tracking are simulated respectively to verify the effectiveness and the robustness of the proposed algorithm. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Automated Interval velocity picking for Atlantic Multi-Channel Seismic Data
NASA Astrophysics Data System (ADS)
Singh, Vishwajit
2016-04-01
This paper described the challenge in developing and testing a fully automated routine for measuring interval velocities from multi-channel seismic data. Various approaches are employed for generating an interactive algorithm picking interval velocity for continuous 1000-5000 normal moveout (NMO) corrected gather and replacing the interpreter's effort for manual picking the coherent reflections. The detailed steps and pitfalls for picking the interval velocities from seismic reflection time measurements are describe in these approaches. Key ingredients these approaches utilized for velocity analysis stage are semblance grid and starting model of interval velocity. Basin-Hopping optimization is employed for convergence of the misfit function toward local minima. SLiding-Overlapping Window (SLOW) algorithm are designed to mitigate the non-linearity and ill- possessedness of root-mean-square velocity. Synthetic data case studies addresses the performance of the velocity picker generating models perfectly fitting the semblance peaks. A similar linear relationship between average depth and reflection time for synthetic model and estimated models proposed picked interval velocities as the starting model for the full waveform inversion to project more accurate velocity structure of the subsurface. The challenges can be categorized as (1) building accurate starting model for projecting more accurate velocity structure of the subsurface, (2) improving the computational cost of algorithm by pre-calculating semblance grid to make auto picking more feasible.
Dynamic Task Optimization in Remote Diabetes Monitoring Systems.
Suh, Myung-Kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid
2012-09-01
Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %.
Dynamic Task Optimization in Remote Diabetes Monitoring Systems
Suh, Myung-kyung; Woodbridge, Jonathan; Moin, Tannaz; Lan, Mars; Alshurafa, Nabil; Samy, Lauren; Mortazavi, Bobak; Ghasemzadeh, Hassan; Bui, Alex; Ahmadi, Sheila; Sarrafzadeh, Majid
2016-01-01
Diabetes is the seventh leading cause of death in the United States, but careful symptom monitoring can prevent adverse events. A real-time patient monitoring and feedback system is one of the solutions to help patients with diabetes and their healthcare professionals monitor health-related measurements and provide dynamic feedback. However, data-driven methods to dynamically prioritize and generate tasks are not well investigated in the domain of remote health monitoring. This paper presents a wireless health project (WANDA) that leverages sensor technology and wireless communication to monitor the health status of patients with diabetes. The WANDA dynamic task management function applies data analytics in real-time to discretize continuous features, applying data clustering and association rule mining techniques to manage a sliding window size dynamically and to prioritize required user tasks. The developed algorithm minimizes the number of daily action items required by patients with diabetes using association rules that satisfy a minimum support, confidence and conditional probability thresholds. Each of these tasks maximizes information gain, thereby improving the overall level of patient adherence and satisfaction. Experimental results from applying EM-based clustering and Apriori algorithms show that the developed algorithm can predict further events with higher confidence levels and reduce the number of user tasks by up to 76.19 %. PMID:27617297
SU-G-JeP1-15: Sliding Window Prior Data Assisted Compressed Sensing for MRI Lung Tumor Tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yip, E; Wachowicz, K; Rathee, S
Purpose: Prior Data Assisted Compressed Sensing (PDACS) is a partial k-space acquisition and reconstruction method for mobile tumour (i.e. lung) tracking using on-line MRI in radiotherapy. PDACS partially relies on prior data acquired at the beginning of dynamic scans, and is therefore susceptible to artifacts in longer duration scan due to slow drifts in MR signal. A novel sliding window strategy is presented to mitigate this effect. Methods: MRI acceleration is simulated by retrospective removal of data from the fully sampled sets. Six lung cancer patients were scanned (clinical 3T MRI) using a balanced steady state free precession (bSSFP) sequencemore » for 3 minutes at approximately 4 frames per second, for a total of 650 dynamics. PDACS acceleration is achieved by undersampling of k-space in a single pseudo-random pattern. Reconstruction iteratively minimizes the total variations while constraining the images to satisfy both the currently acquired data and the prior data in missing k-space. Our novel sliding window technique (SW-PDACS), uses a series of distinct pseudo-random under-sampling patterns of partial k-space – with the prior data drawn from a sliding window of the most recent data available. Under-sampled data, simulating 2 – 5x acceleration are reconstructed using PDACS and SW-PDACS. Three quantitative metrics: artifact power, centroid error and Dice’s coefficient are computed for comparison. Results: Quantitively metric values from all 6 patients are averaged in 3 bins, each containing approximately one minute of dynamic data. For the first minute bin, PDACS and SW-PDACS give comparable results. Progressive decline in image quality metrics in bins 2 and 3 are observed for PDACS. No decline in image quality is observed for SW-PDACS. Conclusion: The novel approach presented (SW-PDACS) is a more robust for accelerating longer duration (>1 minute) dynamic MRI scans for tracking lung tumour motion using on-line MRI in radiotherapy. B.G. Fallone is a co-founder and CEO of MagnetTx Oncology Solutions (under discussions to license Alberta bi-planar linac MR for commercialization).« less
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
Windowed time-reversal music technique for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements.
Optimal Control Allocation with Load Sensor Feedback for Active Load Suppression
NASA Technical Reports Server (NTRS)
Miller, Christopher
2017-01-01
These slide sets describe the OCLA formulation and associated algorithms as a set of new technologies in the first practical application of load limiting flight control utilizing load feedback as a primary control measurement. Slide set one describes Experiment Development and slide set two describes Flight-Test Performance.
Space Object Maneuver Detection Algorithms Using TLE Data
NASA Astrophysics Data System (ADS)
Pittelkau, M.
2016-09-01
An important aspect of Space Situational Awareness (SSA) is detection of deliberate and accidental orbit changes of space objects. Although space surveillance systems detect orbit maneuvers within their tracking algorithms, maneuver data are not readily disseminated for general use. However, two-line element (TLE) data is available and can be used to detect maneuvers of space objects. This work is an attempt to improve upon existing TLE-based maneuver detection algorithms. Three adaptive maneuver detection algorithms are developed and evaluated: The first is a fading-memory Kalman filter, which is equivalent to the sliding-window least-squares polynomial fit, but computationally more efficient and adaptive to the noise in the TLE data. The second algorithm is based on a sample cumulative distribution function (CDF) computed from a histogram of the magnitude-squared |V|2 of change-in-velocity vectors (V), which is computed from the TLE data. A maneuver detection threshold is computed from the median estimated from the CDF, or from the CDF and a specified probability of false alarm. The third algorithm is a median filter. The median filter is the simplest of a class of nonlinear filters called order statistics filters, which is within the theory of robust statistics. The output of the median filter is practically insensitive to outliers, or large maneuvers. The median of the |V|2 data is proportional to the variance of the V, so the variance is estimated from the output of the median filter. A maneuver is detected when the input data exceeds a constant times the estimated variance.
Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad
2018-02-01
Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.
Infrared emission spectra from operating elastohydrodynamic sliding contacts
NASA Technical Reports Server (NTRS)
Lauer, J. L.
1976-01-01
Infrared emission spectra from an operating EHD sliding contact were obtained through a diamond window for an aromatic polymer solute present in equal concentration in four different fluids. Three different temperature ranges, three different loads, and three different speeds for every load were examined. Very sensitive Fourier spectrophotometric (Interferometric) techniques were employed. Band Intensities and band intensity ratios found to depend both on the operating parameters and on the fluid. Fluid film and metal surface temperatures were calculated from the spectra and their dependence on the mechanical parameters plotted. The difference between these temperatures could be plotted against shear rate on one curve for all fluids. However, at the same shear rate the difference between bulk fluid temperature and diamond window temperature was much higher for one of the fluids, a traction fluid, than for the others.
Dynamic Resting-State Functional Connectivity in Major Depression.
Kaiser, Roselinde H; Whitfield-Gabrieli, Susan; Dillon, Daniel G; Goer, Franziska; Beltzer, Miranda; Minkel, Jared; Smoski, Moria; Dichter, Gabriel; Pizzagalli, Diego A
2016-06-01
Major depressive disorder (MDD) is characterized by abnormal resting-state functional connectivity (RSFC), especially in medial prefrontal cortical (MPFC) regions of the default network. However, prior research in MDD has not examined dynamic changes in functional connectivity as networks form, interact, and dissolve over time. We compared unmedicated individuals with MDD (n=100) to control participants (n=109) on dynamic RSFC (operationalized as SD in RSFC over a series of sliding windows) of an MPFC seed region during a resting-state functional magnetic resonance imaging scan. Among participants with MDD, we also investigated the relationship between symptom severity and RSFC. Secondary analyses probed the association between dynamic RSFC and rumination. Results showed that individuals with MDD were characterized by decreased dynamic (less variable) RSFC between MPFC and regions of parahippocampal gyrus within the default network, a pattern related to sustained positive connectivity between these regions across sliding windows. In contrast, the MDD group exhibited increased dynamic (more variable) RSFC between MPFC and regions of insula, and higher severity of depression was related to increased dynamic RSFC between MPFC and dorsolateral prefrontal cortex. These patterns of highly variable RSFC were related to greater frequency of strong positive and negative correlations in activity across sliding windows. Secondary analyses indicated that increased dynamic RSFC between MPFC and insula was related to higher levels of recent rumination. These findings provide initial evidence that depression, and ruminative thinking in depression, are related to abnormal patterns of fluctuating communication among brain systems involved in regulating attention and self-referential thinking.
The method for detecting small lesions in medical image based on sliding window
NASA Astrophysics Data System (ADS)
Han, Guilai; Jiao, Yuan
2016-10-01
At present, the research on computer-aided diagnosis includes the sample image segmentation, extracting visual features, generating the classification model by learning, and according to the model generated to classify and judge the inspected images. However, this method has a large scale of calculation and speed is slow. And because medical images are usually low contrast, when the traditional image segmentation method is applied to the medical image, there is a complete failure. As soon as possible to find the region of interest, improve detection speed, this topic attempts to introduce the current popular visual attention model into small lesions detection. However, Itti model is mainly for natural images. But the effect is not ideal when it is used to medical images which usually are gray images. Especially in the early stages of some cancers, the focus of a disease in the whole image is not the most significant region and sometimes is very difficult to be found. But these lesions are prominent in the local areas. This paper proposes a visual attention mechanism based on sliding window, and use sliding window to calculate the significance of a local area. Combined with the characteristics of the lesion, select the features of gray, entropy, corner and edge to generate a saliency map. Then the significant region is segmented and distinguished. This method reduces the difficulty of image segmentation, and improves the detection accuracy of small lesions, and it has great significance to early discovery, early diagnosis and treatment of cancers.
Ye, Jay J
2015-07-01
Pathologists' daily tasks consist of both the professional interpretation of slides and the secretarial tasks of translating these interpretations into final pathology reports, the latter of which is a time-consuming endeavor for most pathologists. To describe an artificial intelligence that performs secretarial tasks, designated as Secretary-Mimicking Artificial Intelligence (SMILE). The underling implementation of SMILE is a collection of computer programs that work in concert to "listen to" the voice commands and to "watch for" the changes of windows caused by slide bar code scanning; SMILE responds to these inputs by acting upon PowerPath Client windows (Sunquest Information Systems, Tucson, Arizona) and its Microsoft Word (Microsoft, Redmond, Washington) Add-In window, eventuating in the reports being typed and finalized. Secretary-Mimicking Artificial Intelligence also communicates relevant information to the pathologist via the computer speakers and message box on the screen. Secretary-Mimicking Artificial Intelligence performs many secretarial tasks intelligently and semiautonomously, with rapidity and consistency, thus enabling pathologists to focus on slide interpretation, which results in a marked increase in productivity, decrease in errors, and reduction of stress in daily practice. Secretary-Mimicking Artificial Intelligence undergoes encounter-based learning continually, resulting in a continuous improvement in its knowledge-based intelligence. Artificial intelligence for pathologists is both feasible and powerful. The future widespread use of artificial intelligence in our profession is certainly going to transform how we practice pathology.
Infrastructure-Free Mapping and Localization for Tunnel-Based Rail Applications Using 2D Lidar
NASA Astrophysics Data System (ADS)
Daoust, Tyler
This thesis presents an infrastructure-free mapping and localization framework for rail vehicles using only a lidar sensor. The method was designed to handle modern underground tunnels: narrow, parallel, and relatively smooth concrete walls. A sliding-window algorithm was developed to estimate the train's motion, using a Renyi's Quadratic Entropy (RQE)-based point-cloud alignment system. The method was tested with datasets gathered on a subway train travelling at high speeds, with 75 km of data across 14 runs, simulating 500 km of localization. The system was capable of mapping with an average error of less than 0.6 % by distance. It was capable of continuously localizing, relative to the map, to within 10 cm in stations and at crossovers, and 2.3 m in pathological sections of tunnel. This work has the potential to improve train localization in a tunnel, which can be used to increase capacity and for automation purposes.
Reliable multicast protocol specifications flow control and NACK policy
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Whetten, Brian
1995-01-01
This appendix presents the flow and congestion control schemes recommended for RMP and a NACK policy based on the whiteboard tool. Because RMP uses a primarily NACK based error detection scheme, there is no direct feedback path through which receivers can signal losses through low buffer space or congestion. Reliable multicast protocols also suffer from the fact that throughput for a multicast group must be divided among the members of the group. This division is usually very dynamic in nature and therefore does not lend itself well to a priori determination. These facts have led the flow and congestion control schemes of RMP to be made completely orthogonal to the protocol specification. This allows several differing schemes to be used in different environments to produce the best results. As a default, a modified sliding window scheme based on previous algorithms are suggested and described below.
Wavelet based analysis of multi-electrode EEG-signals in epilepsy
NASA Astrophysics Data System (ADS)
Hein, Daniel A.; Tetzlaff, Ronald
2005-06-01
For many epilepsy patients seizures cannot sufficiently be controlled by an antiepileptic pharmacatherapy. Furthermore, only in small number of cases a surgical treatment may be possible. The aim of this work is to contribute to the realization of an implantable seizure warning device. By using recordings of electroenzephalographical(EEG) signals obtained from the department of epileptology of the University of Bonn we studied a recently proposed algorithm for the detection of parameter changes in nonlinear systems. Firstly, after calculating the crosscorrelation function between the signals of two electrodes near the epileptic focus, a wavelet-analysis follows using a sliding window with the so called Mexican-Hat wavelet. Then the Shannon-Entropy of the wavelet-transformed data has been determined providing the information content on a time scale in subject to the dilation of the wavelet-transformation. It shows distinct changes at the seizure onset for all dilations and for all patients.
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
Sliding-mode control of single input multiple output DC-DC converter
NASA Astrophysics Data System (ADS)
Zhang, Libo; Sun, Yihan; Luo, Tiejian; Wan, Qiyang
2016-10-01
Various voltage levels are required in the vehicle mounted power system. A conventional solution is to utilize an independent multiple output DC-DC converter whose cost is high and control scheme is complicated. In this paper, we design a novel SIMO DC-DC converter with sliding mode controller. The proposed converter can boost the voltage of a low-voltage input power source to a controllable high-voltage DC bus and middle-voltage output terminals, which endow the converter with characteristics of simple structure, low cost, and convenient control. In addition, the sliding mode control (SMC) technique applied in our converter can enhance the performances of a certain SIMO DC-DC converter topology. The high-voltage DC bus can be regarded as the main power source to the high-voltage facility of the vehicle mounted power system, and the middle-voltage output terminals can supply power to the low-voltage equipment on an automobile. In the respect of control algorithm, it is the first time to propose the SMC-PID (Proportion Integration Differentiation) control algorithm, in which the SMC algorithm is utilized and the PID control is attended to the conventional SMC algorithm. The PID control increases the dynamic ability of the SMC algorithm by establishing the corresponding SMC surface and introducing the attached integral of voltage error, which endow the sliding-control system with excellent dynamic performance. At last, we established the MATLAB/SIMULINK simulation model, tested performance of the system, and built the hardware prototype based on Digital Signal Processor (DSP). Results show that the sliding mode control is able to track a required trajectory, which has robustness against the uncertainties and disturbances.
Sliding-mode control of single input multiple output DC-DC converter.
Zhang, Libo; Sun, Yihan; Luo, Tiejian; Wan, Qiyang
2016-10-01
Various voltage levels are required in the vehicle mounted power system. A conventional solution is to utilize an independent multiple output DC-DC converter whose cost is high and control scheme is complicated. In this paper, we design a novel SIMO DC-DC converter with sliding mode controller. The proposed converter can boost the voltage of a low-voltage input power source to a controllable high-voltage DC bus and middle-voltage output terminals, which endow the converter with characteristics of simple structure, low cost, and convenient control. In addition, the sliding mode control (SMC) technique applied in our converter can enhance the performances of a certain SIMO DC-DC converter topology. The high-voltage DC bus can be regarded as the main power source to the high-voltage facility of the vehicle mounted power system, and the middle-voltage output terminals can supply power to the low-voltage equipment on an automobile. In the respect of control algorithm, it is the first time to propose the SMC-PID (Proportion Integration Differentiation) control algorithm, in which the SMC algorithm is utilized and the PID control is attended to the conventional SMC algorithm. The PID control increases the dynamic ability of the SMC algorithm by establishing the corresponding SMC surface and introducing the attached integral of voltage error, which endow the sliding-control system with excellent dynamic performance. At last, we established the MATLAB/SIMULINK simulation model, tested performance of the system, and built the hardware prototype based on Digital Signal Processor (DSP). Results show that the sliding mode control is able to track a required trajectory, which has robustness against the uncertainties and disturbances.
NASA Technical Reports Server (NTRS)
Vo, San C.; Biegel, Bryan (Technical Monitor)
2001-01-01
Scalar multiplication is an essential operation in elliptic curve cryptosystems because its implementation determines the speed and the memory storage requirements. This paper discusses some improvements on two popular signed window algorithms for implementing scalar multiplications of an elliptic curve point - Morain-Olivos's algorithm and Koyarna-Tsuruoka's algorithm.
Analysis and design of second-order sliding-mode algorithms for quadrotor roll and pitch estimation.
Chang, Jing; Cieslak, Jérôme; Dávila, Jorge; Zolghadri, Ali; Zhou, Jun
2017-11-01
The problem addressed in this paper is that of quadrotor roll and pitch estimation without any assumption about the knowledge of perturbation bounds when Inertial Measurement Units (IMU) data or position measurements are available. A Smooth Sliding Mode (SSM) algorithm is first designed to provide reliable estimation under a smooth disturbance assumption. This assumption is next relaxed with the second proposed Adaptive Sliding Mode (ASM) algorithm that deals with disturbances of unknown bounds. In addition, the analysis of the observers are extended to the case where measurements are corrupted by bias and noise. The gains of the proposed algorithms were deduced from the Lyapunov function. Furthermore, some useful guidelines are provided for the selection of the observer turning parameters. The performance of these two approaches is evaluated using a nonlinear simulation model and considering either accelerometer or position measurements. The simulation results demonstrate the benefits of the proposed solutions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Correlation between automatic detection of malaria on thin film and experts' parasitaemia scores
NASA Astrophysics Data System (ADS)
Sunarko, Budi; Williams, Simon; Prescott, William R.; Byker, Scott M.; Bottema, Murk J.
2017-03-01
An algorithm was developed to diagnose the presence of malaria and to estimate the depth of infection by automatically counting individual normal and infected erythrocytes in images of thin blood smears. During the training stage, the parameters of the algorithm were optimized to maximize correlation with estimates of parasitaemia from expert human observers. The correlation was tested on a set of 1590 images from seven thin film blood smears. The correlation between the results from the algorithm and expert human readers was r = 0.836. Results indicate that reliable estimates of parasitaemia may be achieved by computational image analysis methods applied to images of thin film smears. Meanwhile, compared to biological experiments, the algorithm fitted well the three high parasitaemia slides and a mid-level parasitaemia slide, and overestimated the three low parasitaemia slides. To improve the parasitaemia estimation, the sources of the overestimation were identified. Emphasis is laid on the importance of further research in order to identify parasites independently of their erythrocyte hosts
20. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND ORIGINAL WOODFRAMED ...
20. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND ORIGINAL WOOD-FRAMED SLIDING GLASS WINDOWS OVER SINK. VIEW TO SOUTHEAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
16. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND ORIGINAL WOODFRAMED ...
16. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND ORIGINAL WOOD-FRAMED SLIDING-GLASS WINDOWS OVER SINK. VIEW TO EAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
Samsi, Siddharth; Krishnamurthy, Ashok K.; Gurcan, Metin N.
2012-01-01
Follicular Lymphoma (FL) is one of the most common non-Hodgkin Lymphoma in the United States. Diagnosis and grading of FL is based on the review of histopathological tissue sections under a microscope and is influenced by human factors such as fatigue and reader bias. Computer-aided image analysis tools can help improve the accuracy of diagnosis and grading and act as another tool at the pathologist’s disposal. Our group has been developing algorithms for identifying follicles in immunohistochemical images. These algorithms have been tested and validated on small images extracted from whole slide images. However, the use of these algorithms for analyzing the entire whole slide image requires significant changes to the processing methodology since the images are relatively large (on the order of 100k × 100k pixels). In this paper we discuss the challenges involved in analyzing whole slide images and propose potential computational methodologies for addressing these challenges. We discuss the use of parallel computing tools on commodity clusters and compare performance of the serial and parallel implementations of our approach. PMID:22962572
Robust and real-time rotor control with magnetic bearings
NASA Technical Reports Server (NTRS)
Sinha, A.; Wang, K. W.; Mease, K. L.
1991-01-01
This paper deals with the sliding mode control of a rigid rotor via radial magnetic bearings. The digital control algorithm and the results from numerical simulations are presented for an experimental rig. The experimental system which has been set up to digitally implement and validate the sliding mode control algorithm is described. Two methods for the development of control softwares are presented. Experimental results for individual rotor axis are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lei, Hongzhuan; Lu, Zhiming; Vesselinov, Velimir Valentinov
These are slides from a presentation on identifying heterogeneities in subsurface environment using the level set method. The slides start with the motivation, then explain Level Set Method (LSM), the algorithms, some examples are given, and finally future work is explained.
Learning in the model space for cognitive fault diagnosis.
Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin
2014-01-01
The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.
Temporally-aware algorithms for the classification of anuran sounds.
Luque, Amalia; Romero-Lemos, Javier; Carrasco, Alejandro; Gonzalez-Abril, Luis
2018-01-01
Several authors have shown that the sounds of anurans can be used as an indicator of climate change. Hence, the recording, storage and further processing of a huge number of anuran sounds, distributed over time and space, are required in order to obtain this indicator. Furthermore, it is desirable to have algorithms and tools for the automatic classification of the different classes of sounds. In this paper, six classification methods are proposed, all based on the data-mining domain, which strive to take advantage of the temporal character of the sounds. The definition and comparison of these classification methods is undertaken using several approaches. The main conclusions of this paper are that: (i) the sliding window method attained the best results in the experiments presented, and even outperformed the hidden Markov models usually employed in similar applications; (ii) noteworthy overall classification performance has been obtained, which is an especially striking result considering that the sounds analysed were affected by a highly noisy background; (iii) the instance selection for the determination of the sounds in the training dataset offers better results than cross-validation techniques; and (iv) the temporally-aware classifiers have revealed that they can obtain better performance than their non-temporally-aware counterparts.
Temporally-aware algorithms for the classification of anuran sounds
Gonzalez-Abril, Luis
2018-01-01
Several authors have shown that the sounds of anurans can be used as an indicator of climate change. Hence, the recording, storage and further processing of a huge number of anuran sounds, distributed over time and space, are required in order to obtain this indicator. Furthermore, it is desirable to have algorithms and tools for the automatic classification of the different classes of sounds. In this paper, six classification methods are proposed, all based on the data-mining domain, which strive to take advantage of the temporal character of the sounds. The definition and comparison of these classification methods is undertaken using several approaches. The main conclusions of this paper are that: (i) the sliding window method attained the best results in the experiments presented, and even outperformed the hidden Markov models usually employed in similar applications; (ii) noteworthy overall classification performance has been obtained, which is an especially striking result considering that the sounds analysed were affected by a highly noisy background; (iii) the instance selection for the determination of the sounds in the training dataset offers better results than cross-validation techniques; and (iv) the temporally-aware classifiers have revealed that they can obtain better performance than their non-temporally-aware counterparts. PMID:29740517
27. INTERIOR OF KITCHEN SHOWING ORIGINAL CABINETS, LATCHES AND PULLS, ...
27. INTERIOR OF KITCHEN SHOWING ORIGINAL CABINETS, LATCHES AND PULLS, AND WOOD-FRAME SLIDING-GLASS WINDOWS ABOVE SINK. VIEW TO EAST. - Rush Creek Hydroelectric System, Clubhouse Cottage, Rush Creek, June Lake, Mono County, CA
5. EXTERIOR OF FRONT AND SOUTHWEST WALL OF HOUSE SHOWING ...
5. EXTERIOR OF FRONT AND SOUTHWEST WALL OF HOUSE SHOWING GABLE-ROOFED 1965 ADDITION WITH SLIDING-GLASS WINDOWS. VIEW TO NORTH. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
29. SECOND FLOOR EAST SIDE APARTMENT EAST BEDROOM INTERIOR. ALUMINUMFRAME ...
29. SECOND FLOOR EAST SIDE APARTMENT EAST BEDROOM INTERIOR. ALUMINUM-FRAME SLIDING-GLASS WINDOWS ARE REPLACEMENTS. VIEW TO NORTHEAST. - Lee Vining Creek Hydroelectric System, Triplex Cottage, Lee Vining Creek, Lee Vining, Mono County, CA
A chest-shape target automatic detection method based on Deformable Part Models
NASA Astrophysics Data System (ADS)
Zhang, Mo; Jin, Weiqi; Li, Li
2016-10-01
Automatic weapon platform is one of the important research directions at domestic and overseas, it needs to accomplish fast searching for the object to be shot under complex background. Therefore, fast detection for given target is the foundation of further task. Considering that chest-shape target is common target of shoot practice, this paper treats chestshape target as the target and studies target automatic detection method based on Deformable Part Models. The algorithm computes Histograms of Oriented Gradient(HOG) features of the target and trains a model using Latent variable Support Vector Machine(SVM); In this model, target image is divided into several parts then we can obtain foot filter and part filters; Finally, the algorithm detects the target at the HOG features pyramid with method of sliding window. The running time of extracting HOG pyramid with lookup table can be shorten by 36%. The result indicates that this algorithm can detect the chest-shape target in natural environments indoors or outdoors. The true positive rate of detection reaches 76% with many hard samples, and the false positive rate approaches 0. Running on a PC (Intel(R)Core(TM) i5-4200H CPU) with C++ language, the detection time of images with the resolution of 640 × 480 is 2.093s. According to TI company run library about image pyramid and convolution for DM642 and other hardware, our detection algorithm is expected to be implemented on hardware platform, and it has application prospect in actual system.
Determination of layer ordering using sliding-window Fourier transform of x-ray reflectivity data
NASA Astrophysics Data System (ADS)
Smigiel, E.; Knoll, A.; Broll, N.; Cornet, A.
1998-01-01
X-ray reflectometry allows the determination of the thickness, density and roughness of thin layers on a substrate from several Angstroms to some hundred nanometres. The thickness is determined by simulation with trial-and-error methods after extracting initial values of the layer thicknesses from the result of a classical Fast Fourier Transform (FFT) of the reflectivity data. However, the order information of the layers is lost during classical FFT. The order of the layers has then to be known a priori. In this paper, it will be shown that the order of the layers can be obtained by a sliding-window Fourier transform, the so-called Gabor representation. This joint time-frequency analysis allows the direct determination of the order of the layers and, therefore, the use of a more appropriate starting model for refining simulations. A simulated and a measured example show the interest of this method.
Effects of window size and shape on accuracy of subpixel centroid estimation of target images
NASA Technical Reports Server (NTRS)
Welch, Sharon S.
1993-01-01
A new algorithm is presented for increasing the accuracy of subpixel centroid estimation of (nearly) point target images in cases where the signal-to-noise ratio is low and the signal amplitude and shape vary from frame to frame. In the algorithm, the centroid is calculated over a data window that is matched in width to the image distribution. Fourier analysis is used to explain the dependency of the centroid estimate on the size of the data window, and simulation and experimental results are presented which demonstrate the effects of window size for two different noise models. The effects of window shape were also investigated for uniform and Gaussian-shaped windows. The new algorithm was developed to improve the dynamic range of a close-range photogrammetric tracking system that provides feedback for control of a large gap magnetic suspension system (LGMSS).
Prediction Study on Anti-Slide Control of Railway Vehicle Based on RBF Neural Networks
NASA Astrophysics Data System (ADS)
Yang, Lijun; Zhang, Jimin
While railway vehicle braking, Anti-slide control system will detect operating status of each wheel-sets e.g. speed difference and deceleration etc. Once the detected value on some wheel-set is over pre-defined threshold, brake effort on such wheel-set will be adjusted automatically to avoid blocking. Such method takes effect on guarantee safety operation of vehicle and avoid wheel-set flatness, however it cannot adapt itself to the rail adhesion variation. While wheel-sets slide, the operating status is chaotic time series with certain law, and can be predicted with the law and experiment data in certain time. The predicted values can be used as the input reference signals of vehicle anti-slide control system, to judge and control the slide status of wheel-sets. In this article, the RBF neural networks is taken to predict wheel-set slide status in multi-step with weight vector adjusted based on online self-adaptive algorithm, and the center & normalizing parameters of active function of the hidden unit of RBF neural networks' hidden layer computed with K-means clustering algorithm. With multi-step prediction simulation, the predicted signal with appropriate precision can be used by anti-slide system to trace actively and adjust wheel-set slide tendency, so as to adapt to wheel-rail adhesion variation and reduce the risk of wheel-set blocking.
FAST CHOPPER BUILDING, TRA665. CAMERA FACING NORTH. NOTE BRICKEDIN WINDOW ...
FAST CHOPPER BUILDING, TRA-665. CAMERA FACING NORTH. NOTE BRICKED-IN WINDOW ON RIGHT SIDE (BELOW PAINTED NUMERALS "665"). SLIDING METAL DOOR ON COVERED RAIL AT UPPER LEVEL. SHELTERED ENTRANCE TO STEEL SHIELDING DOOR. DOOR INTO MTR SERVICE BUILDING, TRA-635, STANDS OPEN. MTR BEHIND CHOPPER BUILDING. INL NEGATIVE NO. HD42-1. Mike Crane, Photographer, 3/2004 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Robust Synchronization Schemes for Dynamic Channel Environments
NASA Technical Reports Server (NTRS)
Xiong, Fugin
2003-01-01
Professor Xiong will investigate robust synchronization schemes for dynamic channel environment. A sliding window will be investigated for symbol timing synchronizer and an open loop carrier estimator for carrier synchronization. Matlab/Simulink will be used for modeling and simulations.
18. INTERIOR OF BATHROOM SHOWING DOOR TO SOUTH BEDROOM AND ...
18. INTERIOR OF BATHROOM SHOWING DOOR TO SOUTH BEDROOM AND ALUMINUM-FRAMED SLIDING GLASS WINDOW ABOVE BATHTUB AT PHOTO LEFT. VIEW TO SOUTHEAST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Dynamic programming-based hot spot identification approach for pedestrian crashes.
Medury, Aditya; Grembek, Offer
2016-08-01
Network screening techniques are widely used by state agencies to identify locations with high collision concentration, also referred to as hot spots. However, most of the research in this regard has focused on identifying highway segments that are of concern to automobile collisions. In comparison, pedestrian hot spot detection has typically focused on analyzing pedestrian crashes in specific locations, such as at/near intersections, mid-blocks, and/or other crossings, as opposed to long stretches of roadway. In this context, the efficiency of the some of the widely used network screening methods has not been tested. Hence, in order to address this issue, a dynamic programming-based hot spot identification approach is proposed which provides efficient hot spot definitions for pedestrian crashes. The proposed approach is compared with the sliding window method and an intersection buffer-based approach. The results reveal that the dynamic programming method generates more hot spots with a higher number of crashes, while providing small hot spot segment lengths. In comparison, the sliding window method is shown to suffer from shortcomings due to a first-come-first-serve approach vis-à-vis hot spot identification and a fixed hot spot window length assumption. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hsiao, Y. R.; Tsai, C.
2017-12-01
As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.
High-resolution melting (HRM) for genotyping bovine ephemeral fever virus (BEFV).
Erster, Oran; Stram, Rotem; Menasherow, Shopia; Rubistein-Giuni, Marisol; Sharir, Binyamin; Kchinich, Evgeni; Stram, Yehuda
2017-02-02
In recent years there have been several major outbreaks of bovine ephemeral disease in the Middle East, including Israel. Such occurrences raise the need for quick identification of the viruses responsible for the outbreaks, in order to rapidly identify the entry of viruses that do not belong to the Middle-East BEFV lineage. This challenge was met by the development of a high-resolution melt (HRM) assay. The assay is based on the viral G gene sequence and generation of an algorithm that calculates and evaluates the GC content of various fragments. The algorithm was designed to scan 50- to 200-base-long segments in a sliding-window manner, compare and rank them using an Order of Technique of Preference by Similarity to Ideal Solution (TOPSIS) the technique for order preference by similarity to ideal solution technique, according to the differences in GC content of homologous fragments. Two fragments were selected, based on a match to the analysis criteria, in terms of size and GC content. These fragments were successfully used in the analysis to differentiate between different virus lineages, thus facilitating assignment of the viruses' geographical origins. Moreover, the assay could be used for differentiating infected from vaccinated animales (DIVA). The new algorithm may therefore be useful for development of improved genotyping studies for other viruses and possibly other microorganisms. Copyright © 2016. Published by Elsevier B.V.
Kück, Patrick; Meusemann, Karen; Dambach, Johannes; Thormann, Birthe; von Reumont, Björn M; Wägele, Johann W; Misof, Bernhard
2010-03-31
Methods of alignment masking, which refers to the technique of excluding alignment blocks prior to tree reconstructions, have been successful in improving the signal-to-noise ratio in sequence alignments. However, the lack of formally well defined methods to identify randomness in sequence alignments has prevented a routine application of alignment masking. In this study, we compared the effects on tree reconstructions of the most commonly used profiling method (GBLOCKS) which uses a predefined set of rules in combination with alignment masking, with a new profiling approach (ALISCORE) based on Monte Carlo resampling within a sliding window, using different data sets and alignment methods. While the GBLOCKS approach excludes variable sections above a certain threshold which choice is left arbitrary, the ALISCORE algorithm is free of a priori rating of parameter space and therefore more objective. ALISCORE was successfully extended to amino acids using a proportional model and empirical substitution matrices to score randomness in multiple sequence alignments. A complex bootstrap resampling leads to an even distribution of scores of randomly similar sequences to assess randomness of the observed sequence similarity. Testing performance on real data, both masking methods, GBLOCKS and ALISCORE, helped to improve tree resolution. The sliding window approach was less sensitive to different alignments of identical data sets and performed equally well on all data sets. Concurrently, ALISCORE is capable of dealing with different substitution patterns and heterogeneous base composition. ALISCORE and the most relaxed GBLOCKS gap parameter setting performed best on all data sets. Correspondingly, Neighbor-Net analyses showed the most decrease in conflict. Alignment masking improves signal-to-noise ratio in multiple sequence alignments prior to phylogenetic reconstruction. Given the robust performance of alignment profiling, alignment masking should routinely be used to improve tree reconstructions. Parametric methods of alignment profiling can be easily extended to more complex likelihood based models of sequence evolution which opens the possibility of further improvements.
Measurement of glucose concentration by image processing of thin film slides
NASA Astrophysics Data System (ADS)
Piramanayagam, Sankaranaryanan; Saber, Eli; Heavner, David
2012-02-01
Measurement of glucose concentration is important for diagnosis and treatment of diabetes mellitus and other medical conditions. This paper describes a novel image-processing based approach for measuring glucose concentration. A fluid drop (patient sample) is placed on a thin film slide. Glucose, present in the sample, reacts with reagents on the slide to produce a color dye. The color intensity of the dye formed varies with glucose at different concentration levels. Current methods use spectrophotometry to determine the glucose level of the sample. Our proposed algorithm uses an image of the slide, captured at a specific wavelength, to automatically determine glucose concentration. The algorithm consists of two phases: training and testing. Training datasets consist of images at different concentration levels. The dye-occupied image region is first segmented using a Hough based technique and then an intensity based feature is calculated from the segmented region. Subsequently, a mathematical model that describes a relationship between the generated feature values and the given concentrations is obtained. During testing, the dye region of a test slide image is segmented followed by feature extraction. These two initial steps are similar to those done in training. However, in the final step, the algorithm uses the model (feature vs. concentration) obtained from the training and feature generated from test image to predict the unknown concentration. The performance of the image-based analysis was compared with that of a standard glucose analyzer.
Bacterial contamination monitor
NASA Technical Reports Server (NTRS)
Rich, E.; Macleod, N. H.
1973-01-01
Economical, simple, and fast method uses apparatus which detects bacteria by photography. Apparatus contains camera, film assembly, calibrated light bulb, opaque plastic plate with built-in reflecting surface and transparent window section, opaque slide, plate with chemical packages, and cover containing roller attached to handle.
5. EXTERIOR OF NORTH SIDE SHOWING ENCLOSED FRONT PORCH AREA, ...
5. EXTERIOR OF NORTH SIDE SHOWING ENCLOSED FRONT PORCH AREA, ALUMINUM SLIDING GLASS WINDOW GLAZING REPLACEMENTS, AND RAILING FOR STAIRS TO BASEMENT. VIEW TO SOUTHWEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
17. INTERIOR OF BEDROOM NO. 3 SHOWING MODERN ALUMINUMFRAMED SLIDINGGLASS ...
17. INTERIOR OF BEDROOM NO. 3 SHOWING MODERN ALUMINUM-FRAMED SLIDING-GLASS WINDOWS WITH WOOD SURROUNDS ON SOUTHWEST AND NORTHWEST WALLS. VIEW TO WEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
17. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, SINK, AND FAUCET, ...
17. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, SINK, AND FAUCET, AND ORIGINAL WOOD-FRAMED SLIDING GLASS WINDOWS ON SOUTH WALL OVER SINK. VIEW TO SOUTHEAST - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
16. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND COUNTER TOP, ...
16. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS AND COUNTER TOP, AND ORIGINAL WOOD-FRAMED SLIDING GLASS WINDOW IN NORTH WALL OVERLOOKING FRONT ENTRY. VIEW TO NORTHEAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
Tools for the IDL widget set within the X-windows environment
NASA Technical Reports Server (NTRS)
Turgeon, B.; Aston, A.
1992-01-01
New tools using the IDL widget set are presented. In particular, a utility allowing the easy creation and update of slide presentations, XSlideManager, is explained in detail and examples of its application are shown. In addition to XSlideManager, other mini-utilities are discussed. These various pieces of software follow the philosophy of the X-Windows distribution system and are made available to anyone within the Internet network. Acquisition procedures through anonymous ftp are clearly explained.
Denimal, Emmanuel; Marin, Ambroise; Guyot, Stéphane; Journaux, Ludovic; Molin, Paul
2015-08-01
In biology, hemocytometers such as Malassez slides are widely used and are effective tools for counting cells manually. In a previous work, a robust algorithm was developed for grid extraction in Malassez slide images. This algorithm was evaluated on a set of 135 images and grids were accurately detected in most cases, but there remained failures for the most difficult images. In this work, we present an optimization of this algorithm that allows for 100% grid detection and a 25% improvement in grid positioning accuracy. These improvements make the algorithm fully reliable for grid detection. This optimization also allows complete erasing of the grid without altering the cells, which eases their segmentation.
Sliding mode fault tolerant control dealing with modeling uncertainties and actuator faults.
Wang, Tao; Xie, Wenfang; Zhang, Youmin
2012-05-01
In this paper, two sliding mode control algorithms are developed for nonlinear systems with both modeling uncertainties and actuator faults. The first algorithm is developed under an assumption that the uncertainty bounds are known. Different design parameters are utilized to deal with modeling uncertainties and actuator faults, respectively. The second algorithm is an adaptive version of the first one, which is developed to accommodate uncertainties and faults without utilizing exact bounds information. The stability of the overall control systems is proved by using a Lyapunov function. The effectiveness of the developed algorithms have been verified on a nonlinear longitudinal model of Boeing 747-100/200. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Zhang, Yao; Tang, Shengjing; Guo, Jie
2017-11-01
In this paper, a novel adaptive-gain fast super-twisting (AGFST) sliding mode attitude control synthesis is carried out for a reusable launch vehicle subject to actuator faults and unknown disturbances. According to the fast nonsingular terminal sliding mode surface (FNTSMS) and adaptive-gain fast super-twisting algorithm, an adaptive fault tolerant control law for the attitude stabilization is derived to protect against the actuator faults and unknown uncertainties. Firstly, a second-order nonlinear control-oriented model for the RLV is established by feedback linearization method. And on the basis a fast nonsingular terminal sliding mode (FNTSM) manifold is designed, which provides fast finite-time global convergence and avoids singularity problem as well as chattering phenomenon. Based on the merits of the standard super-twisting (ST) algorithm and fast reaching law with adaption, a novel adaptive-gain fast super-twisting (AGFST) algorithm is proposed for the finite-time fault tolerant attitude control problem of the RLV without any knowledge of the bounds of uncertainties and actuator faults. The important feature of the AGFST algorithm includes non-overestimating the values of the control gains and faster convergence speed than the standard ST algorithm. A formal proof of the finite-time stability of the closed-loop system is derived using the Lyapunov function technique. An estimation of the convergence time and accurate expression of convergence region are also provided. Finally, simulations are presented to illustrate the effectiveness and superiority of the proposed control scheme. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Geessink, Oscar G. F.; Baidoshvili, Alexi; Freling, Gerard; Klaase, Joost M.; Slump, Cornelis H.; van der Heijden, Ferdinand
2015-03-01
Visual estimation of tumor and stroma proportions in microscopy images yields a strong, Tumor-(lymph)Node- Metastasis (TNM) classification-independent predictor for patient survival in colorectal cancer. Therefore, it is also a potent (contra)indicator for adjuvant chemotherapy. However, quantification of tumor and stroma through visual estimation is highly subject to intra- and inter-observer variability. The aim of this study is to develop and clinically validate a method for objective quantification of tumor and stroma in standard hematoxylin and eosin (H and E) stained microscopy slides of rectal carcinomas. A tissue segmentation algorithm, based on supervised machine learning and pixel classification, was developed, trained and validated using histological slides that were prepared from surgically excised rectal carcinomas in patients who had not received neoadjuvant chemotherapy and/or radiotherapy. Whole-slide scanning was performed at 20× magnification. A total of 40 images (4 million pixels each) were extracted from 20 whole-slide images at sites showing various relative proportions of tumor and stroma. Experienced pathologists provided detailed annotations for every extracted image. The performance of the algorithm was evaluated using cross-validation by testing on 1 image at a time while using the other 39 images for training. The total classification error of the algorithm was 9.4% (SD = 3.2%). Compared to visual estimation by pathologists, the algorithm was 7.3 times (P = 0.033) more accurate in quantifying tissues, also showing 60% less variability. Automatic tissue quantification was shown to be both reliable and practicable. We ultimately intend to facilitate refined prognostic stratification of (colo)rectal cancer patients and enable better personalized treatment.
NASA Astrophysics Data System (ADS)
Qiang, Jiang; Meng-wei, Liao; Ming-jie, Luo
2018-03-01
Abstract.The control performance of Permanent Magnet Synchronous Motor will be affected by the fluctuation or changes of mechanical parameters when PMSM is applied as driving motor in actual electric vehicle,and external disturbance would influence control robustness.To improve control dynamic quality and robustness of PMSM speed control system, a new second order integral sliding mode control algorithm is introduced into PMSM vector control.The simulation results show that, compared with the traditional PID control,the modified control scheme optimized has better control precision and dynamic response ability and perform better with a stronger robustness facing external disturbance,it can effectively solve the traditional sliding mode variable structure control chattering problems as well.
Sliding GAIT Algorithm for the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE)
NASA Technical Reports Server (NTRS)
Townsend, Julie; Biesiadecki, Jeffrey
2012-01-01
The design of a surface robotic system typically involves a trade between the traverse speed of a wheeled rover and the terrain-negotiating capabilities of a multi-legged walker. The ATHLETE mobility system, with both articulated limbs and wheels, is uniquely capable of both driving and walking, and has the flexibility to employ additional hybrid mobility modes. This paper introduces the Sliding Gait, an intermediate mobility algorithm faster than walking with better terrain-handling capabilities than wheeled mobility.
15. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, OUNTER TOP, SINK, ...
15. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, OUNTER TOP, SINK, AND FAUCET, AND ORIGINAL WOOD FRAMED SLIDING-GLASS WINDOW IN NORTH WALL OVERLOOKING FRONT PORCH. VIEW TO NORTH. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
Yurtkuran, Alkın; Emel, Erdal
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms.
A fast algorithm for vertex-frequency representations of signals on graphs
Jestrović, Iva; Coyle, James L.; Sejdić, Ervin
2016-01-01
The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645
Yurtkuran, Alkın
2014-01-01
The traveling salesman problem with time windows (TSPTW) is a variant of the traveling salesman problem in which each customer should be visited within a given time window. In this paper, we propose an electromagnetism-like algorithm (EMA) that uses a new constraint handling technique to minimize the travel cost in TSPTW problems. The EMA utilizes the attraction-repulsion mechanism between charged particles in a multidimensional space for global optimization. This paper investigates the problem-specific constraint handling capability of the EMA framework using a new variable bounding strategy, in which real-coded particle's boundary constraints associated with the corresponding time windows of customers, is introduced and combined with the penalty approach to eliminate infeasibilities regarding time window violations. The performance of the proposed algorithm and the effectiveness of the constraint handling technique have been studied extensively, comparing it to that of state-of-the-art metaheuristics using several sets of benchmark problems reported in the literature. The results of the numerical experiments show that the EMA generates feasible and near-optimal results within shorter computational times compared to the test algorithms. PMID:24723834
A multimodal logistics service network design with time windows and environmental concerns
Zhang, Dezhi; He, Runzhong; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained. PMID:28934272
A multimodal logistics service network design with time windows and environmental concerns.
Zhang, Dezhi; He, Runzhong; Li, Shuangyan; Wang, Zhongwei
2017-01-01
The design of a multimodal logistics service network with customer service time windows and environmental costs is an important and challenging issue. Accordingly, this work established a model to minimize the total cost of multimodal logistics service network design with time windows and environmental concerns. The proposed model incorporates CO2 emission costs to determine the optimal transportation mode combinations and investment selections for transfer nodes, which consider transport cost, transport time, carbon emission, and logistics service time window constraints. Furthermore, genetic and heuristic algorithms are proposed to set up the abovementioned optimal model. A numerical example is provided to validate the model and the abovementioned two algorithms. Then, comparisons of the performance of the two algorithms are provided. Finally, this work investigates the effects of the logistics service time windows and CO2 emission taxes on the optimal solution. Several important management insights are obtained.
NASA Astrophysics Data System (ADS)
Sumantri, Bambang; Uchiyama, Naoki; Sano, Shigenori
2016-01-01
In this paper, a new control structure for a quad-rotor helicopter that employs the least squares method is introduced. This proposed algorithm solves the overdetermined problem of the control input for the translational motion of a quad-rotor helicopter. The algorithm allows all six degrees of freedom to be considered to calculate the control input. The sliding mode controller is applied to achieve robust tracking and stabilization. A saturation function is designed around a boundary layer to reduce the chattering phenomenon that is a common problem in sliding mode control. In order to improve the tracking performance, an integral sliding surface is designed. An energy saving effect because of chattering reduction is also evaluated. First, the dynamics of the quad-rotor helicopter is derived by the Newton-Euler formulation for a rigid body. Second, a constant plus proportional reaching law is introduced to increase the reaching rate of the sliding mode controller. Global stability of the proposed control strategy is guaranteed based on the Lyapunov's stability theory. Finally, the robustness and effectiveness of the proposed control system are demonstrated experimentally under wind gusts, and are compared with a regular sliding mode controller, a proportional-differential controller, and a proportional-integral-differential controller.
2015-01-01
Retinal fundus images are widely used in diagnosing and providing treatment for several eye diseases. Prior works using retinal fundus images detected the presence of exudation with the aid of publicly available dataset using extensive segmentation process. Though it was proved to be computationally efficient, it failed to create a diabetic retinopathy feature selection system for transparently diagnosing the disease state. Also the diagnosis of diseases did not employ machine learning methods to categorize candidate fundus images into true positive and true negative ratio. Several candidate fundus images did not include more detailed feature selection technique for diabetic retinopathy. To apply machine learning methods and classify the candidate fundus images on the basis of sliding window a method called, Diabetic Fundus Image Recuperation (DFIR) is designed in this paper. The initial phase of DFIR method select the feature of optic cup in digital retinal fundus images based on Sliding Window Approach. With this, the disease state for diabetic retinopathy is assessed. The feature selection in DFIR method uses collection of sliding windows to obtain the features based on the histogram value. The histogram based feature selection with the aid of Group Sparsity Non-overlapping function provides more detailed information of features. Using Support Vector Model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy diseases. The ranking of disease level for each candidate set provides a much promising result for developing practically automated diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, specificity rate, ranking efficiency and feature selection time. PMID:25974230
SigHunt: horizontal gene transfer finder optimized for eukaryotic genomes.
Jaron, Kamil S; Moravec, Jiří C; Martínková, Natália
2014-04-15
Genomic islands (GIs) are DNA fragments incorporated into a genome through horizontal gene transfer (also called lateral gene transfer), often with functions novel for a given organism. While methods for their detection are well researched in prokaryotes, the complexity of eukaryotic genomes makes direct utilization of these methods unreliable, and so labour-intensive phylogenetic searches are used instead. We present a surrogate method that investigates nucleotide base composition of the DNA sequence in a eukaryotic genome and identifies putative GIs. We calculate a genomic signature as a vector of tetranucleotide (4-mer) frequencies using a sliding window approach. Extending the neighbourhood of the sliding window, we establish a local kernel density estimate of the 4-mer frequency. We score the number of 4-mer frequencies in the sliding window that deviate from the credibility interval of their local genomic density using a newly developed discrete interval accumulative score (DIAS). To further improve the effectiveness of DIAS, we select informative 4-mers in a range of organisms using the tetranucleotide quality score developed herein. We show that the SigHunt method is computationally efficient and able to detect GIs in eukaryotic genomes that represent non-ameliorated integration. Thus, it is suited to scanning for change in organisms with different DNA composition. Source code and scripts freely available for download at http://www.iba.muni.cz/index-en.php?pg=research-data-analysis-tools-sighunt are implemented in C and R and are platform-independent. 376090@mail.muni.cz or martinkova@ivb.cz. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Single-machine common/slack due window assignment problems with linear decreasing processing times
NASA Astrophysics Data System (ADS)
Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia
2017-08-01
This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.
Adaptive windowing and windowless approaches to estimate dynamic functional brain connectivity
NASA Astrophysics Data System (ADS)
Yaesoubi, Maziar; Calhoun, Vince D.
2017-08-01
In this work, we discuss estimation of dynamic dependence of a multi-variate signal. Commonly used approaches are often based on a locality assumption (e.g. sliding-window) which can miss spontaneous changes due to blurring with local but unrelated changes. We discuss recent approaches to overcome this limitation including 1) a wavelet-space approach, essentially adapting the window to the underlying frequency content and 2) a sparse signal-representation which removes any locality assumption. The latter is especially useful when there is no prior knowledge of the validity of such assumption as in brain-analysis. Results on several large resting-fMRI data sets highlight the potential of these approaches.
Mori, Ichiro; Nunobiki, Osamu; Ozaki, Takashi; Taniguchi, Emiko; Kakudo, Kennichi
2008-01-01
To clarify the issues associated with the applications of virtual microscopy to the daily cytology slide screening, we conducted a survey at a slide conference of cytology. The survey was conducted specifically to the Japanese cytology technologists who use microscopes on a routine basis. Virtual slides (VS) were prepared from cytology slides using NanoZoomer (Hamamatsu Photonics, Japan), which is capable of adjusting focus on any part of the slide. A total of ten layers were scanned from the same slides, with 2 micrometer intervals. To simulate the cytology slide screening, no marker points were created. The total data volume of six slides was approximately 25 Giga Bytes. The slides were stored on the Windows 2003 Server, and were made accessible on the web to the cytology technologists. Most cytotechnologists answered "Satisfied" or "Acceptable" to the VS resolution and drawing speed, and "Dissatisfied" to the operation speed. To the ten layered focus, an answer "insufficient" was slightly more frequent than the answer "sufficient", while no one answered "fewer is acceptable" or "no need for depth". As for the use of cytology slide screening, answers "usable, but requires effort" and "not usable" were about equal in number. In a Japanese cytology meeting, a unique VS system has been used in slide conferences with markings to the discussion point for years. Therefore, Japanese cytotechnologists are relatively well accustomed to the use of VS, and the survey results showed that they regarded VS more positively than we expected. Currently, VS has the acceptable resolution and drawing speed even on the web. Most cytotechnologists regard the focusing capability crucial for cytology slide screening, but the consequential enlargement of data size, longer scanning time, and slower drawing speed are the issues that are yet to be resolved. PMID:18673503
Fuzzy fractional order sliding mode controller for nonlinear systems
NASA Astrophysics Data System (ADS)
Delavari, H.; Ghaderi, R.; Ranjbar, A.; Momani, S.
2010-04-01
In this paper, an intelligent robust fractional surface sliding mode control for a nonlinear system is studied. At first a sliding PD surface is designed and then, a fractional form of these networks PDα, is proposed. Fast reaching velocity into the switching hyperplane in the hitting phase and little chattering phenomena in the sliding phase is desired. To reduce the chattering phenomenon in sliding mode control (SMC), a fuzzy logic controller is used to replace the discontinuity in the signum function at the reaching phase in the sliding mode control. For the problem of determining and optimizing the parameters of fuzzy sliding mode controller (FSMC), genetic algorithm (GA) is used. Finally, the performance and the significance of the controlled system two case studies (robot manipulator and coupled tanks) are investigated under variation in system parameters and also in presence of an external disturbance. The simulation results signify performance of genetic-based fuzzy fractional sliding mode controller.
Serag, Ahmed; Wilkinson, Alastair G.; Telford, Emma J.; Pataky, Rozalia; Sparrow, Sarah A.; Anblagan, Devasuda; Macnaught, Gillian; Semple, Scott I.; Boardman, James P.
2017-01-01
Quantitative volumes from brain magnetic resonance imaging (MRI) acquired across the life course may be useful for investigating long term effects of risk and resilience factors for brain development and healthy aging, and for understanding early life determinants of adult brain structure. Therefore, there is an increasing need for automated segmentation tools that can be applied to images acquired at different life stages. We developed an automatic segmentation method for human brain MRI, where a sliding window approach and a multi-class random forest classifier were applied to high-dimensional feature vectors for accurate segmentation. The method performed well on brain MRI data acquired from 179 individuals, analyzed in three age groups: newborns (38–42 weeks gestational age), children and adolescents (4–17 years) and adults (35–71 years). As the method can learn from partially labeled datasets, it can be used to segment large-scale datasets efficiently. It could also be applied to different populations and imaging modalities across the life course. PMID:28163680
Ordeig, Laura; Garcia-Cehic, Damir; Gregori, Josep; Soria, Maria Eugenia; Nieto-Aponte, Leonardo; Perales, Celia; Llorens, Meritxell; Chen, Qian; Riveiro-Barciela, Mar; Buti, Maria; Esteban, Rafael; Esteban, Juan Ignacio; Rodriguez-Frias, Francisco; Quer, Josep
2018-01-01
Hepatitis C virus (HCV) is a highly divergent virus currently classified into seven major genotypes and 86 subtypes (ICTV, June 2017), which can have differing responses to therapy. Accurate genotyping/subtyping using high-resolution HCV subtyping enables confident subtype identification, identifies mixed infections and allows detection of new subtypes. During routine genotyping/subtyping, one sample from an Equatorial Guinea patient could not be classified into any of the subtypes. The complete genomic sequence was compared to reference sequences by phylogenetic and sliding window analysis. Resistance-associated substitutions (RASs) were assessed by deep sequencing. The unclassified HCV genome did not belong to any of the existing genotype 1 (G1) subtypes. Sliding window analysis along the complete genome ruled out recombination phenomena suggesting that it belongs to a new HCV G1 subtype. Two NS5A RASs (L31V+Y93H) were found to be naturally combined in the genome which could limit treatment possibilities in patients infected with this subtype.
Friction is Fracture: a new paradigm for the onset of frictional motion
NASA Astrophysics Data System (ADS)
Fineberg, Jay
Friction is generally described by a single degree of freedom, a `friction coefficient'. We experimentally study the space-time dynamics of the onset of dry and lubricated frictional motion when two contacting bodies start to slide. We first show that the transition from static to dynamic sliding is governed by rupture fronts (closely analogous to earthquakes) that break the contacts along the interface separating the two bodies. Moreover, the structure of these ''laboratory earthquakes'' is quantitatively described by singular solutions originally derived to describe the motion of rapid cracks under applied shear. We demonstrate that this framework quantitatively describes both earthquake motion and arrest. This framework also providing a new window into the hidden properties of the micron thick interface that governs a body's frictional properties. Using this window we show that lubricated interfaces, although ``slippery'', actually becomes tougher; lubricants significantly increase dissipated energy during rupture. The results establish a new (and fruitful) paradigm for describing friction. Israel Science Foundation, ERC.
Egocentric Temporal Action Proposals.
Shao Huang; Weiqiang Wang; Shengfeng He; Lau, Rynson W H
2018-02-01
We present an approach to localize generic actions in egocentric videos, called temporal action proposals (TAPs), for accelerating the action recognition step. An egocentric TAP refers to a sequence of frames that may contain a generic action performed by the wearer of a head-mounted camera, e.g., taking a knife, spreading jam, pouring milk, or cutting carrots. Inspired by object proposals, this paper aims at generating a small number of TAPs, thereby replacing the popular sliding window strategy, for localizing all action events in the input video. To this end, we first propose to temporally segment the input video into action atoms, which are the smallest units that may contain an action. We then apply a hierarchical clustering algorithm with several egocentric cues to generate TAPs. Finally, we propose two actionness networks to score the likelihood of each TAP containing an action. The top ranked candidates are returned as output TAPs. Experimental results show that the proposed TAP detection framework performs significantly better than relevant approaches for egocentric action detection.
Measuring frequency of one-dimensional vibration with video camera using electronic rolling shutter
NASA Astrophysics Data System (ADS)
Zhao, Yipeng; Liu, Jinyue; Guo, Shijie; Li, Tiejun
2018-04-01
Cameras offer a unique capability of collecting high density spatial data from a distant scene of interest. They can be employed as remote monitoring or inspection sensors to measure vibrating objects because of their commonplace availability, simplicity, and potentially low cost. A defect of vibrating measurement with the camera is to process the massive data generated by camera. In order to reduce the data collected from the camera, the camera using electronic rolling shutter (ERS) is applied to measure the frequency of one-dimensional vibration, whose frequency is much higher than the speed of the camera. Every row in the image captured by the ERS camera records the vibrating displacement at different times. Those displacements that form the vibration could be extracted by local analysis with sliding windows. This methodology is demonstrated on vibrating structures, a cantilever beam, and an air compressor to identify the validity of the proposed algorithm. Suggestions for applications of this methodology and challenges in real-world implementation are given at last.
Centrifugal compressor fault diagnosis based on qualitative simulation and thermal parameters
NASA Astrophysics Data System (ADS)
Lu, Yunsong; Wang, Fuli; Jia, Mingxing; Qi, Yuanchen
2016-12-01
This paper concerns fault diagnosis of centrifugal compressor based on thermal parameters. An improved qualitative simulation (QSIM) based fault diagnosis method is proposed to diagnose the faults of centrifugal compressor in a gas-steam combined-cycle power plant (CCPP). The qualitative models under normal and two faulty conditions have been built through the analysis of the principle of centrifugal compressor. To solve the problem of qualitative description of the observations of system variables, a qualitative trend extraction algorithm is applied to extract the trends of the observations. For qualitative states matching, a sliding window based matching strategy which consists of variables operating ranges constraints and qualitative constraints is proposed. The matching results are used to determine which QSIM model is more consistent with the running state of system. The correct diagnosis of two typical faults: seal leakage and valve stuck in the centrifugal compressor has validated the targeted performance of the proposed method, showing the advantages of fault roots containing in thermal parameters.
Kauppi, Jukka-Pekka; Martikainen, Kalle; Ruotsalainen, Ulla
2010-12-01
The central purpose of passive signal intercept receivers is to perform automatic categorization of unknown radar signals. Currently, there is an urgent need to develop intelligent classification algorithms for these devices due to emerging complexity of radar waveforms. Especially multifunction radars (MFRs) capable of performing several simultaneous tasks by utilizing complex, dynamically varying scheduled waveforms are a major challenge for automatic pattern classification systems. To assist recognition of complex radar emissions in modern intercept receivers, we have developed a novel method to recognize dynamically varying pulse repetition interval (PRI) modulation patterns emitted by MFRs. We use robust feature extraction and classifier design techniques to assist recognition in unpredictable real-world signal environments. We classify received pulse trains hierarchically which allows unambiguous detection of the subpatterns using a sliding window. Accuracy, robustness and reliability of the technique are demonstrated with extensive simulations using both static and dynamically varying PRI modulation patterns. Copyright © 2010 Elsevier Ltd. All rights reserved.
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
Research on Taxiway Path Optimization Based on Conflict Detection
Zhou, Hang; Jiang, Xinxin
2015-01-01
Taxiway path planning is one of the effective measures to make full use of the airport resources, and the optimized paths can ensure the safety of the aircraft during the sliding process. In this paper, the taxiway path planning based on conflict detection is considered. Specific steps are shown as follows: firstly, make an improvement on A * algorithm, the conflict detection strategy is added to search for the shortest and safe path in the static taxiway network. Then, according to the sliding speed of aircraft, a time table for each node is determined and the safety interval is treated as the constraint to judge whether there is a conflict or not. The intelligent initial path planning model is established based on the results. Finally, make an example in an airport simulation environment, detect and relieve the conflict to ensure the safety. The results indicate that the model established in this paper is effective and feasible. Meanwhile, make comparison with the improved A*algorithm and other intelligent algorithms, conclude that the improved A*algorithm has great advantages. It could not only optimize taxiway path, but also ensure the safety of the sliding process and improve the operational efficiency. PMID:26226485
Research on Taxiway Path Optimization Based on Conflict Detection.
Zhou, Hang; Jiang, Xinxin
2015-01-01
Taxiway path planning is one of the effective measures to make full use of the airport resources, and the optimized paths can ensure the safety of the aircraft during the sliding process. In this paper, the taxiway path planning based on conflict detection is considered. Specific steps are shown as follows: firstly, make an improvement on A * algorithm, the conflict detection strategy is added to search for the shortest and safe path in the static taxiway network. Then, according to the sliding speed of aircraft, a time table for each node is determined and the safety interval is treated as the constraint to judge whether there is a conflict or not. The intelligent initial path planning model is established based on the results. Finally, make an example in an airport simulation environment, detect and relieve the conflict to ensure the safety. The results indicate that the model established in this paper is effective and feasible. Meanwhile, make comparison with the improved A*algorithm and other intelligent algorithms, conclude that the improved A*algorithm has great advantages. It could not only optimize taxiway path, but also ensure the safety of the sliding process and improve the operational efficiency.
Online tracking of instantaneous frequency and amplitude of dynamical system response
NASA Astrophysics Data System (ADS)
Frank Pai, P.
2010-05-01
This paper presents a sliding-window tracking (SWT) method for accurate tracking of the instantaneous frequency and amplitude of arbitrary dynamic response by processing only three (or more) most recent data points. Teager-Kaiser algorithm (TKA) is a well-known four-point method for online tracking of frequency and amplitude. Because finite difference is used in TKA, its accuracy is easily destroyed by measurement and/or signal-processing noise. Moreover, because TKA assumes the processed signal to be a pure harmonic, any moving average in the signal can destroy the accuracy of TKA. On the other hand, because SWT uses a constant and a pair of windowed regular harmonics to fit the data and estimate the instantaneous frequency and amplitude, the influence of any moving average is eliminated. Moreover, noise filtering is an implicit capability of SWT when more than three data points are used, and this capability increases with the number of processed data points. To compare the accuracy of SWT and TKA, Hilbert-Huang transform is used to extract accurate time-varying frequencies and amplitudes by processing the whole data set without assuming the signal to be harmonic. Frequency and amplitude trackings of different amplitude- and frequency-modulated signals, vibrato in music, and nonlinear stationary and non-stationary dynamic signals are studied. Results show that SWT is more accurate, robust, and versatile than TKA for online tracking of frequency and amplitude.
Windowed multipole for cross section Doppler broadening
NASA Astrophysics Data System (ADS)
Josey, C.; Ducru, P.; Forget, B.; Smith, K.
2016-02-01
This paper presents an in-depth analysis on the accuracy and performance of the windowed multipole Doppler broadening method. The basic theory behind cross section data is described, along with the basic multipole formalism followed by the approximations leading to windowed multipole method and the algorithm used to efficiently evaluate Doppler broadened cross sections. The method is tested by simulating the BEAVRS benchmark with a windowed multipole library composed of 70 nuclides. Accuracy of the method is demonstrated on a single assembly case where total neutron production rates and 238U capture rates compare within 0.1% to ACE format files at the same temperature. With regards to performance, clock cycle counts and cache misses were measured for single temperature ACE table lookup and for windowed multipole. The windowed multipole method was found to require 39.6% more clock cycles to evaluate, translating to a 7.9% performance loss overall. However, the algorithm has significantly better last-level cache performance, with 3 fewer misses per evaluation, or a 65% reduction in last-level misses. This is due to the small memory footprint of the windowed multipole method and better memory access pattern of the algorithm.
Apparatus for insulating windows and the like
Mitchell, R.A.
1984-06-19
Apparatus for insulating window openings through walls and the like includes a thermal shutter, a rail for mounting the shutter adjacent to the window opening and a coupling for connecting the shutter to the rail. The thermal shutter includes an insulated panel adhered to frame members which surround the periphery of the panel. The frame members include a hard portion for providing the frame and a soft portion for providing a seal with that portion of the wall adjacent to the periphery of the opening. The coupling means is preferably integral with the attachment rail. According to a preferred embodiment, the coupling means includes a continuous hinge of reduced thickness. The thermal shutter can be permanently attached, hinged, bi-folded, or sliding with respect to the window and wall. A distribution method is to market the apparatus in kit'' form. 11 figs.
Apparatus for insulating windows and the like
Mitchell, Robert A.
1984-01-01
Apparatus for insulating window openings through walls and the like includes a thermal shutter, a rail for mounting the shutter adjacent to the window opening and a coupling for connecting the shutter to the rail. The thermal shutter includes an insulated panel adhered to frame members which surround the periphery of the panel. The frame members include a hard portion for providing the frame and a soft portion for providing a seal with that portion of the wall adjacent to the periphery of the opening. The coupling means is preferably integral with the attachment rail. According to a preferred embodiment, the coupling means includes a continuous hinge of reduced thickness. The thermal shutter can be permanently attached, hinged, bi-folded, or sliding with respect to the window and wall. A distribution method is to market the apparatus in "kit" form.
First arrival time picking for microseismic data based on DWSW algorithm
NASA Astrophysics Data System (ADS)
Li, Yue; Wang, Yue; Lin, Hongbo; Zhong, Tie
2018-03-01
The first arrival time picking is a crucial step in microseismic data processing. When the signal-to-noise ratio (SNR) is low, however, it is difficult to get the first arrival time accurately with traditional methods. In this paper, we propose the double-sliding-window SW (DWSW) method based on the Shapiro-Wilk (SW) test. The DWSW method is used to detect the first arrival time by making full use of the differences between background noise and effective signals in the statistical properties. Specifically speaking, we obtain the moment corresponding to the maximum as the first arrival time of microseismic data when the statistic of our method reaches its maximum. Hence, in our method, there is no need to select the threshold, which makes the algorithm more facile when the SNR of microseismic data is low. To verify the reliability of the proposed method, a series of experiments is performed on both synthetic and field microseismic data. Our method is compared with the traditional short-time and long-time average (STA/LTA) method, the Akaike information criterion, and the kurtosis method. Analysis results indicate that the accuracy rate of the proposed method is superior to that of the other three methods when the SNR is as low as - 10 dB.
3D DEM analyses of the 1963 Vajont rock slide
NASA Astrophysics Data System (ADS)
Boon, Chia Weng; Houlsby, Guy; Utili, Stefano
2013-04-01
The 1963 Vajont rock slide has been modelled using the distinct element method (DEM). The open-source DEM code, YADE (Kozicki & Donzé, 2008), was used together with the contact detection algorithm proposed by Boon et al. (2012). The critical sliding friction angle at the slide surface was sought using a strength reduction approach. A shear-softening contact model was used to model the shear resistance of the clayey layer at the slide surface. The results suggest that the critical sliding friction angle can be conservative if stability analyses are calculated based on the peak friction angles. The water table was assumed to be horizontal and the pore pressure at the clay layer was assumed to be hydrostatic. The influence of reservoir filling was marginal, increasing the sliding friction angle by only 1.6˚. The results of the DEM calculations were found to be sensitive to the orientations of the bedding planes and cross-joints. Finally, the failure mechanism was investigated and arching was found to be present at the bend of the chair-shaped slope. References Boon C.W., Houlsby G.T., Utili S. (2012). A new algorithm for contact detection between convex polygonal and polyhedral particles in the discrete element method. Computers and Geotechnics, vol 44, 73-82, doi.org/10.1016/j.compgeo.2012.03.012. Kozicki, J., & Donzé, F. V. (2008). A new open-source software developed for numerical simulations using discrete modeling methods. Computer Methods in Applied Mechanics and Engineering, 197(49-50), 4429-4443.
Design and analysis of adaptive Super-Twisting sliding mode control for a microgyroscope.
Feng, Zhilin; Fei, Juntao
2018-01-01
This paper proposes a novel adaptive Super-Twisting sliding mode control for a microgyroscope under unknown model uncertainties and external disturbances. In order to improve the convergence rate of reaching the sliding surface and the accuracy of regulating and trajectory tracking, a high order Super-Twisting sliding mode control strategy is employed, which not only can combine the advantages of the traditional sliding mode control with the Super-Twisting sliding mode control, but also guarantee that the designed control system can reach the sliding surface and equilibrium point in a shorter finite time from any initial state and avoid chattering problems. In consideration of unknown parameters of micro gyroscope system, an adaptive algorithm based on Lyapunov stability theory is designed to estimate the unknown parameters and angular velocity of microgyroscope. Finally, the effectiveness of the proposed scheme is demonstrated by simulation results. The comparative study between adaptive Super-Twisting sliding mode control and conventional sliding mode control demonstrate the superiority of the proposed method.
19. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, COUNTER TOP, SINK, ...
19. INTERIOR OF KITCHEN SHOWING UPDATED CABINETS, COUNTER TOP, SINK, AND FAUCET, AND ORIGINAL WOODFRAMED SLIDING GLASS WINDOW IN NORTH WALL AT PHOTO LEFT CENTER OVERLOOKING FRONT PORCH. VIEW TO NORTHEAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
NASA Astrophysics Data System (ADS)
Sun, Wei; Ding, Wei; Yan, Huifang; Duan, Shunli
2018-06-01
Shoe-mounted pedestrian navigation systems based on micro inertial sensors rely on zero velocity updates to correct their positioning errors in time, which effectively makes determining the zero velocity interval play a key role during normal walking. However, as walking gaits are complicated, and vary from person to person, it is difficult to detect walking gaits with a fixed threshold method. This paper proposes a pedestrian gait classification method based on a hidden Markov model. Pedestrian gait data are collected with a micro inertial measurement unit installed at the instep. On the basis of analyzing the characteristics of the pedestrian walk, a single direction angular rate gyro output is used to classify gait features. The angular rate data are modeled into a univariate Gaussian mixture model with three components, and a four-state left–right continuous hidden Markov model (CHMM) is designed to classify the normal walking gait. The model parameters are trained and optimized using the Baum–Welch algorithm and then the sliding window Viterbi algorithm is used to decode the gait. Walking data are collected through eight subjects walking along the same route at three different speeds; the leave-one-subject-out cross validation method is conducted to test the model. Experimental results show that the proposed algorithm can accurately detect different walking gaits of zero velocity interval. The location experiment shows that the precision of CHMM-based pedestrian navigation improved by 40% when compared to the angular rate threshold method.
Analysis of Asymmetry by a Slide-Vector.
ERIC Educational Resources Information Center
Zielman, Berrie; Heiser, Willem J.
1993-01-01
An algorithm based on the majorization theory of J. de Leeuw and W. J. Heiser is presented for fitting the slide-vector model. It views the model as a constrained version of the unfolding model. A three-way variant is proposed, and two examples from market structure analysis are presented. (SLD)
Ge, Lan; Kino, Aya; Lee, Daniel; Dharmakumar, Rohan; Carr, James C; Li, Debiao
2010-01-01
First-pass perfusion magnetic resonance imaging (MRI) is a promising technique for detecting ischemic heart disease. However, the diagnostic value of the method is limited by the low spatial coverage, resolution, signal-to-noise ratio (SNR), and cardiac motion-related image artifacts. A combination of sliding window and conjugate-gradient HighlY constrained back-PRojection reconstruction (SW-CG-HYPR) method has been proposed in healthy volunteer studies to reduce the acquisition window for each slice while maintaining the temporal resolution of 1 frame per heartbeat in myocardial perfusion MRI. This method allows for improved spatial coverage, resolution, and SNR. In this study, we use a controlled animal model to test whether the myocardial territory supplied by a stenotic coronary artery can be detected accurately by SW-CG-HYPR perfusion method under pharmacological stress. Results from 6 mongrel dogs (15-25 kg) studies demonstrate the feasibility of SW-CG-HYPR to detect regional perfusion defects. Using this method, the acquisition time per cardiac cycle was reduced by a factor of 4, and the spatial coverage was increased from 2 to 3 slices to 6 slices as compared with the conventional techniques including both turbo-Fast Low Angle Short (FLASH) and echoplanar imaging (EPI). The SNR of the healthy myocardium at peak enhancement with SW-CG-HYPR (12.68 ± 2.46) is significantly higher (P < 0.01) than the turbo-FLASH (8.65 ± 1.93) and EPI (5.48 ± 1.24). The spatial resolution of SW-CG-HYPR images is 1.2 × 1.2 × 8.0 mm, which is better than the turbo-FLASH (1.8 × 1.8 × 8.0 mm) and EPI (2.0 × 1.8 × 8.0 mm). Sliding-window CG-HYPR is a promising technique for myocardial perfusion MRI. This technique provides higher image quality with respect to significantly improved SNR and spatial resolution of the myocardial perfusion images, which might improve myocardial perfusion imaging in a clinical setting.
Detection with Enhanced Energy Windowing Phase I Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bass, David A.; Enders, Alexander L.
2016-12-01
This document reviews the progress of Phase I of the Detection with Enhanced Energy Windowing (DEEW) project. The DEEW project is the implementation of software incorporating an algorithm which reviews data generated by radiation portal monitors and utilizes advanced and novel techniques for detecting radiological and fissile material while not alarming on Naturally Occurring Radioactive Material. Independent testing indicated that the Enhanced Energy Windowing algorithm showed promise at reducing the probability of alarm in the stream of commerce compared to existing algorithms and other developmental algorithms, while still maintaining adequate sensitivity to threats. This document contains a brief description ofmore » the project, instructions for setting up and running the applications, and guidance to help make reviewing the output files and source code easier.« less
NASA Astrophysics Data System (ADS)
Ji, Yanju; Li, Dongsheng; Yu, Mingmei; Wang, Yuan; Wu, Qiong; Lin, Jun
2016-05-01
The ground electrical source airborne transient electromagnetic system (GREATEM) on an unmanned aircraft enjoys considerable prospecting depth, lateral resolution and detection efficiency, etc. In recent years it has become an important technical means of rapid resources exploration. However, GREATEM data are extremely vulnerable to stationary white noise and non-stationary electromagnetic noise (sferics noise, aircraft engine noise and other human electromagnetic noises). These noises will cause degradation of the imaging quality for data interpretation. Based on the characteristics of the GREATEM data and major noises, we propose a de-noising algorithm utilizing wavelet threshold method and exponential adaptive window width-fitting. Firstly, the white noise is filtered in the measured data using the wavelet threshold method. Then, the data are segmented using data window whose step length is even logarithmic intervals. The data polluted by electromagnetic noise are identified within each window based on the discriminating principle of energy detection, and the attenuation characteristics of the data slope are extracted. Eventually, an exponential fitting algorithm is adopted to fit the attenuation curve of each window, and the data polluted by non-stationary electromagnetic noise are replaced with their fitting results. Thus the non-stationary electromagnetic noise can be effectively removed. The proposed algorithm is verified by the synthetic and real GREATEM signals. The results show that in GREATEM signal, stationary white noise and non-stationary electromagnetic noise can be effectively filtered using the wavelet threshold-exponential adaptive window width-fitting algorithm, which enhances the imaging quality.
Visualization of Sliding and Deformation of Orbital Fat During Eye Rotation
Hötte, Gijsbert J.; Schaafsma, Peter J.; Botha, Charl P.; Wielopolski, Piotr A.; Simonsz, Huibert J.
2016-01-01
Purpose Little is known about the way orbital fat slides and/or deforms during eye movements. We compared two deformation algorithms from a sequence of MRI volumes to visualize this complex behavior. Methods Time-dependent deformation data were derived from motion-MRI volumes using Lucas and Kanade Optical Flow (LK3D) and nonrigid registration (B-splines) deformation algorithms. We compared how these two algorithms performed regarding sliding and deformation in three critical areas: the sclera-fat interface, how the optic nerve moves through the fat, and how the fat is squeezed out under the tendon of a relaxing rectus muscle. The efficacy was validated using identified tissue markers such as the lens and blood vessels in the fat. Results Fat immediately behind the eye followed eye rotation by approximately one-half. This was best visualized using the B-splines technique as it showed less ripping of tissue and less distortion. Orbital fat flowed around the optic nerve during eye rotation. In this case, LK3D provided better visualization as it allowed orbital fat tissue to split. The resolution was insufficient to visualize fat being squeezed out between tendon and sclera. Conclusion B-splines performs better in tracking structures such as the lens, while LK3D allows fat tissue to split as should happen as the optic nerve slides through the fat. Orbital fat follows eye rotation by one-half and flows around the optic nerve during eye rotation. Translational Relevance Visualizing orbital fat deformation and sliding offers the opportunity to accurately locate a region of cicatrization and permit an individualized surgical plan. PMID:27540495
DOT National Transportation Integrated Search
2012-03-31
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
DOT National Transportation Integrated Search
2012-03-01
This report evaluates the performance of Continuous Risk Profile (CRP) compared with the : Sliding Window Method (SWM) and Peak Searching (PS) methods. These three network : screening methods all require the same inputs: traffic collision data and Sa...
Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...
28. INTERIOR OF BATHROOM SHOWING OPEN DOORWAY TO BEDROOM NO.3 ...
28. INTERIOR OF BATHROOM SHOWING OPEN DOORWAY TO BEDROOM NO.3 AT PHOTO RIGHT, ALUMINUM-FRAMED SLIDING-GLASS WINDOW ABOVE BATHTUB AT PHOTO CENTER, AND BUILT-IN CABINETS AT PHOTO LEFT. VIEW TO NORTHWEST. - Bishop Creek Hydroelectric System, Plant 4, Worker Cottage, Bishop Creek, Bishop, Inyo County, CA
Sampling solution traces for the problem of sorting permutations by signed reversals
2012-01-01
Background Traditional algorithms to solve the problem of sorting by signed reversals output just one optimal solution while the space of all optimal solutions can be huge. A so-called trace represents a group of solutions which share the same set of reversals that must be applied to sort the original permutation following a partial ordering. By using traces, we therefore can represent the set of optimal solutions in a more compact way. Algorithms for enumerating the complete set of traces of solutions were developed. However, due to their exponential complexity, their practical use is limited to small permutations. A partial enumeration of traces is a sampling of the complete set of traces and can be an alternative for the study of distinct evolutionary scenarios of big permutations. Ideally, the sampling should be done uniformly from the space of all optimal solutions. This is however conjectured to be ♯P-complete. Results We propose and evaluate three algorithms for producing a sampling of the complete set of traces that instead can be shown in practice to preserve some of the characteristics of the space of all solutions. The first algorithm (RA) performs the construction of traces through a random selection of reversals on the list of optimal 1-sequences. The second algorithm (DFALT) consists in a slight modification of an algorithm that performs the complete enumeration of traces. Finally, the third algorithm (SWA) is based on a sliding window strategy to improve the enumeration of traces. All proposed algorithms were able to enumerate traces for permutations with up to 200 elements. Conclusions We analysed the distribution of the enumerated traces with respect to their height and average reversal length. Various works indicate that the reversal length can be an important aspect in genome rearrangements. The algorithms RA and SWA show a tendency to lose traces with high average reversal length. Such traces are however rare, and qualitatively our results show that, for testable-sized permutations, the algorithms DFALT and SWA produce distributions which approximate the reversal length distributions observed with a complete enumeration of the set of traces. PMID:22704580
Free-breathing 3D Cardiac MRI Using Iterative Image-Based Respiratory Motion Correction
Moghari, Mehdi H.; Roujol, Sébastien; Chan, Raymond H.; Hong, Susie N.; Bello, Natalie; Henningsson, Markus; Ngo, Long H.; Goddu, Beth; Goepfert, Lois; Kissinger, Kraig V.; Manning, Warren J.; Nezafat, Reza
2012-01-01
Respiratory motion compensation using diaphragmatic navigator (NAV) gating with a 5 mm gating window is conventionally used for free-breathing cardiac MRI. Due to the narrow gating window, scan efficiency is low resulting in long scan times, especially for patients with irregular breathing patterns. In this work, a new retrospective motion compensation algorithm is presented to reduce the scan time for free-breathing cardiac MRI that increasing the gating window to 15 mm without compromising image quality. The proposed algorithm iteratively corrects for respiratory-induced cardiac motion by optimizing the sharpness of the heart. To evaluate this technique, two coronary MRI datasets with 1.3 mm3 resolution were acquired from 11 healthy subjects (7 females, 25±9 years); one using a NAV with a 5 mm gating window acquired in 12.0±2.0 minutes and one with a 15 mm gating window acquired in 7.1±1.0 minutes. The images acquired with a 15 mm gating window were corrected using the proposed algorithm and compared to the uncorrected images acquired with the 5 mm and 15 mm gating windows. The image quality score, sharpness, and length of the three major coronary arteries were equivalent between the corrected images and the images acquired with a 5 mm gating window (p-value>0.05), while the scan time was reduced by a factor of 1.7. PMID:23132549
Genetic algorithm for nuclear data evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Jennifer Ann
These are slides on genetic algorithm for nuclear data evaluation. The following is covered: initial population, fitness (outer loop), calculate fitness, selection (first part of inner loop), reproduction (second part of inner loop), solution, and examples.
Cerquera, Alexander; Vollebregt, Madelon A; Arns, Martijn
2018-03-01
Nonlinear analysis of EEG recordings allows detection of characteristics that would probably be neglected by linear methods. This study aimed to determine a suitable epoch length for nonlinear analysis of EEG data based on its recurrence rate in EEG alpha activity (electrodes Fz, Oz, and Pz) from 28 healthy and 64 major depressive disorder subjects. Two nonlinear metrics, Lempel-Ziv complexity and scaling index, were applied in sliding windows of 20 seconds shifted every 1 second and in nonoverlapping windows of 1 minute. In addition, linear spectral analysis was carried out for comparison with the nonlinear results. The analysis with sliding windows showed that the cortical dynamics underlying alpha activity had a recurrence period of around 40 seconds in both groups. In the analysis with nonoverlapping windows, long-term nonstationarities entailed changes over time in the nonlinear dynamics that became significantly different between epochs across time, which was not detected with the linear spectral analysis. Findings suggest that epoch lengths shorter than 40 seconds neglect information in EEG nonlinear studies. In turn, linear analysis did not detect characteristics from long-term nonstationarities in EEG alpha waves of control subjects and patients with major depressive disorder patients. We recommend that application of nonlinear metrics in EEG time series, particularly of alpha activity, should be carried out with epochs around 60 seconds. In addition, this study aimed to demonstrate that long-term nonlinearities are inherent to the cortical brain dynamics regardless of the presence or absence of a mental disorder.
NASA Astrophysics Data System (ADS)
Ozbulut, O. E.; Silwal, B.
2014-04-01
This study investigates the optimum design parameters of a superelastic friction base isolator (S-FBI) system through a multi-objective genetic algorithm and performance-based evaluation approach. The S-FBI system consists of a flat steel- PTFE sliding bearing and a superelastic NiTi shape memory alloy (SMA) device. Sliding bearing limits the transfer of shear across the isolation interface and provides damping from sliding friction. SMA device provides restoring force capability to the isolation system together with additional damping characteristics. A three-story building is modeled with S-FBI isolation system. Multiple-objective numerical optimization that simultaneously minimizes isolation-level displacements and superstructure response is carried out with a genetic algorithm (GA) in order to optimize S-FBI system. Nonlinear time history analyses of the building with S-FBI system are performed. A set of 20 near-field ground motion records are used in numerical simulations. Results show that S-FBI system successfully control response of the buildings against near-fault earthquakes without sacrificing in isolation efficacy and producing large isolation-level deformations.
Image microarrays (IMA): Digital pathology's missing tool
Hipp, Jason; Cheng, Jerome; Pantanowitz, Liron; Hewitt, Stephen; Yagi, Yukako; Monaco, James; Madabhushi, Anant; Rodriguez-canales, Jaime; Hanson, Jeffrey; Roy-Chowdhuri, Sinchita; Filie, Armando C.; Feldman, Michael D.; Tomaszewski, John E.; Shih, Natalie NC.; Brodsky, Victor; Giaccone, Giuseppe; Emmert-Buck, Michael R.; Balis, Ulysses J.
2011-01-01
Introduction: The increasing availability of whole slide imaging (WSI) data sets (digital slides) from glass slides offers new opportunities for the development of computer-aided diagnostic (CAD) algorithms. With the all-digital pathology workflow that these data sets will enable in the near future, literally millions of digital slides will be generated and stored. Consequently, the field in general and pathologists, specifically, will need tools to help extract actionable information from this new and vast collective repository. Methods: To address this limitation, we designed and implemented a tool (dCORE) to enable the systematic capture of image tiles with constrained size and resolution that contain desired histopathologic features. Results: In this communication, we describe a user-friendly tool that will enable pathologists to mine digital slides archives to create image microarrays (IMAs). IMAs are to digital slides as tissue microarrays (TMAs) are to cell blocks. Thus, a single digital slide could be transformed into an array of hundreds to thousands of high quality digital images, with each containing key diagnostic morphologies and appropriate controls. Current manual digital image cut-and-paste methods that allow for the creation of a grid of images (such as an IMA) of matching resolutions are tedious. Conclusion: The ability to create IMAs representing hundreds to thousands of vetted morphologic features has numerous applications in education, proficiency testing, consensus case review, and research. Lastly, in a manner analogous to the way conventional TMA technology has significantly accelerated in situ studies of tissue specimens use of IMAs has similar potential to significantly accelerate CAD algorithm development. PMID:22200030
Content-Based Indexing and Teaching Focus Mining for Lecture Videos
ERIC Educational Resources Information Center
Lin, Yu-Tzu; Yen, Bai-Jang; Chang, Chia-Hu; Lee, Greg C.; Lin, Yu-Chih
2010-01-01
Purpose: The purpose of this paper is to propose an indexing and teaching focus mining system for lecture videos recorded in an unconstrained environment. Design/methodology/approach: By applying the proposed algorithms in this paper, the slide structure can be reconstructed by extracting slide images from the video. Instead of applying…
Gundogdu, Erhan; Ozkan, Huseyin; Alatan, A Aydin
2017-11-01
Correlation filters have been successfully used in visual tracking due to their modeling power and computational efficiency. However, the state-of-the-art correlation filter-based (CFB) tracking algorithms tend to quickly discard the previous poses of the target, since they consider only a single filter in their models. On the contrary, our approach is to register multiple CFB trackers for previous poses and exploit the registered knowledge when an appearance change occurs. To this end, we propose a novel tracking algorithm [of complexity O(D) ] based on a large ensemble of CFB trackers. The ensemble [of size O(2 D ) ] is organized over a binary tree (depth D ), and learns the target appearance subspaces such that each constituent tracker becomes an expert of a certain appearance. During tracking, the proposed algorithm combines only the appearance-aware relevant experts to produce boosted tracking decisions. Additionally, we propose a versatile spatial windowing technique to enhance the individual expert trackers. For this purpose, spatial windows are learned for target objects as well as the correlation filters and then the windowed regions are processed for more robust correlations. In our extensive experiments on benchmark datasets, we achieve a substantial performance increase by using the proposed tracking algorithm together with the spatial windowing.
Global Precipitation Measurement: GPM Microwave Imager (GMI) Algorithm Development Approach
NASA Technical Reports Server (NTRS)
Stocker, Erich Franz
2009-01-01
This slide presentation reviews the approach to the development of the Global Precipitation Measurement algorithm. This presentation includes information about the responsibilities for the development of the algorithm, and the calibration. Also included is information about the orbit, and the sun angle. The test of the algorithm code will be done with synthetic data generated from the Precipitation Processing System (PPS).
Myocardial perfusion MRI with sliding-window conjugate-gradient HYPR.
Ge, Lan; Kino, Aya; Griswold, Mark; Mistretta, Charles; Carr, James C; Li, Debiao
2009-10-01
First-pass perfusion MRI is a promising technique for detecting ischemic heart disease. However, the diagnostic value of the method is limited by the low spatial coverage, resolution, signal-to-noise ratio (SNR), and cardiac motion-related image artifacts. In this study we investigated the feasibility of using a method that combines sliding window and CG-HYPR methods (SW-CG-HYPR) to reduce the acquisition window for each slice while maintaining the temporal resolution of one frame per heartbeat in myocardial perfusion MRI. This method allows an increased number of slices, reduced motion artifacts, and preserves the relatively high SNR and spatial resolution of the "composite images." Results from eight volunteers demonstrate the feasibility of SW-CG-HYPR for accelerated myocardial perfusion imaging with accurate signal intensity changes of left ventricle blood pool and myocardium. Using this method the acquisition time per cardiac cycle was reduced by a factor of 4 and the number of slices was increased from 3 to 8 as compared to the conventional technique. The SNR of the myocardium at peak enhancement with SW-CG-HYPR (13.83 +/- 2.60) was significantly higher (P < 0.05) than the conventional turbo-FLASH protocol (8.40 +/- 1.62). Also, the spatial resolution of the myocardial perfection images was significantly improved. SW-CG-HYPR is a promising technique for myocardial perfusion MRI. (c) 2009 Wiley-Liss, Inc.
A lightweight QRS detector for single lead ECG signals using a max-min difference algorithm.
Pandit, Diptangshu; Zhang, Li; Liu, Chengyu; Chattopadhyay, Samiran; Aslam, Nauman; Lim, Chee Peng
2017-06-01
Detection of the R-peak pertaining to the QRS complex of an ECG signal plays an important role for the diagnosis of a patient's heart condition. To accurately identify the QRS locations from the acquired raw ECG signals, we need to handle a number of challenges, which include noise, baseline wander, varying peak amplitudes, and signal abnormality. This research aims to address these challenges by developing an efficient lightweight algorithm for QRS (i.e., R-peak) detection from raw ECG signals. A lightweight real-time sliding window-based Max-Min Difference (MMD) algorithm for QRS detection from Lead II ECG signals is proposed. Targeting to achieve the best trade-off between computational efficiency and detection accuracy, the proposed algorithm consists of five key steps for QRS detection, namely, baseline correction, MMD curve generation, dynamic threshold computation, R-peak detection, and error correction. Five annotated databases from Physionet are used for evaluating the proposed algorithm in R-peak detection. Integrated with a feature extraction technique and a neural network classifier, the proposed ORS detection algorithm has also been extended to undertake normal and abnormal heartbeat detection from ECG signals. The proposed algorithm exhibits a high degree of robustness in QRS detection and achieves an average sensitivity of 99.62% and an average positive predictivity of 99.67%. Its performance compares favorably with those from the existing state-of-the-art models reported in the literature. In regards to normal and abnormal heartbeat detection, the proposed QRS detection algorithm in combination with the feature extraction technique and neural network classifier achieves an overall accuracy rate of 93.44% based on an empirical evaluation using the MIT-BIH Arrhythmia data set with 10-fold cross validation. In comparison with other related studies, the proposed algorithm offers a lightweight adaptive alternative for R-peak detection with good computational efficiency. The empirical results indicate that it not only yields a high accuracy rate in QRS detection, but also exhibits efficient computational complexity at the order of O(n), where n is the length of an ECG signal. Copyright © 2017 Elsevier B.V. All rights reserved.
Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D
2018-06-08
Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.
Mandell, Jacob C; Khurana, Bharti; Folio, Les R; Hyun, Hyewon; Smith, Stacy E; Dunne, Ruth M; Andriole, Katherine P
2017-06-01
A methodology is described using Adobe Photoshop and Adobe Extendscript to process DICOM images with a Relative Attenuation-Dependent Image Overlay (RADIO) algorithm to visualize the full dynamic range of CT in one view, without requiring a change in window and level settings. The potential clinical uses for such an algorithm are described in a pictorial overview, including applications in emergency radiology, oncologic imaging, and nuclear medicine and molecular imaging.
Optimized graph-based mosaicking for virtual microscopy
NASA Astrophysics Data System (ADS)
Steckhan, Dirk G.; Wittenberg, Thomas
2009-02-01
Virtual microscopy has the potential to partially replace traditional microscopy. For virtualization, the slide is scanned once by a fully automatized robotic microscope and saved digitally. Typically, such a scan results in several hundreds to thousands of fields of view. Since robotic stages have positioning errors, these fields of view have to be registered locally and globally in an additional step. In this work we propose a new global mosaicking method for the creation of virtual slides based on sub-pixel exact phase correlation for local alignment in combination with Prim's minimum spanning tree algorithm for global alignment. Our algorithm allows for a robust reproduction of the original slide even in the presence of views with little to no information content. This makes it especially suitable for the mosaicking of cervical smears. These smears often exhibit large empty areas, which do not contain enough information for common stitching approaches.
Women's Energy Tool Kit: Home Heating, Cooling and Weatherization.
ERIC Educational Resources Information Center
Byalin, Joan
This book is the first in a series of Energy Tool Kits designed for women by Consumer Action Now, a non-profit organization devoted to promoting energy efficiency and renewable energy resources. Information is provided in 16 sections: introduction, home energy survey; caulking; weatherstripping (double-hung and sliding windows, and casement,…
24 CFR 3280.403 - Standard for windows and sliding glass doors used in manufactured homes.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Manufactured Housing, except the exterior and interior pressure tests must be conducted at the design wind... the products, an independent quality assurance agency shall conduct pre-production specimen tests in... meet ANSI Z97.1-1984, “Safety Performance Specifications and Methods of Test for Safety Glazing...
24 CFR 3280.403 - Standard for windows and sliding glass doors used in manufactured homes.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Manufactured Housing, except the exterior and interior pressure tests must be conducted at the design wind... the products, an independent quality assurance agency shall conduct pre-production specimen tests in... meet ANSI Z97.1-1984, “Safety Performance Specifications and Methods of Test for Safety Glazing...
24 CFR 3280.403 - Standard for windows and sliding glass doors used in manufactured homes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Manufactured Housing, except the exterior and interior pressure tests must be conducted at the design wind... the products, an independent quality assurance agency shall conduct pre-production specimen tests in... meet ANSI Z97.1-1984, “Safety Performance Specifications and Methods of Test for Safety Glazing...
24 CFR 3280.403 - Standard for windows and sliding glass doors used in manufactured homes.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Manufactured Housing, except the exterior and interior pressure tests must be conducted at the design wind... the products, an independent quality assurance agency shall conduct pre-production specimen tests in... meet ANSI Z97.1-1984, “Safety Performance Specifications and Methods of Test for Safety Glazing...
Data Stream Mining Based Dynamic Link Anomaly Analysis Using Paired Sliding Time Window Data
2014-11-01
Conference on Knowledge Dis- covery and Data Mining, PAKDD’10, Hyderabad, India , (2010). [2] Almansoori, W., Gao, S., Jarada, T. N., Elsheikh, A. M...F., Greif, C., and Lakshmanan, L. V., “Fast Matrix Computations for Pairwise and Columnwise Commute Times and Katz Scores,” Internet Mathematics, Vol
Modeling Valuations from Experience: A Comment on Ashby and Rakow (2014)
ERIC Educational Resources Information Center
Wulff, Dirk U.; Pachur, Thorsten
2016-01-01
What are the cognitive mechanisms underlying subjective valuations formed on the basis of sequential experiences of an option's possible outcomes? Ashby and Rakow (2014) have proposed a sliding window model (SWIM), according to which people's valuations represent the average of a limited sample of recent experiences (the size of which is estimated…
Sliding Window-Based Region of Interest Extraction for Finger Vein Images
Yang, Lu; Yang, Gongping; Yin, Yilong; Xiao, Rongyang
2013-01-01
Region of Interest (ROI) extraction is a crucial step in an automatic finger vein recognition system. The aim of ROI extraction is to decide which part of the image is suitable for finger vein feature extraction. This paper proposes a finger vein ROI extraction method which is robust to finger displacement and rotation. First, we determine the middle line of the finger, which will be used to correct the image skew. Then, a sliding window is used to detect the phalangeal joints and further to ascertain the height of ROI. Last, for the corrective image with certain height, we will obtain the ROI by using the internal tangents of finger edges as the left and right boundary. The experimental results show that the proposed method can extract ROI more accurately and effectively compared with other methods, and thus improve the performance of finger vein identification system. Besides, to acquire the high quality finger vein image during the capture process, we propose eight criteria for finger vein capture from different aspects and these criteria should be helpful to some extent for finger vein capture. PMID:23507824
Characterizing Detrended Fluctuation Analysis of multifractional Brownian motion
NASA Astrophysics Data System (ADS)
Setty, V. A.; Sharma, A. S.
2015-02-01
The Hurst exponent (H) is widely used to quantify long range dependence in time series data and is estimated using several well known techniques. Recognizing its ability to remove trends the Detrended Fluctuation Analysis (DFA) is used extensively to estimate a Hurst exponent in non-stationary data. Multifractional Brownian motion (mBm) broadly encompasses a set of models of non-stationary data exhibiting time varying Hurst exponents, H(t) as against a constant H. Recently, there has been a growing interest in time dependence of H(t) and sliding window techniques have been used to estimate a local time average of the exponent. This brought to fore the ability of DFA to estimate scaling exponents in systems with time varying H(t) , such as mBm. This paper characterizes the performance of DFA on mBm data with linearly varying H(t) and further test the robustness of estimated time average with respect to data and technique related parameters. Our results serve as a bench-mark for using DFA as a sliding window estimator to obtain H(t) from time series data.
Automated image segmentation-assisted flattening of atomic force microscopy images.
Wang, Yuliang; Lu, Tongda; Li, Xiaolai; Wang, Huimin
2018-01-01
Atomic force microscopy (AFM) images normally exhibit various artifacts. As a result, image flattening is required prior to image analysis. To obtain optimized flattening results, foreground features are generally manually excluded using rectangular masks in image flattening, which is time consuming and inaccurate. In this study, a two-step scheme was proposed to achieve optimized image flattening in an automated manner. In the first step, the convex and concave features in the foreground were automatically segmented with accurate boundary detection. The extracted foreground features were taken as exclusion masks. In the second step, data points in the background were fitted as polynomial curves/surfaces, which were then subtracted from raw images to get the flattened images. Moreover, sliding-window-based polynomial fitting was proposed to process images with complex background trends. The working principle of the two-step image flattening scheme were presented, followed by the investigation of the influence of a sliding-window size and polynomial fitting direction on the flattened images. Additionally, the role of image flattening on the morphological characterization and segmentation of AFM images were verified with the proposed method.
Remembering the Important Things: Semantic Importance in Stream Reasoning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Rui; Greaves, Mark T.; Smith, William P.
Reasoning and querying over data streams rely on the abil- ity to deliver a sequence of stream snapshots to the processing algo- rithms. These snapshots are typically provided using windows as views into streams and associated window management strategies. Generally, the goal of any window management strategy is to preserve the most im- portant data in the current window and preferentially evict the rest, so that the retained data can continue to be exploited. A simple timestamp- based strategy is rst-in-rst-out (FIFO), in which items are replaced in strict order of arrival. All timestamp-based strategies implicitly assume that a temporalmore » ordering reliably re ects importance to the processing task at hand, and thus that window management using timestamps will maximize the ability of the processing algorithms to deliver accurate interpretations of the stream. In this work, we explore a general no- tion of semantic importance that can be used for window management for streams of RDF data using semantically-aware processing algorithms like deduction or semantic query. Semantic importance exploits the infor- mation carried in RDF and surrounding ontologies for ranking window data in terms of its likely contribution to the processing algorithms. We explore the general semantic categories of query contribution, prove- nance, and trustworthiness, as well as the contribution of domain-specic ontologies. We describe how these categories behave using several con- crete examples. Finally, we consider how a stream window management strategy based on semantic importance could improve overall processing performance, especially as available window sizes decrease.« less
Zhang, Huaguang; Qu, Qiuxia; Xiao, Geyang; Cui, Yang
2018-06-01
Based on integral sliding mode and approximate dynamic programming (ADP) theory, a novel optimal guaranteed cost sliding mode control is designed for constrained-input nonlinear systems with matched and unmatched disturbances. When the system moves on the sliding surface, the optimal guaranteed cost control problem of sliding mode dynamics is transformed into the optimal control problem of a reformulated auxiliary system with a modified cost function. The ADP algorithm based on single critic neural network (NN) is applied to obtain the approximate optimal control law for the auxiliary system. Lyapunov techniques are used to demonstrate the convergence of the NN weight errors. In addition, the derived approximate optimal control is verified to guarantee the sliding mode dynamics system to be stable in the sense of uniform ultimate boundedness. Some simulation results are presented to verify the feasibility of the proposed control scheme.
Optoelectronic hit/miss transform for screening cervical smear slides
NASA Astrophysics Data System (ADS)
Narayanswamy, R.; Turner, R. M.; McKnight, D. J.; Johnson, K. M.; Sharpe, J. P.
1995-06-01
An optoelectronic morphological processor for detecting regions of interest (abnormal cells) on a cervical smear slide using the hit/miss transform is presented. Computer simulation of the algorithm tested on 184 Pap-smear images provided 95% detection and 5% false alarm. An optoelectronic implementation of the hit/miss transform is presented, along with preliminary experimental results.
NASA Astrophysics Data System (ADS)
Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong
2018-01-01
An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.
Comparison of Frequency-Domain Array Methods for Studying Earthquake Rupture Process
NASA Astrophysics Data System (ADS)
Sheng, Y.; Yin, J.; Yao, H.
2014-12-01
Seismic array methods, in both time- and frequency- domains, have been widely used to study the rupture process and energy radiation of earthquakes. With better spatial resolution, the high-resolution frequency-domain methods, such as Multiple Signal Classification (MUSIC) (Schimdt, 1986; Meng et al., 2011) and the recently developed Compressive Sensing (CS) technique (Yao et al., 2011, 2013), are revealing new features of earthquake rupture processes. We have performed various tests on the methods of MUSIC, CS, minimum-variance distortionless response (MVDR) Beamforming and conventional Beamforming in order to better understand the advantages and features of these methods for studying earthquake rupture processes. We use the ricker wavelet to synthesize seismograms and use these frequency-domain techniques to relocate the synthetic sources we set, for instance, two sources separated in space but, their waveforms completely overlapping in the time domain. We also test the effects of the sliding window scheme on the recovery of a series of input sources, in particular, some artifacts that are caused by the sliding window scheme. Based on our tests, we find that CS, which is developed from the theory of sparsity inversion, has relatively high spatial resolution than the other frequency-domain methods and has better performance at lower frequencies. In high-frequency bands, MUSIC, as well as MVDR Beamforming, is more stable, especially in the multi-source situation. Meanwhile, CS tends to produce more artifacts when data have poor signal-to-noise ratio. Although these techniques can distinctly improve the spatial resolution, they still produce some artifacts along with the sliding of the time window. Furthermore, we propose a new method, which combines both the time-domain and frequency-domain techniques, to suppress these artifacts and obtain more reliable earthquake rupture images. Finally, we apply this new technique to study the 2013 Okhotsk deep mega earthquake in order to better capture the rupture characteristics (e.g., rupture area and velocity) of this earthquake.
Time reversal and phase coherent music techniques for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements. A modified TR-MUSIC imaging algorithm is used to account for ultrasound scattering from both density and compressibility contrasts. The phase response of ultrasound transducer elements is accounted for in a PC-MUSIC system.
Improved artificial bee colony algorithm for vehicle routing problem with time windows
Yan, Qianqian; Zhang, Mengjie; Yang, Yunong
2017-01-01
This paper investigates a well-known complex combinatorial problem known as the vehicle routing problem with time windows (VRPTW). Unlike the standard vehicle routing problem, each customer in the VRPTW is served within a given time constraint. This paper solves the VRPTW using an improved artificial bee colony (IABC) algorithm. The performance of this algorithm is improved by a local optimization based on a crossover operation and a scanning strategy. Finally, the effectiveness of the IABC is evaluated on some well-known benchmarks. The results demonstrate the power of IABC algorithm in solving the VRPTW. PMID:28961252
Validation of vision-based obstacle detection algorithms for low-altitude helicopter flight
NASA Technical Reports Server (NTRS)
Suorsa, Raymond; Sridhar, Banavar
1991-01-01
A validation facility being used at the NASA Ames Research Center is described which is aimed at testing vision based obstacle detection and range estimation algorithms suitable for low level helicopter flight. The facility is capable of processing hundreds of frames of calibrated multicamera 6 degree-of-freedom motion image sequencies, generating calibrated multicamera laboratory images using convenient window-based software, and viewing range estimation results from different algorithms along with truth data using powerful window-based visualization software.
Research on the Diesel Engine with Sliding Mode Variable Structure Theory
NASA Astrophysics Data System (ADS)
Ma, Zhexuan; Mao, Xiaobing; Cai, Le
2018-05-01
This study constructed the nonlinear mathematical model of the diesel engine high-pressure common rail (HPCR) system through two polynomial fitting which was treated as a kind of affine nonlinear system. Based on sliding-mode variable structure control (SMVSC) theory, a sliding-mode controller for affine nonlinear systems was designed for achieving the control of common rail pressure and the diesel engine’s rotational speed. Finally, on the simulation platform of MATLAB, the designed nonlinear HPCR system was simulated. The simulation results demonstrated that sliding-mode variable structure control algorithm shows favourable control performances which are overcoming the shortcomings of traditional PID control in overshoot, parameter adjustment, system precision, adjustment time and ascending time.
Control algorithms for dynamic windows for residential buildings
Firlag, Szymon; Yazdanian, Mehrangiz; Curcija, Charlie; ...
2015-09-30
This study analyzes the influence of control algorithms for dynamic windows on energy consumption, number of hours of retracted shades during daylight and shade operations. Five different control algorithms - heating/cooling, simple rules, perfect citizen, heat flow and predictive weather were developed and compared. The performance of a typical residential building was modeled with EnergyPlus. The program Widow was used to generate a Bi-Directional Distribution Function (BSDF) for two window configurations. The BSDF was exported to EnergyPlus using the IDF file format. The EMS feature in EnergyPlus was used to develop custom control algorithms. The calculations were made for fourmore » locations with diverse climate. The results showed that: (a) use of automated shading with proposed control algorithms can reduce the site energy in the range of 11.6-13.0%; in regard to source (primary) energy in the range of 20.1-21.6%, (b) the differences between algorithms in regard to energy savings are not high, (c) the differences between algorithms in regard to number of hours of retracted shades are visible, (e) the control algorithms have a strong influence on shade operation and oscillation of shade can occur, (d) additional energy consumption caused by motor, sensors and a small microprocessor in the analyzed case is very small.« less
Fault-tolerant nonlinear adaptive flight control using sliding mode online learning.
Krüger, Thomas; Schnetter, Philipp; Placzek, Robin; Vörsmann, Peter
2012-08-01
An expanded nonlinear model inversion flight control strategy using sliding mode online learning for neural networks is presented. The proposed control strategy is implemented for a small unmanned aircraft system (UAS). This class of aircraft is very susceptible towards nonlinearities like atmospheric turbulence, model uncertainties and of course system failures. Therefore, these systems mark a sensible testbed to evaluate fault-tolerant, adaptive flight control strategies. Within this work the concept of feedback linearization is combined with feed forward neural networks to compensate for inversion errors and other nonlinear effects. Backpropagation-based adaption laws of the network weights are used for online training. Within these adaption laws the standard gradient descent backpropagation algorithm is augmented with the concept of sliding mode control (SMC). Implemented as a learning algorithm, this nonlinear control strategy treats the neural network as a controlled system and allows a stable, dynamic calculation of the learning rates. While considering the system's stability, this robust online learning method therefore offers a higher speed of convergence, especially in the presence of external disturbances. The SMC-based flight controller is tested and compared with the standard gradient descent backpropagation algorithm in the presence of system failures. Copyright © 2012 Elsevier Ltd. All rights reserved.
Commowick, Olivier; Akhondi-Asl, Alireza; Warfield, Simon K.
2012-01-01
We present a new algorithm, called local MAP STAPLE, to estimate from a set of multi-label segmentations both a reference standard segmentation and spatially varying performance parameters. It is based on a sliding window technique to estimate the segmentation and the segmentation performance parameters for each input segmentation. In order to allow for optimal fusion from the small amount of data in each local region, and to account for the possibility of labels not being observed in a local region of some (or all) input segmentations, we introduce prior probabilities for the local performance parameters through a new Maximum A Posteriori formulation of STAPLE. Further, we propose an expression to compute confidence intervals in the estimated local performance parameters. We carried out several experiments with local MAP STAPLE to characterize its performance and value for local segmentation evaluation. First, with simulated segmentations with known reference standard segmentation and spatially varying performance, we show that local MAP STAPLE performs better than both STAPLE and majority voting. Then we present evaluations with data sets from clinical applications. These experiments demonstrate that spatial adaptivity in segmentation performance is an important property to capture. We compared the local MAP STAPLE segmentations to STAPLE, and to previously published fusion techniques and demonstrate the superiority of local MAP STAPLE over other state-of-the- art algorithms. PMID:22562727
Dynamic vehicle routing with time windows in theory and practice.
Yang, Zhiwei; van Osta, Jan-Paul; van Veen, Barry; van Krevelen, Rick; van Klaveren, Richard; Stam, Andries; Kok, Joost; Bäck, Thomas; Emmerich, Michael
2017-01-01
The vehicle routing problem is a classical combinatorial optimization problem. This work is about a variant of the vehicle routing problem with dynamically changing orders and time windows. In real-world applications often the demands change during operation time. New orders occur and others are canceled. In this case new schedules need to be generated on-the-fly. Online optimization algorithms for dynamical vehicle routing address this problem but so far they do not consider time windows. Moreover, to match the scenarios found in real-world problems adaptations of benchmarks are required. In this paper, a practical problem is modeled based on the procedure of daily routing of a delivery company. New orders by customers are introduced dynamically during the working day and need to be integrated into the schedule. A multiple ant colony algorithm combined with powerful local search procedures is proposed to solve the dynamic vehicle routing problem with time windows. The performance is tested on a new benchmark based on simulations of a working day. The problems are taken from Solomon's benchmarks but a certain percentage of the orders are only revealed to the algorithm during operation time. Different versions of the MACS algorithm are tested and a high performing variant is identified. Finally, the algorithm is tested in situ: In a field study, the algorithm schedules a fleet of cars for a surveillance company. We compare the performance of the algorithm to that of the procedure used by the company and we summarize insights gained from the implementation of the real-world study. The results show that the multiple ant colony algorithm can get a much better solution on the academic benchmark problem and also can be integrated in a real-world environment.
Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
OSLG: A new granting scheme in WDM Ethernet passive optical networks
NASA Astrophysics Data System (ADS)
Razmkhah, Ali; Rahbar, Akbar Ghaffarpour
2011-12-01
Several granting schemes have been proposed to grant transmission window and dynamic bandwidth allocation (DBA) in passive optical networks (PON). Generally, granting schemes suffer from bandwidth wastage of granted windows. Here, we propose a new granting scheme for WDM Ethernet PONs, called optical network unit (ONU) Side Limited Granting (OSLG) that conserves upstream bandwidth, thus resulting in decreasing queuing delay and packet drop ratio. In OSLG instead of optical line terminal (OLT), each ONU determines its transmission window. Two OSLG algorithms are proposed in this paper: the OSLG_GA algorithm that determines the size of its transmission window in such a way that the bandwidth wastage problem is relieved, and the OSLG_SC algorithm that saves unused bandwidth for more bandwidth utilization later on. The OSLG can be used as granting scheme of any DBA to provide better performance in the terms of packet drop ratio and queuing delay. Our performance evaluations show the effectiveness of OSLG in reducing packet drop ratio and queuing delay under different DBA techniques.
getimages: Background derivation and image flattening method
NASA Astrophysics Data System (ADS)
Men'shchikov, Alexander
2017-05-01
getimages performs background derivation and image flattening for high-resolution images obtained with space observatories. It is based on median filtering with sliding windows corresponding to a range of spatial scales from the observational beam size up to a maximum structure width X. The latter is a single free parameter of getimages that can be evaluated manually from the observed image. The median filtering algorithm provides a background image for structures of all widths below X. The same median filtering procedure applied to an image of standard deviations derived from a background-subtracted image results in a flattening image. Finally, a flattened image is computed by dividing the background-subtracted by the flattening image. Standard deviations in the flattened image are now uniform outside sources and filaments. Detecting structures in such radically simplified images results in much cleaner extractions that are more complete and reliable. getimages also reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images. The code (a Bash script) uses FORTRAN utilities from getsources (ascl:1507.014), which must be installed.
Ares I-X In-Flight Modal Identification
NASA Technical Reports Server (NTRS)
Bartkowicz, Theodore J.; James, George H., III
2011-01-01
Operational modal analysis is a procedure that allows the extraction of modal parameters of a structure in its operating environment. It is based on the idealized premise that input to the structure is white noise. In some cases, when free decay responses are corrupted by unmeasured random disturbances, the response data can be processed into cross-correlation functions that approximate free decay responses. Modal parameters can be computed from these functions by time domain identification methods such as the Eigenvalue Realization Algorithm (ERA). The extracted modal parameters have the same characteristics as impulse response functions of the original system. Operational modal analysis is performed on Ares I-X in-flight data. Since the dynamic system is not stationary due to propellant mass loss, modal identification is only possible by analyzing the system as a series of linearized models over short periods of time via a sliding time-window of short time intervals. A time-domain zooming technique was also employed to enhance the modal parameter extraction. Results of this study demonstrate that free-decay time domain modal identification methods can be successfully employed for in-flight launch vehicle modal extraction.
4. EXTERIOR OF SOUTH END OF BUILDING 104 SHOWING 1LIGHT ...
4. EXTERIOR OF SOUTH END OF BUILDING 104 SHOWING 1-LIGHT SIDE EXIT DOOR AND ORIGINAL WOOD-FRAMED SLIDING GLASS KITCHEN WINDOWS AT PHOTO CENTER, AND TALL RUSTIC STYLE CHIMNEY WITH GABLE FRAME ON BACK WALL OF HOUSE. VIEW TO NORTHEAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
India's Vernacular Architecture as a Reflection of Culture.
ERIC Educational Resources Information Center
Masalski, Kathleen Woods
This paper contains the narrative for a slide presentation on the architecture of India. Through the narration, the geography and climate of the country and the social conditions of the Indian people are discussed. Roofs and windows are adapted for the hot, rainy climate, while the availability of building materials ranges from palm leaves to mud…
Field-programmable analogue arrays for the sensorless control of DC motors
NASA Astrophysics Data System (ADS)
Rivera, J.; Dueñas, I.; Ortega, S.; Del Valle, J. L.
2018-02-01
This work presents the analogue implementation of a sensorless controller for direct current motors based on the super-twisting (ST) sliding mode technique, by means of field programmable analogue arrays (FPAA). The novelty of this work is twofold, first is the use of the ST algorithm in a sensorless scheme for DC motors, and the implementation method of this type of sliding mode controllers in FPAAs. The ST algorithm reduces the chattering problem produced with the deliberate use of the sign function in classical sliding mode approaches. On the other hand, the advantages of the implementation method over a digital one are that the controller is not digitally approximated, the controller gains are not fine tuned and the implementation does not require the use of analogue-to-digital and digital-to-analogue converter circuits. In addition to this, the FPAA is a reconfigurable, lower cost and power consumption technology. Simulation and experimentation results were registered, where a more accurate transient response and lower power consumption were obtained by the proposed implementation method when compared to a digital implementation. Also, a more accurate performance by the DC motor is obtained with proposed sensorless ST technique when compared with a classical sliding mode approach.
Noise normalization and windowing functions for VALIDAR in wind parameter estimation
NASA Astrophysics Data System (ADS)
Beyon, Jeffrey Y.; Koch, Grady J.; Li, Zhiwen
2006-05-01
The wind parameter estimates from a state-of-the-art 2-μm coherent lidar system located at NASA Langley, Virginia, named VALIDAR (validation lidar), were compared after normalizing the noise by its estimated power spectra via the periodogram and the linear predictive coding (LPC) scheme. The power spectra and the Doppler shift estimates were the main parameter estimates for comparison. Different types of windowing functions were implemented in VALIDAR data processing algorithm and their impact on the wind parameter estimates was observed. Time and frequency independent windowing functions such as Rectangular, Hanning, and Kaiser-Bessel and time and frequency dependent apodized windowing function were compared. The briefing of current nonlinear algorithm development for Doppler shift correction subsequently follows.
Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation
NASA Astrophysics Data System (ADS)
Sekhar, S. Chandra; Sreenivas, TV
2004-12-01
We address the problem of estimating instantaneous frequency (IF) of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE). The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF) estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD)-based IF estimators for different signal-to-noise ratio (SNR).
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-01-01
Abstract To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis. PMID:28422856
Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju
2017-04-01
To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis.
An algorithm for simulating fracture of cohesive-frictional materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nukala, Phani K; Sampath, Rahul S; Barai, Pallab
Fracture of disordered frictional granular materials is dominated by interfacial failure response that is characterized by de-cohesion followed by frictional sliding response. To capture such an interfacial failure response, we introduce a cohesive-friction random fuse model (CFRFM), wherein the cohesive response of the interface is represented by a linear stress-strain response until a failure threshold, which is then followed by a constant response at a threshold lower than the initial failure threshold to represent the interfacial frictional sliding mechanism. This paper presents an efficient algorithm for simulating fracture of such disordered frictional granular materials using the CFRFM. We note that,more » when applied to perfectly plastic disordered materials, our algorithm is both theoretically and numerically equivalent to the traditional tangent algorithm (Roux and Hansen 1992 J. Physique II 2 1007) used for such simulations. However, the algorithm is general and is capable of modeling discontinuous interfacial response. Our numerical simulations using the algorithm indicate that the local and global roughness exponents ({zeta}{sub loc} and {zeta}, respectively) of the fracture surface are equal to each other, and the two-dimensional crack roughness exponent is estimated to be {zeta}{sub loc} = {zeta} = 0.69 {+-} 0.03.« less
Advanced Interval Type-2 Fuzzy Sliding Mode Control for Robot Manipulator.
Hwang, Ji-Hwan; Kang, Young-Chang; Park, Jong-Wook; Kim, Dong W
2017-01-01
In this paper, advanced interval type-2 fuzzy sliding mode control (AIT2FSMC) for robot manipulator is proposed. The proposed AIT2FSMC is a combination of interval type-2 fuzzy system and sliding mode control. For resembling a feedback linearization (FL) control law, interval type-2 fuzzy system is designed. For compensating the approximation error between the FL control law and interval type-2 fuzzy system, sliding mode controller is designed, respectively. The tuning algorithms are derived in the sense of Lyapunov stability theorem. Two-link rigid robot manipulator with nonlinearity is used to test and the simulation results are presented to show the effectiveness of the proposed method that can control unknown system well.
Guo, Zongyi; Chang, Jing; Guo, Jianguo; Zhou, Jun
2018-06-01
This paper focuses on the adaptive twisting sliding mode control for the Hypersonic Reentry Vehicles (HRVs) attitude tracking issue. The HRV attitude tracking model is transformed into the error dynamics in matched structure, whereas an unmeasurable state is redefined by lumping the existing unmatched disturbance with the angular rate. Hence, an adaptive finite-time observer is used to estimate the unknown state. Then, an adaptive twisting algorithm is proposed for systems subject to disturbances with unknown bounds. The stability of the proposed observer-based adaptive twisting approach is guaranteed, and the case of noisy measurement is analyzed. Also, the developed control law avoids the aggressive chattering phenomenon of the existing adaptive twisting approaches because the adaptive gains decrease close to the disturbance once the trajectories reach the sliding surface. Finally, numerical simulations on the attitude control of the HRV are conducted to verify the effectiveness and benefit of the proposed approach. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Detecting brain tumor in pathological slides using hyperspectral imaging
Ortega, Samuel; Fabelo, Himar; Camacho, Rafael; de la Luz Plaza, María; Callicó, Gustavo M.; Sarmiento, Roberto
2018-01-01
Hyperspectral imaging (HSI) is an emerging technology for medical diagnosis. This research work presents a proof-of-concept on the use of HSI data to automatically detect human brain tumor tissue in pathological slides. The samples, consisting of hyperspectral cubes collected from 400 nm to 1000 nm, were acquired from ten different patients diagnosed with high-grade glioma. Based on the diagnosis provided by pathologists, a spectral library of normal and tumor tissues was created and processed using three different supervised classification algorithms. Results prove that HSI is a suitable technique to automatically detect high-grade tumors from pathological slides. PMID:29552415
Detecting brain tumor in pathological slides using hyperspectral imaging.
Ortega, Samuel; Fabelo, Himar; Camacho, Rafael; de la Luz Plaza, María; Callicó, Gustavo M; Sarmiento, Roberto
2018-02-01
Hyperspectral imaging (HSI) is an emerging technology for medical diagnosis. This research work presents a proof-of-concept on the use of HSI data to automatically detect human brain tumor tissue in pathological slides. The samples, consisting of hyperspectral cubes collected from 400 nm to 1000 nm, were acquired from ten different patients diagnosed with high-grade glioma. Based on the diagnosis provided by pathologists, a spectral library of normal and tumor tissues was created and processed using three different supervised classification algorithms. Results prove that HSI is a suitable technique to automatically detect high-grade tumors from pathological slides.
Krecsák, László; Micsik, Tamás; Kiszler, Gábor; Krenács, Tibor; Szabó, Dániel; Jónás, Viktor; Császár, Gergely; Czuni, László; Gurzó, Péter; Ficsor, Levente; Molnár, Béla
2011-01-18
The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two scoring schemes. NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14 software application proved to be a reliable image analysis tool for pathologists testing ER and PR status in breast cancer.
Towards automatic patient selection for chemotherapy in colorectal cancer trials
NASA Astrophysics Data System (ADS)
Wright, Alexander; Magee, Derek; Quirke, Philip; Treanor, Darren E.
2014-03-01
A key factor in the prognosis of colorectal cancer, and its response to chemoradiotherapy, is the ratio of cancer cells to surrounding tissue (the so called tumour:stroma ratio). Currently tumour:stroma ratio is calculated manually, by examining H&E stained slides and counting the proportion of area of each. Virtual slides facilitate this analysis by allowing pathologists to annotate areas of tumour on a given digital slide image, and in-house developed stereometry tools mark random, systematic points on the slide, known as spots. These spots are examined and classified by the pathologist. Typical analyses require a pathologist to score at least 300 spots per tumour. This is a time consuming (10- 60 minutes per case) and laborious task for the pathologist and automating this process is highly desirable. Using an existing dataset of expert-classified spots from one colorectal cancer clinical trial, an automated tumour:stroma detection algorithm has been trained and validated. Each spot is extracted as an image patch, and then processed for feature extraction, identifying colour, texture, stain intensity and object characteristics. These features are used as training data for a random forest classification algorithm, and validated against unseen image patches. This process was repeated for multiple patch sizes. Over 82,000 such patches have been used, and results show an accuracy of 79%, depending on image patch size. A second study examining contextual requirements for pathologist scoring was conducted and indicates that further analysis of structures within each image patch is required in order to improve algorithm accuracy.
NASA Astrophysics Data System (ADS)
Litjens, G.; Ehteshami Bejnordi, B.; Timofeeva, N.; Swadi, G.; Kovacs, I.; Hulsbergen-van de Kaa, C.; van der Laak, J.
2015-03-01
Automated detection of prostate cancer in digitized H and E whole-slide images is an important first step for computer-driven grading. Most automated grading algorithms work on preselected image patches as they are too computationally expensive to calculate on the multi-gigapixel whole-slide images. An automated multi-resolution cancer detection system could reduce the computational workload for subsequent grading and quantification in two ways: by excluding areas of definitely normal tissue within a single specimen or by excluding entire specimens which do not contain any cancer. In this work we present a multi-resolution cancer detection algorithm geared towards the latter. The algorithm methodology is as follows: at a coarse resolution the system uses superpixels, color histograms and local binary patterns in combination with a random forest classifier to assess the likelihood of cancer. The five most suspicious superpixels are identified and at a higher resolution more computationally expensive graph and gland features are added to refine classification for these superpixels. Our methods were evaluated in a data set of 204 digitized whole-slide H and E stained images of MR-guided biopsy specimens from 163 patients. A pathologist exhaustively annotated the specimens for areas containing cancer. The performance of our system was evaluated using ten-fold cross-validation, stratified according to patient. Image-based receiver operating characteristic (ROC) analysis was subsequently performed where a specimen containing cancer was considered positive and specimens without cancer negative. We obtained an area under the ROC curve of 0.96 and a 0.4 specificity at a 1.0 sensitivity.
LIVING ROOM WITH HALL TO BEDROOMS AT FAR WALL. NOTE ...
LIVING ROOM WITH HALL TO BEDROOMS AT FAR WALL. NOTE FLOOR TO CEILING WINDOWS ON RIGHT AND SLIDING DOORS TO DINING ROOM ON LEFT. VIEW FACING SOUTHWEST - Camp H.M. Smith and Navy Public Works Center Manana Title VII (Capehart) Housing, Three-Bedroom Single-Family Type 7, Birch Circle, Elm Drive, Elm Circle, and Date Drive, Pearl City, Honolulu County, HI
24 CFR 3280.403 - Standard for windows and sliding glass doors used in manufactured homes.
Code of Federal Regulations, 2014 CFR
2014-04-01
... pressure tests must be conducted at the design wind loads required for components and cladding specified in... certification must be based on tests conducted at the design wind loads specified in § 3280.305(c)(1). (1) All... agency shall conduct pre-production specimen tests in accordance with AAMA 1701.2-95. Further, such...
7. PHOTOGRAPHIC COPY OF ORIGINAL CONSTRUCTION DRAWING, DATED 1918, HORIZONAL ...
7. PHOTOGRAPHIC COPY OF ORIGINAL CONSTRUCTION DRAWING, DATED 1918, HORIZONAL SLIDING WINDOW DETAIL, WAR DEPARTMENT, MANUAL OF THE CONSTRUCTION DIVISION OF THE ARMY, WAR EMERGENCY CONSTRUCTION, SECTION C, ENGINEERING DIVISION, PLATE 5, CONSOLIDATED SUPPLY COMPANY PRINTERS, WASHINGTON - Fort Bliss, 7th Cavalry Buildings, U.S. Army Air Defence Artillery Center & Fort Bliss, El Paso, El Paso County, TX
Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.
Somasundaram, K; Rajendran, P Alli
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.
Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach
Somasundaram, K.; Alli Rajendran, P.
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362
Presentation Extensions of the SOAP
NASA Technical Reports Server (NTRS)
Carnright, Robert; Stodden, David; Coggi, John
2009-01-01
A set of extensions of the Satellite Orbit Analysis Program (SOAP) enables simultaneous and/or sequential presentation of information from multiple sources. SOAP is used in the aerospace community as a means of collaborative visualization and analysis of data on planned spacecraft missions. The following definitions of terms also describe the display modalities of SOAP as now extended: In SOAP terminology, View signifies an animated three-dimensional (3D) scene, two-dimensional still image, plot of numerical data, or any other visible display derived from a computational simulation or other data source; a) "Viewport" signifies a rectangular portion of a computer-display window containing a view; b) "Palette" signifies a collection of one or more viewports configured for simultaneous (split-screen) display in the same window; c) "Slide" signifies a palette with a beginning and ending time and an animation time step; and d) "Presentation" signifies a prescribed sequence of slides. For example, multiple 3D views from different locations can be crafted for simultaneous display and combined with numerical plots and other representations of data for both qualitative and quantitative analysis. The resulting sets of views can be temporally sequenced to convey visual impressions of a sequence of events for a planned mission.
Histopathological Image Analysis: A Review
Gurcan, Metin N.; Boucheron, Laura; Can, Ali; Madabhushi, Anant; Rajpoot, Nasir; Yener, Bulent
2010-01-01
Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement to the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe. PMID:20671804
Robust control of electrostatic torsional micromirrors using adaptive sliding-mode control
NASA Astrophysics Data System (ADS)
Sane, Harshad S.; Yazdi, Navid; Mastrangelo, Carlos H.
2005-01-01
This paper presents high-resolution control of torsional electrostatic micromirrors beyond their inherent pull-in instability using robust sliding-mode control (SMC). The objectives of this paper are two-fold - firstly, to demonstrate the applicability of SMC for MEMS devices; secondly - to present a modified SMC algorithm that yields improved control accuracy. SMC enables compact realization of a robust controller tolerant of device characteristic variations and nonlinearities. Robustness of the control loop is demonstrated through extensive simulations and measurements on MEMS with a wide range in their characteristics. Control of two-axis gimbaled micromirrors beyond their pull-in instability with overall 10-bit pointing accuracy is confirmed experimentally. In addition, this paper presents an analysis of the sources of errors in discrete-time implementation of the control algorithm. To minimize these errors, we present an adaptive version of the SMC algorithm that yields substantial performance improvement without considerably increasing implementation complexity.
Prevedello, Luciano M; Erdal, Barbaros S; Ryu, John L; Little, Kevin J; Demirer, Mutlu; Qian, Songyue; White, Richard D
2017-12-01
Purpose To evaluate the performance of an artificial intelligence (AI) tool using a deep learning algorithm for detecting hemorrhage, mass effect, or hydrocephalus (HMH) at non-contrast material-enhanced head computed tomographic (CT) examinations and to determine algorithm performance for detection of suspected acute infarct (SAI). Materials and Methods This HIPAA-compliant retrospective study was completed after institutional review board approval. A training and validation dataset of noncontrast-enhanced head CT examinations that comprised 100 examinations of HMH, 22 of SAI, and 124 of noncritical findings was obtained resulting in 2583 representative images. Examinations were processed by using a convolutional neural network (deep learning) using two different window and level configurations (brain window and stroke window). AI algorithm performance was tested on a separate dataset containing 50 examinations with HMH findings, 15 with SAI findings, and 35 with noncritical findings. Results Final algorithm performance for HMH showed 90% (45 of 50) sensitivity (95% confidence interval [CI]: 78%, 97%) and 85% (68 of 80) specificity (95% CI: 76%, 92%), with area under the receiver operating characteristic curve (AUC) of 0.91 with the brain window. For SAI, the best performance was achieved with the stroke window showing 62% (13 of 21) sensitivity (95% CI: 38%, 82%) and 96% (27 of 28) specificity (95% CI: 82%, 100%), with AUC of 0.81. Conclusion AI using deep learning demonstrates promise for detecting critical findings at noncontrast-enhanced head CT. A dedicated algorithm was required to detect SAI. Detection of SAI showed lower sensitivity in comparison to detection of HMH, but showed reasonable performance. Findings support further investigation of the algorithm in a controlled and prospective clinical setting to determine whether it can independently screen noncontrast-enhanced head CT examinations and notify the interpreting radiologist of critical findings. © RSNA, 2017 Online supplemental material is available for this article.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Vehicle routing problem with time windows using natural inspired algorithms
NASA Astrophysics Data System (ADS)
Pratiwi, A. B.; Pratama, A.; Sa’diyah, I.; Suprajitno, H.
2018-03-01
Process of distribution of goods needs a strategy to make the total cost spent for operational activities minimized. But there are several constrains have to be satisfied which are the capacity of the vehicles and the service time of the customers. This Vehicle Routing Problem with Time Windows (VRPTW) gives complex constrains problem. This paper proposes natural inspired algorithms for dealing with constrains of VRPTW which involves Bat Algorithm and Cat Swarm Optimization. Bat Algorithm is being hybrid with Simulated Annealing, the worst solution of Bat Algorithm is replaced by the solution from Simulated Annealing. Algorithm which is based on behavior of cats, Cat Swarm Optimization, is improved using Crow Search Algorithm to make simplier and faster convergence. From the computational result, these algorithms give good performances in finding the minimized total distance. Higher number of population causes better computational performance. The improved Cat Swarm Optimization with Crow Search gives better performance than the hybridization of Bat Algorithm and Simulated Annealing in dealing with big data.
NOTE: A BPF-type algorithm for CT with a curved PI detector
NASA Astrophysics Data System (ADS)
Tang, Jie; Zhang, Li; Chen, Zhiqiang; Xing, Yuxiang; Cheng, Jianping
2006-08-01
Helical cone-beam CT is used widely nowadays because of its rapid scan speed and efficient utilization of x-ray dose. Recently, an exact reconstruction algorithm for helical cone-beam CT was proposed (Zou and Pan 2004a Phys. Med. Biol. 49 941 59). The algorithm is referred to as a backprojection-filtering (BPF) algorithm. This BPF algorithm for a helical cone-beam CT with a flat-panel detector (FPD-HCBCT) requires minimum data within the Tam Danielsson window and can naturally address the problem of ROI reconstruction from data truncated in both longitudinal and transversal directions. In practical CT systems, detectors are expensive and always take a very important position in the total cost. Hence, we work on an exact reconstruction algorithm for a CT system with a detector of the smallest size, i.e., a curved PI detector fitting the Tam Danielsson window. The reconstruction algorithm is derived following the framework of the BPF algorithm. Numerical simulations are done to validate our algorithm in this study.
A BPF-type algorithm for CT with a curved PI detector.
Tang, Jie; Zhang, Li; Chen, Zhiqiang; Xing, Yuxiang; Cheng, Jianping
2006-08-21
Helical cone-beam CT is used widely nowadays because of its rapid scan speed and efficient utilization of x-ray dose. Recently, an exact reconstruction algorithm for helical cone-beam CT was proposed (Zou and Pan 2004a Phys. Med. Biol. 49 941-59). The algorithm is referred to as a backprojection-filtering (BPF) algorithm. This BPF algorithm for a helical cone-beam CT with a flat-panel detector (FPD-HCBCT) requires minimum data within the Tam-Danielsson window and can naturally address the problem of ROI reconstruction from data truncated in both longitudinal and transversal directions. In practical CT systems, detectors are expensive and always take a very important position in the total cost. Hence, we work on an exact reconstruction algorithm for a CT system with a detector of the smallest size, i.e., a curved PI detector fitting the Tam-Danielsson window. The reconstruction algorithm is derived following the framework of the BPF algorithm. Numerical simulations are done to validate our algorithm in this study.
Han, Lianghao; Dong, Hua; McClelland, Jamie R; Han, Liangxiu; Hawkes, David J; Barratt, Dean C
2017-07-01
This paper presents a new hybrid biomechanical model-based non-rigid image registration method for lung motion estimation. In the proposed method, a patient-specific biomechanical modelling process captures major physically realistic deformations with explicit physical modelling of sliding motion, whilst a subsequent non-rigid image registration process compensates for small residuals. The proposed algorithm was evaluated with 10 4D CT datasets of lung cancer patients. The target registration error (TRE), defined as the Euclidean distance of landmark pairs, was significantly lower with the proposed method (TRE = 1.37 mm) than with biomechanical modelling (TRE = 3.81 mm) and intensity-based image registration without specific considerations for sliding motion (TRE = 4.57 mm). The proposed method achieved a comparable accuracy as several recently developed intensity-based registration algorithms with sliding handling on the same datasets. A detailed comparison on the distributions of TREs with three non-rigid intensity-based algorithms showed that the proposed method performed especially well on estimating the displacement field of lung surface regions (mean TRE = 1.33 mm, maximum TRE = 5.3 mm). The effects of biomechanical model parameters (such as Poisson's ratio, friction and tissue heterogeneity) on displacement estimation were investigated. The potential of the algorithm in optimising biomechanical models of lungs through analysing the pattern of displacement compensation from the image registration process has also been demonstrated. Copyright © 2017 Elsevier B.V. All rights reserved.
Røge, Rasmus; Riber-Hansen, Rikke; Nielsen, Søren; Vyberg, Mogens
2016-07-01
Manual estimation of Ki67 Proliferation Index (PI) in breast carcinoma classification is labor intensive and prone to intra- and interobserver variation. Standard Digital Image Analysis (DIA) has limitations due to issues with tumor cell identification. Recently, a computer algorithm, DIA based on Virtual Double Staining (VDS), segmenting Ki67-positive and -negative tumor cells using digitally fused parallel cytokeratin (CK) and Ki67-stained slides has been introduced. In this study, we compare VDS with manual stereological counting of Ki67-positive and -negative cells and examine the impact of the physical distance of the parallel slides on the alignment of slides. TMAs, containing 140 cores of consecutively obtained breast carcinomas, were stained for CK and Ki67 using optimized staining protocols. By means of stereological principles, Ki67-positive and -negative cell profiles were counted in sampled areas and used for the estimation of PIs of the whole tissue core. The VDS principle was applied to both the same sampled areas and the whole tissue core. Additionally, five neighboring slides were stained for CK in order to examine the alignment algorithm. Correlation between manual counting and VDS in both sampled areas and whole core was almost perfect (correlation coefficients above 0.97). Bland-Altman plots did not reveal any skewness in any data ranges. There was a good agreement in alignment (>85 %) in neighboring slides, whereas agreement decreased in non-neighboring slides. VDS gave similar results compared with manual counting using stereological principles. Introduction of this method in clinical and research practice may improve accuracy and reproducibility of Ki67 PI.
Effect of window length on performance of the elbow-joint angle prediction based on electromyography
NASA Astrophysics Data System (ADS)
Triwiyanto; Wahyunggoro, Oyas; Adi Nugroho, Hanung; Herianto
2017-05-01
The high performance of the elbow joint angle prediction is essential on the development of the devices based on electromyography (EMG) control. The performance of the prediction depends on the feature of extraction parameters such as window length. In this paper, we evaluated the effect of the window length on the performance of the elbow-joint angle prediction. The prediction algorithm consists of zero-crossing feature extraction and second order of Butterworth low pass filter. The feature was used to extract the EMG signal by varying window length. The EMG signal was collected from the biceps muscle while the elbow was moved in the flexion and extension motion. The subject performed the elbow motion by holding a 1-kg load and moved the elbow in different periods (12 seconds, 8 seconds and 6 seconds). The results indicated that the window length affected the performance of the prediction. The 250 window lengths yielded the best performance of the prediction algorithm of (mean±SD) root mean square error = 5.68%±1.53% and Person’s correlation = 0.99±0.0059.
Rapid prototyping of update algorithm of discrete Fourier transform for real-time signal processing
NASA Astrophysics Data System (ADS)
Kakad, Yogendra P.; Sherlock, Barry G.; Chatapuram, Krishnan V.; Bishop, Stephen
2001-10-01
An algorithm is developed in the companion paper, to update the existing DFT to represent the new data series that results when a new signal point is received. Updating the DFT in this way uses less computation than directly evaluating the DFT using the FFT algorithm, This reduces the computational order by a factor of log2 N. The algorithm is able to work in the presence of data window function, for use with rectangular window, the split triangular, Hanning, Hamming, and Blackman windows. In this paper, a hardware implementation of this algorithm, using FPGA technology, is outlined. Unlike traditional fully customized VLSI circuits, FPGAs represent a technical break through in the corresponding industry. The FPGA implements thousands of gates of logic in a single IC chip and it can be programmed by users at their site in a few seconds or less depending on the type of device used. The risk is low and the development time is short. The advantages have made FPGAs very popular for rapid prototyping of algorithms in the area of digital communication, digital signal processing, and image processing. Our paper addresses the related issues of implementation using hardware descriptive language in the development of the design and the subsequent downloading on the programmable hardware chip.
A Supervised Approach to Windowing Detection on Dynamic Networks
2017-07-01
A supervised approach to windowing detection on dynamic networks Benjamin Fish University of Illinois at Chicago 1200 W. Harrison St. Chicago...Using this framework, we introduce windowing algorithms that take a supervised approach : they leverage ground truth on training data to find a good...windowing of the test data. We compare the supervised approach to previous approaches and several baselines on real data. ACM Reference format: Benjamin
Kocher, L.F.
1958-10-01
A persornel dosimeter film badge made of plastic, with provision for a picture of the wearer and an internal slide containing photographic film that is sensitive to various radiations, is described. Four windows made of differing material selectively attenuate alpha, beta, gamma rays, and neutrons so as to distinguish the particular type of radiation the wearer was subjected to. In addition, a lead shield has the identification number of the wearer perforated thereon so as to identify the film after processing. An internal magnetically actuated latch securely locks the slide within the body, and may be withdrawn only upon the external application of two strong magnetic forces in order to insure that the wearer or other curious persons will not accidentally expose the film to visual light.
Alcantara, Luiz Carlos Junior; Cassol, Sharon; Libin, Pieter; Deforche, Koen; Pybus, Oliver G; Van Ranst, Marc; Galvão-Castro, Bernardo; Vandamme, Anne-Mieke; de Oliveira, Tulio
2009-07-01
Human immunodeficiency virus type-1 (HIV-1), hepatitis B and C and other rapidly evolving viruses are characterized by extremely high levels of genetic diversity. To facilitate diagnosis and the development of prevention and treatment strategies that efficiently target the diversity of these viruses, and other pathogens such as human T-lymphotropic virus type-1 (HTLV-1), human herpes virus type-8 (HHV8) and human papillomavirus (HPV), we developed a rapid high-throughput-genotyping system. The method involves the alignment of a query sequence with a carefully selected set of pre-defined reference strains, followed by phylogenetic analysis of multiple overlapping segments of the alignment using a sliding window. Each segment of the query sequence is assigned the genotype and sub-genotype of the reference strain with the highest bootstrap (>70%) and bootscanning (>90%) scores. Results from all windows are combined and displayed graphically using color-coded genotypes. The new Virus-Genotyping Tools provide accurate classification of recombinant and non-recombinant viruses and are currently being assessed for their diagnostic utility. They have incorporated into several HIV drug resistance algorithms including the Stanford (http://hivdb.stanford.edu) and two European databases (http://www.umcutrecht.nl/subsite/spread-programme/ and http://www.hivrdb.org.uk/) and have been successfully used to genotype a large number of sequences in these and other databases. The tools are a PHP/JAVA web application and are freely accessible on a number of servers including: http://bioafrica.mrc.ac.za/rega-genotype/html/, http://lasp.cpqgm.fiocruz.br/virus-genotype/html/, http://jose.med.kuleuven.be/genotypetool/html/.
Forecasting Strategies for Predicting Peak Electric Load Days
NASA Astrophysics Data System (ADS)
Saxena, Harshit
Academic institutions spend thousands of dollars every month on their electric power consumption. Some of these institutions follow a demand charges pricing structure; here the amount a customer pays to the utility is decided based on the total energy consumed during the month, with an additional charge based on the highest average power load required by the customer over a moving window of time as decided by the utility. Therefore, it is crucial for these institutions to minimize the time periods where a high amount of electric load is demanded over a short duration of time. In order to reduce the peak loads and have more uniform energy consumption, it is imperative to predict when these peaks occur, so that appropriate mitigation strategies can be developed. The research work presented in this thesis has been conducted for Rochester Institute of Technology (RIT), where the demand charges are decided based on a 15 minute sliding window panned over the entire month. This case study makes use of different statistical and machine learning algorithms to develop a forecasting strategy for predicting the peak electric load days of the month. The proposed strategy was tested for a whole year starting May 2015 to April 2016 during which a total of 57 peak days were observed. The model predicted a total of 74 peak days during this period, 40 of these cases were true positives, hence achieving an accuracy level of 70 percent. The results obtained with the proposed forecasting strategy are promising and demonstrate an annual savings potential worth about $80,000 for a single submeter of RIT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Rui; Praggastis, Brenda L.; Smith, William P.
While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQLmore » is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.« less
Activity Recognition on Streaming Sensor Data.
Krishnan, Narayanan C; Cook, Diane J
2014-02-01
Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.
lcps: Light curve pre-selection
NASA Astrophysics Data System (ADS)
Schlecker, Martin
2018-05-01
lcps searches for transit-like features (i.e., dips) in photometric data. Its main purpose is to restrict large sets of light curves to a number of files that show interesting behavior, such as drops in flux. While lcps is adaptable to any format of time series, its I/O module is designed specifically for photometry of the Kepler spacecraft. It extracts the pre-conditioned PDCSAP data from light curves files created by the standard Kepler pipeline. It can also handle csv-formatted ascii files. lcps uses a sliding window technique to compare a section of flux time series with its surroundings. A dip is detected if the flux within the window is lower than a threshold fraction of the surrounding fluxes.
Hu, B; Dixon, P C; Jacobs, J V; Dennerlein, J T; Schiffman, J M
2018-04-11
The aim of this study was to investigate if a machine learning algorithm utilizing triaxial accelerometer, gyroscope, and magnetometer data from an inertial motion unit (IMU) could detect surface- and age-related differences in walking. Seventeen older (71.5 ± 4.2 years) and eighteen young (27.0 ± 4.7 years) healthy adults walked over flat and uneven brick surfaces wearing an inertial measurement unit (IMU) over the L5 vertebra. IMU data were binned into smaller data segments using 4-s sliding windows with 1-s step lengths. Ninety percent of the data were used as training inputs and the remaining ten percent were saved for testing. A deep learning network with long short-term memory units was used for training (fully supervised), prediction, and implementation. Four models were trained using the following inputs: all nine channels from every sensor in the IMU (fully trained model), accelerometer signals alone, gyroscope signals alone, and magnetometer signals alone. The fully trained models for surface and age outperformed all other models (area under the receiver operator curve, AUC = 0.97 and 0.96, respectively; p ≤ .045). The fully trained models for surface and age had high accuracy (96.3, 94.7%), precision (96.4, 95.2%), recall (96.3, 94.7%), and f1-score (96.3, 94.6%). These results demonstrate that processing the signals of a single IMU device with machine-learning algorithms enables the detection of surface conditions and age-group status from an individual's walking behavior which, with further learning, may be utilized to facilitate identifying and intervening on fall risk. Copyright © 2018 Elsevier Ltd. All rights reserved.
Extended volume coverage in helical cone-beam CT by using PI-line based BPF algorithm
NASA Astrophysics Data System (ADS)
Cho, Seungryong; Pan, Xiaochuan
2007-03-01
We compared data requirements of filtered-backprojection (FBP) and backprojection-filtration (BPF) algorithms based on PI-lines in helical cone-beam CT. Since the filtration process in FBP algorithm needs all the projection data of PI-lines for each view, the required detector size should be bigger than the size that can cover Tam-Danielsson (T-D) window to avoid data truncation. BPF algorithm, however, requires the projection data only within the T-D window, which means smaller detector size can be used to reconstruct the same image than that in FBP. In other words, a longer helical pitch can be obtained by using BPF algorithm without any truncation artifacts when a fixed detector size is given. The purpose of the work is to demonstrate numerically that extended volume coverage in helical cone-beam CT by using PI-line-based BPF algorithm can be achieved.
Robust image modeling techniques with an image restoration application
NASA Astrophysics Data System (ADS)
Kashyap, Rangasami L.; Eom, Kie-Bum
1988-08-01
A robust parameter-estimation algorithm for a nonsymmetric half-plane (NSHP) autoregressive model, where the driving noise is a mixture of a Gaussian and an outlier process, is presented. The convergence of the estimation algorithm is proved. An algorithm to estimate parameters and original image intensity simultaneously from the impulse-noise-corrupted image, where the model governing the image is not available, is also presented. The robustness of the parameter estimates is demonstrated by simulation. Finally, an algorithm to restore realistic images is presented. The entire image generally does not obey a simple image model, but a small portion (e.g., 8 x 8) of the image is assumed to obey an NSHP model. The original image is divided into windows and the robust estimation algorithm is applied for each window. The restoration algorithm is tested by comparing it to traditional methods on several different images.
Information Processing Research.
1986-09-01
Kuroe. The 3D MOSAIC Scene Understanding System. In Alan Bundy, Editor, Proceedings of the Eighth International Joint Conference on Artificial ... Artificial Jntelligencel7(1-3):409-460, August, 1981. Given a single picture which is a projection of a three-dimensional scene onto the two...values are detected as outliers by computing the distribution of values over a sliding 80 msec window. During the third pass (based on artificial
High Frequency Adaptive Instability Suppression Controls in a Liquid-Fueled Combustor
NASA Technical Reports Server (NTRS)
Kopasakis, George
2003-01-01
This effort extends into high frequency (>500 Hz), an earlier developed adaptive control algorithm for the suppression of thermo-acoustic instabilities in a liquidfueled combustor. The earlier work covered the development of a controls algorithm for the suppression of a low frequency (280 Hz) combustion instability based on simulations, with no hardware testing involved. The work described here includes changes to the simulation and controller design necessary to control the high frequency instability, augmentations to the control algorithm to improve its performance, and finally hardware testing and results with an experimental combustor rig developed for the high frequency case. The Adaptive Sliding Phasor Averaged Control (ASPAC) algorithm modulates the fuel flow in the combustor with a control phase that continuously slides back and forth within the phase region that reduces the amplitude of the instability. The results demonstrate the power of the method - that it can identify and suppress the instability even when the instability amplitude is buried in the noise of the combustor pressure. The successful testing of the ASPAC approach helped complete an important NASA milestone to demonstrate advanced technologies for low-emission combustors.
Towards developing robust algorithms for solving partial differential equations on MIMD machines
NASA Technical Reports Server (NTRS)
Saltz, Joel H.; Naik, Vijay K.
1988-01-01
Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.
Towards developing robust algorithms for solving partial differential equations on MIMD machines
NASA Technical Reports Server (NTRS)
Saltz, J. H.; Naik, V. K.
1985-01-01
Methods for efficient computation of numerical algorithms on a wide variety of MIMD machines are proposed. These techniques reorganize the data dependency patterns to improve the processor utilization. The model problem finds the time-accurate solution to a parabolic partial differential equation discretized in space and implicitly marched forward in time. The algorithms are extensions of Jacobi and SOR. The extensions consist of iterating over a window of several timesteps, allowing efficient overlap of computation with communication. The methods increase the degree to which work can be performed while data are communicated between processors. The effect of the window size and of domain partitioning on the system performance is examined both by implementing the algorithm on a simulated multiprocessor system.
Evaluation of beam tracking strategies for the THOR-CSW solar wind instrument
NASA Astrophysics Data System (ADS)
De Keyser, Johan; Lavraud, Benoit; Prech, Lubomir; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent
2017-04-01
We compare different beam tracking strategies for the Cold Solar Wind (CSW) plasma spectrometer on the ESA M4 THOR mission candidate. The goal is to intelligently select the energy and angular windows the instrument is sampling and to adapt these windows as the solar wind properties evolve, with the aim to maximize the velocity distribution acquisition rate while maintaining excellent energy and angular resolution. Using synthetic data constructed using high-cadence measurements by the Faraday cup instrument on the Spektr-R mission (30 ms resolution), we test the performance of energy beam tracking with or without angular beam tracking. The algorithm can be fed both by data acquired by the plasma spectrometer during the previous measurement cycle, or by data from another instrument, in casu the Faraday Cup (FAR) instrument foreseen on THOR. We verify how these beam tracking algorithms behave for different sizes of the energy and angular windows, and for different data integration times, in order to assess the limitations of the algorithm and to avoid situations in which the algorithm loses track of the beam.
WordCluster: detecting clusters of DNA words and genomic elements
2011-01-01
Background Many k-mers (or DNA words) and genomic elements are known to be spatially clustered in the genome. Well established examples are the genes, TFBSs, CpG dinucleotides, microRNA genes and ultra-conserved non-coding regions. Currently, no algorithm exists to find these clusters in a statistically comprehensible way. The detection of clustering often relies on densities and sliding-window approaches or arbitrarily chosen distance thresholds. Results We introduce here an algorithm to detect clusters of DNA words (k-mers), or any other genomic element, based on the distance between consecutive copies and an assigned statistical significance. We implemented the method into a web server connected to a MySQL backend, which also determines the co-localization with gene annotations. We demonstrate the usefulness of this approach by detecting the clusters of CAG/CTG (cytosine contexts that can be methylated in undifferentiated cells), showing that the degree of methylation vary drastically between inside and outside of the clusters. As another example, we used WordCluster to search for statistically significant clusters of olfactory receptor (OR) genes in the human genome. Conclusions WordCluster seems to predict biological meaningful clusters of DNA words (k-mers) and genomic entities. The implementation of the method into a web server is available at http://bioinfo2.ugr.es/wordCluster/wordCluster.php including additional features like the detection of co-localization with gene regions or the annotation enrichment tool for functional analysis of overlapped genes. PMID:21261981
Fast words boundaries localization in text fields for low quality document images
NASA Astrophysics Data System (ADS)
Ilin, Dmitry; Novikov, Dmitriy; Polevoy, Dmitry; Nikolaev, Dmitry
2018-04-01
The paper examines the problem of word boundaries precise localization in document text zones. Document processing on a mobile device consists of document localization, perspective correction, localization of individual fields, finding words in separate zones, segmentation and recognition. While capturing an image with a mobile digital camera under uncontrolled capturing conditions, digital noise, perspective distortions or glares may occur. Further document processing gets complicated because of its specifics: layout elements, complex background, static text, document security elements, variety of text fonts. However, the problem of word boundaries localization has to be solved at runtime on mobile CPU with limited computing capabilities under specified restrictions. At the moment, there are several groups of methods optimized for different conditions. Methods for the scanned printed text are quick but limited only for images of high quality. Methods for text in the wild have an excessively high computational complexity, thus, are hardly suitable for running on mobile devices as part of the mobile document recognition system. The method presented in this paper solves a more specialized problem than the task of finding text on natural images. It uses local features, a sliding window and a lightweight neural network in order to achieve an optimal algorithm speed-precision ratio. The duration of the algorithm is 12 ms per field running on an ARM processor of a mobile device. The error rate for boundaries localization on a test sample of 8000 fields is 0.3
NASA Astrophysics Data System (ADS)
Wixson, Steve E.
1990-07-01
Transparent Volume Imaging began with the stereo xray in 1895 and ended for most investigators when radiation safety concerns eliminated the second view. Today, similiar images can be generated by the computer without safety hazards providing improved perception and new means of image quantification. A volumetric workstation is under development based on an operational prototype. The workstation consists of multiple symbolic and numeric processors, binocular stereo color display generator with large image memory and liquid crystal shutter, voice input and output, a 3D pointer that uses projection lenses so that structures in 3 space can be touched directly, 3D hard copy using vectograph and lenticular printing, and presentation facilities using stereo 35mm slide and stereo video tape projection. Volumetric software includes a volume window manager, Mayo Clinic's Analyze program and our Digital Stereo Microscope (DSM) algorithms. The DSM uses stereo xray-like projections, rapidly oscillating motion and focal depth cues such that detail can be studied in the spatial context of the entire set of data. Focal depth cues are generated with a lens and apeture algorithm that generates a plane of sharp focus, and multiple stereo pairs each with a different plane of sharp focus are generated and stored in the large memory for interactive selection using a physical or symbolic depth selector. More recent work is studying non-linear focussing. Psychophysical studies are underway to understand how people perce ive images on a volumetric display and how accurately 3 dimensional structures can be quantitated from these displays.
C4I Community of Interest C2 Roadmap
2015-03-24
QoS -based services – Digital policy-based prioritization – Dynamic bandwidth allocation – Automated network management April 15 Slide 9...Co-Site Mitigation) NC-3 • LPD/LPI Comms NC-4 • Increased Range NC-7 • Increased Loss Tolerance & Recovery NC-7 • Mobile Ad Hoc Networking NC-8...Algorithms and Software • Systems and Processes Networks and Communications • Radios and Apertures • Networks • Information April 15 Slide 8
NASA Astrophysics Data System (ADS)
Kwon, Hyuk Ju; Yeon, Sang Hun; Lee, Keum Ho; Lee, Kwang Ho
2018-02-01
As various studies focusing on building energy saving have been continuously conducted, studies utilizing renewable energy sources, instead of fossil fuel, are needed. In particular, studies regarding solar energy are being carried out in the field of building science; in order to utilize such solar energy effectively, solar radiation being brought into the indoors should be acquired and blocked properly. Blinds are a typical solar radiation control device that is capable of controlling indoor thermal and light environments. However, slat-type blinds are manually controlled, giving a negative effect on building energy saving. In this regard, studies regarding the automatic control of slat-type blinds have been carried out for the last couple of decades. Therefore, this study aims to provide preliminary data for optimal control research through the controlling of slat angle in slat-type blinds by comprehensively considering various input variables. The window area ratio and orientation were selected as input variables. It was found that an optimal control algorithm was different among each window-to-wall ratio and window orientation. In addition, through comparing and analyzing the building energy saving performance for each condition by applying the developed algorithms to simulations, up to 20.7 % energy saving was shown in the cooling period and up to 12.3 % energy saving was shown in the heating period. In addition, building energy saving effect was greater as the window area ratio increased given the same orientation, and the effects of window-to-wall ratio in the cooling period were higher than those of window-to-wall ratio in the heating period.
Teo, Troy P; Ahmed, Syed Bilal; Kawalec, Philip; Alayoubi, Nadia; Bruce, Neil; Lyn, Ethan; Pistorius, Stephen
2018-02-01
The accurate prediction of intrafraction lung tumor motion is required to compensate for system latency in image-guided adaptive radiotherapy systems. The goal of this study was to identify an optimal prediction model that has a short learning period so that prediction and adaptation can commence soon after treatment begins, and requires minimal reoptimization for individual patients. Specifically, the feasibility of predicting tumor position using a combination of a generalized (i.e., averaged) neural network, optimized using historical patient data (i.e., tumor trajectories) obtained offline, coupled with the use of real-time online tumor positions (obtained during treatment delivery) was examined. A 3-layer perceptron neural network was implemented to predict tumor motion for a prediction horizon of 650 ms. A backpropagation algorithm and batch gradient descent approach were used to train the model. Twenty-seven 1-min lung tumor motion samples (selected from a CyberKnife patient dataset) were sampled at a rate of 7.5 Hz (0.133 s) to emulate the frame rate of an electronic portal imaging device (EPID). A sliding temporal window was used to sample the data for learning. The sliding window length was set to be equivalent to the first breathing cycle detected from each trajectory. Performing a parametric sweep, an averaged error surface of mean square errors (MSE) was obtained from the prediction responses of seven trajectories used for the training of the model (Group 1). An optimal input data size and number of hidden neurons were selected to represent the generalized model. To evaluate the prediction performance of the generalized model on unseen data, twenty tumor traces (Group 2) that were not involved in the training of the model were used for the leave-one-out cross-validation purposes. An input data size of 35 samples (4.6 s) and 20 hidden neurons were selected for the generalized neural network. An average sliding window length of 28 data samples was used. The average initial learning period prior to the availability of the first predicted tumor position was 8.53 ± 1.03 s. Average mean absolute error (MAE) of 0.59 ± 0.13 mm and 0.56 ± 0.18 mm were obtained from Groups 1 and 2, respectively, giving an overall MAE of 0.57 ± 0.17 mm. Average root-mean-square-error (RMSE) of 0.67 ± 0.36 for all the traces (0.76 ± 0.34 mm, Group 1 and 0.63 ± 0.36 mm, Group 2), is comparable to previously published results. Prediction errors are mainly due to the irregular periodicities between cycles. Since the errors from Groups 1 and 2 are within the same range, it demonstrates that this model can generalize and predict on unseen data. This is a first attempt to use an averaged MSE error surface (obtained from the prediction of different patients' tumor trajectories) to determine the parameters of a generalized neural network. This network could be deployed as a plug-and-play predictor for tumor trajectory during treatment delivery, eliminating the need for optimizing individual networks with pretreatment patient data. © 2017 American Association of Physicists in Medicine.
Control of equipment isolation system using wavelet-based hybrid sliding mode control
NASA Astrophysics Data System (ADS)
Huang, Shieh-Kung; Loh, Chin-Hsiung
2017-04-01
Critical non-structural equipment, including life-saving equipment in hospitals, circuit breakers, computers, high technology instrumentations, etc., is vulnerable to strong earthquakes, and on top of that, the failure of the vibration-sensitive equipment will cause severe economic loss. In order to protect vibration-sensitive equipment or machinery against strong earthquakes, various innovative control algorithms are developed to compensate the internal forces that to be applied. These new or improved control strategies, such as the control algorithms based on optimal control theory and sliding mode control (SMC), are also developed for structures engineering as a key element in smart structure technology. The optimal control theory, one of the most common methodologies in feedback control, finds control forces through achieving a certain optimal criterion by minimizing a cost function. For example, the linear-quadratic regulator (LQR) was the most popular control algorithm over the past three decades, and a number of modifications have been proposed to increase the efficiency of classical LQR algorithm. However, except to the advantage of simplicity and ease of implementation, LQR are susceptible to parameter uncertainty and modeling error due to complex nature of civil structures. Different from LQR control, a robust and easy to be implemented control algorithm, SMC has also been studied. SMC is a nonlinear control methodology that forces the structural system to slide along surfaces or boundaries; hence this control algorithm is naturally robust with respect to parametric uncertainties of a structure. Early attempts at protecting vibration-sensitive equipment were based on the use of existing control algorithms as described above. However, in recent years, researchers have tried to renew the existing control algorithms or developing a new control algorithm to adapt the complex nature of civil structures which include the control of both structures and non-structural components. The aim of this paper is to develop a hybrid control algorithm on the control of both structures and equipments simultaneously to overcome the limitations of classical feedback control through combining the advantage of classic LQR and SMC. To suppress vibrations with the frequency contents of strong earthquakes differing from the natural frequencies of civil structures, the hybrid control algorithms integrated with the wavelet-base vibration control algorithm is developed. The performance of classical, hybrid, and wavelet-based hybrid control algorithms as well as the responses of structure and non-structural components are evaluated and discussed through numerical simulation in this study.
Computed Tomography Window Blending: Feasibility in Thoracic Trauma.
Mandell, Jacob C; Wortman, Jeremy R; Rocha, Tatiana C; Folio, Les R; Andriole, Katherine P; Khurana, Bharti
2018-02-07
This study aims to demonstrate the feasibility of processing computed tomography (CT) images with a custom window blending algorithm that combines soft-tissue, bone, and lung window settings into a single image; to compare the time for interpretation of chest CT for thoracic trauma with window blending and conventional window settings; and to assess diagnostic performance of both techniques. Adobe Photoshop was scripted to process axial DICOM images from retrospective contrast-enhanced chest CTs performed for trauma with a window-blending algorithm. Two emergency radiologists independently interpreted the axial images from 103 chest CTs with both blended and conventional windows. Interpretation time and diagnostic performance were compared with Wilcoxon signed-rank test and McNemar test, respectively. Agreement with Nexus CT Chest injury severity was assessed with the weighted kappa statistic. A total of 13,295 images were processed without error. Interpretation was faster with window blending, resulting in a 20.3% time saving (P < .001), with no difference in diagnostic performance, within the power of the study to detect a difference in sensitivity of 5% as determined by post hoc power analysis. The sensitivity of the window-blended cases was 82.7%, compared to 81.6% for conventional windows. The specificity of the window-blended cases was 93.1%, compared to 90.5% for conventional windows. All injuries of major clinical significance (per Nexus CT Chest criteria) were correctly identified in all reading sessions, and all negative cases were correctly classified. All readers demonstrated near-perfect agreement with injury severity classification with both window settings. In this pilot study utilizing retrospective data, window blending allows faster preliminary interpretation of axial chest CT performed for trauma, with no significant difference in diagnostic performance compared to conventional window settings. Future studies would be required to assess the utility of window blending in clinical practice. Copyright © 2018 The Association of University Radiologists. All rights reserved.
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor; Essex, M
2015-05-01
To improve the methodology of HIV cluster analysis, we addressed how analysis of HIV clustering is associated with parameters that can affect the outcome of viral clustering. The extent of HIV clustering and tree certainty was compared between 401 HIV-1C near full-length genome sequences and subgenomic regions retrieved from the LANL HIV Database. Sliding window analysis was based on 99 windows of 1,000 bp and 45 windows of 2,000 bp. Potential associations between the extent of HIV clustering and sequence length and the number of variable and informative sites were evaluated. The near full-length genome HIV sequences showed the highest extent of HIV clustering and the highest tree certainty. At the bootstrap threshold of 0.80 in maximum likelihood (ML) analysis, 58.9% of near full-length HIV-1C sequences but only 15.5% of partial pol sequences (ViroSeq) were found in clusters. Among HIV-1 structural genes, pol showed the highest extent of clustering (38.9% at a bootstrap threshold of 0.80), although it was significantly lower than in the near full-length genome sequences. The extent of HIV clustering was significantly higher for sliding windows of 2,000 bp than 1,000 bp. We found a strong association between the sequence length and proportion of HIV sequences in clusters, and a moderate association between the number of variable and informative sites and the proportion of HIV sequences in clusters. In HIV cluster analysis, the extent of detectable HIV clustering is directly associated with the length of viral sequences used, as well as the number of variable and informative sites. Near full-length genome sequences could provide the most informative HIV cluster analysis. Selected subgenomic regions with a high extent of HIV clustering and high tree certainty could also be considered as a second choice.
Novitsky, Vlad; Moyo, Sikhulile; Lei, Quanhong; DeGruttola, Victor
2015-01-01
Abstract To improve the methodology of HIV cluster analysis, we addressed how analysis of HIV clustering is associated with parameters that can affect the outcome of viral clustering. The extent of HIV clustering and tree certainty was compared between 401 HIV-1C near full-length genome sequences and subgenomic regions retrieved from the LANL HIV Database. Sliding window analysis was based on 99 windows of 1,000 bp and 45 windows of 2,000 bp. Potential associations between the extent of HIV clustering and sequence length and the number of variable and informative sites were evaluated. The near full-length genome HIV sequences showed the highest extent of HIV clustering and the highest tree certainty. At the bootstrap threshold of 0.80 in maximum likelihood (ML) analysis, 58.9% of near full-length HIV-1C sequences but only 15.5% of partial pol sequences (ViroSeq) were found in clusters. Among HIV-1 structural genes, pol showed the highest extent of clustering (38.9% at a bootstrap threshold of 0.80), although it was significantly lower than in the near full-length genome sequences. The extent of HIV clustering was significantly higher for sliding windows of 2,000 bp than 1,000 bp. We found a strong association between the sequence length and proportion of HIV sequences in clusters, and a moderate association between the number of variable and informative sites and the proportion of HIV sequences in clusters. In HIV cluster analysis, the extent of detectable HIV clustering is directly associated with the length of viral sequences used, as well as the number of variable and informative sites. Near full-length genome sequences could provide the most informative HIV cluster analysis. Selected subgenomic regions with a high extent of HIV clustering and high tree certainty could also be considered as a second choice. PMID:25560745
Optimal tracking and second order sliding power control of the DFIG wind turbine
NASA Astrophysics Data System (ADS)
Abdeddaim, S.; Betka, A.; Charrouf, O.
2017-02-01
In the present paper, an optimal operation of a grid-connected variable speed wind turbine equipped with a Doubly Fed Induction Generator (DFIG) is presented. The proposed cascaded nonlinear controller is designed to perform two main objectives. In the outer loop, a maximum power point tracking (MPPT) algorithm based on fuzzy logic theory is designed to permanently extract the optimal aerodynamic energy, whereas in the inner loop, a second order sliding mode control (2-SM) is applied to achieve smooth regulation of both stator active and reactive powers quantities. The obtained simulation results show a permanent track of the MPP point regardless of the turbine power-speed slope moreover the proposed sliding mode control strategy presents attractive features such as chattering-free, compared to the conventional first order sliding technique (1-SM).
Smith predictor based-sliding mode controller for integrating processes with elevated deadtime.
Camacho, Oscar; De la Cruz, Francisco
2004-04-01
An approach to control integrating processes with elevated deadtime using a Smith predictor sliding mode controller is presented. A PID sliding surface and an integrating first-order plus deadtime model have been used to synthesize the controller. Since the performance of existing controllers with a Smith predictor decrease in the presence of modeling errors, this paper presents a simple approach to combining the Smith predictor with the sliding mode concept, which is a proven, simple, and robust procedure. The proposed scheme has a set of tuning equations as a function of the characteristic parameters of the model. For implementation of our proposed approach, computer based industrial controllers that execute PID algorithms can be used. The performance and robustness of the proposed controller are compared with the Matausek-Micić scheme for linear systems using simulations.
Automated detection of tuberculosis on sputum smeared slides using stepwise classification
NASA Astrophysics Data System (ADS)
Divekar, Ajay; Pangilinan, Corina; Coetzee, Gerrit; Sondh, Tarlochan; Lure, Fleming Y. M.; Kennedy, Sean
2012-03-01
Routine visual slide screening for identification of tuberculosis (TB) bacilli in stained sputum slides under microscope system is a tedious labor-intensive task and can miss up to 50% of TB. Based on the Shannon cofactor expansion on Boolean function for classification, a stepwise classification (SWC) algorithm is developed to remove different types of false positives, one type at a time, and to increase the detection of TB bacilli at different concentrations. Both bacilli and non-bacilli objects are first analyzed and classified into several different categories including scanty positive, high concentration positive, and several non-bacilli categories: small bright objects, beaded, dim elongated objects, etc. The morphological and contrast features are extracted based on aprior clinical knowledge. The SWC is composed of several individual classifiers. Individual classifier to increase the bacilli counts utilizes an adaptive algorithm based on a microbiologist's statistical heuristic decision process. Individual classifier to reduce false positive is developed through minimization from a binary decision tree to classify different types of true and false positive based on feature vectors. Finally, the detection algorithm is was tested on 102 independent confirmed negative and 74 positive cases. A multi-class task analysis shows high accordance rate for negative, scanty, and high-concentration as 88.24%, 56.00%, and 97.96%, respectively. A binary-class task analysis using a receiver operating characteristics method with the area under the curve (Az) is also utilized to analyze the performance of this detection algorithm, showing the superior detection performance on the high-concentration cases (Az=0.913) and cases mixed with high-concentration and scanty cases (Az=0.878).
NASA Astrophysics Data System (ADS)
Kong, Xiangxi; Zhang, Xueliang; Chen, Xiaozhe; Wen, Bangchun; Wang, Bo
2016-05-01
In this paper, phase and speed synchronization control of four eccentric rotors (ERs) driven by induction motors in a linear vibratory feeder with unknown time-varying load torques is studied. Firstly, the electromechanical coupling model of the linear vibratory feeder is established by associating induction motor's model with the dynamic model of the system, which is a typical under actuated model. According to the characteristics of the linear vibratory feeder, the complex control problem of the under actuated electromechanical coupling model converts to phase and speed synchronization control of four ERs. In order to keep the four ERs operating synchronously with zero phase differences, phase and speed synchronization controllers are designed by employing adaptive sliding mode control (ASMC) algorithm via a modified master-slave structure. The stability of the controllers is proved by Lyapunov stability theorem. The proposed controllers are verified by simulation via Matlab/Simulink program and compared with the conventional sliding mode control (SMC) algorithm. The results show the proposed controllers can reject the time-varying load torques effectively and four ERs can operate synchronously with zero phase differences. Moreover, the control performance is better than the conventional SMC algorithm and the chattering phenomenon is attenuated. Furthermore, the effects of reference speed and parametric perturbations are discussed to show the strong robustness of the proposed controllers. Finally, experiments on a simple vibratory test bench are operated by using the proposed controllers and without control, respectively, to validate the effectiveness of the proposed controllers further.
NASA Technical Reports Server (NTRS)
Moore, J. E.
1975-01-01
An enumeration algorithm is presented for solving a scheduling problem similar to the single machine job shop problem with sequence dependent setup times. The scheduling problem differs from the job shop problem in two ways. First, its objective is to select an optimum subset of the available tasks to be performed during a fixed period of time. Secondly, each task scheduled is constrained to occur within its particular scheduling window. The algorithm is currently being used to develop typical observational timelines for a telescope that will be operated in earth orbit. Computational times associated with timeline development are presented.
Visual Recognition Software for Binary Classification and its Application to Pollen Identification
NASA Astrophysics Data System (ADS)
Punyasena, S. W.; Tcheng, D. K.; Nayak, A.
2014-12-01
An underappreciated source of uncertainty in paleoecology is the uncertainty of palynological identifications. The confidence of any given identification is not regularly reported in published results, so cannot be incorporated into subsequent meta-analyses. Automated identifications systems potentially provide a means of objectively measuring the confidence of a given count or single identification, as well as a mechanism for increasing sample sizes and throughput. We developed the software ARLO (Automated Recognition with Layered Optimization) to tackle difficult visual classification problems such as pollen identification. ARLO applies pattern recognition and machine learning to the analysis of pollen images. The features that the system discovers are not the traditional features of pollen morphology. Instead, general purpose image features, such as pixel lines and grids of different dimensions, size, spacing, and resolution, are used. ARLO adapts to a given problem by searching for the most effective combination of feature representation and learning strategy. We present a two phase approach which uses our machine learning process to first segment pollen grains from the background and then classify pollen pixels and report species ratios. We conducted two separate experiments that utilized two distinct sets of algorithms and optimization procedures. The first analysis focused on reconstructing black and white spruce pollen ratios, training and testing our classification model at the slide level. This allowed us to directly compare our automated counts and expert counts to slides of known spruce ratios. Our second analysis focused on maximizing classification accuracy at the individual pollen grain level. Instead of predicting ratios of given slides, we predicted the species represented in a given image window. The resulting analysis was more scalable, as we were able to adapt the most efficient parts of the methodology from our first analysis. ARLO was able to distinguish between the pollen of black and white spruce with an accuracy of ~83.61%. This compared favorably to human expert performance. At the writing of this abstract, we are also experimenting with experimenting with the analysis of higher diversity samples, including modern tropical pollen material collected from ground pollen traps.
McMahon, Ryan; Berbeco, Ross; Nishioka, Seiko; Ishikawa, Masayori; Papiez, Lech
2008-09-01
An MLC control algorithm for delivering intensity modulated radiation therapy (IMRT) to targets that are undergoing two-dimensional (2D) rigid motion in the beam's eye view (BEV) is presented. The goal of this method is to deliver 3D-derived fluence maps over a moving patient anatomy. Target motion measured prior to delivery is first used to design a set of planned dynamic-MLC (DMLC) sliding-window leaf trajectories. During actual delivery, the algorithm relies on real-time feedback to compensate for target motion that does not agree with the motion measured during planning. The methodology is based on an existing one-dimensional (ID) algorithm that uses on-the-fly intensity calculations to appropriately adjust the DMLC leaf trajectories in real-time during exposure delivery [McMahon et al., Med. Phys. 34, 3211-3223 (2007)]. To extend the 1D algorithm's application to 2D target motion, a real-time leaf-pair shifting mechanism has been developed. Target motion that is orthogonal to leaf travel is tracked by appropriately shifting the positions of all MLC leaves. The performance of the tracking algorithm was tested for a single beam of a fractionated IMRT treatment, using a clinically derived intensity profile and a 2D target trajectory based on measured patient data. Comparisons were made between 2D tracking, 1D tracking, and no tracking. The impact of the tracking lag time and the frequency of real-time imaging were investigated. A study of the dependence of the algorithm's performance on the level of agreement between the motion measured during planning and delivery was also included. Results demonstrated that tracking both components of the 2D motion (i.e., parallel and orthogonal to leaf travel) results in delivered fluence profiles that are superior to those that track the component of motion that is parallel to leaf travel alone. Tracking lag time effects may lead to relatively large intensity delivery errors compared to the other sources of error investigated. However, the algorithm presented is robust in the sense that it does not rely on a high level of agreement between the target motion measured during treatment planning and delivery.
Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding
NASA Astrophysics Data System (ADS)
Dung, Lan-Rong; Lin, Meng-Chun
This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kan, Monica W.K., E-mail: kanwkm@ha.org.hk; Department of Physics and Materials Science, City University of Hong Kong, Hong Kong; Leung, Lucullus H.T.
2013-01-01
Purpose: To assess the dosimetric implications for the intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy with RapidArc (RA) of nasopharyngeal carcinomas (NPC) due to the use of the Acuros XB (AXB) algorithm versus the anisotropic analytical algorithm (AAA). Methods and Materials: Nine-field sliding window IMRT and triple-arc RA plans produced for 12 patients with NPC using AAA were recalculated using AXB. The dose distributions to multiple planning target volumes (PTVs) with different prescribed doses and critical organs were compared. The PTVs were separated into components in bone, air, and tissue. The change of doses by AXB duemore » to air and bone, and the variation of the amount of dose changes with number of fields was also studied using simple geometric phantoms. Results: Using AXB instead of AAA, the averaged mean dose to PTV{sub 70} (70 Gy was prescribed to PTV{sub 70}) was found to be 0.9% and 1.2% lower for IMRT and RA, respectively. It was approximately 1% lower in tissue, 2% lower in bone, and 1% higher in air. The averaged minimum dose to PTV{sub 70} in bone was approximately 4% lower for both IMRT and RA, whereas it was approximately 1.5% lower for PTV{sub 70} in tissue. The decrease in target doses estimated by AXB was mostly contributed from the presence of bone, less from tissue, and none from air. A similar trend was observed for PTV{sub 60} (60 Gy was prescribed to PTV{sub 60}). The doses to most serial organs were found to be 1% to 3% lower and to other organs 4% to 10% lower for both techniques. Conclusions: The use of the AXB algorithm is highly recommended for IMRT and RapidArc planning for NPC cases.« less
Pre-Launch Performance Assessment of the VIIRS Land Surface Temperature Environmental Data Record
NASA Astrophysics Data System (ADS)
Hauss, B.; Ip, J.; Agravante, H.
2009-12-01
The Visible/Infrared Imager Radiometer Suite (VIIRS) Land Surface Temperature (LST) Environmental Data Record (EDR) provides the surface temperature of land surface including coastal and inland-water pixels at VIIRS moderate resolution (750m) during both day and night. To predict the LST under optimal conditions, the retrieval algorithm utilizes a dual split-window approach with both Short-wave Infrared (SWIR) channels at 3.70 µm (M12) and 4.05 µm (M13), and Long-wave Infrared (LWIR) channels at 10.76 µm (M15) and 12.01 µm (M16) to correct for atmospheric water vapor. Under less optimal conditions, the algorithm uses a fallback split-window approach with M15 and M16 channels. By comparison, the MODIS generalized split-window algorithm only uses the LWIR bands in the retrieval of surface temperature because of the concern for both solar contamination and large emissivity variations in the SWIR bands. In this paper, we assess whether these concerns are real and whether there is an impact on the precision and accuracy of the LST retrieval. The algorithm relies on the VIIRS Cloud Mask IP for identifying cloudy and ocean pixels, the VIIRS Surface Type EDR for identifying the IGBP land cover type for the pixels, and the VIIRS Aerosol Optical Thickness (AOT) IP for excluding pixels with AOT greater than 1.0. In this paper, we will report the pre-launch performance assessment of the LST EDR based on global synthetic data and proxy data from Terra MODIS. Results of both the split-window and dual split-window algorithms will be assessed by comparison either to synthetic "truth" or results of the MODIS retrieval. We will also show that the results of the assessment with proxy data are consistent with those obtained using the global synthetic data.
Analysing the Effects of Different Land Cover Types on Land Surface Temperature Using Satellite Data
NASA Astrophysics Data System (ADS)
Şekertekin, A.; Kutoglu, Ş. H.; Kaya, S.; Marangoz, A. M.
2015-12-01
Monitoring Land Surface Temperature (LST) via remote sensing images is one of the most important contributions to climatology. LST is an important parameter governing the energy balance on the Earth and it also helps us to understand the behavior of urban heat islands. There are lots of algorithms to obtain LST by remote sensing techniques. The most commonly used algorithms are split-window algorithm, temperature/emissivity separation method, mono-window algorithm and single channel method. In this research, mono window algorithm was implemented to Landsat 5 TM image acquired on 28.08.2011. Besides, meteorological data such as humidity and temperature are used in the algorithm. Moreover, high resolution Geoeye-1 and Worldview-2 images acquired on 29.08.2011 and 12.07.2013 respectively were used to investigate the relationships between LST and land cover type. As a result of the analyses, area with vegetation cover has approximately 5 ºC lower temperatures than the city center and arid land., LST values change about 10 ºC in the city center because of different surface properties such as reinforced concrete construction, green zones and sandbank. The temperature around some places in thermal power plant region (ÇATES and ZETES) Çatalağzı, is about 5 ºC higher than city center. Sandbank and agricultural areas have highest temperature due to the land cover structure.
Ren, Peng; Qian, Jiansheng
2016-01-01
This study proposes a novel power-efficient and anti-fading clustering based on a cross-layer that is specific to the time-varying fading characteristics of channels in the monitoring of coal mine faces with wireless sensor networks. The number of active sensor nodes and a sliding window are set up such that the optimal number of cluster heads (CHs) is selected in each round. Based on a stable expected number of CHs, we explore the channel efficiency between nodes and the base station by using a probe frame and the joint surplus energy in assessing the CH selection. Moreover, the sending power of a node in different periods is regulated by the signal fade margin method. The simulation results demonstrate that compared with several common algorithms, the power-efficient and fading-aware clustering with a cross-layer (PEAFC-CL) protocol features a stable network topology and adaptability under signal time-varying fading, which effectively prolongs the lifetime of the network and reduces network packet loss, thus making it more applicable to the complex and variable environment characteristic of a coal mine face. PMID:27338380
NASA Astrophysics Data System (ADS)
He, G.; Xia, Z.; Chen, H.; Li, K.; Zhao, Z.; Guo, Y.; Feng, P.
2018-04-01
Real-time ship detection using synthetic aperture radar (SAR) plays a vital role in disaster emergency and marine security. Especially the high resolution and wide swath (HRWS) SAR images, provides the advantages of high resolution and wide swath synchronously, significantly promotes the wide area ocean surveillance performance. In this study, a novel method is developed for ship target detection by using the HRWS SAR images. Firstly, an adaptive sliding window is developed to propose the suspected ship target areas, based upon the analysis of SAR backscattering intensity images. Then, backscattering intensity and texture features extracted from the training samples of manually selected ship and non-ship slice images, are used to train a support vector machine (SVM) to classify the proposed ship slice images. The approach is verified by using the Sentinl1A data working in interferometric wide swath mode. The results demonstrate the improvement performance of the proposed method over the constant false alarm rate (CFAR) method, where the classification accuracy improved from 88.5 % to 96.4 % and the false alarm rate mitigated from 11.5 % to 3.6 % compared with CFAR respectively.
Probability based remaining capacity estimation using data-driven and neural network model
NASA Astrophysics Data System (ADS)
Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai
2016-05-01
Since large numbers of lithium-ion batteries are composed in pack and the batteries are complex electrochemical devices, their monitoring and safety concerns are key issues for the applications of battery technology. An accurate estimation of battery remaining capacity is crucial for optimization of the vehicle control, preventing battery from over-charging and over-discharging and ensuring the safety during its service life. The remaining capacity estimation of a battery includes the estimation of state-of-charge (SOC) and state-of-energy (SOE). In this work, a probability based adaptive estimator is presented to obtain accurate and reliable estimation results for both SOC and SOE. For the SOC estimation, an n ordered RC equivalent circuit model is employed by combining an electrochemical model to obtain more accurate voltage prediction results. For the SOE estimation, a sliding window neural network model is proposed to investigate the relationship between the terminal voltage and the model inputs. To verify the accuracy and robustness of the proposed model and estimation algorithm, experiments under different dynamic operation current profiles are performed on the commercial 1665130-type lithium-ion batteries. The results illustrate that accurate and robust estimation can be obtained by the proposed method.
4. EXTERIOR OF SOUTH END OF BUILDING 103 SHOWING 1LIGHT ...
4. EXTERIOR OF SOUTH END OF BUILDING 103 SHOWING 1-LIGHT SIDE EXIT DOOR AND ORIGINAL WOOD-FRAMED SLIDING GLASS KITCHEN WINDOWS AT PHOTO LEFT, CRISS-CROSS WOOD BALUSTRADE AROUND FRONT PORCH WITH OPEN DOOWAY TO BASEMENT BENEATH, AND STONE FACING ALONG ORIGINAL PORTION OF HOUSE FRONT AT PHOTO RIGHT. VIEW TO WEST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
High Temperature Tribometer. Phase 1
1989-06-01
13 Figure 2.3.2 Setpoint and Gain Windows in FW.EXE ......... . Figure 2.4.1 Data-Flow Diagram for Data-Acquisition Module ..... .. 23 I Figure...mounted in a friction force measuring device. Optimally , material testing results should not be test machine sensitiye; but due to equipment variables...fixed. The friction force due to sliding should be continuously measured. This is optimally done in conjunction with the normal force measurement via
Preprocessing the Nintendo Wii Board Signal to Derive More Accurate Descriptors of Statokinesigrams.
Audiffren, Julien; Contal, Emile
2016-08-01
During the past few years, the Nintendo Wii Balance Board (WBB) has been used in postural control research as an affordable but less reliable replacement for laboratory grade force platforms. However, the WBB suffers some limitations, such as a lower accuracy and an inconsistent sampling rate. In this study, we focus on the latter, namely the non uniform acquisition frequency. We show that this problem, combined with the poor signal to noise ratio of the WBB, can drastically decrease the quality of the obtained information if not handled properly. We propose a new resampling method, Sliding Window Average with Relevance Interval Interpolation (SWARII), specifically designed with the WBB in mind, for which we provide an open source implementation. We compare it with several existing methods commonly used in postural control, both on synthetic and experimental data. The results show that some methods, such as linear and piecewise constant interpolations should definitely be avoided, particularly when the resulting signal is differentiated, which is necessary to estimate speed, an important feature in postural control. Other methods, such as averaging on sliding windows or SWARII, perform significantly better on synthetic dataset, and produce results more similar to the laboratory-grade AMTI force plate (AFP) during experiments. Those methods should be preferred when resampling data collected from a WBB.
A Genome-Wide Scan for Breast Cancer Risk Haplotypes among African American Women
Song, Chi; Chen, Gary K.; Millikan, Robert C.; Ambrosone, Christine B.; John, Esther M.; Bernstein, Leslie; Zheng, Wei; Hu, Jennifer J.; Ziegler, Regina G.; Nyante, Sarah; Bandera, Elisa V.; Ingles, Sue A.; Press, Michael F.; Deming, Sandra L.; Rodriguez-Gil, Jorge L.; Chanock, Stephen J.; Wan, Peggy; Sheng, Xin; Pooler, Loreall C.; Van Den Berg, David J.; Le Marchand, Loic; Kolonel, Laurence N.; Henderson, Brian E.; Haiman, Chris A.; Stram, Daniel O.
2013-01-01
Genome-wide association studies (GWAS) simultaneously investigating hundreds of thousands of single nucleotide polymorphisms (SNP) have become a powerful tool in the investigation of new disease susceptibility loci. Haplotypes are sometimes thought to be superior to SNPs and are promising in genetic association analyses. The application of genome-wide haplotype analysis, however, is hindered by the complexity of haplotypes themselves and sophistication in computation. We systematically analyzed the haplotype effects for breast cancer risk among 5,761 African American women (3,016 cases and 2,745 controls) using a sliding window approach on the genome-wide scale. Three regions on chromosomes 1, 4 and 18 exhibited moderate haplotype effects. Furthermore, among 21 breast cancer susceptibility loci previously established in European populations, 10p15 and 14q24 are likely to harbor novel haplotype effects. We also proposed a heuristic of determining the significance level and the effective number of independent tests by the permutation analysis on chromosome 22 data. It suggests that the effective number was approximately half of the total (7,794 out of 15,645), thus the half number could serve as a quick reference to evaluating genome-wide significance if a similar sliding window approach of haplotype analysis is adopted in similar populations using similar genotype density. PMID:23468962
Development of daily "swath" mascon solutions from GRACE
NASA Astrophysics Data System (ADS)
Save, Himanshu; Bettadpur, Srinivas
2016-04-01
The Gravity Recovery and Climate Experiment (GRACE) mission has provided invaluable and the only data of its kind over the past 14 years that measures the total water column in the Earth System. The GRACE project provides monthly average solutions and there are experimental quick-look solutions and regularized sliding window solutions available from Center for Space Research (CSR) that implement a sliding window approach and variable daily weights. The need for special handling of these solutions in data assimilation and the possibility of capturing the total water storage (TWS) signal at sub-monthly time scales motivated this study. This study discusses the progress of the development of true daily high resolution "swath" mascon total water storage estimate from GRACE using Tikhonov regularization. These solutions include the estimates of daily total water storage (TWS) for the mascon elements that were "observed" by the GRACE satellites on a given day. This paper discusses the computation techniques, signal, error and uncertainty characterization of these daily solutions. We discuss the comparisons with the official GRACE RL05 solutions and with CSR mascon solution to characterize the impact on science results especially at the sub-monthly time scales. The evaluation is done with emphasis on the temporal signal characteristics and validated against in-situ data set and multiple models.
Preprocessing the Nintendo Wii Board Signal to Derive More Accurate Descriptors of Statokinesigrams
Audiffren, Julien; Contal, Emile
2016-01-01
During the past few years, the Nintendo Wii Balance Board (WBB) has been used in postural control research as an affordable but less reliable replacement for laboratory grade force platforms. However, the WBB suffers some limitations, such as a lower accuracy and an inconsistent sampling rate. In this study, we focus on the latter, namely the non uniform acquisition frequency. We show that this problem, combined with the poor signal to noise ratio of the WBB, can drastically decrease the quality of the obtained information if not handled properly. We propose a new resampling method, Sliding Window Average with Relevance Interval Interpolation (SWARII), specifically designed with the WBB in mind, for which we provide an open source implementation. We compare it with several existing methods commonly used in postural control, both on synthetic and experimental data. The results show that some methods, such as linear and piecewise constant interpolations should definitely be avoided, particularly when the resulting signal is differentiated, which is necessary to estimate speed, an important feature in postural control. Other methods, such as averaging on sliding windows or SWARII, perform significantly better on synthetic dataset, and produce results more similar to the laboratory-grade AMTI force plate (AFP) during experiments. Those methods should be preferred when resampling data collected from a WBB. PMID:27490545
NASA Astrophysics Data System (ADS)
Tuan, Le Anh; Lee, Soon-Geul
2018-03-01
In this study, a new mathematical model of crawler cranes is developed for heavy working conditions, with payload-lifting and boom-hoisting motions simultaneously activated. The system model is built with full consideration of wind disturbances, geometrical nonlinearities, and cable elasticities of cargo lifting and boom luffing. On the basis of this dynamic model, three versions of sliding mode control are analyzed and designed to control five system outputs with only two inputs. When used in complicated operations, the effectiveness of the controllers is analyzed using analytical investigation and numerical simulation. Results indicate the effectiveness of the control algorithms and the proposed dynamic model. The control algorithms asymptotically stabilize the system with finite-time convergences, remaining robust amid disturbances and parametric uncertainties.
Multiple objects tracking with HOGs matching in circular windows
NASA Astrophysics Data System (ADS)
Miramontes-Jaramillo, Daniel; Kober, Vitaly; Díaz-Ramírez, Víctor H.
2014-09-01
In recent years tracking applications with development of new technologies like smart TVs, Kinect, Google Glass and Oculus Rift become very important. When tracking uses a matching algorithm, a good prediction algorithm is required to reduce the search area for each object to be tracked as well as processing time. In this work, we analyze the performance of different tracking algorithms based on prediction and matching for a real-time tracking multiple objects. The used matching algorithm utilizes histograms of oriented gradients. It carries out matching in circular windows, and possesses rotation invariance and tolerance to viewpoint and scale changes. The proposed algorithm is implemented in a personal computer with GPU, and its performance is analyzed in terms of processing time in real scenarios. Such implementation takes advantage of current technologies and helps to process video sequences in real-time for tracking several objects at the same time.
Zhang, Yong; Li, Yuan; Rong, Zhi-Guo
2010-06-01
Remote sensors' channel spectral response function (SRF) was one of the key factors to influence the quantitative products' inversion algorithm, accuracy and the geophysical characteristics. Aiming at the adjustments of FY-2E's split window channels' SRF, detailed comparisons between the FY-2E and FY-2C corresponding channels' SRF differences were carried out based on three data collections: the NOAA AVHRR corresponding channels' calibration look up tables, field measured water surface radiance and atmospheric profiles at Lake Qinghai and radiance calculated from the PLANK function within all dynamic range of FY-2E/C. The results showed that the adjustments of FY-2E's split window channels' SRF would result in the spectral range's movements and influence the inversion algorithms of some ground quantitative products. On the other hand, these adjustments of FY-2E SRFs would increase the brightness temperature differences between FY-2E's two split window channels within all dynamic range relative to FY-2C's. This would improve the inversion ability of FY-2E's split window channels.
An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data
NASA Astrophysics Data System (ADS)
Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.
2016-12-01
QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.
Sliding mode controller for a photovoltaic pumping system
NASA Astrophysics Data System (ADS)
ElOugli, A.; Miqoi, S.; Boutouba, M.; Tidhaf, B.
2017-03-01
In this paper, a sliding mode control scheme (SMC) for maximum power point tracking controller for a photovoltaic pumping system, is proposed. The main goal is to maximize the flow rate for a water pump, by forcing the photovoltaic system to operate in its MPP, to obtain the maximum power that a PV system can deliver.And this, through the intermediary of a sliding mode controller to track and control the MPP by overcoming the power oscillation around the operating point, which appears in most implemented MPPT techniques. The sliding mode control approach is recognized as one of the efficient and powerful tools for nonlinear systems under uncertainty conditions.The proposed controller with photovoltaic pumping system is designed and simulated using MATLAB/SIMULINK environment. In addition, to evaluate its performances, a classical MPPT algorithm using perturb and observe (P&O) has been used for the same system to compare to our controller. Simulation results are shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domino, Stefan P.
2017-12-01
This milestone was focused on deploying and verifying a “sliding-mesh interface,” and establishing baseline timings for blade-resolved simulations of a sub-MW-scale turbine. In the ExaWind project, we are developing both sliding-mesh and overset-mesh approaches for handling the rotating blades in an operating wind turbine. In the sliding-mesh approach, the turbine rotor and its immediate surrounding fluid are captured in a “disk” that is embedded in the larger fluid domain. The embedded fluid is simulated in a coordinate system that rotates with the rotor. It is important that the coupling algorithm (and its implementation) between the rotating and inertial discrete modelsmore » maintains the accuracy of the numerical methods on either side of the interface, i.e., the interface is “design order.”« less
Sliding mode control based on Kalman filter dynamic estimation of battery SOC
NASA Astrophysics Data System (ADS)
He, Dongmeia; Hou, Enguang; Qiao, Xin; Liu, Guangmin
2018-06-01
Lithium-ion battery charge state of the accurate and rapid estimation of battery management system is the key technology. In this paper, an exponentially reaching law sliding-mode variable structure control algorithm based on Kalman filter is proposed to estimate the state of charge of Li-ion battery for the dynamic nonlinear system. The RC equivalent circuit model is established, and the model equation with specific structure is given. The proposed Kalman filter sliding mode structure is used to estimate the state of charge of the battery in the battery model, and the jitter effect can be avoided and the estimation performance can be improved. The simulation results show that the proposed Kalman filter sliding mode control has good accuracy in estimating the state of charge of the battery compared with the ordinary Kalman filter, and the error range is within 3%.
Modelling the regulatory system for diabetes mellitus with a threshold window
NASA Astrophysics Data System (ADS)
Yang, Jin; Tang, Sanyi; Cheke, Robert A.
2015-05-01
Piecewise (or non-smooth) glucose-insulin models with threshold windows for type 1 and type 2 diabetes mellitus are proposed and analyzed with a view to improving understanding of the glucose-insulin regulatory system. For glucose-insulin models with a single threshold, the existence and stability of regular, virtual, pseudo-equilibria and tangent points are addressed. Then the relations between regular equilibria and a pseudo-equilibrium are studied. Furthermore, the sufficient and necessary conditions for the global stability of regular equilibria and the pseudo-equilibrium are provided by using qualitative analysis techniques of non-smooth Filippov dynamic systems. Sliding bifurcations related to boundary node bifurcations were investigated with theoretical and numerical techniques, and insulin clinical therapies are discussed. For glucose-insulin models with a threshold window, the effects of glucose thresholds or the widths of threshold windows on the durations of insulin therapy and glucose infusion were addressed. The duration of the effects of an insulin injection is sensitive to the variation of thresholds. Our results indicate that blood glucose level can be maintained within a normal range using piecewise glucose-insulin models with a single threshold or a threshold window. Moreover, our findings suggest that it is critical to individualise insulin therapy for each patient separately, based on initial blood glucose levels.
Tang, Bohui; Bi, Yuyun; Li, Zhao-Liang; Xia, Jun
2008-01-01
On the basis of the radiative transfer theory, this paper addressed the estimate of Land Surface Temperature (LST) from the Chinese first operational geostationary meteorological satellite-FengYun-2C (FY-2C) data in two thermal infrared channels (IR1, 10.3-11.3 μm and IR2, 11.5-12.5 μm), using the Generalized Split-Window (GSW) algorithm proposed by Wan and Dozier (1996). The coefficients in the GSW algorithm corresponding to a series of overlapping ranging of the mean emissivity, the atmospheric Water Vapor Content (WVC), and the LST were derived using a statistical regression method from the numerical values simulated with an accurate atmospheric radiative transfer model MODTRAN 4 over a wide range of atmospheric and surface conditions. The simulation analysis showed that the LST could be estimated by the GSW algorithm with the Root Mean Square Error (RMSE) less than 1 K for the sub-ranges with the Viewing Zenith Angle (VZA) less than 30° or for the sub-rangs with VZA less than 60° and the atmospheric WVC less than 3.5 g/cm2 provided that the Land Surface Emissivities (LSEs) are known. In order to determine the range for the optimum coefficients of the GSW algorithm, the LSEs could be derived from the data in MODIS channels 31 and 32 provided by MODIS/Terra LST product MOD11B1, or be estimated either according to the land surface classification or using the method proposed by Jiang et al. (2006); and the WVC could be obtained from MODIS total precipitable water product MOD05, or be retrieved using Li et al.' method (2003). The sensitivity and error analyses in term of the uncertainty of the LSE and WVC as well as the instrumental noise were performed. In addition, in order to compare the different formulations of the split-window algorithms, several recently proposed split-window algorithms were used to estimate the LST with the same simulated FY-2C data. The result of the intercomparsion showed that most of the algorithms give comparable results. PMID:27879744
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John; Iredell, Lena
2011-01-01
The Goddard DISC has generated products derived from AIRS/AMSU-A observations, starting from September 2002 when the AIRS instrument became stable, using the AIRS Science Team Version-5 retrieval algorithm. The AIRS Science Team Version-6 retrieval algorithm will be finalized in September 2011. This paper describes some of the significant improvements contained in the Version-6 retrieval algorithm, compared to that used in Version-5, with an emphasis on the improvement of atmospheric temperature profiles, ocean and land surface skin temperatures, and ocean and land surface spectral emissivities. AIRS contains 2378 spectral channels covering portions of the spectral region 650 cm(sup -1) (15.38 micrometers) - 2665 cm(sup -1) (3.752 micrometers). These spectral regions contain significant absorption features from two CO2 absorption bands, the 15 micrometers (longwave) CO2 band, and the 4.3 micrometers (shortwave) CO2 absorption band. There are also two atmospheric window regions, the 12 micrometer - 8 micrometer (longwave) window, and the 4.17 micrometer - 3.75 micrometer (shortwave) window. Historically, determination of surface and atmospheric temperatures from satellite observations was performed using primarily observations in the longwave window and CO2 absorption regions. According to cloud clearing theory, more accurate soundings of both surface skin and atmospheric temperatures can be obtained under partial cloud cover conditions if one uses observations in longwave channels to determine coefficients which generate cloud cleared radiances R(sup ^)(sub i) for all channels, and uses R(sup ^)(sub i) only from shortwave channels in the determination of surface and atmospheric temperatures. This procedure is now being used in the AIRS Version-6 Retrieval Algorithm. Results are presented for both daytime and nighttime conditions showing improved Version-6 surface and atmospheric soundings under partial cloud cover.
Han, Buhm; Kang, Hyun Min; Eskin, Eleazar
2009-01-01
With the development of high-throughput sequencing and genotyping technologies, the number of markers collected in genetic association studies is growing rapidly, increasing the importance of methods for correcting for multiple hypothesis testing. The permutation test is widely considered the gold standard for accurate multiple testing correction, but it is often computationally impractical for these large datasets. Recently, several studies proposed efficient alternative approaches to the permutation test based on the multivariate normal distribution (MVN). However, they cannot accurately correct for multiple testing in genome-wide association studies for two reasons. First, these methods require partitioning of the genome into many disjoint blocks and ignore all correlations between markers from different blocks. Second, the true null distribution of the test statistic often fails to follow the asymptotic distribution at the tails of the distribution. We propose an accurate and efficient method for multiple testing correction in genome-wide association studies—SLIDE. Our method accounts for all correlation within a sliding window and corrects for the departure of the true null distribution of the statistic from the asymptotic distribution. In simulations using the Wellcome Trust Case Control Consortium data, the error rate of SLIDE's corrected p-values is more than 20 times smaller than the error rate of the previous MVN-based methods' corrected p-values, while SLIDE is orders of magnitude faster than the permutation test and other competing methods. We also extend the MVN framework to the problem of estimating the statistical power of an association study with correlated markers and propose an efficient and accurate power estimation method SLIP. SLIP and SLIDE are available at http://slide.cs.ucla.edu. PMID:19381255
Testing a new Free Core Nutation empirical model
NASA Astrophysics Data System (ADS)
Belda, Santiago; Ferrándiz, José M.; Heinkelmann, Robert; Nilsson, Tobias; Schuh, Harald
2016-03-01
The Free Core Nutation (FCN) is a free mode of the Earth's rotation caused by the different material characteristics of the Earth's core and mantle. This causes the rotational axes of those layers to slightly diverge from each other, resulting in a wobble of the Earth's rotation axis comparable to nutations. In this paper we focus on estimating empirical FCN models using the observed nutations derived from the VLBI sessions between 1993 and 2013. Assuming a fixed value for the oscillation period, the time-variable amplitudes and phases are estimated by means of multiple sliding window analyses. The effects of using different a priori Earth Rotation Parameters (ERP) in the derivation of models are also addressed. The optimal choice of the fundamental parameters of the model, namely the window width and step-size of its shift, is searched by performing a thorough experimental analysis using real data. The former analyses lead to the derivation of a model with a temporal resolution higher than the one used in the models currently available, with a sliding window reduced to 400 days and a day-by-day shift. It is shown that this new model increases the accuracy of the modeling of the observed Earth's rotation. Besides, empirical models determined from USNO Finals as a priori ERP present a slightly lower Weighted Root Mean Square (WRMS) of residuals than IERS 08 C04 along the whole period of VLBI observations, according to our computations. The model is also validated through comparisons with other recognized models. The level of agreement among them is satisfactory. Let us remark that our estimates give rise to the lowest residuals and seem to reproduce the FCN signal in more detail.
Hudson, Nicholas J; Naval-Sánchez, Marina; Porto-Neto, Laercio; Pérez-Enciso, Miguel; Reverter, Antonio
2018-06-05
Asian and European wild boars were independently domesticated ca. 10,000 years ago. Since the 17th century, Chinese breeds have been imported to Europe to improve the genetics of European animals by introgression of favourable alleles, resulting in a complex mosaic of haplotypes. To interrogate the structure of these haplotypes further, we have run a new haplotype segregation analysis based on information theory, namely compression efficiency (CE). We applied the approach to sequence data from individuals from each phylogeographic region (n = 23 from Asia and Europe) including a number of major pig breeds. Our genome-wide CE is able to discriminate the breeds in a manner reflecting phylogeography. Furthermore, 24,956 non-overlapping sliding windows (each comprising 1,000 consecutive SNP) were quantified for extent of haplotype sharing within and between Asia and Europe. The genome-wide distribution of extent of haplotype sharing was quite different between groups. Unlike European pigs, Asian pigs haplotype sharing approximates a normal distribution. In line with this, we found the European breeds possessed a number of genomic windows of dramatically higher haplotype sharing than the Asian breeds. Our CE analysis of sliding windows capture some of the genomic regions reported to contain signatures of selection in domestic pigs. Prominent among these regions, we highlight the role of a gene encoding the mitochondrial enzyme LACTB which has been associated with obesity, and the gene encoding MYOG a fundamental transcriptional regulator of myogenesis. The origin of these regions likely reflects either a population bottleneck in European animals, or selective targets on commercial phenotypes reducing allelic diversity in particular genes and/or regulatory regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Geun Young; Steemers, Koen
2010-07-15
This paper investigates occupant behaviour of window-use in night-time naturally ventilated offices on the basis of a pilot field study, conducted during the summers of 2006 and 2007 in Cambridge, UK, and then demonstrates the effects of employing night-time ventilation on indoor thermal conditions using predictive models of occupant window-use. A longitudinal field study shows that occupants make good use of night-time natural ventilation strategies when provided with openings that allow secure ventilation, and that there is a noticeable time of day effect in window-use patterns (i.e. increased probability of action on arrival and departure). We develop logistic models ofmore » window-use for night-time naturally ventilated offices, which are subsequently applied to a behaviour algorithm, including Markov chains and Monte Carlo methods. The simulations using the behaviour algorithm demonstrate a good agreement with the observational data of window-use, and reveal how building design and occupant behaviour collectively affect the thermal performance of offices. They illustrate that the provision of secure ventilation leads to more frequent use of the window, and thus contributes significantly to the achievement of a comfortable indoor environment during the daytime occupied period. For example, the maximum temperature for a night-time ventilated office is found to be 3 C below the predicted value for a daytime-only ventilated office. (author)« less
Accurate identification of microseismic P- and S-phase arrivals using the multi-step AIC algorithm
NASA Astrophysics Data System (ADS)
Zhu, Mengbo; Wang, Liguan; Liu, Xiaoming; Zhao, Jiaxuan; Peng, Ping'an
2018-03-01
Identification of P- and S-phase arrivals is the primary work in microseismic monitoring. In this study, a new multi-step AIC algorithm is proposed. This algorithm consists of P- and S-phase arrival pickers (P-picker and S-picker). The P-picker contains three steps: in step 1, a preliminary P-phase arrival window is determined by the waveform peak. Then a preliminary P-pick is identified using the AIC algorithm. Finally, the P-phase arrival window is narrowed based on the above P-pick. Thus the P-phase arrival can be identified accurately by using the AIC algorithm again. The S-picker contains five steps: in step 1, a narrow S-phase arrival window is determined based on the P-pick and the AIC curve of amplitude biquadratic time-series. In step 2, the S-picker automatically judges whether the S-phase arrival is clear to identify. In step 3 and 4, the AIC extreme points are extracted, and the relationship between the local minimum and the S-phase arrival is researched. In step 5, the S-phase arrival is picked based on the maximum probability criterion. To evaluate of the proposed algorithm, a P- and S-picks classification criterion is also established based on a source location numerical simulation. The field data tests show a considerable improvement of the multi-step AIC algorithm in comparison with the manual picks and the original AIC algorithm. Furthermore, the technique is independent of the kind of SNR. Even in the poor-quality signal group which the SNRs are below 5, the effective picking rates (the corresponding location error is <15 m) of P- and S-phase arrivals are still up to 80.9% and 76.4% respectively.
Software for Allocating Resources in the Deep Space Network
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Borden, Chester; Zendejas, Silvino; Baldwin, John
2003-01-01
TIGRAS 2.0 is a computer program designed to satisfy a need for improved means for analyzing the tracking demands of interplanetary space-flight missions upon the set of ground antenna resources of the Deep Space Network (DSN) and for allocating those resources. Written in Microsoft Visual C++, TIGRAS 2.0 provides a single rich graphical analysis environment for use by diverse DSN personnel, by connecting to various data sources (relational databases or files) based on the stages of the analyses being performed. Notable among the algorithms implemented by TIGRAS 2.0 are a DSN antenna-load-forecasting algorithm and a conflict-aware DSN schedule-generating algorithm. Computers running TIGRAS 2.0 can also be connected using SOAP/XML to a Web services server that provides analysis services via the World Wide Web. TIGRAS 2.0 supports multiple windows and multiple panes in each window for users to view and use information, all in the same environment, to eliminate repeated switching among various application programs and Web pages. TIGRAS 2.0 enables the use of multiple windows for various requirements, trajectory-based time intervals during which spacecraft are viewable, ground resources, forecasts, and schedules. Each window includes a time navigation pane, a selection pane, a graphical display pane, a list pane, and a statistics pane.
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
Marks, Daniel L; Oldenburg, Amy L; Reynolds, J Joshua; Boppart, Stephen A
2003-01-10
The resolution of optical coherence tomography (OCT) often suffers from blurring caused by material dispersion. We present a numerical algorithm for computationally correcting the effect of material dispersion on OCT reflectance data for homogeneous and stratified media. This is experimentally demonstrated by correcting the image of a polydimethyl siloxane microfludic structure and of glass slides. The algorithm can be implemented using the fast Fourier transform. With broad spectral bandwidths and highly dispersive media or thick objects, dispersion correction becomes increasingly important.
NASA Astrophysics Data System (ADS)
Marks, Daniel L.; Oldenburg, Amy L.; Reynolds, J. Joshua; Boppart, Stephen A.
2003-01-01
The resolution of optical coherence tomography (OCT) often suffers from blurring caused by material dispersion. We present a numerical algorithm for computationally correcting the effect of material dispersion on OCT reflectance data for homogeneous and stratified media. This is experimentally demonstrated by correcting the image of a polydimethyl siloxane microfludic structure and of glass slides. The algorithm can be implemented using the fast Fourier transform. With broad spectral bandwidths and highly dispersive media or thick objects, dispersion correction becomes increasingly important.
Gram staining with an automatic machine.
Felek, S; Arslan, A
1999-01-01
This study was undertaken to develop a new Gram-staining machine controlled by a micro-controller and to investigate the quality of slides that were stained in the machine. The machine was designed and produced by the authors. It uses standard 220 V AC. Staining, washing, and drying periods are controlled by a timer built in the micro-controller. A software was made that contains a certain algorithm and time intervals for the staining mode. One-hundred and forty smears were prepared from Escherichia coli, Staphylococcus aureus, Neisseria sp., blood culture, trypticase soy broth, direct pus and sputum smears for comparison studies. Half of the slides in each group were stained with the machine, the other half by hand and then examined by four different microbiologists. Machine-stained slides had a higher clarity and less debris than the hand-stained slides (p < 0.05). In hand-stained slides, some Gram-positive organisms showed poor Gram-positive staining features (p < 0.05). In conclusion, we suggest that Gram staining with the automatic machine increases the staining quality and helps to decrease the work load in a busy diagnostic laboratory.
An Evaluation of TCP with Larger Initial Windows
NASA Technical Reports Server (NTRS)
Allman, Mark; Hayes, Christopher; Ostermann, Shawn
1998-01-01
Transmission Control Protocol (TCP's) slow start algorithm gradually increases the amount of data a sender injects into the network, which prevents the sender from overwhelming the network with an inappropriately large burst of traffic. However, the slow start algorithm can make poor use of the available band-width for transfers which are small compared to the bandwidth-delay product of the link, such as file transfers up to few thousand characters over satellite links or even transfers of several hundred bytes over local area networks. This paper evaluates a proposed performance enhancement that raises the initial window used by TCP from 1 MSS-sized segment to roughly 4 KB. The paper evaluates the impact of using larger initial windows on TCP transfers over both the shared Internet and dialup modem links.
Control of discrete time systems based on recurrent Super-Twisting-like algorithm.
Salgado, I; Kamal, S; Bandyopadhyay, B; Chairez, I; Fridman, L
2016-09-01
Most of the research in sliding mode theory has been carried out to in continuous time to solve the estimation and control problems. However, in discrete time, the results in high order sliding modes have been less developed. In this paper, a discrete time super-twisting-like algorithm (DSTA) was proposed to solve the problems of control and state estimation. The stability proof was developed in terms of the discrete time Lyapunov approach and the linear matrix inequalities theory. The system trajectories were ultimately bounded inside a small region dependent on the sampling period. Simulation results tested the DSTA. The DSTA was applied as a controller for a Furuta pendulum and for a DC motor supplied by a DSTA signal differentiator. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Alavi Fazel, S. Ali
2017-09-01
A new optimized model which can predict the heat transfer in the nucleate boiling at isolated bubble regime is proposed for pool boiling on a horizontal rod heater. This model is developed based on the results of direct observations of the physical boiling phenomena. Boiling heat flux, wall temperature, bubble departing diameter, bubble generation frequency and bubble nucleation site density have been experimentally measured. Water and ethanol have been used as two different boiling fluids. Heating surface was made by several metals and various degrees of roughness. The mentioned model considers various mechanisms such as latent heat transfer due to micro-layer evaporation, transient conduction due to thermal boundary layer reformation, natural convection, heat transfer due to the sliding bubbles and bubble super-heating. The fractional contributions of individual mentioned heat transfer mechanisms have been calculated by genetic algorithm. The results show that at wall temperature difference more that about 3 K, bubble sliding transient conduction, non-sliding transient conduction, micro-layer evaporation, natural convection, radial forced convection and bubble super-heating have higher to lower fractional contributions respectively. The performance of the new optimized model has been verified by comparison of the existing experimental data.
Lun, Aaron T.L.; Smyth, Gordon K.
2016-01-01
Chromatin immunoprecipitation with massively parallel sequencing (ChIP-seq) is widely used to identify binding sites for a target protein in the genome. An important scientific application is to identify changes in protein binding between different treatment conditions, i.e. to detect differential binding. This can reveal potential mechanisms through which changes in binding may contribute to the treatment effect. The csaw package provides a framework for the de novo detection of differentially bound genomic regions. It uses a window-based strategy to summarize read counts across the genome. It exploits existing statistical software to test for significant differences in each window. Finally, it clusters windows into regions for output and controls the false discovery rate properly over all detected regions. The csaw package can handle arbitrarily complex experimental designs involving biological replicates. It can be applied to both transcription factor and histone mark datasets, and, more generally, to any type of sequencing data measuring genomic coverage. csaw performs favorably against existing methods for de novo DB analyses on both simulated and real data. csaw is implemented as a R software package and is freely available from the open-source Bioconductor project. PMID:26578583
Applying a visual language for image processing as a graphical teaching tool in medical imaging
NASA Astrophysics Data System (ADS)
Birchman, James J.; Tanimoto, Steven L.; Rowberg, Alan H.; Choi, Hyung-Sik; Kim, Yongmin
1992-05-01
Typical user interaction in image processing is with command line entries, pull-down menus, or text menu selections from a list, and as such is not generally graphical in nature. Although applying these interactive methods to construct more sophisticated algorithms from a series of simple image processing steps may be clear to engineers and programmers, it may not be clear to clinicians. A solution to this problem is to implement a visual programming language using visual representations to express image processing algorithms. Visual representations promote a more natural and rapid understanding of image processing algorithms by providing more visual insight into what the algorithms do than the interactive methods mentioned above can provide. Individuals accustomed to dealing with images will be more likely to understand an algorithm that is represented visually. This is especially true of referring physicians, such as surgeons in an intensive care unit. With the increasing acceptance of picture archiving and communications system (PACS) workstations and the trend toward increasing clinical use of image processing, referring physicians will need to learn more sophisticated concepts than simply image access and display. If the procedures that they perform commonly, such as window width and window level adjustment and image enhancement using unsharp masking, are depicted visually in an interactive environment, it will be easier for them to learn and apply these concepts. The software described in this paper is a visual programming language for imaging processing which has been implemented on the NeXT computer using NeXTstep user interface development tools and other tools in an object-oriented environment. The concept is based upon the description of a visual language titled `Visualization of Vision Algorithms' (VIVA). Iconic representations of simple image processing steps are placed into a workbench screen and connected together into a dataflow path by the user. As the user creates and edits a dataflow path, more complex algorithms can be built on the screen. Once the algorithm is built, it can be executed, its results can be reviewed, and operator parameters can be interactively adjusted until an optimized output is produced. The optimized algorithm can then be saved and added to the system as a new operator. This system has been evaluated as a graphical teaching tool for window width and window level adjustment, image enhancement using unsharp masking, and other techniques.
Color standardization and optimization in whole slide imaging.
Yagi, Yukako
2011-03-30
Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters), image processing and display factors in the digital systems themselves. We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart); the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI). The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. As a first step, the two slide method (above) was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality - color.
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue
Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-01-01
Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715
A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.
Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain
2017-04-01
This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.
A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.
Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron
2015-04-01
To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.
Quick Vegas: Improving Performance of TCP Vegas for High Bandwidth-Delay Product Networks
NASA Astrophysics Data System (ADS)
Chan, Yi-Cheng; Lin, Chia-Liang; Ho, Cheng-Yuan
An important issue in designing a TCP congestion control algorithm is that it should allow the protocol to quickly adjust the end-to-end communication rate to the bandwidth on the bottleneck link. However, the TCP congestion control may function poorly in high bandwidth-delay product networks because of its slow response with large congestion windows. In this paper, we propose an enhanced version of TCP Vegas called Quick Vegas, in which we present an efficient congestion window control algorithm for a TCP source. Our algorithm improves the slow-start and congestion avoidance techniques of original Vegas. Simulation results show that Quick Vegas significantly improves the performance of connections as well as remaining fair when the bandwidth-delay product increases.
Managing coherence via put/get windows
Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY
2011-01-11
A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.
Managing coherence via put/get windows
Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton on Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Yorktown Heights, NY
2012-02-21
A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an area of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.
Multilocus Association Mapping Using Variable-Length Markov Chains
Browning, Sharon R.
2006-01-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests. PMID:16685642
Multilocus association mapping using variable-length Markov chains.
Browning, Sharon R
2006-06-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests.
Finite-time stabilization of chaotic gyros based on a homogeneous supertwisting-like algorithm
NASA Astrophysics Data System (ADS)
Khamsuwan, Pitcha; Sangpet, Teerawat; Kuntanapreeda, Suwat
2018-01-01
This paper presents a finite-time stabilization scheme for nonlinear chaotic gyros. The scheme utilizes a supertwisting-like continuous control algorithm for the systems of dimension more than one with a Lipschitz disturbance. The algorithm yields finite-time convergence similar to that produces by discontinuous sliding mode control algorithms. To design the controller, the nonlinearities in the gyro are treated as a disturbance in the system. Thanks to the dissipativeness of chaotic systems, the nonlinearities also possess the Lipschitz property. Numerical results are provided to illustrate the effectiveness of the scheme.
Mousavi, Hojjat Seyed; Monga, Vishal; Rao, Ganesh; Rao, Arvind U K
2015-01-01
Histopathological images have rich structural information, are multi-channel in nature and contain meaningful pathological information at various scales. Sophisticated image analysis tools that can automatically extract discriminative information from the histopathology image slides for diagnosis remain an area of significant research activity. In this work, we focus on automated brain cancer grading, specifically glioma grading. Grading of a glioma is a highly important problem in pathology and is largely done manually by medical experts based on an examination of pathology slides (images). To complement the efforts of clinicians engaged in brain cancer diagnosis, we develop novel image processing algorithms and systems to automatically grade glioma tumor into two categories: Low-grade glioma (LGG) and high-grade glioma (HGG) which represent a more advanced stage of the disease. We propose novel image processing algorithms based on spatial domain analysis for glioma tumor grading that will complement the clinical interpretation of the tissue. The image processing techniques are developed in close collaboration with medical experts to mimic the visual cues that a clinician looks for in judging of the grade of the disease. Specifically, two algorithmic techniques are developed: (1) A cell segmentation and cell-count profile creation for identification of Pseudopalisading Necrosis, and (2) a customized operation of spatial and morphological filters to accurately identify microvascular proliferation (MVP). In both techniques, a hierarchical decision is made via a decision tree mechanism. If either Pseudopalisading Necrosis or MVP is found present in any part of the histopathology slide, the whole slide is identified as HGG, which is consistent with World Health Organization guidelines. Experimental results on the Cancer Genome Atlas database are presented in the form of: (1) Successful detection rates of pseudopalisading necrosis and MVP regions, (2) overall classification accuracy into LGG and HGG categories, and (3) receiver operating characteristic curves which can facilitate a desirable trade-off between HGG detection and false-alarm rates. The proposed method demonstrates fairly high accuracy and compares favorably against best-known alternatives such as the state-of-the-art WND-CHARM feature set provided by NIH combined with powerful support vector machine classifier. Our results reveal that the proposed method can be beneficial to a clinician in effectively separating histopathology slides into LGG and HGG categories, particularly where the analysis of a large number of slides is needed. Our work also reveals that MVP regions are much harder to detect than Pseudopalisading Necrosis and increasing accuracy of automated image processing for MVP detection emerges as a significant future research direction.
Galias, Zbigniew
2017-05-01
An efficient method to find positions of periodic windows for the quadratic map f(x)=ax(1-x) and a heuristic algorithm to locate the majority of wide periodic windows are proposed. Accurate rigorous bounds of positions of all periodic windows with periods below 37 and the majority of wide periodic windows with longer periods are found. Based on these results, we prove that the measure of the set of regular parameters in the interval [3,4] is above 0.613960137. The properties of periodic windows are studied numerically. The results of the analysis are used to estimate that the true value of the measure of the set of regular parameters is close to 0.6139603.
Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla
2016-11-01
Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.
NASA Astrophysics Data System (ADS)
Knudsen, Steven; Golubovic, Leonardo
Prospects to build Space Elevator (SE) systems have become realistic with ultra-strong materials such as carbon nano-tubes and diamond nano-threads. At cosmic length-scales, space elevators can be modeled as polymer like floppy strings of tethered mass beads. A new venue in SE science has emerged with the introduction of the Rotating Space Elevator (RSE) concept supported by novel algorithms discussed in this presentation. An RSE is a loopy string reaching into outer space. Unlike the classical geostationary SE concepts of Tsiolkovsky, Artsutanov, and Pearson, our RSE exhibits an internal rotation. Thanks to this, objects sliding along the RSE loop spontaneously oscillate between two turning points, one of which is close to the Earth whereas the other one is in outer space. The RSE concept thus solves a major problem in SE technology which is how to supply energy to the climbers moving along space elevator strings. The investigation of the classical and statistical mechanics of a floppy string interacting with objects sliding along it required development of subtle computational algorithms described in this presentation
Resource-constrained scheduling with hard due windows and rejection penalties
NASA Astrophysics Data System (ADS)
Garcia, Christopher
2016-09-01
This work studies a scheduling problem where each job must be either accepted and scheduled to complete within its specified due window, or rejected altogether. Each job has a certain processing time and contributes a certain profit if accepted or penalty cost if rejected. There is a set of renewable resources, and no resource limit can be exceeded at any time. Each job requires a certain amount of each resource when processed, and the objective is to maximize total profit. A mixed-integer programming formulation and three approximation algorithms are presented: a priority rule heuristic, an algorithm based on the metaheuristic for randomized priority search and an evolutionary algorithm. Computational experiments comparing these four solution methods were performed on a set of generated benchmark problems covering a wide range of problem characteristics. The evolutionary algorithm outperformed the other methods in most cases, often significantly, and never significantly underperformed any method.
The dynamic financial distress prediction method of EBW-VSTW-SVM
NASA Astrophysics Data System (ADS)
Sun, Jie; Li, Hui; Chang, Pei-Chann; He, Kai-Yu
2016-07-01
Financial distress prediction (FDP) takes important role in corporate financial risk management. Most of former researches in this field tried to construct effective static FDP (SFDP) models that are difficult to be embedded into enterprise information systems, because they are based on horizontal data-sets collected outside the modelling enterprise by defining the financial distress as the absolute conditions such as bankruptcy or insolvency. This paper attempts to propose an approach for dynamic evaluation and prediction of financial distress based on the entropy-based weighting (EBW), the support vector machine (SVM) and an enterprise's vertical sliding time window (VSTW). The dynamic FDP (DFDP) method is named EBW-VSTW-SVM, which keeps updating the FDP model dynamically with time goes on and only needs the historic financial data of the modelling enterprise itself and thus is easier to be embedded into enterprise information systems. The DFDP method of EBW-VSTW-SVM consists of four steps, namely evaluation of vertical relative financial distress (VRFD) based on EBW, construction of training data-set for DFDP modelling according to VSTW, training of DFDP model based on SVM and DFDP for the future time point. We carry out case studies for two listed pharmaceutical companies and experimental analysis for some other companies to simulate the sliding of enterprise vertical time window. The results indicated that the proposed approach was feasible and efficient to help managers improve corporate financial management.
Sliding window adaptive histogram equalization of intraoral radiographs: effect on image quality.
Sund, T; Møystad, A
2006-05-01
To investigate whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization (SWAHE) can enhance the image quality of intraoral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was first done on an unprocessed radiograph ("single-view"), and then re-done with the image processed with SWAHE displayed beside the unprocessed version ("twin-view"). The processing parameters for SWAHE were the same for all the images. For the periapical examinations, twin-view was judged to raise the image quality for 52% of those cases where the single-view quality was below the maximum. For the bitewing radiographs, there was a change of caries classification (both positive and negative) with twin-view in 19% of the cases, but with only a 3% net increase in the total number of caries registrations. For both examinations interobserver variance was unaffected. Non-interactive SWAHE applied to dental SP radiographs produces a supplemental contrast enhanced image which in twin-view reading improves the image quality of periapical examinations. SWAHE also affects caries diagnosis of bitewing images, and further study using a gold standard is warranted.
Orellana, Luis H.; Rodriguez-R, Luis M.; Konstantinidis, Konstantinos T.
2016-10-07
Functional annotation of metagenomic and metatranscriptomic data sets relies on similarity searches based on e-value thresholds resulting in an unknown number of false positive and negative matches. To overcome these limitations, we introduce ROCker, aimed at identifying position-specific, most-discriminant thresholds in sliding windows along the sequence of a target protein, accounting for non-discriminative domains shared by unrelated proteins. ROCker employs the receiver operating characteristic (ROC) curve to minimize false discovery rate (FDR) and calculate the best thresholds based on how simulated shotgun metagenomic reads of known composition map onto well-curated reference protein sequences and thus, differs from HMM profiles andmore » related methods. We showcase ROCker using ammonia monooxygenase (amoA) and nitrous oxide reductase (nosZ) genes, mediating oxidation of ammonia and the reduction of the potent greenhouse gas, N 2O, to inert N 2, respectively. ROCker typically showed 60-fold lower FDR when compared to the common practice of using fixed e-values. Previously uncounted ‘atypical’ nosZ genes were found to be two times more abundant, on average, than their typical counterparts in most soil metagenomes and the abundance of bacterial amoA was quantified against the highly-related particulate methane monooxygenase (pmoA). Therefore, ROCker can reliably detect and quantify target genes in short-read metagenomes.« less
An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.
Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein
2017-12-22
The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.
NASA Astrophysics Data System (ADS)
Zhou, Ping; Barkhaus, Paul E.; Zhang, Xu; Zev Rymer, William
2011-10-01
This paper presents a novel application of the approximate entropy (ApEn) measurement for characterizing spontaneous motor unit activity of amyotrophic lateral sclerosis (ALS) patients. High-density surface electromyography (EMG) was used to record spontaneous motor unit activity bilaterally from the thenar muscles of nine ALS subjects. Three distinct patterns of spontaneous motor unit activity (sporadic spikes, tonic spikes and high-frequency repetitive spikes) were observed. For each pattern, complexity was characterized by calculating the ApEn values of the representative signal segments. A sliding window over each segment was also introduced to quantify the dynamic changes in complexity for the different spontaneous motor unit patterns. We found that the ApEn values for the sporadic spikes were the highest, while those of the high-frequency repetitive spikes were the lowest. There is a significant difference in mean ApEn values between two arbitrary groups of the three spontaneous motor unit patterns (P < 0.001). The dynamic ApEn curve from the sliding window analysis is capable of tracking variations in EMG activity, thus providing a vivid, distinctive description for different patterns of spontaneous motor unit action potentials in terms of their complexity. These findings expand the existing knowledge of spontaneous motor unit activity in ALS beyond what was previously obtained using conventional linear methods such as firing rate or inter-spike interval statistics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orellana, Luis H.; Rodriguez-R, Luis M.; Konstantinidis, Konstantinos T.
Functional annotation of metagenomic and metatranscriptomic data sets relies on similarity searches based on e-value thresholds resulting in an unknown number of false positive and negative matches. To overcome these limitations, we introduce ROCker, aimed at identifying position-specific, most-discriminant thresholds in sliding windows along the sequence of a target protein, accounting for non-discriminative domains shared by unrelated proteins. ROCker employs the receiver operating characteristic (ROC) curve to minimize false discovery rate (FDR) and calculate the best thresholds based on how simulated shotgun metagenomic reads of known composition map onto well-curated reference protein sequences and thus, differs from HMM profiles andmore » related methods. We showcase ROCker using ammonia monooxygenase (amoA) and nitrous oxide reductase (nosZ) genes, mediating oxidation of ammonia and the reduction of the potent greenhouse gas, N 2O, to inert N 2, respectively. ROCker typically showed 60-fold lower FDR when compared to the common practice of using fixed e-values. Previously uncounted ‘atypical’ nosZ genes were found to be two times more abundant, on average, than their typical counterparts in most soil metagenomes and the abundance of bacterial amoA was quantified against the highly-related particulate methane monooxygenase (pmoA). Therefore, ROCker can reliably detect and quantify target genes in short-read metagenomes.« less
2017-01-01
Abstract Functional annotation of metagenomic and metatranscriptomic data sets relies on similarity searches based on e-value thresholds resulting in an unknown number of false positive and negative matches. To overcome these limitations, we introduce ROCker, aimed at identifying position-specific, most-discriminant thresholds in sliding windows along the sequence of a target protein, accounting for non-discriminative domains shared by unrelated proteins. ROCker employs the receiver operating characteristic (ROC) curve to minimize false discovery rate (FDR) and calculate the best thresholds based on how simulated shotgun metagenomic reads of known composition map onto well-curated reference protein sequences and thus, differs from HMM profiles and related methods. We showcase ROCker using ammonia monooxygenase (amoA) and nitrous oxide reductase (nosZ) genes, mediating oxidation of ammonia and the reduction of the potent greenhouse gas, N2O, to inert N2, respectively. ROCker typically showed 60-fold lower FDR when compared to the common practice of using fixed e-values. Previously uncounted ‘atypical’ nosZ genes were found to be two times more abundant, on average, than their typical counterparts in most soil metagenomes and the abundance of bacterial amoA was quantified against the highly-related particulate methane monooxygenase (pmoA). Therefore, ROCker can reliably detect and quantify target genes in short-read metagenomes. PMID:28180325
Zeng, Dong; Gao, Yuanyuan; Huang, Jing; Bian, Zhaoying; Zhang, Hua; Lu, Lijun; Ma, Jianhua
2016-10-01
Multienergy computed tomography (MECT) allows identifying and differentiating different materials through simultaneous capture of multiple sets of energy-selective data belonging to specific energy windows. However, because sufficient photon counts are not available in each energy window compared with that in the whole energy window, the MECT images reconstructed by the analytical approach often suffer from poor signal-to-noise and strong streak artifacts. To address the particular challenge, this work presents a penalized weighted least-squares (PWLS) scheme by incorporating the new concept of structure tensor total variation (STV) regularization, which is henceforth referred to as 'PWLS-STV' for simplicity. Specifically, the STV regularization is derived by penalizing higher-order derivatives of the desired MECT images. Thus it could provide more robust measures of image variation, which can eliminate the patchy artifacts often observed in total variation (TV) regularization. Subsequently, an alternating optimization algorithm was adopted to minimize the objective function. Extensive experiments with a digital XCAT phantom and meat specimen clearly demonstrate that the present PWLS-STV algorithm can achieve more gains than the existing TV-based algorithms and the conventional filtered backpeojection (FBP) algorithm in terms of both quantitative and visual quality evaluations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pre-launch Performance Assessment of the VIIRS Ice Surface Temperature Algorithm
NASA Astrophysics Data System (ADS)
Ip, J.; Hauss, B.
2008-12-01
The VIIRS Ice Surface Temperature (IST) environmental data product provides the surface temperature of sea-ice at VIIRS moderate resolution (750m) during both day and night. To predict the IST, the retrieval algorithm utilizes a split-window approach with Long-wave Infrared (LWIR) channels at 10.76 μm (M15) and 12.01 μm (M16) to correct for atmospheric water vapor. The split-window approach using these LWIR channels is AVHRR and MODIS heritage, where the MODIS formulation has a slightly modified functional form. The algorithm relies on the VIIRS Cloud Mask IP for identifying cloudy and ocean pixels, the VIIRS Ice Concentration IP for identifying ice pixels, and the VIIRS Aerosol Optical Thickness (AOT) IP for excluding pixels with AOT greater than 1.0. In this paper, we will report the pre-launch performance assessment of the IST retrieval. We have taken two separate approaches to perform this assessment, one based on global synthetic data and the other based on proxy data from Terra MODIS. Results of the split- window algorithm have been assessed by comparison either to synthetic "truth" or results of the MODIS retrieval. We will also show that the results of the assessment with proxy data are consistent with those obtained using the global synthetic data.
NASA Astrophysics Data System (ADS)
Zeng, Dong; Bian, Zhaoying; Gong, Changfei; Huang, Jing; He, Ji; Zhang, Hua; Lu, Lijun; Feng, Qianjin; Liang, Zhengrong; Ma, Jianhua
2016-03-01
Multienergy computed tomography (MECT) has the potential to simultaneously offer multiple sets of energy- selective data belonging to specific energy windows. However, because sufficient photon counts are not available in the specific energy windows compared with that in the whole energy window, the MECT images reconstructed by the analytical approach often suffer from poor signal-to-noise (SNR) and strong streak artifacts. To eliminate this drawback, in this work we present a penalized weighted least-squares (PWLS) scheme by incorporating the new concept of structure tensor total variation (STV) regularization to improve the MECT images quality from low-milliampere-seconds (low-mAs) data acquisitions. Henceforth the present scheme is referred to as `PWLS- STV' for simplicity. Specifically, the STV regularization is derived by penalizing the eigenvalues of the structure tensor of every point in the MECT images. Thus it can provide more robust measures of image variation, which can eliminate the patchy artifacts often observed in total variation regularization. Subsequently, an alternating optimization algorithm was adopted to minimize the objective function. Experiments with a digital XCAT phantom clearly demonstrate that the present PWLS-STV algorithm can achieve more gains than the existing TV-based algorithms and the conventional filtered backpeojection (FBP) algorithm in terms of noise-induced artifacts suppression, resolution preservation, and material decomposition assessment.
Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang
2017-11-01
The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.
Land surface temperature measurements from EOS MODIS data
NASA Technical Reports Server (NTRS)
Wan, Zhengming
1994-01-01
A generalized split-window method for retrieving land-surface temperature (LST) from AVHRR and MODIS data has been developed. Accurate radiative transfer simulations show that the coefficients in the split-window algorithm for LST must depend on the viewing angle, if we are to achieve a LST accuracy of about 1 K for the whole scan swath range (+/-55.4 deg and +/-55 deg from nadir for AVHRR and MODIS, respectively) and for the ranges of surface temperature and atmospheric conditions over land, which are much wider than those over oceans. We obtain these coefficients from regression analysis of radiative transfer simulations, and we analyze sensitivity and error by using results from systematic radiative transfer simulations over wide ranges of surface temperatures and emissivities, and atmospheric water vapor abundance and temperatures. Simulations indicated that as atmospheric column water vapor increases and viewing angle is larger than 45 deg it is necessary to optimize the split-window method by separating the ranges of the atmospheric column water vapor and lower boundary temperature, and the surface temperature into tractable sub-ranges. The atmospheric lower boundary temperature and (vertical) column water vapor values retrieved from HIRS/2 or MODIS atmospheric sounding channels can be used to determine the range where the optimum coefficients of the split-window method are given. This new LST algorithm not only retrieves LST more accurately but also is less sensitive than viewing-angle independent LST algorithms to the uncertainty in the band emissivities of the land-surface in the split-window and to the instrument noise.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul
2011-07-01
In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of moving average tracking was up to four times higher than that of real-time tracking and approached the efficiency of no compensation for all cases. The geometric accuracy and dosimetric accuracy of the moving average algorithm was between real-time tracking and no compensation, approximately half the percentage of dosimetric points failing the gamma-test compared with no compensation.
NASA Astrophysics Data System (ADS)
Cai, Le; Mao, Xiaobing; Ma, Zhexuan
2018-02-01
This study first constructed the nonlinear mathematical model of the high-pressure common rail (HPCR) system in the diesel engine. Then, the nonlinear state transformation was performed using the flow’s calculation and the standard state space equation was acquired. Based on sliding-mode variable structure control (SMVSC) theory, a sliding-mode controller for nonlinear systems was designed for achieving the control of common rail pressure and the diesel engine’s rotational speed. Finally, on the simulation platform of MATLAB, the designed nonlinear HPCR system was simulated. The simulation results demonstrate that sliding-mode variable structure control algorithm shows favorable control performances and overcome the shortcomings of traditional PID control in overshoot, parameter adjustment, system precision, adjustment time and ascending time.
Functional Based Adaptive and Fuzzy Sliding Controller for Non-Autonomous Active Suspension System
NASA Astrophysics Data System (ADS)
Huang, Shiuh-Jer; Chen, Hung-Yi
In this paper, an adaptive sliding controller is developed for controlling a vehicle active suspension system. The functional approximation technique is employed to substitute the unknown non-autonomous functions of the suspension system and release the model-based requirement of sliding mode control algorithm. In order to improve the control performance and reduce the implementation problem, a fuzzy strategy with online learning ability is added to compensate the functional approximation error. The update laws of the functional approximation coefficients and the fuzzy tuning parameters are derived from the Lyapunov theorem to guarantee the system stability. The proposed controller is implemented on a quarter-car hydraulic actuating active suspension system test-rig. The experimental results show that the proposed controller suppresses the oscillation amplitude of the suspension system effectively.
Chen, Gang; Song, Yongduan; Guan, Yanfeng
2018-03-01
This brief investigates the finite-time consensus tracking control problem for networked uncertain mechanical systems on digraphs. A new terminal sliding-mode-based cooperative control scheme is developed to guarantee that the tracking errors converge to an arbitrarily small bound around zero in finite time. All the networked systems can have different dynamics and all the dynamics are unknown. A neural network is used at each node to approximate the local unknown dynamics. The control schemes are implemented in a fully distributed manner. The proposed control method eliminates some limitations in the existing terminal sliding-mode-based consensus control methods and extends the existing analysis methods to the case of directed graphs. Simulation results on networked robot manipulators are provided to show the effectiveness of the proposed control algorithms.
NASA Astrophysics Data System (ADS)
Jiang, Peng; Peng, Lihui; Xiao, Deyun
2007-06-01
This paper presents a regularization method by using different window functions as regularization for electrical capacitance tomography (ECT) image reconstruction. Image reconstruction for ECT is a typical ill-posed inverse problem. Because of the small singular values of the sensitivity matrix, the solution is sensitive to the measurement noise. The proposed method uses the spectral filtering properties of different window functions to make the solution stable by suppressing the noise in measurements. The window functions, such as the Hanning window, the cosine window and so on, are modified for ECT image reconstruction. Simulations with respect to five typical permittivity distributions are carried out. The reconstructions are better and some of the contours are clearer than the results from the Tikhonov regularization. Numerical results show that the feasibility of the image reconstruction algorithm using different window functions as regularization.
Big Data in Reciprocal Space: Sliding Fast Fourier Transforms for Determining Periodicity
Vasudevan, Rama K.; Belianinov, Alex; Gianfrancesco, Anthony G.; ...
2015-03-03
Significant advances in atomically resolved imaging of crystals and surfaces have occurred in the last decade allowing unprecedented insight into local crystal structures and periodicity. Yet, the analysis of the long-range periodicity from the local imaging data, critical to correlation of functional properties and chemistry to the local crystallography, remains a challenge. Here, we introduce a Sliding Fast Fourier Transform (FFT) filter to analyze atomically resolved images of in-situ grown La5/8Ca3/8MnO3 films. We demonstrate the ability of sliding FFT algorithm to differentiate two sub-lattices, resulting from a mixed-terminated surface. Principal Component Analysis (PCA) and Independent Component Analysis (ICA) of themore » Sliding FFT dataset reveal the distinct changes in crystallography, step edges and boundaries between the multiple sub-lattices. The method is universal for images with any periodicity, and is especially amenable to atomically resolved probe and electron-microscopy data for rapid identification of the sub-lattices present.« less
Big Data in Reciprocal Space: Sliding Fast Fourier Transforms for Determining Periodicity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, Rama K.; Belianinov, Alex; Gianfrancesco, Anthony G.
Significant advances in atomically resolved imaging of crystals and surfaces have occurred in the last decade allowing unprecedented insight into local crystal structures and periodicity. Yet, the analysis of the long-range periodicity from the local imaging data, critical to correlation of functional properties and chemistry to the local crystallography, remains a challenge. Here, we introduce a Sliding Fast Fourier Transform (FFT) filter to analyze atomically resolved images of in-situ grown La5/8Ca3/8MnO3 films. We demonstrate the ability of sliding FFT algorithm to differentiate two sub-lattices, resulting from a mixed-terminated surface. Principal Component Analysis (PCA) and Independent Component Analysis (ICA) of themore » Sliding FFT dataset reveal the distinct changes in crystallography, step edges and boundaries between the multiple sub-lattices. The method is universal for images with any periodicity, and is especially amenable to atomically resolved probe and electron-microscopy data for rapid identification of the sub-lattices present.« less
NASA Astrophysics Data System (ADS)
Boudjema, Zinelaabidine; Taleb, Rachid; Bounadja, Elhadj
2017-02-01
Traditional filed oriented control strategy including proportional-integral (PI) regulator for the speed drive of the doubly fed induction motor (DFIM) have some drawbacks such as parameter tuning complications, mediocre dynamic performances and reduced robustness. Therefore, based on the analysis of the mathematical model of a DFIM supplied by two five-level SVPWM inverters, this paper proposes a new robust control scheme based on super twisting sliding mode and fuzzy logic. The conventional sliding mode control (SMC) has vast chattering effect on the electromagnetic torque developed by the DFIM. In order to resolve this problem, a second order sliding mode technique based on super twisting algorithm and fuzzy logic functions is employed. The validity of the employed approach was tested by using Matlab/Simulink software. Interesting simulation results were obtained and remarkable advantages of the proposed control scheme were exposed including simple design of the control system, reduced chattering as well as the other advantages.
NASA Astrophysics Data System (ADS)
Kang, Shuo; Yan, Hao; Dong, Lijing; Li, Changchun
2018-03-01
This paper addresses the force tracking problem of electro-hydraulic load simulator under the influence of nonlinear friction and uncertain disturbance. A nonlinear system model combined with the improved generalized Maxwell-slip (GMS) friction model is firstly derived to describe the characteristics of load simulator system more accurately. Then, by using particle swarm optimization (PSO) algorithm combined with the system hysteresis characteristic analysis, the GMS friction parameters are identified. To compensate for nonlinear friction and uncertain disturbance, a finite-time adaptive sliding mode control method is proposed based on the accurate system model. This controller has the ability to ensure that the system state moves along the nonlinear sliding surface to steady state in a short time as well as good dynamic properties under the influence of parametric uncertainties and disturbance, which further improves the force loading accuracy and rapidity. At the end of this work, simulation and experimental results are employed to demonstrate the effectiveness of the proposed sliding mode control strategy.
Sliding Mode Control Applied to Reconfigurable Flight Control Design
NASA Technical Reports Server (NTRS)
Hess, R. A.; Wells, S. R.; Bacon, Barton (Technical Monitor)
2002-01-01
Sliding mode control is applied to the design of a flight control system capable of operating with limited bandwidth actuators and in the presence of significant damage to the airframe and/or control effector actuators. Although inherently robust, sliding mode control algorithms have been hampered by their sensitivity to the effects of parasitic unmodeled dynamics, such as those associated with actuators and structural modes. It is known that asymptotic observers can alleviate this sensitivity while still allowing the system to exhibit significant robustness. This approach is demonstrated. The selection of the sliding manifold as well as the interpretation of the linear design that results after introduction of a boundary layer is accomplished in the frequency domain. The design technique is exercised on a pitch-axis controller for a simple short-period model of the High Angle of Attack F-18 vehicle via computer simulation. Stability and performance is compared to that of a system incorporating a controller designed by classical loop-shaping techniques.
Han, Yaozhen; Liu, Xiangjie
2016-05-01
This paper presents a continuous higher-order sliding mode (HOSM) control scheme with time-varying gain for a class of uncertain nonlinear systems. The proposed controller is derived from the concept of geometric homogeneity and super-twisting algorithm, and includes two parts, the first part of which achieves smooth finite time stabilization of pure integrator chains. The second part conquers the twice differentiable uncertainty and realizes system robustness by employing super-twisting algorithm. Particularly, time-varying switching control gain is constructed to reduce the switching control action magnitude to the minimum possible value while keeping the property of finite time convergence. Examples concerning the perturbed triple integrator chains and excitation control for single-machine infinite bus power system are simulated respectively to demonstrate the effectiveness and applicability of the proposed approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
Managing coherence via put/get windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumrich, Matthias A; Chen, Dong; Coteus, Paul W
A method and apparatus for managing coherence between two processors of a two processor node of a multi-processor computer system. Generally the present invention relates to a software algorithm that simplifies and significantly speeds the management of cache coherence in a message passing parallel computer, and to hardware apparatus that assists this cache coherence algorithm. The software algorithm uses the opening and closing of put/get windows to coordinate the activated required to achieve cache coherence. The hardware apparatus may be an extension to the hardware address decode, that creates, in the physical memory address space of the node, an areamore » of virtual memory that (a) does not actually exist, and (b) is therefore able to respond instantly to read and write requests from the processing elements.« less
Robust and real-time control of magnetic bearings for space engines
NASA Technical Reports Server (NTRS)
Sinha, Alok; Wang, Kon-Well; Mease, K.; Lewis, S.
1991-01-01
Currently, NASA Lewis Research Center is developing magnetic bearings for Space Shuttle Main Engine (SSME) turbopumps. The control algorithms which have been used are based on either the proportional-intergral-derivative control (PID) approach or the linear quadratic (LQ) state space approach. These approaches lead to an acceptable performance only when the system model is accurately known, which is seldom true in practice. For example, the rotor eccentricity, which is a major source of vibration at high speeds, cannot be predicted accurately. Furthermore, the dynamics of a rotor shaft, which must be treated as a flexible system to model the elastic rotor shaft, is infinite dimensional in theory and the controller can only be developed on the basis of a finite number of modes. Therefore, the development of the control system is further complicated by the possibility of closed loop system instability because of residual or uncontrolled modes, the so called spillover problem. Consequently, novel control algorithms for magnetic bearings are being developed to be robust to inevitable parametric uncertainties, external disturbances, spillover phenomenon and noise. Also, as pointed out earlier, magnetic bearings must exhibit good performance at a speed over 30,000 rpm. This implies that the sampling period available for the design of a digital control system has to be of the order of 0.5 milli-seconds. Therefore, feedback coefficients and other required controller parameters have to be computed off-line so that the on-line computational burden is extremely small. The development of the robust and real-time control algorithms is based on the sliding mode control theory. In this method, a dynamic system is made to move along a manifold of sliding hyperplanes to the origin of the state space. The number of sliding hyperplanes equals that of actuators. The sliding mode controller has two parts; linear state feedback and nonlinear terms. The nonlinear terms guarantee that the systems would reach the intersection of all sliding hyperplanes and remain on it when bounds on the errors in the system parameters and external disturbances are known. The linear part of the control drives the system to the origin of state space. Another important feature is that the controller parameter can be computed off-line. Consequently, on-line computational burden is small.
Survey: interpolation methods for whole slide image processing.
Roszkowiak, L; Korzynska, A; Zak, J; Pijanowska, D; Swiderska-Chadaj, Z; Markiewicz, T
2017-02-01
Evaluating whole slide images of histological and cytological samples is used in pathology for diagnostics, grading and prognosis . It is often necessary to rescale whole slide images of a very large size. Image resizing is one of the most common applications of interpolation. We collect the advantages and drawbacks of nine interpolation methods, and as a result of our analysis, we try to select one interpolation method as the preferred solution. To compare the performance of interpolation methods, test images were scaled and then rescaled to the original size using the same algorithm. The modified image was compared to the original image in various aspects. The time needed for calculations and results of quantification performance on modified images were also compared. For evaluation purposes, we used four general test images and 12 specialized biological immunohistochemically stained tissue sample images. The purpose of this survey is to determine which method of interpolation is the best to resize whole slide images, so they can be further processed using quantification methods. As a result, the interpolation method has to be selected depending on the task involving whole slide images. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
X33 Reusable Launch Vehicle Control on Sliding Modes: Concepts for a Control System Development
NASA Technical Reports Server (NTRS)
Shtessel, Yuri B.
1998-01-01
Control of the X33 reusable launch vehicle is considered. The launch control problem consists of automatic tracking of the launch trajectory which is assumed to be optimally precalculated. It requires development of a reliable, robust control algorithm that can automatically adjust to some changes in mission specifications (mass of payload, target orbit) and the operating environment (atmospheric perturbations, interconnection perturbations from the other subsystems of the vehicle, thrust deficiencies, failure scenarios). One of the effective control strategies successfully applied in nonlinear systems is the Sliding Mode Control. The main advantage of the Sliding Mode Control is that the system's state response in the sliding surface remains insensitive to certain parameter variations, nonlinearities and disturbances. Employing the time scaling concept, a new two (three)-loop structure of the control system for the X33 launch vehicle was developed. Smoothed sliding mode controllers were designed to robustly enforce the given closed-loop dynamics. Simulations of the 3-DOF model of the X33 launch vehicle with the table-look-up models for Euler angle reference profiles and disturbance torque profiles showed a very accurate, robust tracking performance.
Reusable Launch Vehicle Control In Multiple Time Scale Sliding Modes
NASA Technical Reports Server (NTRS)
Shtessel, Yuri; Hall, Charles; Jackson, Mark
2000-01-01
A reusable launch vehicle control problem during ascent is addressed via multiple-time scaled continuous sliding mode control. The proposed sliding mode controller utilizes a two-loop structure and provides robust, de-coupled tracking of both orientation angle command profiles and angular rate command profiles in the presence of bounded external disturbances and plant uncertainties. Sliding mode control causes the angular rate and orientation angle tracking error dynamics to be constrained to linear, de-coupled, homogeneous, and vector valued differential equations with desired eigenvalues placement. Overall stability of a two-loop control system is addressed. An optimal control allocation algorithm is designed that allocates torque commands into end-effector deflection commands, which are executed by the actuators. The dual-time scale sliding mode controller was designed for the X-33 technology demonstration sub-orbital launch vehicle in the launch mode. Simulation results show that the designed controller provides robust, accurate, de-coupled tracking of the orientation angle command profiles in presence of external disturbances and vehicle inertia uncertainties. This is a significant advancement in performance over that achieved with linear, gain scheduled control systems currently being used for launch vehicles.
Unannounced Meals in the Artificial Pancreas: Detection Using Continuous Glucose Monitoring
Herrero, Pau; Bondia, Jorge
2018-01-01
The artificial pancreas (AP) system is designed to regulate blood glucose in subjects with type 1 diabetes using a continuous glucose monitor informed controller that adjusts insulin infusion via an insulin pump. However, current AP developments are mainly hybrid closed-loop systems that include feed-forward actions triggered by the announcement of meals or exercise. The first step to fully closing the loop in the AP requires removing meal announcement, which is currently the most effective way to alleviate postprandial hyperglycemia due to the delay in insulin action. Here, a novel approach to meal detection in the AP is presented using a sliding window and computing the normalized cross-covariance between measured glucose and the forward difference of a disturbance term, estimated from an augmented minimal model using an Unscented Kalman Filter. Three different tunings were applied to the same meal detection algorithm: (1) a high sensitivity tuning, (2) a trade-off tuning that has a high amount of meals detected and a low amount of false positives (FP), and (3) a low FP tuning. For the three tunings sensitivities 99 ± 2%, 93 ± 5%, and 47 ± 12% were achieved, respectively. A sensitivity analysis was also performed and found that higher carbohydrate quantities and faster rates of glucose appearance result in favorable meal detection outcomes. PMID:29547553
MO-G-BRD-01: Point/Counterpoint Debate: Arc Based Techniques Will Make Conventional IMRT Obsolete
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shepard, D; Popple, R; Balter, P
2014-06-15
A variety of intensity modulated radiation therapy (IMRT) delivery techniques have been developed that have provided clinicians with the ability to deliver highly conformal dose distributions. The delivery techniques include compensators, step-and-shoot IMRT, sliding window IMRT, volumetric modulated arc therapy (VMAT), and tomotherapy. A key development in the field of IMRT was the introduction of new planning algorithms and delivery control systems in 2007 that made it possible to coordinate the gantry rotation speed, dose rate, and multileaf collimator leaf positions during the delivery of arc therapy. With these developments, VMAT became a routine clinical tool. The use of VMATmore » has continued to grow in recent years and some would argue that this will soon make conventional IMRT obsolete, and this is the premise of this debate. To introduce the debate, David Shepard, Ph.D. will provide an overview of IMRT delivery techniques including historical context and how they are being used today. The debate will follow with Richard Popple, Ph.D. arguing FOR the Proposition and Peter Balter, Ph.D. arguing AGAINST it. Learning Objectives: Understand the different delivery techniques for IMRT. Understand the potential benefits of conventional IMRT. Understand the potential benefits of arc-based IMRT delivery.« less
Jawarneh, Sana; Abdullah, Salwani
2015-01-01
This paper presents a bee colony optimisation (BCO) algorithm to tackle the vehicle routing problem with time window (VRPTW). The VRPTW involves recovering an ideal set of routes for a fleet of vehicles serving a defined number of customers. The BCO algorithm is a population-based algorithm that mimics the social communication patterns of honeybees in solving problems. The performance of the BCO algorithm is dependent on its parameters, so the online (self-adaptive) parameter tuning strategy is used to improve its effectiveness and robustness. Compared with the basic BCO, the adaptive BCO performs better. Diversification is crucial to the performance of the population-based algorithm, but the initial population in the BCO algorithm is generated using a greedy heuristic, which has insufficient diversification. Therefore the ways in which the sequential insertion heuristic (SIH) for the initial population drives the population toward improved solutions are examined. Experimental comparisons indicate that the proposed adaptive BCO-SIH algorithm works well across all instances and is able to obtain 11 best results in comparison with the best-known results in the literature when tested on Solomon’s 56 VRPTW 100 customer instances. Also, a statistical test shows that there is a significant difference between the results. PMID:26132158
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Dean J.; Harding, Lee T.
Isotope identification algorithms that are contained in the Gamma Detector Response and Analysis Software (GADRAS) can be used for real-time stationary measurement and search applications on platforms operating under Linux or Android operating sys-tems. Since the background radiation can vary considerably due to variations in natu-rally-occurring radioactive materials (NORM), spectral algorithms can be substantial-ly more sensitive to threat materials than search algorithms based strictly on count rate. Specific isotopes or interest can be designated for the search algorithm, which permits suppression of alarms for non-threatening sources, such as such as medical radionuclides. The same isotope identification algorithms that are usedmore » for search ap-plications can also be used to process static measurements. The isotope identification algorithms follow the same protocols as those used by the Windows version of GADRAS, so files that are created under the Windows interface can be copied direct-ly to processors on fielded sensors. The analysis algorithms contain provisions for gain adjustment and energy lineariza-tion, which enables direct processing of spectra as they are recorded by multichannel analyzers. Gain compensation is performed by utilizing photopeaks in background spectra. Incorporation of this energy calibration tasks into the analysis algorithm also eliminates one of the more difficult challenges associated with development of radia-tion detection equipment.« less
Airborne target tracking algorithm against oppressive decoys in infrared imagery
NASA Astrophysics Data System (ADS)
Sun, Xiechang; Zhang, Tianxu
2009-10-01
This paper presents an approach for tracking airborne target against oppressive infrared decoys. Oppressive decoy lures infrared guided missile by its high infrared radiation. Traditional tracking algorithms have degraded stability even come to tracking failure when airborne target continuously throw out many decoys. The proposed approach first determines an adaptive tracking window. The center of the tracking window is set at a predicted target position which is computed based on uniform motion model. Different strategies are applied for determination of tracking window size according to target state. The image within tracking window is segmented and multi features of candidate targets are extracted. The most similar candidate target is associated to the tracking target by using a decision function, which calculates a weighted sum of normalized feature differences between two comparable targets. Integrated intensity ratio of association target and tracking target, and target centroid are examined to estimate target state in the presence of decoys. The tracking ability and robustness of proposed approach has been validated by processing available real-world and simulated infrared image sequences containing airborne targets and oppressive decoys.
SU-F-J-10: Sliding Mode Control of a SMA Actuated Active Flexible Needle for Medical Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Podder, T
Purpose: In medical interventional procedures such as brachytherapy, ablative therapies and biopsy precise steering and accurate placement of needles are very important for anatomical obstacle avoidance and accurate targeting. This study presents the efficacy of a sliding mode controller for Shape Memory Alloy (SMA) actuated flexible needle for medical procedures. Methods: Second order system dynamics of the SMA actuated active flexible needle was used for deriving the sliding mode control equations. Both proportional-integral-derivative (PID) and adaptive PID sliding mode control (APIDSMC) algorithms were developed and implemented. The flexible needle was attached at the end of a 6 DOF robotic system.more » Through LabView programming environment, the control commands were generated using the PID and APIDSMC algorithms. Experiments with artificial tissue mimicking phantom were performed to evaluate the performance of the controller. The actual needle tip position was obtained using an electromagnetic (EM) tracking sensor (Aurora, NDI, waterloo, Canada) at a sampling period of 1ms. During experiment, external disturbances were created applying force and thermal shock to investigate the robustness of the controllers. Results: The root mean square error (RMSE) values for APIDSMC and PID controllers were 0.75 mm and 0.92 mm, respectively, for sinusoidal reference input. In the presence of external disturbances, the APIDSMC controller showed much smoother and less overshooting response compared to that of the PID controller. Conclusion: Performance of the APIDSMC was superior to the PID controller. The APIDSMC was proved to be more effective controller in compensating the SMA uncertainties and external disturbances with clinically acceptable thresholds.« less
NASA Astrophysics Data System (ADS)
Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri
2015-04-01
Sputum smear observation has an important role in tuberculosis (TB) disease diagnosis, because it needs accurate identification to avoid high errors diagnosis. In development countries, sputum smear slide observation is commonly done with conventional light microscope from Ziehl-Neelsen stained tissue and it doesn't need high cost to maintain the microscope. The clinicians do manual screening process for sputum smear slide which is time consuming and needs highly training to detect the presence of TB bacilli (mycobacterium tuberculosis) accurately, especially for negative slide and slide with less number of TB bacilli. For helping the clinicians, we propose automatic scanning microscope with automatic identification of TB bacilli. The designed system modified the field movement of light microscope with stepper motor which was controlled by microcontroller. Every sputum smear field was captured by camera. After that some image processing techniques were done for the sputum smear images. The color threshold was used for background subtraction with hue canal in HSV color space. Sobel edge detection algorithm was used for TB bacilli image segmentation. We used feature extraction based on shape for bacilli analyzing and then neural network classified TB bacilli or not. The results indicated identification of TB bacilli that we have done worked well and detected TB bacilli accurately in sputum smear slide with normal staining, but not worked well in over staining and less staining tissue slide. However, overall the designed system can help the clinicians in sputum smear observation becomes more easily.
A method for photon beam Monte Carlo multileaf collimator particle transport
NASA Astrophysics Data System (ADS)
Siebers, Jeffrey V.; Keall, Paul J.; Kim, Jong Oh; Mohan, Radhe
2002-09-01
Monte Carlo (MC) algorithms are recognized as the most accurate methodology for patient dose assessment. For intensity-modulated radiation therapy (IMRT) delivered with dynamic multileaf collimators (DMLCs), accurate dose calculation, even with MC, is challenging. Accurate IMRT MC dose calculations require inclusion of the moving MLC in the MC simulation. Due to its complex geometry, full transport through the MLC can be time consuming. The aim of this work was to develop an MLC model for photon beam MC IMRT dose computations. The basis of the MC MLC model is that the complex MLC geometry can be separated into simple geometric regions, each of which readily lends itself to simplified radiation transport. For photons, only attenuation and first Compton scatter interactions are considered. The amount of attenuation material an individual particle encounters while traversing the entire MLC is determined by adding the individual amounts from each of the simplified geometric regions. Compton scatter is sampled based upon the total thickness traversed. Pair production and electron interactions (scattering and bremsstrahlung) within the MLC are ignored. The MLC model was tested for 6 MV and 18 MV photon beams by comparing it with measurements and MC simulations that incorporate the full physics and geometry for fields blocked by the MLC and with measurements for fields with the maximum possible tongue-and-groove and tongue-or-groove effects, for static test cases and for sliding windows of various widths. The MLC model predicts the field size dependence of the MLC leakage radiation within 0.1% of the open-field dose. The entrance dose and beam hardening behind a closed MLC are predicted within +/-1% or 1 mm. Dose undulations due to differences in inter- and intra-leaf leakage are also correctly predicted. The MC MLC model predicts leaf-edge tongue-and-groove dose effect within +/-1% or 1 mm for 95% of the points compared at 6 MV and 88% of the points compared at 18 MV. The dose through a static leaf tip is also predicted generally within +/-1% or 1 mm. Tests with sliding windows of various widths confirm the accuracy of the MLC model for dynamic delivery and indicate that accounting for a slight leaf position error (0.008 cm for our MLC) will improve the accuracy of the model. The MLC model developed is applicable to both dynamic MLC and segmental MLC IMRT beam delivery and will be useful for patient IMRT dose calculations, pre-treatment verification of IMRT delivery and IMRT portal dose transmission dosimetry.
Acquisition and use of Orlando, Florida and Continental Airbus radar flight test data
NASA Technical Reports Server (NTRS)
Eide, Michael C.; Mathews, Bruce
1992-01-01
Westinghouse is developing a lookdown pulse Doppler radar for production as the sensor and processor of a forward looking hazardous windshear detection and avoidance system. A data collection prototype of that product was ready for flight testing in Orlando to encounter low level windshear in corroboration with the FAA-Terminal Doppler Weather Radar (TDWR). Airborne real-time processing and display of the hazard factor were demonstrated with TDWR facilitated intercepts and penetrations of over 80 microbursts in a three day period, including microbursts with hazard factors in excess of .16 (with 500 ft. PIREP altitude loss) and the hazard factor display at 6 n.mi. of a visually transparent ('dry') microburst with TDWR corroborated outflow reflectivities of +5 dBz. Range gated Doppler spectrum data was recorded for subsequent development and refinement of hazard factor detection and urban clutter rejection algorithms. Following Orlando, the data collection radar was supplemental type certified for in revenue service on a Continental Airlines Airbus in an automatic and non-interferring basis with its ARINC 708 radar to allow Westinghouse to confirm its understanding of commercial aircraft installation, interface realities, and urban airport clutter. A number of software upgrades, all of which were verified at the Receiver-Transmitter-Processor (RTP) hardware bench with Orlando microburst data to produce desired advanced warning hazard factor detection, included some preliminary loads with automatic (sliding window average hazard factor) detection and annunciation recording. The current (14-APR-92) configured software is free from false and/or nuisance alerts (CAUTIONS, WARNINGS, etc.) for all take-off and landing approaches, under 2500 ft. altitude to weight-on-wheels, into all encountered airports, including Newark (NJ), LAX, Denver, Houston, Cleveland, etc. Using the Orlando data collected on hazardous microbursts, Westinghouse has developed a lookdown pulse Doppler radar product with signal and data processing algorithms which detect realistic microburst hazards and has demonstrated those algorithms produce no false alerts (or nuisance alerts) in urban airport ground moving vehicle (GMTI) and/or clutter environments.
Wijetunge, Chalini D; Saeed, Isaam; Boughton, Berin A; Spraggins, Jeffrey M; Caprioli, Richard M; Bacic, Antony; Roessner, Ute; Halgamuge, Saman K
2015-10-01
Matrix Assisted Laser Desorption Ionization-Imaging Mass Spectrometry (MALDI-IMS) in 'omics' data acquisition generates detailed information about the spatial distribution of molecules in a given biological sample. Various data processing methods have been developed for exploring the resultant high volume data. However, most of these methods process data in the spectral domain and do not make the most of the important spatial information available through this technology. Therefore, we propose a novel streamlined data analysis pipeline specifically developed for MALDI-IMS data utilizing significant spatial information for identifying hidden significant molecular distribution patterns in these complex datasets. The proposed unsupervised algorithm uses Sliding Window Normalization (SWN) and a new spatial distribution based peak picking method developed based on Gray level Co-Occurrence (GCO) matrices followed by clustering of biomolecules. We also use gist descriptors and an improved version of GCO matrices to extract features from molecular images and minimum medoid distance to automatically estimate the number of possible groups. We evaluated our algorithm using a new MALDI-IMS metabolomics dataset of a plant (Eucalypt) leaf. The algorithm revealed hidden significant molecular distribution patterns in the dataset, which the current Component Analysis and Segmentation Map based approaches failed to extract. We further demonstrate the performance of our peak picking method over other traditional approaches by using a publicly available MALDI-IMS proteomics dataset of a rat brain. Although SWN did not show any significant improvement as compared with using no normalization, the visual assessment showed an improvement as compared to using the median normalization. The source code and sample data are freely available at http://exims.sourceforge.net/. awgcdw@student.unimelb.edu.au or chalini_w@live.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Chest compression rate measurement from smartphone video.
Engan, Kjersti; Hinna, Thomas; Ryen, Tom; Birkenes, Tonje S; Myklebust, Helge
2016-08-11
Out-of-hospital cardiac arrest is a life threatening situation where the first person performing cardiopulmonary resuscitation (CPR) most often is a bystander without medical training. Some existing smartphone apps can call the emergency number and provide for example global positioning system (GPS) location like Hjelp 113-GPS App by the Norwegian air ambulance. We propose to extend functionality of such apps by using the built in camera in a smartphone to capture video of the CPR performed, primarily to estimate the duration and rate of the chest compression executed, if any. All calculations are done in real time, and both the caller and the dispatcher will receive the compression rate feedback when detected. The proposed algorithm is based on finding a dynamic region of interest in the video frames, and thereafter evaluating the power spectral density by computing the fast fourier transform over sliding windows. The power of the dominating frequencies is compared to the power of the frequency area of interest. The system is tested on different persons, male and female, in different scenarios addressing target compression rates, background disturbances, compression with mouth-to-mouth ventilation, various background illuminations and phone placements. All tests were done on a recording Laerdal manikin, providing true compression rates for comparison. Overall, the algorithm is seen to be promising, and it manages a number of disturbances and light situations. For target rates at 110 cpm, as recommended during CPR, the mean error in compression rate (Standard dev. over tests in parentheses) is 3.6 (0.8) for short hair bystanders, and 8.7 (6.0) including medium and long haired bystanders. The presented method shows that it is feasible to detect the compression rate of chest compressions performed by a bystander by placing the smartphone close to the patient, and using the built-in camera combined with a video processing algorithm performed real-time on the device.
Toward blind removal of unwanted sound from orchestrated music
NASA Astrophysics Data System (ADS)
Chang, Soo-Young; Chun, Joohwan
2000-11-01
The problem addressed in this paper is to removing unwanted sounds from music sound. The sound to be removed could be disturbance such as cough. We shall present some preliminary results on this problem using statistical properties of signals. Our approach consists of three steps. We first estimate the fundamental frequencies and partials given noise-corrupted music sound. This gives us the autoregressive (AR) model of the music sound. Then we filter the noise-corrupted sound using the AR parameters. The filtered signal is then subtracted from the original noise-corrupted signal to get the disturbance. Finally, the obtained disturbance is used a reference signal to eliminate the disturbance from the noise- corrupted music signal. Above three steps are carried out in a recursive manner using a sliding window or an infinitely growing window with an appropriate forgetting factor.
Fixed-rate layered multicast congestion control
NASA Astrophysics Data System (ADS)
Bing, Zhang; Bing, Yuan; Zengji, Liu
2006-10-01
A new fixed-rate layered multicast congestion control algorithm called FLMCC is proposed. The sender of a multicast session transmits data packets at a fixed rate on each layer, while receivers each obtain different throughput by cumulatively subscribing to deferent number of layers based on their expected rates. In order to provide TCP-friendliness and estimate the expected rate accurately, a window-based mechanism implemented at receivers is presented. To achieve this, each receiver maintains a congestion window, adjusts it based on the GAIMD algorithm, and from the congestion window an expected rate is calculated. To measure RTT, a new method is presented which combines an accurate measurement with a rough estimation. A feedback suppression based on a random timer mechanism is given to avoid feedback implosion in the accurate measurement. The protocol is simple in its implementation. Simulations indicate that FLMCC shows good TCP-friendliness, responsiveness as well as intra-protocol fairness, and provides high link utilization.
Shot boundary detection and label propagation for spatio-temporal video segmentation
NASA Astrophysics Data System (ADS)
Piramanayagam, Sankaranaryanan; Saber, Eli; Cahill, Nathan D.; Messinger, David
2015-02-01
This paper proposes a two stage algorithm for streaming video segmentation. In the first stage, shot boundaries are detected within a window of frames by comparing dissimilarity between 2-D segmentations of each frame. In the second stage, the 2-D segments are propagated across the window of frames in both spatial and temporal direction. The window is moved across the video to find all shot transitions and obtain spatio-temporal segments simultaneously. As opposed to techniques that operate on entire video, the proposed approach consumes significantly less memory and enables segmentation of lengthy videos. We tested our segmentation based shot detection method on the TRECVID 2007 video dataset and compared it with block-based technique. Cut detection results on the TRECVID 2007 dataset indicate that our algorithm has comparable results to the best of the block-based methods. The streaming video segmentation routine also achieves promising results on a challenging video segmentation benchmark database.
Window classification of brain CT images in biomedical articles.
Xue, Zhiyun; Antani, Sameer; Long, L Rodney; Demner-Fushman, Dina; Thoma, George R
2012-01-01
Effective capability to search biomedical articles based on visual properties of article images may significantly augment information retrieval in the future. In this paper, we present a new method to classify the window setting types of brain CT images. Windowing is a technique frequently used in the evaluation of CT scans, and is used to enhance contrast for the particular tissue or abnormality type being evaluated. In particular, it provides radiologists with an enhanced view of certain types of cranial abnormalities, such as the skull lesions and bone dysplasia which are usually examined using the " bone window" setting and illustrated in biomedical articles using "bone window images". Due to the inherent large variations of images among articles, it is important that the proposed method is robust. Our algorithm attained 90% accuracy in classifying images as bone window or non-bone window in a 210 image data set.
Evaluation of the effect of filter apodization for volume PET imaging using the 3-D RP algorithm
NASA Astrophysics Data System (ADS)
Baghaei, H.; Wong, Wai-Hoi; Li, Hongdi; Uribe, J.; Wang, Yu; Aykac, M.; Liu, Yaqiang; Xing, Tao
2003-02-01
We investigated the influence of filter apodization and cutoff frequency on the image quality of volume positron emission tomography (PET) imaging using the three-dimensional reprojection (3-D RP) algorithm. An important parameter in 3-D RP and other filtered backprojection algorithms is the choice of the filter window function. In this study, the Hann, Hamming, and Butterworth low-pass window functions were investigated. For each window, a range of cutoff frequencies was considered. Projection data were acquired by scanning a uniform cylindrical phantom, a cylindrical phantom containing four small lesion phantoms having diameters of 3, 4, 5, and 6 mm and the 3-D Hoffman brain phantom. All measurements were performed using the high-resolution PET camera developed at the M.D. Anderson Cancer Center (MDAPET), University of Texas, Houston, TX. This prototype camera, which is a multiring scanner with no septa, has an intrinsic transaxial resolution of 2.8 mm. The evaluation was performed by computing the noise level in the reconstructed images of the uniform phantom and the contrast recovery of the 6-mm hot lesion in a warm background and also by visually inspecting images, especially those of the Hoffman brain phantom. For this work, we mainly studied the central slices which are less affected by the incompleteness of the 3-D data. Overall, the Butterworth window offered a better contrast-noise performance over the Hann and Hamming windows. For our high statistics data, for the Hann and Hamming apodization functions a cutoff frequency of 0.6-0.8 of the Nyquist frequency resulted in a reasonable compromise between the contrast recovery and noise level and for the Butterworth window a cutoff frequency of 0.4-0.6 of the Nyquist frequency was a reasonable choice. For the low statistics data, use of lower cutoff frequencies was more appropriate.
NASA Astrophysics Data System (ADS)
Kang, Jinbum; Jang, Won Seuk; Yoo, Yangmo
2018-02-01
Ultrafast compound Doppler imaging based on plane-wave excitation (UCDI) can be used to evaluate cardiovascular diseases using high frame rates. In particular, it provides a fully quantifiable flow analysis over a large region of interest with high spatio-temporal resolution. However, the pulse-repetition frequency (PRF) in the UCDI method is limited for high-velocity flow imaging since it has a tradeoff between the number of plane-wave angles (N) and acquisition time. In this paper, we present high PRF ultrafast sliding compound Doppler imaging method (HUSDI) to improve quantitative flow analysis. With the HUSDI method, full scanline images (i.e. each tilted plane wave data) in a Doppler frame buffer are consecutively summed using a sliding window to create high-quality ensemble data so that there is no reduction in frame rate and flow sensitivity. In addition, by updating a new compounding set with a certain time difference (i.e. sliding window step size or L), the HUSDI method allows various Doppler PRFs with the same acquisition data to enable a fully qualitative, retrospective flow assessment. To evaluate the performance of the proposed HUSDI method, simulation, in vitro and in vivo studies were conducted under diverse flow circumstances. In the simulation and in vitro studies, the HUSDI method showed improved hemodynamic representations without reducing either temporal resolution or sensitivity compared to the UCDI method. For the quantitative analysis, the root mean squared velocity error (RMSVE) was measured using 9 angles (-12° to 12°) with L of 1-9, and the results were found to be comparable to those of the UCDI method (L = N = 9), i.e. ⩽0.24 cm s-1, for all L values. For the in vivo study, the flow data acquired from a full cardiac cycle of the femoral vessels of a healthy volunteer were analyzed using a PW spectrogram, and arterial and venous flows were successfully assessed with high Doppler PRF (e.g. 5 kHz at L = 4). These results indicate that the proposed HUSDI method can improve flow visualization and quantification with a higher frame rate, PRF and flow sensitivity in cardiovascular imaging.
Kang, Jinbum; Jang, Won Seuk; Yoo, Yangmo
2018-02-09
Ultrafast compound Doppler imaging based on plane-wave excitation (UCDI) can be used to evaluate cardiovascular diseases using high frame rates. In particular, it provides a fully quantifiable flow analysis over a large region of interest with high spatio-temporal resolution. However, the pulse-repetition frequency (PRF) in the UCDI method is limited for high-velocity flow imaging since it has a tradeoff between the number of plane-wave angles (N) and acquisition time. In this paper, we present high PRF ultrafast sliding compound Doppler imaging method (HUSDI) to improve quantitative flow analysis. With the HUSDI method, full scanline images (i.e. each tilted plane wave data) in a Doppler frame buffer are consecutively summed using a sliding window to create high-quality ensemble data so that there is no reduction in frame rate and flow sensitivity. In addition, by updating a new compounding set with a certain time difference (i.e. sliding window step size or L), the HUSDI method allows various Doppler PRFs with the same acquisition data to enable a fully qualitative, retrospective flow assessment. To evaluate the performance of the proposed HUSDI method, simulation, in vitro and in vivo studies were conducted under diverse flow circumstances. In the simulation and in vitro studies, the HUSDI method showed improved hemodynamic representations without reducing either temporal resolution or sensitivity compared to the UCDI method. For the quantitative analysis, the root mean squared velocity error (RMSVE) was measured using 9 angles (-12° to 12°) with L of 1-9, and the results were found to be comparable to those of the UCDI method (L = N = 9), i.e. ⩽0.24 cm s -1 , for all L values. For the in vivo study, the flow data acquired from a full cardiac cycle of the femoral vessels of a healthy volunteer were analyzed using a PW spectrogram, and arterial and venous flows were successfully assessed with high Doppler PRF (e.g. 5 kHz at L = 4). These results indicate that the proposed HUSDI method can improve flow visualization and quantification with a higher frame rate, PRF and flow sensitivity in cardiovascular imaging.
NASA Astrophysics Data System (ADS)
Prasetyo, H.; Alfatsani, M. A.; Fauza, G.
2018-05-01
The main issue in vehicle routing problem (VRP) is finding the shortest route of product distribution from the depot to outlets to minimize total cost of distribution. Capacitated Closed Vehicle Routing Problem with Time Windows (CCVRPTW) is one of the variants of VRP that accommodates vehicle capacity and distribution period. Since the main problem of CCVRPTW is considered a non-polynomial hard (NP-hard) problem, it requires an efficient and effective algorithm to solve the problem. This study was aimed to develop Biased Random Key Genetic Algorithm (BRKGA) that is combined with local search to solve the problem of CCVRPTW. The algorithm design was then coded by MATLAB. Using numerical test, optimum algorithm parameters were set and compared with the heuristic method and Standard BRKGA to solve a case study on soft drink distribution. Results showed that BRKGA combined with local search resulted in lower total distribution cost compared with the heuristic method. Moreover, the developed algorithm was found to be successful in increasing the performance of Standard BRKGA.
LES of a ducted propeller with rotor and stator in crashback
NASA Astrophysics Data System (ADS)
Jang, Hyunchul; Mahesh, Krishnan
2012-11-01
A sliding interface method is developed for large eddy simulation (LES) of flow past ducted propellers with both rotor and stator. The method is developed for arbitrarily shaped unstructured elements on massively parallel computing platforms. Novel algorithms for searching sliding elements, interpolation at the sliding interface, and data structures for message passing are developed. We perform LES of flow past a ducted propeller with stator blades in the crashback mode of operation, where a marine vessel is quickly decelerated by rotating the propeller in reverse. The unsteady loads predicted by LES are in good agreement with experiments. A highly unsteady vortex ring is observed outside the duct. High pressure fluctuations are observed near the blade tips, which significantly contribute to the side-force. This work is supported by the United States Office of Naval Research.
A fast non-local means algorithm based on integral image and reconstructed similar kernel
NASA Astrophysics Data System (ADS)
Lin, Zheng; Song, Enmin
2018-03-01
Image denoising is one of the essential methods in digital image processing. The non-local means (NLM) denoising approach is a remarkable denoising technique. However, its time complexity of the computation is high. In this paper, we design a fast NLM algorithm based on integral image and reconstructed similar kernel. First, the integral image is introduced in the traditional NLM algorithm. In doing so, it reduces a great deal of repetitive operations in the parallel processing, which will greatly improves the running speed of the algorithm. Secondly, in order to amend the error of the integral image, we construct a similar window resembling the Gaussian kernel in the pyramidal stacking pattern. Finally, in order to eliminate the influence produced by replacing the Gaussian weighted Euclidean distance with Euclidean distance, we propose a scheme to construct a similar kernel with a size of 3 x 3 in a neighborhood window which will reduce the effect of noise on a single pixel. Experimental results demonstrate that the proposed algorithm is about seventeen times faster than the traditional NLM algorithm, yet produce comparable results in terms of Peak Signal-to- Noise Ratio (the PSNR increased 2.9% in average) and perceptual image quality.
NASA Astrophysics Data System (ADS)
Zhang, Xue; Wang, Yong; Fan, Junjie; Zhong, Yong; Zhang, Rui
2014-09-01
To improve the transmitting power in an S-band klystron, a long pill-box window that has a disk with grooves with a semicircular cross section is theoretically investigated and simulated. A Monte-Carlo algorithm is used to track the secondary electron trajectories and analyze the multipactor scenario in the long pill-box window and on the grooved surface. Extending the height of the long-box window can decrease the normal electric field on the surface of the window disk, but the single surface multipactor still exists. It is confirmed that the window disk with periodic semicircular grooves can explicitly suppress the multipactor and predominantly depresses the local field enhancement and the bottom continuous multipactor. The difference between semicircular and sharp boundary grooves is clarified numerically and analytically.
An algorithm for spatial heirarchy clustering
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Velasco, F. R. D.
1981-01-01
A method for utilizing both spectral and spatial redundancy in compacting and preclassifying images is presented. In multispectral satellite images, a high correlation exists between neighboring image points which tend to occupy dense and restricted regions of the feature space. The image is divided into windows of the same size where the clustering is made. The classes obtained in several neighboring windows are clustered, and then again successively clustered until only one region corresponding to the whole image is obtained. By employing this algorithm only a few points are considered in each clustering, thus reducing computational effort. The method is illustrated as applied to LANDSAT images.
NASA Astrophysics Data System (ADS)
Levine, Zachary H.; Pintar, Adam L.
2015-11-01
A simple algorithm for averaging a stochastic sequence of 1D arrays in a moving, expanding window is provided. The samples are grouped in bins which increase exponentially in size so that a constant fraction of the samples is retained at any point in the sequence. The algorithm is shown to have particular relevance for a class of Monte Carlo sampling problems which includes one characteristic of iterative reconstruction in computed tomography. The code is available in the CPC program library in both Fortran 95 and C and is also available in R through CRAN.
Wang, Shenghao; Zhang, Yuyan; Cao, Fuyi; Pei, Zhenying; Gao, Xuewei; Zhang, Xu; Zhao, Yong
2018-02-13
This paper presents a novel spectrum analysis tool named synergy adaptive moving window modeling based on immune clone algorithm (SA-MWM-ICA) considering the tedious and inconvenient labor involved in the selection of pre-processing methods and spectral variables by prior experience. In this work, immune clone algorithm is first introduced into the spectrum analysis field as a new optimization strategy, covering the shortage of the relative traditional methods. Based on the working principle of the human immune system, the performance of the quantitative model is regarded as antigen, and a special vector corresponding to the above mentioned antigen is regarded as antibody. The antibody contains a pre-processing method optimization region which is created by 11 decimal digits, and a spectrum variable optimization region which is formed by some moving windows with changeable width and position. A set of original antibodies are created by modeling with this algorithm. After calculating the affinity of these antibodies, those with high affinity will be selected to clone. The regulation for cloning is that the higher the affinity, the more copies will be. In the next step, another import operation named hyper-mutation is applied to the antibodies after cloning. Moreover, the regulation for hyper-mutation is that the lower the affinity, the more possibility will be. Several antibodies with high affinity will be created on the basis of these steps. Groups of simulated dataset, gasoline near-infrared spectra dataset, and soil near-infrared spectra dataset are employed to verify and illustrate the performance of SA-MWM-ICA. Analysis results show that the performance of the quantitative models adopted by SA-MWM-ICA are better especially for structures with relatively complex spectra than traditional models such as partial least squares (PLS), moving window PLS (MWPLS), genetic algorithm PLS (GAPLS), and pretreatment method classification and adjustable parameter changeable size moving window PLS (CA-CSMWPLS). The selected pre-processing methods and spectrum variables are easily explained. The proposed method will converge in few generations and can be used not only for near-infrared spectroscopy analysis but also for other similar spectral analysis, such as infrared spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry
NASA Astrophysics Data System (ADS)
Kilgour, David P. A.; Hughes, Sam; Kilgour, Samantha L.; Mackay, C. Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K.; Clarke, David J.; Goodlett, David R.
2017-02-01
We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks.
Autopiquer - a Robust and Reliable Peak Detection Algorithm for Mass Spectrometry.
Kilgour, David P A; Hughes, Sam; Kilgour, Samantha L; Mackay, C Logan; Palmblad, Magnus; Tran, Bao Quoc; Goo, Young Ah; Ernst, Robert K; Clarke, David J; Goodlett, David R
2017-02-01
We present a simple algorithm for robust and unsupervised peak detection by determining a noise threshold in isotopically resolved mass spectrometry data. Solving this problem will greatly reduce the subjective and time-consuming manual picking of mass spectral peaks and so will prove beneficial in many research applications. The Autopiquer approach uses autocorrelation to test for the presence of (isotopic) structure in overlapping windows across the spectrum. Within each window, a noise threshold is optimized to remove the most unstructured data, whilst keeping as much of the (isotopic) structure as possible. This algorithm has been successfully demonstrated for both peak detection and spectral compression on data from many different classes of mass spectrometer and for different sample types, and this approach should also be extendible to other types of data that contain regularly spaced discrete peaks. Graphical Abstract ᅟ.
Rapid update of discrete Fourier transform for real-time signal processing
NASA Astrophysics Data System (ADS)
Sherlock, Barry G.; Kakad, Yogendra P.
2001-10-01
In many identification and target recognition applications, the incoming signal will have properties that render it amenable to analysis or processing in the Fourier domain. In such applications, however, it is usually essential that the identification or target recognition be performed in real time. An important constraint upon real-time processing in the Fourier domain is the time taken to perform the Discrete Fourier Transform (DFT). Ideally, a new Fourier transform should be obtained after the arrival of every new data point. However, the Fast Fourier Transform (FFT) algorithm requires on the order of N log2 N operations, where N is the length of the transform, and this usually makes calculation of the transform for every new data point computationally prohibitive. In this paper, we develop an algorithm to update the existing DFT to represent the new data series that results when a new signal point is received. Updating the DFT in this way uses less computational order by a factor of log2 N. The algorithm can be modified to work in the presence of data window functions. This is a considerable advantage, because windowing is often necessary to reduce edge effects that occur because the implicit periodicity of the Fourier transform is not exhibited by the real-world signal. Versions are developed in this paper for use with the boxcar window, the split triangular, Hanning, Hamming, and Blackman windows. Generalization of these results to 2D is also presented.
Daytime Land Surface Temperature Extraction from MODIS Thermal Infrared Data under Cirrus Clouds
Fan, Xiwei; Tang, Bo-Hui; Wu, Hua; Yan, Guangjian; Li, Zhao-Liang
2015-01-01
Simulated data showed that cirrus clouds could lead to a maximum land surface temperature (LST) retrieval error of 11.0 K when using the generalized split-window (GSW) algorithm with a cirrus optical depth (COD) at 0.55 μm of 0.4 and in nadir view. A correction term in the COD linear function was added to the GSW algorithm to extend the GSW algorithm to cirrus cloudy conditions. The COD was acquired by a look up table of the isolated cirrus bidirectional reflectance at 0.55 μm. Additionally, the slope k of the linear function was expressed as a multiple linear model of the top of the atmospheric brightness temperatures of MODIS channels 31–34 and as the difference between split-window channel emissivities. The simulated data showed that the LST error could be reduced from 11.0 to 2.2 K. The sensitivity analysis indicated that the total errors from all the uncertainties of input parameters, extension algorithm accuracy, and GSW algorithm accuracy were less than 2.5 K in nadir view. Finally, the Great Lakes surface water temperatures measured by buoys showed that the retrieval accuracy of the GSW algorithm was improved by at least 1.5 K using the proposed extension algorithm for cirrus skies. PMID:25928059
GPU accelerated dynamic functional connectivity analysis for functional MRI data.
Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu
2015-07-01
Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gardiner, Laura-Jayne; Bansept-Basler, Pauline; Olohan, Lisa; Joynson, Ryan; Brenchley, Rachel; Hall, Neil; O'Sullivan, Donal M; Hall, Anthony
2016-08-01
Previously we extended the utility of mapping-by-sequencing by combining it with sequence capture and mapping sequence data to pseudo-chromosomes that were organized using wheat-Brachypodium synteny. This, with a bespoke haplotyping algorithm, enabled us to map the flowering time locus in the diploid wheat Triticum monococcum L. identifying a set of deleted genes (Gardiner et al., 2014). Here, we develop this combination of gene enrichment and sliding window mapping-by-synteny analysis to map the Yr6 locus for yellow stripe rust resistance in hexaploid wheat. A 110 MB NimbleGen capture probe set was used to enrich and sequence a doubled haploid mapping population of hexaploid wheat derived from an Avalon and Cadenza cross. The Yr6 locus was identified by mapping to the POPSEQ chromosomal pseudomolecules using a bespoke pipeline and algorithm (Chapman et al., 2015). Furthermore the same locus was identified using newly developed pseudo-chromosome sequences as a mapping reference that are based on the genic sequence used for sequence enrichment. The pseudo-chromosomes allow us to demonstrate the application of mapping-by-sequencing to even poorly defined polyploidy genomes where chromosomes are incomplete and sub-genome assemblies are collapsed. This analysis uniquely enabled us to: compare wheat genome annotations; identify the Yr6 locus - defining a smaller genic region than was previously possible; associate the interval with one wheat sub-genome and increase the density of SNP markers associated. Finally, we built the pipeline in iPlant, making it a user-friendly community resource for phenotype mapping. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.
Learnable despeckling framework for optical coherence tomography images
NASA Astrophysics Data System (ADS)
Adabi, Saba; Rashedi, Elaheh; Clayton, Anne; Mohebbi-Kalkhoran, Hamed; Chen, Xue-wen; Conforto, Silvia; Nasiriavanaki, Mohammadreza
2018-01-01
Optical coherence tomography (OCT) is a prevalent, interferometric, high-resolution imaging method with broad biomedical applications. Nonetheless, OCT images suffer from an artifact called speckle, which degrades the image quality. Digital filters offer an opportunity for image improvement in clinical OCT devices, where hardware modification to enhance images is expensive. To reduce speckle, a wide variety of digital filters have been proposed; selecting the most appropriate filter for an OCT image/image set is a challenging decision, especially in dermatology applications of OCT where a different variety of tissues are imaged. To tackle this challenge, we propose an expandable learnable despeckling framework, we call LDF. LDF decides which speckle reduction algorithm is most effective on a given image by learning a figure of merit (FOM) as a single quantitative image assessment measure. LDF is learnable, which means when implemented on an OCT machine, each given image/image set is retrained and its performance is improved. Also, LDF is expandable, meaning that any despeckling algorithm can easily be added to it. The architecture of LDF includes two main parts: (i) an autoencoder neural network and (ii) filter classifier. The autoencoder learns the FOM based on several quality assessment measures obtained from the OCT image including signal-to-noise ratio, contrast-to-noise ratio, equivalent number of looks, edge preservation index, and mean structural similarity index. Subsequently, the filter classifier identifies the most efficient filter from the following categories: (a) sliding window filters including median, mean, and symmetric nearest neighborhood, (b) adaptive statistical-based filters including Wiener, homomorphic Lee, and Kuwahara, and (c) edge preserved patch or pixel correlation-based filters including nonlocal mean, total variation, and block matching three-dimensional filtering.
Miller, Nathan D; Haase, Nicholas J; Lee, Jonghyun; Kaeppler, Shawn M; de Leon, Natalia; Spalding, Edgar P
2017-01-01
Grain yield of the maize plant depends on the sizes, shapes, and numbers of ears and the kernels they bear. An automated pipeline that can measure these components of yield from easily-obtained digital images is needed to advance our understanding of this globally important crop. Here we present three custom algorithms designed to compute such yield components automatically from digital images acquired by a low-cost platform. One algorithm determines the average space each kernel occupies along the cob axis using a sliding-window Fourier transform analysis of image intensity features. A second counts individual kernels removed from ears, including those in clusters. A third measures each kernel's major and minor axis after a Bayesian analysis of contour points identifies the kernel tip. Dimensionless ear and kernel shape traits that may interrelate yield components are measured by principal components analysis of contour point sets. Increased objectivity and speed compared to typical manual methods are achieved without loss of accuracy as evidenced by high correlations with ground truth measurements and simulated data. Millimeter-scale differences among ear, cob, and kernel traits that ranged more than 2.5-fold across a diverse group of inbred maize lines were resolved. This system for measuring maize ear, cob, and kernel attributes is being used by multiple research groups as an automated Web service running on community high-throughput computing and distributed data storage infrastructure. Users may create their own workflow using the source code that is staged for download on a public repository. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.
Kowalik, Grzegorz T; Knight, Daniel S; Steeden, Jennifer A; Tann, Oliver; Odille, Freddy; Atkinson, David; Taylor, Andrew; Muthurangu, Vivek
2015-02-01
To develop a real-time phase contrast MR sequence with high enough temporal resolution to assess cardiac time intervals. The sequence utilized spiral trajectories with an acquisition strategy that allowed a combination of temporal encoding (Unaliasing by fourier-encoding the overlaps using the temporal dimension; UNFOLD) and parallel imaging (Sensitivity encoding; SENSE) to be used (UNFOLDed-SENSE). An in silico experiment was performed to determine the optimum UNFOLD filter. In vitro experiments were carried out to validate the accuracy of time intervals calculation and peak mean velocity quantification. In addition, 15 healthy volunteers were imaged with the new sequence, and cardiac time intervals were compared to reference standard Doppler echocardiography measures. For comparison, in silico, in vitro, and in vivo experiments were also carried out using sliding window reconstructions. The in vitro experiments demonstrated good agreement between real-time spiral UNFOLDed-SENSE phase contrast MR and the reference standard measurements of velocity and time intervals. The protocol was successfully performed in all volunteers. Subsequent measurement of time intervals produced values in keeping with literature values and good agreement with the gold standard echocardiography. Importantly, the proposed UNFOLDed-SENSE sequence outperformed the sliding window reconstructions. Cardiac time intervals can be successfully assessed with UNFOLDed-SENSE real-time spiral phase contrast. Real-time MR assessment of cardiac time intervals may be beneficial in assessment of patients with cardiac conditions such as diastolic dysfunction. © 2014 Wiley Periodicals, Inc.
Recognizing surgeon's actions during suture operations from video sequences
NASA Astrophysics Data System (ADS)
Li, Ye; Ohya, Jun; Chiba, Toshio; Xu, Rong; Yamashita, Hiromasa
2014-03-01
Because of the shortage of nurses in the world, the realization of a robotic nurse that can support surgeries autonomously is very important. More specifically, the robotic nurse should be able to autonomously recognize different situations of surgeries so that the robotic nurse can pass necessary surgical tools to the medical doctors in a timely manner. This paper proposes and explores methods that can classify suture and tying actions during suture operations from the video sequence that observes the surgery scene that includes the surgeon's hands. First, the proposed method uses skin pixel detection and foreground extraction to detect the hand area. Then, interest points are randomly chosen from the hand area so that their 3D SIFT descriptors are computed. A word vocabulary is built by applying hierarchical K-means to these descriptors, and the words' frequency histogram, which corresponds to the feature space, is computed. Finally, to classify the actions, either SVM (Support Vector Machine), Nearest Neighbor rule (NN) for the feature space or a method that combines "sliding window" with NN is performed. We collect 53 suture videos and 53 tying videos to build the training set and to test the proposed method experimentally. It turns out that the NN gives higher than 90% accuracies, which are better recognition than SVM. Negative actions, which are different from either suture or tying action, are recognized with quite good accuracies, while "Sliding window" did not show significant improvements for suture and tying and cannot recognize negative actions.
Effects of the 7-8-year cycle in daily mean air temperature as a cross-scale information transfer
NASA Astrophysics Data System (ADS)
Jajcay, Nikola; Hlinka, Jaroslav; Paluš, Milan
2015-04-01
Using a novel nonlinear time-series analysis method, an information transfer from larger to smaller scales of the air temperature variability has been observed in daily mean surface air temperature (SAT) data from European stations as the influence of the phase of slow oscillatory phenomena with periods around 6-11 years on amplitudes of the variability characterized by smaller temporal scales from a few months to 4-5 years [1]. The strongest effect is exerted by an oscillatory mode with the period close to 8 years and its influence can be seen in 1-2 °C differences of the conditional SAT means taken conditionally on the phase of the 8-year cycle. The size of this effect, however, changes in space and time. The changes in time are studied using sliding window technique, showing that the effect evolves in time, and during the last decades the effect is stronger and significant. Sliding window technique was used along with seasonal division of the data, and it has been found that the cycle is most pronounced in the winter season. Different types of surrogate data are applied in order to establish statistical significance and distinguish the effect of the 7-8-yr cycle from climate variability on shorter time scales. [1] M. Palus, Phys. Rev. Lett. 112 078702 (2014) This study is supported by the Ministry of Education, Youth and Sports of the Czech Republic within the Program KONTAKT II, Project No. LH14001.
A class of least-squares filtering and identification algorithms with systolic array architectures
NASA Technical Reports Server (NTRS)
Kalson, Seth Z.; Yao, Kung
1991-01-01
A unified approach is presented for deriving a large class of new and previously known time- and order-recursive least-squares algorithms with systolic array architectures, suitable for high-throughput-rate and VLSI implementations of space-time filtering and system identification problems. The geometrical derivation given is unique in that no assumption is made concerning the rank of the sample data correlation matrix. This method utilizes and extends the concept of oblique projections, as used previously in the derivations of the least-squares lattice algorithms. Exponentially weighted least-squares criteria are considered for both sliding and growing memory.
Huang, Yi-Fei; Golding, G Brian
2015-02-15
A number of statistical phylogenetic methods have been developed to infer conserved functional sites or regions in proteins. Many methods, e.g. Rate4Site, apply the standard phylogenetic models to infer site-specific substitution rates and totally ignore the spatial correlation of substitution rates in protein tertiary structures, which may reduce their power to identify conserved functional patches in protein tertiary structures when the sequences used in the analysis are highly similar. The 3D sliding window method has been proposed to infer conserved functional patches in protein tertiary structures, but the window size, which reflects the strength of the spatial correlation, must be predefined and is not inferred from data. We recently developed GP4Rate to solve these problems under the Bayesian framework. Unfortunately, GP4Rate is computationally slow. Here, we present an intuitive web server, FuncPatch, to perform a fast approximate Bayesian inference of conserved functional patches in protein tertiary structures. Both simulations and four case studies based on empirical data suggest that FuncPatch is a good approximation to GP4Rate. However, FuncPatch is orders of magnitudes faster than GP4Rate. In addition, simulations suggest that FuncPatch is potentially a useful tool complementary to Rate4Site, but the 3D sliding window method is less powerful than FuncPatch and Rate4Site. The functional patches predicted by FuncPatch in the four case studies are supported by experimental evidence, which corroborates the usefulness of FuncPatch. The software FuncPatch is freely available at the web site, http://info.mcmaster.ca/yifei/FuncPatch golding@mcmaster.ca Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Analytical and numerical analysis of frictional damage in quasi brittle materials
NASA Astrophysics Data System (ADS)
Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.
2016-07-01
Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.
Automatic identification and location technology of glass insulator self-shattering
NASA Astrophysics Data System (ADS)
Huang, Xinbo; Zhang, Huiying; Zhang, Ye
2017-11-01
The insulator of transmission lines is one of the most important infrastructures, which is vital to ensure the safe operation of transmission lines under complex and harsh operating conditions. The glass insulator often self-shatters but the available identification methods are inefficient and unreliable. Then, an automatic identification and localization technology of self-shattered glass insulators is proposed, which consists of the cameras installed on the tower video monitoring devices or the unmanned aerial vehicles, the 4G/OPGW network, and the monitoring center, where the identification and localization algorithm is embedded into the expert software. First, the images of insulators are captured by cameras, which are processed to identify the region of insulator string by the presented identification algorithm of insulator string. Second, according to the characteristics of the insulator string image, a mathematical model of the insulator string is established to estimate the direction and the length of the sliding blocks. Third, local binary pattern histograms of the template and the sliding block are extracted, by which the self-shattered insulator can be recognized and located. Finally, a series of experiments is fulfilled to verify the effectiveness of the algorithm. For single insulator images, Ac, Pr, and Rc of the algorithm are 94.5%, 92.38%, and 96.78%, respectively. For double insulator images, Ac, Pr, and Rc are 90.00%, 86.36%, and 93.23%, respectively.
Kozlowski, Cleopatra; Jeet, Surinder; Beyer, Joseph; Guerrero, Steve; Lesch, Justin; Wang, Xiaoting; DeVoss, Jason; Diehl, Lauri
2013-01-01
SUMMARY The DSS (dextran sulfate sodium) model of colitis is a mouse model of inflammatory bowel disease. Microscopic symptoms include loss of crypt cells from the gut lining and infiltration of inflammatory cells into the colon. An experienced pathologist requires several hours per study to score histological changes in selected regions of the mouse gut. In order to increase the efficiency of scoring, Definiens Developer software was used to devise an entirely automated method to quantify histological changes in the whole H&E slide. When the algorithm was applied to slides from historical drug-discovery studies, automated scores classified 88% of drug candidates in the same way as pathologists’ scores. In addition, another automated image analysis method was developed to quantify colon-infiltrating macrophages, neutrophils, B cells and T cells in immunohistochemical stains of serial sections of the H&E slides. The timing of neutrophil and macrophage infiltration had the highest correlation to pathological changes, whereas T and B cell infiltration occurred later. Thus, automated image analysis enables quantitative comparisons between tissue morphology changes and cell-infiltration dynamics. PMID:23580198
Big data in reciprocal space: Sliding fast Fourier transforms for determining periodicity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, Rama K., E-mail: rvv@ornl.gov; Belianinov, Alex; Baddorf, Arthur P.
Significant advances in atomically resolved imaging of crystals and surfaces have occurred in the last decade allowing unprecedented insight into local crystal structures and periodicity. Yet, the analysis of the long-range periodicity from the local imaging data, critical to correlation of functional properties and chemistry to the local crystallography, remains a challenge. Here, we introduce a Sliding Fast Fourier Transform (FFT) filter to analyze atomically resolved images of in-situ grown La{sub 5/8}Ca{sub 3/8}MnO{sub 3} (LCMO) films. We demonstrate the ability of sliding FFT algorithm to differentiate two sub-lattices, resulting from a mixed-terminated surface. Principal Component Analysis and Independent Component Analysismore » of the Sliding FFT dataset reveal the distinct changes in crystallography, step edges, and boundaries between the multiple sub-lattices. The implications for the LCMO system are discussed. The method is universal for images with any periodicity, and is especially amenable to atomically resolved probe and electron-microscopy data for rapid identification of the sub-lattices present.« less
Efficient Scalable Median Filtering Using Histogram-Based Operations.
Green, Oded
2018-05-01
Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.
Segmentation of pomegranate MR images using spatial fuzzy c-means (SFCM) algorithm
NASA Astrophysics Data System (ADS)
Moradi, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.
2011-10-01
Segmentation is one of the fundamental issues of image processing and machine vision. It plays a prominent role in a variety of image processing applications. In this paper, one of the most important applications of image processing in MRI segmentation of pomegranate is explored. Pomegranate is a fruit with pharmacological properties such as being anti-viral and anti-cancer. Having a high quality product in hand would be critical factor in its marketing. The internal quality of the product is comprehensively important in the sorting process. The determination of qualitative features cannot be manually made. Therefore, the segmentation of the internal structures of the fruit needs to be performed as accurately as possible in presence of noise. Fuzzy c-means (FCM) algorithm is noise-sensitive and pixels with noise are classified inversely. As a solution, in this paper, the spatial FCM algorithm in pomegranate MR images' segmentation is proposed. The algorithm is performed with setting the spatial neighborhood information in FCM and modification of fuzzy membership function for each class. The segmentation algorithm results on the original and the corrupted Pomegranate MR images by Gaussian, Salt Pepper and Speckle noises show that the SFCM algorithm operates much more significantly than FCM algorithm. Also, after diverse steps of qualitative and quantitative analysis, we have concluded that the SFCM algorithm with 5×5 window size is better than the other windows.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
NASA Astrophysics Data System (ADS)
Phu, Do Xuan; Huy, Ta Duc; Mien, Van; Choi, Seung-Bok
2018-07-01
This work proposes a novel composite adaptive controller based on the prescribed performance of the sliding surface and applies it to vibration control of a semi-active vehicle seat suspension system subjected to severe external disturbances. As a first step, the online fast interval type 2 fuzzy neural network system is adopted to establish a model and two sliding surfaces are used; conventional surface and prescribed surface. Then, an equivalent control is determined by assuming the derivative of the prescribed surface is zero, followed by the design of a controller which can guarantee both stability and robustness. Then, two controllers are combined and integrated with adaptation laws using the projection algorithm. The effectiveness of the proposed composite controller is validated through both simulation and experiment by undertaking vibration control of a semi-active seat suspension system equipped with a magneto-rheological (MR) damper. It is shown from both simulation and experimental realization that excellent vibration control performances are achieved with a small tracking error between the proposed and prescribed objectives. In addition, the control superiority of the proposed controller to conventional sliding mode controller featuring one sliding surface and proportional-integral-derivative (PID) controllers are demonstrated through a comparative work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xue, E-mail: zhangxue.iecas@yahoo.com; Wang, Yong; Fan, Junjie
2014-09-15
To improve the transmitting power in an S-band klystron, a long pill-box window that has a disk with grooves with a semicircular cross section is theoretically investigated and simulated. A Monte-Carlo algorithm is used to track the secondary electron trajectories and analyze the multipactor scenario in the long pill-box window and on the grooved surface. Extending the height of the long-box window can decrease the normal electric field on the surface of the window disk, but the single surface multipactor still exists. It is confirmed that the window disk with periodic semicircular grooves can explicitly suppress the multipactor and predominantlymore » depresses the local field enhancement and the bottom continuous multipactor. The difference between semicircular and sharp boundary grooves is clarified numerically and analytically.« less
An Adaptive Channel Access Method for Dynamic Super Dense Wireless Sensor Networks.
Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Zhang, Xuekun
2015-12-03
Super dense and distributed wireless sensor networks have become very popular with the development of small cell technology, Internet of Things (IoT), Machine-to-Machine (M2M) communications, Vehicular-to-Vehicular (V2V) communications and public safety networks. While densely deployed wireless networks provide one of the most important and sustainable solutions to improve the accuracy of sensing and spectral efficiency, a new channel access scheme needs to be designed to solve the channel congestion problem introduced by the high dynamics of competing nodes accessing the channel simultaneously. In this paper, we firstly analyzed the channel contention problem using a novel normalized channel contention analysis model which provides information on how to tune the contention window according to the state of channel contention. We then proposed an adaptive channel contention window tuning algorithm in which the contention window tuning rate is set dynamically based on the estimated channel contention level. Simulation results show that our proposed adaptive channel access algorithm based on fast contention window tuning can achieve more than 95 % of the theoretical optimal throughput and 0 . 97 of fairness index especially in dynamic and dense networks.
Sliding Mode Control of Real-Time PNU Vehicle Driving Simulator and Its Performance Evaluation
NASA Astrophysics Data System (ADS)
Lee, Min Cheol; Park, Min Kyu; Yoo, Wan Suk; Son, Kwon; Han, Myung Chul
This paper introduces an economical and effective full-scale driving simulator for study of human sensibility and development of new vehicle parts and its control. Real-time robust control to accurately reappear a various vehicle motion may be a difficult task because the motion platform is the nonlinear complex system. This study proposes the sliding mode controller with a perturbation compensator using observer-based fuzzy adaptive network (FAN). This control algorithm is designed to solve the chattering problem of a sliding mode control and to select the adequate fuzzy parameters of the perturbation compensator. For evaluating the trajectory control performance of the proposed approach, a tracking control of the developed simulator named PNUVDS is experimentally carried out. And then, the driving performance of the simulator is evaluated by using human perception and sensibility of some drivers in various driving conditions.
Smooth integral sliding mode controller for the position control of Stewart platform.
Kumar P, Ramesh; Chalanga, Asif; Bandyopadhyay, B
2015-09-01
This paper proposes the application of a new algorithm for the position control of a Stewart platform. The conventional integral sliding mode controller is a combination of nominal control and discontinuous feedback control hence the overall control is discontinuous in nature. The discontinuity in the feedback control is undesirable for practical applications due to chattering which causes the wear and tear of the mechanical actuators. In this paper the existing integral sliding mode control law for systems with matched disturbances is modified by replacing the discontinuous part by a continuous modified twisting control. This proposed controller is continuous in nature due to the combinations of two continuous controls. The desired position of the platform has been achieved using the proposed controller even in the presence of matched disturbances. The effectiveness of the proposed controller has been proved with the simulation results. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Ashtiani Haghighi, Donya; Mobayen, Saleh
2018-04-01
This paper proposes an adaptive super-twisting decoupled terminal sliding mode control technique for a class of fourth-order systems. The adaptive-tuning law eliminates the requirement of the knowledge about the upper bounds of external perturbations. Using the proposed control procedure, the state variables of cart-pole system are converged to decoupled terminal sliding surfaces and their equilibrium points in the finite time. Moreover, via the super-twisting algorithm, the chattering phenomenon is avoided without affecting the control performance. The numerical results demonstrate the high stabilization accuracy and lower performance indices values of the suggested method over the other ones. The simulation results on the cart-pole system as well as experimental validations demonstrate that the proposed control technique exhibits a reasonable performance in comparison with the other methods. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Swiderska, Zaneta; Markiewicz, Tomasz; Grala, Bartlomiej; Slodkowska, Janina
2015-01-01
The paper presents a combined method for an automatic hot-spot areas selection based on penalty factor in the whole slide images to support the pathomorphological diagnostic procedure. The studied slides represent the meningiomas and oligodendrogliomas tumor on the basis of the Ki-67/MIB-1 immunohistochemical reaction. It allows determining the tumor proliferation index as well as gives an indication to the medical treatment and prognosis. The combined method based on mathematical morphology, thresholding, texture analysis and classification is proposed and verified. The presented algorithm includes building a specimen map, elimination of hemorrhages from them, two methods for detection of hot-spot fields with respect to an introduced penalty factor. Furthermore, we propose localization concordance measure to evaluation localization of hot spot selection by the algorithms in respect to the expert's results. Thus, the results of the influence of the penalty factor are presented and discussed. It was found that the best results are obtained for 0.2 value of them. They confirm effectiveness of applied approach.
On dealing with multiple correlation peaks in PIV
NASA Astrophysics Data System (ADS)
Masullo, A.; Theunissen, R.
2018-05-01
A novel algorithm to analyse PIV images in the presence of strong in-plane displacement gradients and reduce sub-grid filtering is proposed in this paper. Interrogation windows subjected to strong in-plane displacement gradients often produce correlation maps presenting multiple peaks. Standard multi-grid procedures discard such ambiguous correlation windows using a signal to noise (SNR) filter. The proposed algorithm improves the standard multi-grid algorithm allowing the detection of splintered peaks in a correlation map through an automatic threshold, producing multiple displacement vectors for each correlation area. Vector locations are chosen by translating images according to the peak displacements and by selecting the areas with the strongest match. The method is assessed on synthetic images of a boundary layer of varying intensity and a sinusoidal displacement field of changing wavelength. An experimental case of a flow exhibiting strong velocity gradients is also provided to show the improvements brought by this technique.
Linear segmentation algorithm for detecting layer boundary with lidar.
Mao, Feiyue; Gong, Wei; Logan, Timothy
2013-11-04
The automatic detection of aerosol- and cloud-layer boundary (base and top) is important in atmospheric lidar data processing, because the boundary information is not only useful for environment and climate studies, but can also be used as input for further data processing. Previous methods have demonstrated limitations in defining the base and top, window-size setting, and have neglected the in-layer attenuation. To overcome these limitations, we present a new layer detection scheme for up-looking lidars based on linear segmentation with a reasonable threshold setting, boundary selecting, and false positive removing strategies. Preliminary results from both real and simulated data show that this algorithm cannot only detect the layer-base as accurate as the simple multi-scale method, but can also detect the layer-top more accurately than that of the simple multi-scale method. Our algorithm can be directly applied to uncalibrated data without requiring any additional measurements or window size selections.
NASA Astrophysics Data System (ADS)
Kim, Byung Soo; Lee, Woon-Seek; Koh, Shiegheun
2012-07-01
This article considers an inbound ordering and outbound dispatching problem for a single product in a third-party warehouse, where the demands are dynamic over a discrete and finite time horizon, and moreover, each demand has a time window in which it must be satisfied. Replenishing orders are shipped in containers and the freight cost is proportional to the number of containers used. The problem is classified into two cases, i.e. non-split demand case and split demand case, and a mathematical model for each case is presented. An in-depth analysis of the models shows that they are very complicated and difficult to find optimal solutions as the problem size becomes large. Therefore, genetic algorithm (GA) based heuristic approaches are designed to solve the problems in a reasonable time. To validate and evaluate the algorithms, finally, some computational experiments are conducted.
Locality-constrained anomaly detection for hyperspectral imagery
NASA Astrophysics Data System (ADS)
Liu, Jiabin; Li, Wei; Du, Qian; Liu, Kui
2015-12-01
Detecting a target with low-occurrence-probability from unknown background in a hyperspectral image, namely anomaly detection, is of practical significance. Reed-Xiaoli (RX) algorithm is considered as a classic anomaly detector, which calculates the Mahalanobis distance between local background and the pixel under test. Local RX, as an adaptive RX detector, employs a dual-window strategy to consider pixels within the frame between inner and outer windows as local background. However, the detector is sensitive if such a local region contains anomalous pixels (i.e., outliers). In this paper, a locality-constrained anomaly detector is proposed to remove outliers in the local background region before employing the RX algorithm. Specifically, a local linear representation is designed to exploit the internal relationship between linearly correlated pixels in the local background region and the pixel under test and its neighbors. Experimental results demonstrate that the proposed detector improves the original local RX algorithm.
Intelligent bandwidth compression
NASA Astrophysics Data System (ADS)
Tseng, D. Y.; Bullock, B. L.; Olin, K. E.; Kandt, R. K.; Olsen, J. D.
1980-02-01
The feasibility of a 1000:1 bandwidth compression ratio for image transmission has been demonstrated using image-analysis algorithms and a rule-based controller. Such a high compression ratio was achieved by first analyzing scene content using auto-cueing and feature-extraction algorithms, and then transmitting only the pertinent information consistent with mission requirements. A rule-based controller directs the flow of analysis and performs priority allocations on the extracted scene content. The reconstructed bandwidth-compressed image consists of an edge map of the scene background, with primary and secondary target windows embedded in the edge map. The bandwidth-compressed images are updated at a basic rate of 1 frame per second, with the high-priority target window updated at 7.5 frames per second. The scene-analysis algorithms used in this system together with the adaptive priority controller are described. Results of simulated 1000:1 bandwidth-compressed images are presented.
Sadala, S P; Patre, B M
2018-03-01
The 2-degree of freedom (DOF) helicopter system is a typical higher-order, multi-variable, nonlinear and strong coupled control system. The helicopter dynamics also includes parametric uncertainties and is subject to unknown external disturbances. Such complicated system requires designing a sophisticated control algorithm that can handle these difficulties. This paper presents a new robust control algorithm which is a combination of two continuous control techniques, composite nonlinear feedback (CNF) and super-twisting control (STC) methods. In the existing integral sliding mode (ISM) based CNF control law, the discontinuous term exhibits chattering which is not desirable for many practical applications. As the continuity of well known STC reduces chattering in the system, the proposed strategy is beneficial over the current ISM based CNF control law which has a discontinuous term. Two controllers with integral sliding surface are designed to control the position of the pitch and the yaw angles of the 2- DOF helicopter. The adequacy of this specific combination has been exhibited through general analysis, simulation and experimental results of 2-DOF helicopter setup. The acquired results demonstrate the good execution of the proposed controller regarding stabilization, following reference input without overshoot against actuator saturation and robustness concerning to the limited matched disturbances. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
AI (artificial intelligence) in histopathology--from image analysis to automated diagnosis.
Kayser, Klaus; Görtler, Jürgen; Bogovac, Milica; Bogovac, Aleksandar; Goldmann, Torsten; Vollmer, Ekkehard; Kayser, Gian
2009-01-01
The technological progress in digitalization of complete histological glass slides has opened a new door in tissue--based diagnosis. The presentation of microscopic images as a whole in a digital matrix is called virtual slide. A virtual slide allows calculation and related presentation of image information that otherwise can only be seen by individual human performance. The digital world permits attachments of several (if not all) fields of view and the contemporary visualization on a screen. The presentation of all microscopic magnifications is possible if the basic pixel resolution is less than 0.25 microns. To introduce digital tissue--based diagnosis into the daily routine work of a surgical pathologist requires a new setup of workflow arrangement and procedures. The quality of digitized images is sufficient for diagnostic purposes; however, the time needed for viewing virtual slides exceeds that of viewing original glass slides by far. The reason lies in a slower and more difficult sampling procedure, which is the selection of information containing fields of view. By application of artificial intelligence, tissue--based diagnosis in routine work can be managed automatically in steps as follows: 1. The individual image quality has to be measured, and corrected, if necessary. 2. A diagnostic algorithm has to be applied. An algorithm has be developed, that includes both object based (object features, structures) and pixel based (texture) measures. 3. These measures serve for diagnosis classification and feedback to order additional information, for example in virtual immunohistochemical slides. 4. The measures can serve for automated image classification and detection of relevant image information by themselves without any labeling. 5. The pathologists' duty will not be released by such a system; to the contrary, it will manage and supervise the system, i.e., just working at a "higher level". Virtual slides are already in use for teaching and continuous education in anatomy and pathology. First attempts to introduce them into routine work have been reported. Application of AI has been established by automated immunohistochemical measurement systems (EAMUS, www.diagnomX.eu). The performance of automated diagnosis has been reported for a broad variety of organs at sensitivity and specificity levels >85%). The implementation of a complete connected AI supported system is in its childhood. Application of AI in digital tissue--based diagnosis will allow the pathologists to work as supervisors and no longer as primary "water carriers". Its accurate use will give them the time needed to concentrating on difficult cases for the benefit of their patients.
Multi-window detection for P-wave in electrocardiograms based on bilateral accumulative area.
Chen, Riqing; Huang, Yingsong; Wu, Jian
2016-11-01
P-wave detection is one of the most challenging aspects in electrocardiograms (ECGs) due to its low amplitude, low frequency, and variable waveforms. This work introduces a novel multi-window detection method for P-wave delineation based on the bilateral accumulative area. The bilateral accumulative area is calculated by summing the areas covered by the P-wave curve with left and right sliding windows. The onset and offset of a positive P-wave correspond to the local maxima of the area detector. The position drift and difference in area variation of local extreme points with different windows are used to systematically combine multi-window and 12-lead synchronous detection methods, which are used to screen the optimization boundary points from all extreme points of different window widths and adaptively match the P-wave location. The proposed method was validated with ECG signals from various databases, including the Standard CSE Database, T-Wave Alternans Challenge Database, PTB Diagnostic ECG Database, and the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database. The average sensitivity Se was 99.44% with a positive predictivity P+ of 99.37% for P-wave detection. Standard deviations of 3.7 and 4.3ms were achieved for the onset and offset of P-waves, respectively, which is in agreement with the accepted tolerances required by the CSE committee. Compared with well-known delineation methods, this method can achieve high sensitivity and positive predictability using a simple calculation process. The experiment results suggest that the bilateral accumulative area could be an effective detection tool for ECG signal analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Xiaosong; Shan, Zebiao; Li, Yuanchun
2017-04-01
Pinpoint landing is a critical step in some asteroid exploring missions. This paper is concerned with the descent trajectory control for soft touching down on a small irregularly-shaped asteroid. A dynamic boundary layer based neural network quasi-sliding mode control law is proposed to track a desired descending path. The asteroid's gravitational acceleration acting on the spacecraft is described by the polyhedron method. Considering the presence of input constraint and unmodeled acceleration, the dynamic equation of relative motion is presented first. The desired descending path is planned using cubic polynomial method, and a collision detection algorithm is designed. To perform trajectory tracking, a neural network sliding mode control law is given first, where the sliding mode control is used to ensure the convergence of system states. Two radial basis function neural networks (RBFNNs) are respectively used as an approximator for the unmodeled term and a compensator for the difference between the actual control input with magnitude constraint and nominal control. To improve the chattering induced by the traditional sliding mode control and guarantee the reachability of the system, a specific saturation function with dynamic boundary layer is proposed to replace the sign function in the preceding control law. Through the Lyapunov approach, the reachability condition of the control system is given. The improved control law can guarantee the system state move within a gradually shrinking quasi-sliding mode band. Numerical simulation results demonstrate the effectiveness of the proposed control strategy.
Intelligent Use of CFAR Algorithms
1993-05-01
the reference windows can raise the threshold too high in many CFAR algorithms and result in masking of targets. GCMLD is a modification of CMLD that...AD-A267 755 RL-TR-93-75 III 11 III II liiI Interim Report May 1993 INTELLIGENT USE OF CFAR ALGORITHMS Kaman Sciences Corporation P. Antonik, B...AND DATES COVERED IMay 1993 Inte ’rim Jan 92 - Se2 92 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS INTELLIGENT USE OF CFAR ALGORITHMS C - F30602-91-C-0017
Cognitive Assessment in Long-Duration Space Flight
NASA Technical Reports Server (NTRS)
Kane, Robert; Seaton, Kimberly; Sipes, Walter
2011-01-01
This slide presentation reviews the development and use of a tool for assessing spaceflight cognitive ability in astronauts. This tool. the Spaceflight Cognitive Assessment Tool for Windows (WinSCAT) has been used to provide ISS flight surgeons with an objective clinical tool to monitor the astronauts cognitive status during long-duration space flight and allow immediate feedback to the astronaut. Its use is medically required for all long-duration missions and it contains a battery of five cognitive assessment subtests that are scheduled monthly and compared against the individual preflight baseline.
Time-series analysis of foreign exchange rates using time-dependent pattern entropy
NASA Astrophysics Data System (ADS)
Ishizaki, Ryuji; Inoue, Masayoshi
2013-08-01
Time-dependent pattern entropy is a method that reduces variations to binary symbolic dynamics and considers the pattern of symbols in a sliding temporal window. We use this method to analyze the instability of daily variations in foreign exchange rates, in particular, the dollar-yen rate. The time-dependent pattern entropy of the dollar-yen rate was found to be high in the following periods: before and after the turning points of the yen from strong to weak or from weak to strong, and the period after the Lehman shock.
Introduction to Numerical Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schoonover, Joseph A.
2016-06-14
These are slides for a lecture for the Parallel Computing Summer Research Internship at the National Security Education Center. This gives an introduction to numerical methods. Repetitive algorithms are used to obtain approximate solutions to mathematical problems, using sorting, searching, root finding, optimization, interpolation, extrapolation, least squares regresion, Eigenvalue problems, ordinary differential equations, and partial differential equations. Many equations are shown. Discretizations allow us to approximate solutions to mathematical models of physical systems using a repetitive algorithm and introduce errors that can lead to numerical instabilities if we are not careful.
Kros, Johan M; Huizer, Karin; Hernández-Laín, Aurelio; Marucci, Gianluca; Michotte, Alex; Pollo, Bianca; Rushing, Elisabeth J; Ribalta, Teresa; French, Pim; Jaminé, David; Bekka, Nawal; Lacombe, Denis; van den Bent, Martin J; Gorlia, Thierry
2015-06-10
With the rapid discovery of prognostic and predictive molecular parameters for glioma, the status of histopathology in the diagnostic process should be scrutinized. Our project aimed to construct a diagnostic algorithm for gliomas based on molecular and histologic parameters with independent prognostic values. The pathology slides of 636 patients with gliomas who had been included in EORTC 26951 and 26882 trials were reviewed using virtual microscopy by a panel of six neuropathologists who independently scored 18 histologic features and provided an overall diagnosis. The molecular data for IDH1, 1p/19q loss, EGFR amplification, loss of chromosome 10 and chromosome arm 10q, gain of chromosome 7, and hypermethylation of the promoter of MGMT were available for some of the cases. The slides were divided in discovery (n = 426) and validation sets (n = 210). The diagnostic algorithm resulting from analysis of the discovery set was validated in the latter. In 66% of cases, consensus of overall diagnosis was present. A diagnostic algorithm consisting of two molecular markers and one consensus histologic feature was created by conditional inference tree analysis. The order of prognostic significance was: 1p/19q loss, EGFR amplification, and astrocytic morphology, which resulted in the identification of four diagnostic nodes. Validation of the nodes in the validation set confirmed the prognostic value (P < .001). We succeeded in the creation of a timely diagnostic algorithm for anaplastic glioma based on multivariable analysis of consensus histopathology and molecular parameters. © 2015 by American Society of Clinical Oncology.
Novel application of windowed beamforming function imaging for FLGPR
NASA Astrophysics Data System (ADS)
Xique, Ismael J.; Burns, Joseph W.; Thelen, Brian J.; LaRose, Ryan M.
2018-04-01
Backprojection of cross-correlated array data, using algorithms such as coherent interferometric imaging (Borcea, et al., 2006), has been advanced as a method to improve the statistical stability of images of targets in an inhomogeneous medium. Recently, the Windowed Beamforming Energy (WBE) function algorithm has been introduced as a functionally equivalent approach, which is significantly less computationally burdensome (Borcea, et al., 2011). WBE produces similar results through the use of a quadratic function summing signals after beamforming in transmission and reception, and windowing in the time domain. We investigate the application of WBE to improve the detection of buried targets with forward looking ground penetrating MIMO radar (FLGPR) data. The formulation of WBE as well the software implementation of WBE for the FLGPR data collection will be discussed. WBE imaging results are compared to standard backprojection and Coherence Factor imaging. Additionally, the effectiveness of WBE on field-collected data is demonstrated qualitatively through images and quantitatively through the use of a CFAR statistic on buried targets of a variety of contrast levels.
Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors
Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal
2014-01-01
Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782
Practical quantification of necrosis in histological whole-slide images.
Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K
2013-06-01
Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.