NASA Astrophysics Data System (ADS)
Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun
2018-03-01
Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation
NASA Astrophysics Data System (ADS)
Sekhar, S. Chandra; Sreenivas, TV
2004-12-01
We address the problem of estimating instantaneous frequency (IF) of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE). The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF) estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD)-based IF estimators for different signal-to-noise ratio (SNR).
Foo, Lee Kien; McGree, James; Duffull, Stephen
2012-01-01
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.
Fabrication of Microcapsules for Dye-Doped Polymer-Dispersed Liquid Crystal-Based Smart Windows.
Kim, Mingyun; Park, Kyun Joo; Seok, Seunghwan; Ok, Jong Min; Jung, Hee-Tae; Choe, Jaehoon; Kim, Do Hyun
2015-08-19
A dye-doped polymer-dispersed liquid crystal (PDLC) is an attractive material for application in smart windows. Smart windows using a PDLC can be operated simply and have a high contrast ratio compared to those of other devices that employed photochromic or thermochromic material. However, in conventional dye-doped PDLC methods, dye contamination can cause problems and has a limited degree of commercialization of electric smart windows. Here, we report on an approach to resolve dye-related problems by encapsulating the dye in monodispersed capsules. By encapsulation, a fabricated dye-doped PDLC had a contrast ratio of >120 at 600 nm. This fabrication method of encapsulating the dye in a core-shell structured microcapsule in a dye-doped PDLC device provides a practical platform for dye-doped PDLC-based smart windows.
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
NASA Astrophysics Data System (ADS)
Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang
2016-05-01
In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.
Window classification of brain CT images in biomedical articles.
Xue, Zhiyun; Antani, Sameer; Long, L Rodney; Demner-Fushman, Dina; Thoma, George R
2012-01-01
Effective capability to search biomedical articles based on visual properties of article images may significantly augment information retrieval in the future. In this paper, we present a new method to classify the window setting types of brain CT images. Windowing is a technique frequently used in the evaluation of CT scans, and is used to enhance contrast for the particular tissue or abnormality type being evaluated. In particular, it provides radiologists with an enhanced view of certain types of cranial abnormalities, such as the skull lesions and bone dysplasia which are usually examined using the " bone window" setting and illustrated in biomedical articles using "bone window images". Due to the inherent large variations of images among articles, it is important that the proposed method is robust. Our algorithm attained 90% accuracy in classifying images as bone window or non-bone window in a 210 image data set.
Drug exposure in register-based research—An expert-opinion based evaluation of methods
Taipale, Heidi; Koponen, Marjaana; Tolppanen, Anna-Maija; Hartikainen, Sirpa; Ahonen, Riitta; Tiihonen, Jari
2017-01-01
Background In register-based pharmacoepidemiological studies, construction of drug exposure periods from drug purchases is a major methodological challenge. Various methods have been applied but their validity is rarely evaluated. Our objective was to conduct an expert-opinion based evaluation of the correctness of drug use periods produced by different methods. Methods Drug use periods were calculated with three fixed methods: time windows, assumption of one Defined Daily Dose (DDD) per day and one tablet per day, and with PRE2DUP that is based on modelling of individual drug purchasing behavior. Expert-opinion based evaluation was conducted with 200 randomly selected purchase histories of warfarin, bisoprolol, simvastatin, risperidone and mirtazapine in the MEDALZ-2005 cohort (28,093 persons with Alzheimer’s disease). Two experts reviewed purchase histories and judged which methods had joined correct purchases and gave correct duration for each of 1000 drug exposure periods. Results The evaluated correctness of drug use periods was 70–94% for PRE2DUP, and depending on grace periods and time window lengths 0–73% for tablet methods, 0–41% for DDD methods and 0–11% for time window methods. The highest rate of evaluated correct solutions for each method class were observed for 1 tablet per day with 180 days grace period (TAB_1_180, 43–73%), and 1 DDD per day with 180 days grace period (1–41%). Time window methods produced at maximum only 11% correct solutions. The best performing fixed method TAB_1_180 reached highest correctness for simvastatin 73% (95% CI 65–81%) whereas 89% (95% CI 84–94%) of PRE2DUP periods were judged as correct. Conclusions This study shows inaccuracy of fixed methods and the urgent need for new data-driven methods. In the expert-opinion based evaluation, the lowest error rates were observed with data-driven method PRE2DUP. PMID:28886089
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Stereo matching using census cost over cross window and segmentation-based disparity refinement
NASA Astrophysics Data System (ADS)
Li, Qingwu; Ni, Jinyan; Ma, Yunpeng; Xu, Jinxin
2018-03-01
Stereo matching is a vital requirement for many applications, such as three-dimensional (3-D) reconstruction, robot navigation, object detection, and industrial measurement. To improve the practicability of stereo matching, a method using census cost over cross window and segmentation-based disparity refinement is proposed. First, a cross window is obtained using distance difference and intensity similarity in binocular images. Census cost over the cross window and color cost are combined as the matching cost, which is aggregated by the guided filter. Then, winner-takes-all strategy is used to calculate the initial disparities. Second, a graph-based segmentation method is combined with color and edge information to achieve moderate under-segmentation. The segmented regions are classified into reliable regions and unreliable regions by consistency checking. Finally, the two regions are optimized by plane fitting and propagation, respectively, to match the ambiguous pixels. The experimental results are on Middlebury Stereo Datasets, which show that the proposed method has good performance in occluded and discontinuous regions, and it obtains smoother disparity maps with a lower average matching error rate compared with other algorithms.
NASA Astrophysics Data System (ADS)
Kang, Jae-sik; Oh, Eun-Joo; Bae, Min-Jung; Song, Doo-Sam
2017-12-01
Given that the Korean government is implementing what has been termed the energy standards and labelling program for windows, window companies will be required to assign window ratings based on the experimental results of their product. Because this has added to the cost and time required for laboratory tests by window companies, the simulation system for the thermal performance of windows has been prepared to compensate for time and cost burdens. In Korea, a simulator is usually used to calculate the thermal performance of a window through WINDOW/THERM, complying with ISO 15099. For a single window, the simulation results are similar to experimental results. A double window is also calculated using the same method, but the calculation results for this type of window are unreliable. ISO 15099 should not recommend the calculation of the thermal properties of an air cavity between window sashes in a double window. This causes a difference between simulation and experimental results pertaining to the thermal performance of a double window. In this paper, the thermal properties of air cavities between window sashes in a double window are analyzed through computational fluid dynamics (CFD) simulations with the results compared to calculation results certified by ISO 15099. The surface temperature of the air cavity analyzed by CFD is compared to the experimental temperatures. These results show that an appropriate calculation method for an air cavity between window sashes in a double window should be established for reliable thermal performance results for a double window.
Joint histogram-based cost aggregation for stereo matching.
Min, Dongbo; Lu, Jiangbo; Do, Minh N
2013-10-01
This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Templated fabrication of hollow nanospheres with 'windows' of accurate size and tunable number.
Xie, Duan; Hou, Yidong; Su, Yarong; Gao, Fuhua; Du, Jinglei
2015-01-01
The 'windows' or 'doors' on the surface of a closed hollow structure can enable the exchange of material and information between the interior and exterior of one hollow sphere or between two hollow spheres, and this information or material exchange can also be controlled through altering the window' size. Thus, it is very interesting and important to achieve the fabrication and adjustment of the 'windows' or 'doors' on the surface of a closed hollow structure. In this paper, we propose a new method based on the temple-assisted deposition method to achieve the fabrication of hollow spheres with windows of accurate size and number. Through precisely controlling of deposition parameters (i.e., deposition angle and number), hollow spheres with windows of total size from 0% to 50% and number from 1 to 6 have been successfully achieved. A geometrical model has been developed for the morphology simulation and size calculation of the windows, and the simulation results meet well with the experiment. This model will greatly improve the convenience and efficiency of this temple-assisted deposition method. In addition, these hollow spheres with desired windows also can be dispersed into liquid or arranged regularly on any desired substrate. These advantages will maximize their applications in many fields, such as drug transport and nano-research container.
Acoustic window planning for ultrasound acquisition.
Göbl, Rüdiger; Virga, Salvatore; Rackerseder, Julia; Frisch, Benjamin; Navab, Nassir; Hennersperger, Christoph
2017-06-01
Autonomous robotic ultrasound has recently gained considerable interest, especially for collaborative applications. Existing methods for acquisition trajectory planning are solely based on geometrical considerations, such as the pose of the transducer with respect to the patient surface. This work aims at establishing acoustic window planning to enable autonomous ultrasound acquisitions of anatomies with restricted acoustic windows, such as the liver or the heart. We propose a fully automatic approach for the planning of acquisition trajectories, which only requires information about the target region as well as existing tomographic imaging data, such as X-ray computed tomography. The framework integrates both geometrical and physics-based constraints to estimate the best ultrasound acquisition trajectories with respect to the available acoustic windows. We evaluate the developed method using virtual planning scenarios based on real patient data as well as for real robotic ultrasound acquisitions on a tissue-mimicking phantom. The proposed method yields superior image quality in comparison with a naive planning approach, while maintaining the necessary coverage of the target. We demonstrate that by taking image formation properties into account acquisition planning methods can outperform naive plannings. Furthermore, we show the need for such planning techniques, since naive approaches are not sufficient as they do not take the expected image quality into account.
Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang
2017-04-26
This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
Multi-window detection for P-wave in electrocardiograms based on bilateral accumulative area.
Chen, Riqing; Huang, Yingsong; Wu, Jian
2016-11-01
P-wave detection is one of the most challenging aspects in electrocardiograms (ECGs) due to its low amplitude, low frequency, and variable waveforms. This work introduces a novel multi-window detection method for P-wave delineation based on the bilateral accumulative area. The bilateral accumulative area is calculated by summing the areas covered by the P-wave curve with left and right sliding windows. The onset and offset of a positive P-wave correspond to the local maxima of the area detector. The position drift and difference in area variation of local extreme points with different windows are used to systematically combine multi-window and 12-lead synchronous detection methods, which are used to screen the optimization boundary points from all extreme points of different window widths and adaptively match the P-wave location. The proposed method was validated with ECG signals from various databases, including the Standard CSE Database, T-Wave Alternans Challenge Database, PTB Diagnostic ECG Database, and the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database. The average sensitivity Se was 99.44% with a positive predictivity P+ of 99.37% for P-wave detection. Standard deviations of 3.7 and 4.3ms were achieved for the onset and offset of P-waves, respectively, which is in agreement with the accepted tolerances required by the CSE committee. Compared with well-known delineation methods, this method can achieve high sensitivity and positive predictability using a simple calculation process. The experiment results suggest that the bilateral accumulative area could be an effective detection tool for ECG signal analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels.
Al-Samman, A M; Azmi, M H; Rahman, T A; Khan, I; Hindia, M N; Fattouh, A
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk-1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method.
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels
Al-Samman, A. M.; Azmi, M. H.; Rahman, T. A.; Khan, I.; Hindia, M. N.; Fattouh, A.
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk−1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method. PMID:27992445
Time-marching multi-grid seismic tomography
NASA Astrophysics Data System (ADS)
Tong, P.; Yang, D.; Liu, Q.
2016-12-01
From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.
NASA Astrophysics Data System (ADS)
Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han
2017-11-01
Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey-Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey-Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods.
Active noise attenuation in ventilation windows.
Huang, Huahua; Qiu, Xiaojun; Kang, Jian
2011-07-01
The feasibility of applying active noise control techniques to attenuate low frequency noise transmission through a natural ventilation window into a room is investigated analytically and experimentally. The window system is constructed by staggering the opening sashes of a spaced double glazing window to allow ventilation and natural light. An analytical model based on the modal expansion method is developed to calculate the low frequency sound field inside the window and the room and to be used in the active noise control simulations. The effectiveness of the proposed analytical model is validated by using the finite element method. The performance of the active control system for a window with different source and receiver configurations are compared, and it is found that the numerical and experimental results are in good agreement and the best result is achieved when the secondary sources are placed in the center at the bottom of the staggered window. The extra attenuation at the observation points in the optimized window system is almost equivalent to the noise reduction at the error sensor and the frequency range of effective control is up to 390 Hz in the case of a single channel active noise control system. © 2011 Acoustical Society of America
NASA Astrophysics Data System (ADS)
Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih
2018-03-01
Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.
Asgari, Afrouz; Ashoor, Mansour; Sohrabpour, Mostafa; Shokrani, Parvaneh; Rezaei, Ali
2015-05-01
Improving signal to noise ratio (SNR) and qualified images by the various methods is very important for detecting the abnormalities at the body organs. Scatter and attenuation of photons by the organs lead to errors in radiopharmaceutical estimation as well as degradation of images. The choice of suitable energy window and the radionuclide have a key role in nuclear medicine which appearing the lowest scatter fraction as well as having a nearly constant linear attenuation coefficient as a function of phantom thickness. The energy windows of symmetrical window (SW), asymmetric window (ASW), high window (WH) and low window (WL) using Tc-99m and Sm-153 radionuclide with solid water slab phantom (RW3) and Teflon bone phantoms have been compared, and Matlab software and Monte Carlo N-Particle (MCNP4C) code were modified to simulate these methods and obtaining the amounts of FWHM and full width at tenth maximum (FWTM) using line spread functions (LSFs). The experimental data were obtained from the Orbiter Scintron gamma camera. Based on the results of the simulation as well as experimental work, the performance of WH and ASW display of the results, lowest scatter fraction as well as constant linear attenuation coefficient as a function of phantom thickness. WH and ASW were optimal windows in nuclear medicine imaging for Tc-99m in RW3 phantom and Sm-153 in Teflon bone phantom. Attenuation correction was done for WH and ASW optimal windows and for these radionuclides using filtered back projection algorithm. Results of simulation and experimental show that very good agreement between the set of experimental with simulation as well as theoretical values with simulation data were obtained which was nominally less than 7.07 % for Tc-99m and less than 8.00 % for Sm-153. Corrected counts were not affected by the thickness of scattering material. The Simulated results of Line Spread Function (LSF) for Sm-153 and Tc-99m in phantom based on four windows and TEW method were indicated that the FWHM and FWTM values were approximately the same in TEW method and WH and ASW, but the sensitivity at the optimal window was more than that of the other one. The suitable determination of energy window width on the energy spectra can be useful in optimal design to improve efficiency and contrast. It is found that the WH is preferred to the ASW and the ASW is preferred to the SW.
Galias, Zbigniew
2017-05-01
An efficient method to find positions of periodic windows for the quadratic map f(x)=ax(1-x) and a heuristic algorithm to locate the majority of wide periodic windows are proposed. Accurate rigorous bounds of positions of all periodic windows with periods below 37 and the majority of wide periodic windows with longer periods are found. Based on these results, we prove that the measure of the set of regular parameters in the interval [3,4] is above 0.613960137. The properties of periodic windows are studied numerically. The results of the analysis are used to estimate that the true value of the measure of the set of regular parameters is close to 0.6139603.
Sunlight Responsive Thermochromic Window System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millett, F,A; Byker,H, J
2006-10-27
Pleotint has embarked on a novel approach with our Sunlight Responsive Thermochromic, SRT™, windows. We are integrating dynamic sunlight control, high insulation values and low solar heat gain together in a high performance window. The Pleotint SRT window is dynamic because it reversibly changes light transmission based on thermochromics activated directly by the heating effect of sunlight. We can achieve a window package with low solar heat gain coefficient (SHGC), a low U value and high insulation. At the same time our windows provide good daylighting. Our innovative window design offers architects and building designers the opportunity to choose theirmore » desired energy performance, excellent sound reduction, external pane can be self-cleaning, or a resistance to wind load, blasts, bullets or hurricanes. SRT windows would provide energy savings that are estimated at up to 30% over traditional window systems. Glass fabricators will be able to use existing equipment to make the SRT window while adding value and flexibility to the basic design. Glazing installers will have the ability to fit the windows with traditional methods without wires, power supplies and controllers. SRT windows can be retrofit into existing buildings,« less
Shot boundary detection and label propagation for spatio-temporal video segmentation
NASA Astrophysics Data System (ADS)
Piramanayagam, Sankaranaryanan; Saber, Eli; Cahill, Nathan D.; Messinger, David
2015-02-01
This paper proposes a two stage algorithm for streaming video segmentation. In the first stage, shot boundaries are detected within a window of frames by comparing dissimilarity between 2-D segmentations of each frame. In the second stage, the 2-D segments are propagated across the window of frames in both spatial and temporal direction. The window is moved across the video to find all shot transitions and obtain spatio-temporal segments simultaneously. As opposed to techniques that operate on entire video, the proposed approach consumes significantly less memory and enables segmentation of lengthy videos. We tested our segmentation based shot detection method on the TRECVID 2007 video dataset and compared it with block-based technique. Cut detection results on the TRECVID 2007 dataset indicate that our algorithm has comparable results to the best of the block-based methods. The streaming video segmentation routine also achieves promising results on a challenging video segmentation benchmark database.
NASA Astrophysics Data System (ADS)
Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang
2017-03-01
Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.
Peng, Sijia; Wang, Wenjuan; Chen, Chunlai
2018-05-10
Fluorescence correlation spectroscopy is a powerful single-molecule tool that is able to capture kinetic processes occurring at the nanosecond time scale. However, the upper limit of its time window is restricted by the dwell time of the molecule of interest in the confocal detection volume, which is usually around submilliseconds for a freely diffusing biomolecule. Here, we present a simple and easy-to-implement method, named surface transient binding-based fluorescence correlation spectroscopy (STB-FCS), which extends the upper limit of the time window to seconds. We further demonstrated that STB-FCS enables capture of both intramolecular and intermolecular kinetic processes whose time scales cross several orders of magnitude.
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
Subsurface event detection and classification using Wireless Signal Networks.
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T
2012-11-05
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.
Subsurface Event Detection and Classification Using Wireless Signal Networks
Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T.
2012-01-01
Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events. PMID:23202191
Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang
2016-08-01
Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation. Copyright © 2016 Elsevier B.V. All rights reserved.
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
Sound transmission loss of windows on high speed trains
NASA Astrophysics Data System (ADS)
Zhang, Yumei; Xiao, Xinbiao; Thompson, David; Squicciarini, Giacomo; Wen, Zefeng; Li, Zhihui; Wu, Yue
2016-09-01
The window is one of the main components of the high speed train car body structure through which noise can be transmitted. To study the windows’ acoustic properties, the vibration of one window of a high speed train has been measured for a running speed of 250 km/h. The corresponding interior noise and the noise in the wheel-rail area have been measured simultaneously. The experimental results show that the window vibration velocity has a similar spectral shape to the interior noise. Interior noise source identification further indicates that the window makes a contribution to the interior noise. Improvement of the window's Sound Transmission Loss (STL) can reduce the interior noise from this transmission path. An STL model of the window is built based on wave propagation and modal superposition methods. From the theoretical results, the window's STL property is studied and several factors affecting it are investigated, which provide indications for future low noise design of high speed train windows.
Multi-focus image fusion based on window empirical mode decomposition
NASA Astrophysics Data System (ADS)
Qin, Xinqiang; Zheng, Jiaoyue; Hu, Gang; Wang, Jiao
2017-09-01
In order to improve multi-focus image fusion quality, a novel fusion algorithm based on window empirical mode decomposition (WEMD) is proposed. This WEMD is an improved form of bidimensional empirical mode decomposition (BEMD), due to its decomposition process using the adding window principle, effectively resolving the signal concealment problem. We used WEMD for multi-focus image fusion, and formulated different fusion rules for bidimensional intrinsic mode function (BIMF) components and the residue component. For fusion of the BIMF components, the concept of the Sum-modified-Laplacian was used and a scheme based on the visual feature contrast adopted; when choosing the residue coefficients, a pixel value based on the local visibility was selected. We carried out four groups of multi-focus image fusion experiments and compared objective evaluation criteria with other three fusion methods. The experimental results show that the proposed fusion approach is effective and performs better at fusing multi-focus images than some traditional methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polewko-Klim, A., E-mail: anetapol@uwb.edu.pl; Uba, S.; Uba, L.
2014-07-15
A solution to the problem of disturbing effect of the background Faraday rotation in the cryostat windows on longitudinal magneto-optical Kerr effect (LMOKE) measured under vacuum conditions and/or at low temperatures is proposed. The method for eliminating the influence of Faraday rotation in cryostat windows is based on special arrangement of additional mirrors placed on sample holder. In this arrangement, the orientation of the cryostat window is perpendicular to the light beam direction and parallel to an external magnetic field generated by the H-frame electromagnet. The operation of the LMOKE magnetometer with the special sample holder based on polarization modulationmore » technique with a photo-elastic modulator is theoretically analyzed with the use of Jones matrices, and formulas for evaluating of the actual Kerr rotation and ellipticity of the sample are derived. The feasibility of the method and good performance of the magnetometer is experimentally demonstrated for the LMOKE effect measured in Fe/Au multilayer structures. The influence of imperfect alignment of the magnetometer setup on the Kerr angles, as derived theoretically through the analytic model and verified experimentally, is examined and discussed.« less
Salient object detection method based on multiple semantic features
NASA Astrophysics Data System (ADS)
Wang, Chunyang; Yu, Chunyan; Song, Meiping; Wang, Yulei
2018-04-01
The existing salient object detection model can only detect the approximate location of salient object, or highlight the background, to resolve the above problem, a salient object detection method was proposed based on image semantic features. First of all, three novel salient features were presented in this paper, including object edge density feature (EF), object semantic feature based on the convex hull (CF) and object lightness contrast feature (LF). Secondly, the multiple salient features were trained with random detection windows. Thirdly, Naive Bayesian model was used for combine these features for salient detection. The results on public datasets showed that our method performed well, the location of salient object can be fixed and the salient object can be accurately detected and marked by the specific window.
NASA Astrophysics Data System (ADS)
Ji, Yanju; Li, Dongsheng; Yu, Mingmei; Wang, Yuan; Wu, Qiong; Lin, Jun
2016-05-01
The ground electrical source airborne transient electromagnetic system (GREATEM) on an unmanned aircraft enjoys considerable prospecting depth, lateral resolution and detection efficiency, etc. In recent years it has become an important technical means of rapid resources exploration. However, GREATEM data are extremely vulnerable to stationary white noise and non-stationary electromagnetic noise (sferics noise, aircraft engine noise and other human electromagnetic noises). These noises will cause degradation of the imaging quality for data interpretation. Based on the characteristics of the GREATEM data and major noises, we propose a de-noising algorithm utilizing wavelet threshold method and exponential adaptive window width-fitting. Firstly, the white noise is filtered in the measured data using the wavelet threshold method. Then, the data are segmented using data window whose step length is even logarithmic intervals. The data polluted by electromagnetic noise are identified within each window based on the discriminating principle of energy detection, and the attenuation characteristics of the data slope are extracted. Eventually, an exponential fitting algorithm is adopted to fit the attenuation curve of each window, and the data polluted by non-stationary electromagnetic noise are replaced with their fitting results. Thus the non-stationary electromagnetic noise can be effectively removed. The proposed algorithm is verified by the synthetic and real GREATEM signals. The results show that in GREATEM signal, stationary white noise and non-stationary electromagnetic noise can be effectively filtered using the wavelet threshold-exponential adaptive window width-fitting algorithm, which enhances the imaging quality.
Method of high speed flow field influence and restrain on laser communication
NASA Astrophysics Data System (ADS)
Meng, Li-xin; Wang, Chun-hui; Qian, Cun-zhu; Wang, Shuo; Zhang, Li-zhong
2013-08-01
For laser communication performance which carried by airplane or airship, due to high-speed platform movement, the air has two influences in platform and laser communication terminal window. The first influence is that aerodynamic effect causes the deformation of the optical window; the second one is that a shock wave and boundary layer would be generated. For subsonic within the aircraft, the boundary layer is the main influence. The presence of a boundary layer could change the air density and the temperature of the optical window, which causes the light deflection and received beam spot flicker. Ultimately, the energy hunting of the beam spot which reaches receiving side increases, so that the error rate increases. In this paper, aerodynamic theory is used in analyzing the influence of the optical window deformation due to high speed air. Aero-optics theory is used to analyze the influence of the boundary layer in laser communication link. Based on this, we focused on working on exploring in aerodynamic and aero-optical effect suppression method in the perspective of the optical window design. Based on planning experimental aircraft types and equipment installation location, we optimized the design parameters of the shape and thickness of the optical window, the shape and size of air-management kit. Finally, deformation of the optical window and air flow distribution were simulated by fluid simulation software in the different mach and different altitude fly condition. The simulation results showed that the optical window can inhibit the aerodynamic influence after optimization. In addition, the boundary layer is smoothed; the turbulence influence is reduced, which meets the requirements of the airborne laser communication.
Novel Hyperspectral Anomaly Detection Methods Based on Unsupervised Nearest Regularized Subspace
NASA Astrophysics Data System (ADS)
Hou, Z.; Chen, Y.; Tan, K.; Du, P.
2018-04-01
Anomaly detection has been of great interest in hyperspectral imagery analysis. Most conventional anomaly detectors merely take advantage of spectral and spatial information within neighboring pixels. In this paper, two methods of Unsupervised Nearest Regularized Subspace-based with Outlier Removal Anomaly Detector (UNRSORAD) and Local Summation UNRSORAD (LSUNRSORAD) are proposed, which are based on the concept that each pixel in background can be approximately represented by its spatial neighborhoods, while anomalies cannot. Using a dual window, an approximation of each testing pixel is a representation of surrounding data via a linear combination. The existence of outliers in the dual window will affect detection accuracy. Proposed detectors remove outlier pixels that are significantly different from majority of pixels. In order to make full use of various local spatial distributions information with the neighboring pixels of the pixels under test, we take the local summation dual-window sliding strategy. The residual image is constituted by subtracting the predicted background from the original hyperspectral imagery, and anomalies can be detected in the residual image. Experimental results show that the proposed methods have greatly improved the detection accuracy compared with other traditional detection method.
Dynamic programming-based hot spot identification approach for pedestrian crashes.
Medury, Aditya; Grembek, Offer
2016-08-01
Network screening techniques are widely used by state agencies to identify locations with high collision concentration, also referred to as hot spots. However, most of the research in this regard has focused on identifying highway segments that are of concern to automobile collisions. In comparison, pedestrian hot spot detection has typically focused on analyzing pedestrian crashes in specific locations, such as at/near intersections, mid-blocks, and/or other crossings, as opposed to long stretches of roadway. In this context, the efficiency of the some of the widely used network screening methods has not been tested. Hence, in order to address this issue, a dynamic programming-based hot spot identification approach is proposed which provides efficient hot spot definitions for pedestrian crashes. The proposed approach is compared with the sliding window method and an intersection buffer-based approach. The results reveal that the dynamic programming method generates more hot spots with a higher number of crashes, while providing small hot spot segment lengths. In comparison, the sliding window method is shown to suffer from shortcomings due to a first-come-first-serve approach vis-à-vis hot spot identification and a fixed hot spot window length assumption. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tsai, Kuo-Ming; Wang, He-Yi
2014-08-20
This study focuses on injection molding process window determination for obtaining optimal imaging optical properties, astigmatism, coma, and spherical aberration using plastic lenses. The Taguchi experimental method was first used to identify the optimized combination of parameters and significant factors affecting the imaging optical properties of the lens. Full factorial experiments were then implemented based on the significant factors to build the response surface models. The injection molding process windows for lenses with optimized optical properties were determined based on the surface models, and confirmation experiments were performed to verify their validity. The results indicated that the significant factors affecting the optical properties of lenses are mold temperature, melt temperature, and cooling time. According to experimental data for the significant factors, the oblique ovals for different optical properties on the injection molding process windows based on melt temperature and cooling time can be obtained using the curve fitting approach. The confirmation experiments revealed that the average errors for astigmatism, coma, and spherical aberration are 3.44%, 5.62%, and 5.69%, respectively. The results indicated that the process windows proposed are highly reliable.
Adaptive synchrosqueezing based on a quilted short-time Fourier transform
NASA Astrophysics Data System (ADS)
Berrian, Alexander; Saito, Naoki
2017-08-01
In recent years, the synchrosqueezing transform (SST) has gained popularity as a method for the analysis of signals that can be broken down into multiple components determined by instantaneous amplitudes and phases. One such version of SST, based on the short-time Fourier transform (STFT), enables the sharpening of instantaneous frequency (IF) information derived from the STFT, as well as the separation of amplitude-phase components corresponding to distinct IF curves. However, this SST is limited by the time-frequency resolution of the underlying window function, and may not resolve signals exhibiting diverse time-frequency behaviors with sufficient accuracy. In this work, we develop a framework for an SST based on a "quilted" short-time Fourier transform (SST-QSTFT), which allows adaptation to signal behavior in separate time-frequency regions through the use of multiple windows. This motivates us to introduce a discrete reassignment frequency formula based on a finite difference of the phase spectrum, ensuring computational accuracy for a wider variety of windows. We develop a theoretical framework for the SST-QSTFT in both the continuous and the discrete settings, and describe an algorithm for the automatic selection of optimal windows depending on the region of interest. Using synthetic data, we demonstrate the superior numerical performance of SST-QSTFT relative to other SST methods in a noisy context. Finally, we apply SST-QSTFT to audio recordings of animal calls to demonstrate the potential of our method for the analysis of real bioacoustic signals.
AN ASSESSMENT OF MCNP WEIGHT WINDOWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. S. HENDRICKS; C. N. CULBERTSON
2000-01-01
The weight window variance reduction method in the general-purpose Monte Carlo N-Particle radiation transport code MCNPTM has recently been rewritten. In particular, it is now possible to generate weight window importance functions on a superimposed mesh, eliminating the need to subdivide geometries for variance reduction purposes. Our assessment addresses the following questions: (1) Does the new MCNP4C treatment utilize weight windows as well as the former MCNP4B treatment? (2) Does the new MCNP4C weight window generator generate importance functions as well as MCNP4B? (3) How do superimposed mesh weight windows compare to cell-based weight windows? (4) What are the shortcomingsmore » of the new MCNP4C weight window generator? Our assessment was carried out with five neutron and photon shielding problems chosen for their demanding variance reduction requirements. The problems were an oil well logging problem, the Oak Ridge fusion shielding benchmark problem, a photon skyshine problem, an air-over-ground problem, and a sample problem for variance reduction.« less
Restoration of severely weathered wood
R. Sam Williams; Mark Knaebe
2000-01-01
Severely weathered window units were used to test various restoration methods and pretreatments. Sanded and unsanded units were pretreated with a consolidant or water repellent preservative, finished with an oil- or latex-based paint system, and exposed outdoors near Madison, WI, for five years. Pretreatments were applied to both window sashes (stiles and rails) and...
NASA Astrophysics Data System (ADS)
Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher
2017-03-01
As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.
Seismic facies analysis based on self-organizing map and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian
2015-01-01
Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.
Window-based method for approximating the Hausdorff in three-dimensional range imagery
Koch, Mark W [Albuquerque, NM
2009-06-02
One approach to pattern recognition is to use a template from a database of objects and match it to a probe image containing the unknown. Accordingly, the Hausdorff distance can be used to measure the similarity of two sets of points. In particular, the Hausdorff can measure the goodness of a match in the presence of occlusion, clutter, and noise. However, existing 3D algorithms for calculating the Hausdorff are computationally intensive, making them impractical for pattern recognition that requires scanning of large databases. The present invention is directed to a new method that can efficiently, in time and memory, compute the Hausdorff for 3D range imagery. The method uses a window-based approach.
Sliding window prior data assisted compressed sensing for MRI tracking of lung tumors.
Yip, Eugene; Yun, Jihyun; Wachowicz, Keith; Gabos, Zsolt; Rathee, Satyapal; Fallone, B G
2017-01-01
Hybrid magnetic resonance imaging and radiation therapy devices are capable of imaging in real-time to track intrafractional lung tumor motion during radiotherapy. Highly accelerated magnetic resonance (MR) imaging methods can potentially reduce system delay time and/or improves imaging spatial resolution, and provide flexibility in imaging parameters. Prior Data Assisted Compressed Sensing (PDACS) has previously been proposed as an acceleration method that combines the advantages of 2D compressed sensing and the KEYHOLE view-sharing technique. However, as PDACS relies on prior data acquired at the beginning of a dynamic imaging sequence, decline in image quality occurs for longer duration scans due to drifts in MR signal. Novel sliding window-based techniques for refreshing prior data are proposed as a solution to this problem. MR acceleration is performed by retrospective removal of data from the fully sampled sets. Six patients with lung tumors are scanned with a clinical 3 T MRI using a balanced steady-state free precession (bSSFP) sequence for 3 min at approximately 4 frames per second, for a total of 650 dynamics. A series of distinct pseudo-random patterns of partial k-space acquisition is generated such that, when combined with other dynamics within a sliding window of 100 dynamics, covers the entire k-space. The prior data in the sliding window are continuously refreshed to reduce the impact of MR signal drifts. We intended to demonstrate two different ways to utilize the sliding window data: a simple averaging method and a navigator-based method. These two sliding window methods are quantitatively compared against the original PDACS method using three metrics: artifact power, centroid displacement error, and Dice's coefficient. The study is repeated with pseudo 0.5 T images by adding complex, normally distributed noise with a standard deviation that reduces image SNR, relative to original 3 T images, by a factor of 6. Without sliding window implemented, PDACS-reconstructed dynamic datasets showed progressive increases in image artifact power as the 3 min scan progresses. With sliding windows implemented, this increase in artifact power is eliminated. Near the end of a 3 min scan at 3 T SNR and 5× acceleration, implementation of an averaging (navigator) sliding window method improves our metrics by the following ways: artifact power decreases from 0.065 without sliding window to 0.030 (0.031), centroid error decreases from 2.64 to 1.41 mm (1.28 mm), and Dice coefficient agreement increases from 0.860 to 0.912 (0.915). At pseudo 0.5 T SNR, the improvements in metrics are as follows: artifact power decreases from 0.110 without sliding window to 0.0897 (0.0985), centroid error decreases from 2.92 mm to 1.36 mm (1.32 mm), and Dice coefficient agreements increases from 0.851 to 0.894 (0.896). In this work we demonstrated the negative impact of slow changes in MR signal for longer duration PDACS dynamic scans, namely increases in image artifact power and reductions of tumor tracking accuracy. We have also demonstrated sliding window implementations (i.e., refreshing of prior data) of PDACS are effective solutions to this problem at both 3 T and simulated 0.5 T bSSFP images. © 2016 American Association of Physicists in Medicine.
Rusterholz, Thomas; Achermann, Peter; Dürr, Roland; Koenig, Thomas; Tarokh, Leila
2017-06-01
Investigating functional connectivity between brain networks has become an area of interest in neuroscience. Several methods for investigating connectivity have recently been developed, however, these techniques need to be applied with care. We demonstrate that global field synchronization (GFS), a global measure of phase alignment in the EEG as a function of frequency, must be applied considering signal processing principles in order to yield valid results. Multichannel EEG (27 derivations) was analyzed for GFS based on the complex spectrum derived by the fast Fourier transform (FFT). We examined the effect of window functions on GFS, in particular of non-rectangular windows. Applying a rectangular window when calculating the FFT revealed high GFS values for high frequencies (>15Hz) that were highly correlated (r=0.9) with spectral power in the lower frequency range (0.75-4.5Hz) and tracked the depth of sleep. This turned out to be spurious synchronization. With a non-rectangular window (Tukey or Hanning window) these high frequency synchronization vanished. Both, GFS and power density spectra significantly differed for rectangular and non-rectangular windows. Previous papers using GFS typically did not specify the applied window and may have used a rectangular window function. However, the demonstrated impact of the window function raises the question of the validity of some previous findings at higher frequencies. We demonstrated that it is crucial to apply an appropriate window function for determining synchronization measures based on a spectral approach to avoid spurious synchronization in the beta/gamma range. Copyright © 2017 Elsevier B.V. All rights reserved.
Dong, Bing; Li, Yan; Han, Xin-Li; Hu, Bin
2016-09-02
For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10(-5) in optimized correction and is 1.427 × 10(-5) in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method.
Alkaline battery operational methodology
Sholklapper, Tal; Gallaway, Joshua; Steingart, Daniel; Ingale, Nilesh; Nyce, Michael
2016-08-16
Methods of using specific operational charge and discharge parameters to extend the life of alkaline batteries are disclosed. The methods can be used with any commercial primary or secondary alkaline battery, as well as with newer alkaline battery designs, including batteries with flowing electrolyte. The methods include cycling batteries within a narrow operating voltage window, with minimum and maximum cut-off voltages that are set based on battery characteristics and environmental conditions. The narrow voltage window decreases available capacity but allows the batteries to be cycled for hundreds or thousands of times.
Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V
2009-04-01
We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, <0.001 and 0.0002, respectively). When comparing measurement errors between stones of different composition in vitro, the error for calcium oxalate calculi was significantly different from the gold standard for all methods except bone window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x magnified bone windows was superior to 4.0x magnified soft tissue windows in measurement accuracy. Magnified bone window measurements were not statistically different from digital caliper measurements (mean underestimation vs digital caliper 0.3 mm, p = 0.4), while magnified soft tissue windows were statistically distinct (mean underestimation 1.4 mm, p = 0.001). In this study magnified bone windows were the most accurate method of stone measurements in vitro and in vivo. Therefore, we recommend the routine use of magnified bone windows for computerized tomography measurement of stones. In vitro the measurement error in calcium oxalate stones was greater than that in uric acid stones, suggesting that stone composition may be responsible for measurement inaccuracies.
Domingo-Almenara, Xavier; Perera, Alexandre; Brezmes, Jesus
2016-11-25
Gas chromatography-mass spectrometry (GC-MS) produces large and complex datasets characterized by co-eluted compounds and at trace levels, and with a distinct compound ion-redundancy as a result of the high fragmentation by the electron impact ionization. Compounds in GC-MS can be resolved by taking advantage of the multivariate nature of GC-MS data by applying multivariate resolution methods. However, multivariate methods have to be applied in small regions of the chromatogram, and therefore chromatograms are segmented prior to the application of the algorithms. The automation of this segmentation process is a challenging task as it implies separating between informative data and noise from the chromatogram. This study demonstrates the capabilities of independent component analysis-orthogonal signal deconvolution (ICA-OSD) and multivariate curve resolution-alternating least squares (MCR-ALS) with an overlapping moving window implementation to avoid the typical hard chromatographic segmentation. Also, after being resolved, compounds are aligned across samples by an automated alignment algorithm. We evaluated the proposed methods through a quantitative analysis of GC-qTOF MS data from 25 serum samples. The quantitative performance of both moving window ICA-OSD and MCR-ALS-based implementations was compared with the quantification of 33 compounds by the XCMS package. Results shown that most of the R 2 coefficients of determination exhibited a high correlation (R 2 >0.90) in both ICA-OSD and MCR-ALS moving window-based approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
Least Squares Moving-Window Spectral Analysis.
Lee, Young Jong
2017-08-01
Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.
Determination of skeleton and sign map for phase obtaining from a single ESPI image
NASA Astrophysics Data System (ADS)
Yang, Xia; Yu, Qifeng; Fu, Sihua
2009-06-01
A robust method of determining the sign map and skeletons for ESPI images is introduced in this paper. ESPI images have high speckle noise which makes it difficult to obtain the fringe information, especially from a single image. To overcome the effects of high speckle noise, local directional computing windows are designed according to the fringe directions. Then by calculating the gradients from the filtered image in directional windows, sign map and good skeletons can be determined robustly. Based on the sign map, single image phase-extracting methods such as quadrature transform can be improved. And based on skeletons, fringe phases can be obtained directly by normalization methods. Experiments show that this new method is robust and effective for extracting phase from a single ESPI fringe image.
NASA Astrophysics Data System (ADS)
Zboril, Ondrej; Nedoma, Jan; Cubik, Jakub; Novak, Martin; Bednarek, Lukas; Fajkus, Marcel; Vasinek, Vladimir
2016-04-01
Interferometric sensors are very accurate and sensitive sensors that due to the extreme sensitivity allow sensing vibration and acoustic signals. This paper describes a new method of implementation of Mach-Zehnder interferometer for sensing of vibrations caused by touching on the window panes. Window panes are part of plastic windows, in which the reference arm of the interferometer is mounted and isolated inside the frame, a measuring arm of the interferometer is fixed to the window pane and it is mounted under the cover of the window frame. It prevents visibility of the optical fiber and this arrangement is the basis for the safety system. For the construction of the vibration sensor standard elements of communication networks are used - optical fiber according to G.652D and 1x2 splitters with dividing ratio 1:1. Interferometer operated at a wavelength of 1550 nm. The paper analyses the sensitivity of the window in a 12x12 measuring points matrix, there is specified sensitivity distribution of the window pane.
Linear segmentation algorithm for detecting layer boundary with lidar.
Mao, Feiyue; Gong, Wei; Logan, Timothy
2013-11-04
The automatic detection of aerosol- and cloud-layer boundary (base and top) is important in atmospheric lidar data processing, because the boundary information is not only useful for environment and climate studies, but can also be used as input for further data processing. Previous methods have demonstrated limitations in defining the base and top, window-size setting, and have neglected the in-layer attenuation. To overcome these limitations, we present a new layer detection scheme for up-looking lidars based on linear segmentation with a reasonable threshold setting, boundary selecting, and false positive removing strategies. Preliminary results from both real and simulated data show that this algorithm cannot only detect the layer-base as accurate as the simple multi-scale method, but can also detect the layer-top more accurately than that of the simple multi-scale method. Our algorithm can be directly applied to uncalibrated data without requiring any additional measurements or window size selections.
Plan averaging for multicriteria navigation of sliding window IMRT and VMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, David, E-mail: dcraft@partners.org; Papp, Dávid; Unkelbach, Jan
2014-02-15
Purpose: To describe a method for combining sliding window plans [intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT)] for use in treatment plan averaging, which is needed for Pareto surface navigation based multicriteria treatment planning. Methods: The authors show that by taking an appropriately defined average of leaf trajectories of sliding window plans, the authors obtain a sliding window plan whose fluence map is the exact average of the fluence maps corresponding to the initial plans. In the case of static-beam IMRT, this also implies that the dose distribution of the averaged plan is the exact dosimetricmore » average of the initial plans. In VMAT delivery, the dose distribution of the averaged plan is a close approximation of the dosimetric average of the initial plans. Results: The authors demonstrate the method on three Pareto optimal VMAT plans created for a demanding paraspinal case, where the tumor surrounds the spinal cord. The results show that the leaf averaged plans yield dose distributions that approximate the dosimetric averages of the precomputed Pareto optimal plans well. Conclusions: The proposed method enables the navigation of deliverable Pareto optimal plans directly, i.e., interactive multicriteria exploration of deliverable sliding window IMRT and VMAT plans, eliminating the need for a sequencing step after navigation and hence the dose degradation that is caused by such a sequencing step.« less
NASA Astrophysics Data System (ADS)
Xin, Meiting; Li, Bing; Yan, Xiao; Chen, Lei; Wei, Xiang
2018-02-01
A robust coarse-to-fine registration method based on the backpropagation (BP) neural network and shift window technology is proposed in this study. Specifically, there are three steps: coarse alignment between the model data and measured data, data simplification based on the BP neural network and point reservation in the contour region of point clouds, and fine registration with the reweighted iterative closest point algorithm. In the process of rough alignment, the initial rotation matrix and the translation vector between the two datasets are obtained. After performing subsequent simplification operations, the number of points can be reduced greatly. Therefore, the time and space complexity of the accurate registration can be significantly reduced. The experimental results show that the proposed method improves the computational efficiency without loss of accuracy.
Rigorous Numerical Study of Low-Period Windows for the Quadratic Map
NASA Astrophysics Data System (ADS)
Galias, Zbigniew
An efficient method to find all low-period windows for the quadratic map is proposed. The method is used to obtain very accurate rigorous bounds of positions of all periodic windows with periods p ≤ 32. The contribution of period-doubling windows on the total width of periodic windows is discussed. Properties of periodic windows are studied numerically.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
Yin, Xiaoming; Li, Xiang; Zhao, Liping; Fang, Zhongping
2009-11-10
A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.
Dong, Bing; Li, Yan; Han, Xin-li; Hu, Bin
2016-01-01
For high-speed aircraft, a conformal window is used to optimize the aerodynamic performance. However, the local shape of the conformal window leads to large amounts of dynamic aberrations varying with look angle. In this paper, deformable mirror (DM) and model-based wavefront sensorless adaptive optics (WSLAO) are used for dynamic aberration correction of an infrared remote sensor equipped with a conformal window and scanning mirror. In model-based WSLAO, aberration is captured using Lukosz mode, and we use the low spatial frequency content of the image spectral density as the metric function. Simulations show that aberrations induced by the conformal window are dominated by some low-order Lukosz modes. To optimize the dynamic correction, we can only correct dominant Lukosz modes and the image size can be minimized to reduce the time required to compute the metric function. In our experiment, a 37-channel DM is used to mimic the dynamic aberration of conformal window with scanning rate of 10 degrees per second. A 52-channel DM is used for correction. For a 128 × 128 image, the mean value of image sharpness during dynamic correction is 1.436 × 10−5 in optimized correction and is 1.427 × 10−5 in un-optimized correction. We also demonstrated that model-based WSLAO can achieve convergence two times faster than traditional stochastic parallel gradient descent (SPGD) method. PMID:27598161
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L.
2014-01-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. PMID:24780131
Windowed multitaper correlation analysis of multimodal brain monitoring parameters.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media
NASA Astrophysics Data System (ADS)
Bruno, O. P.; Pérez-Arancibia, C.
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.
Bruno, O P; Pérez-Arancibia, C
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Using seismic coda waves to resolve intrinsic and scattering attenuation
NASA Astrophysics Data System (ADS)
Wang, W.; Shearer, P. M.
2016-12-01
Seismic attenuation is caused by two factors, scattering and intrinsic absorption. Characterizing scattering and absorbing properties and the power spectrum of crustal heterogeneity is a fundamental problem for informing strong ground motion estimates at high frequencies, where scattering and attenuation effects are critical. Determining the relative amount of attenuation caused by scattering and intrinsic absorption has been a long-standing problem in seismology. The wavetrain following the direct body wave phases is called the coda, which is caused by scattered energy. Many studies have analyzed the coda of local events to constrain crustal and upper-mantle scattering strength and intrinsic attenuation. Here we examine two popular attenuation inversion methods, the Multiple Lapse Time Window Method (MLTWM) and the Coda Qc Method. First, based on our previous work on California attenuation structure, we apply an efficient and accurate method, the Monte Carlo Approach, to synthesize seismic envelope functions. We use this code to generate a series of synthetic data based on several complex and realistic forward models. Although the MLTWM assumes a uniform whole space, we use the MLTWM to invert for both scattering and intrinsic attenuation from the synthetic data to test how accurately it can recover the attenuation models. Results for the coda Qc method depend on choices for the length and starting time of the coda-wave time window. Here we explore the relation between the inversion results for Qc, the windowing parameters, and the intrinsic and scattering Q structure of our synthetic model. These results should help assess the practicality and accuracy of the Multiple Lapse Time Window Method and Coda Qc Method when applied to realistic crustal velocity and attenuation models.
A fast non-contact imaging photoplethysmography method using a tissue-like model
NASA Astrophysics Data System (ADS)
McDuff, Daniel J.; Blackford, Ethan B.; Estepp, Justin R.; Nishidate, Izumi
2018-02-01
Imaging photoplethysmography (iPPG) allows non-contact, concomitant measurement and visualization of peripheral blood flow using just an RGB camera. Most iPPG methods require a window of temporal data and complex computation, this makes real-time measurement and spatial visualization impossible. We present a fast,"window-less", non-contact imaging photoplethysmography method, based on a tissue-like model of the skin, that allows accurate measurement of heart rate and heart rate variability parameters. The error in heart rate estimates is equivalent to state-of-the-art techniques and computation is much faster.
Fixed-rate layered multicast congestion control
NASA Astrophysics Data System (ADS)
Bing, Zhang; Bing, Yuan; Zengji, Liu
2006-10-01
A new fixed-rate layered multicast congestion control algorithm called FLMCC is proposed. The sender of a multicast session transmits data packets at a fixed rate on each layer, while receivers each obtain different throughput by cumulatively subscribing to deferent number of layers based on their expected rates. In order to provide TCP-friendliness and estimate the expected rate accurately, a window-based mechanism implemented at receivers is presented. To achieve this, each receiver maintains a congestion window, adjusts it based on the GAIMD algorithm, and from the congestion window an expected rate is calculated. To measure RTT, a new method is presented which combines an accurate measurement with a rough estimation. A feedback suppression based on a random timer mechanism is given to avoid feedback implosion in the accurate measurement. The protocol is simple in its implementation. Simulations indicate that FLMCC shows good TCP-friendliness, responsiveness as well as intra-protocol fairness, and provides high link utilization.
Zhang, Yong; Shi, Chaojun; Brennecke, Joan F; Maginn, Edward J
2014-06-12
A combined classical molecular dynamics (MD) and ab initio MD (AIMD) method was developed for the calculation of electrochemical windows (ECWs) of ionic liquids. In the method, the liquid phase of ionic liquid is explicitly sampled using classical MD. The electrochemical window, estimated by the energy difference between the highest occupied molecular orbital (HOMO) and lowest unoccupied molecular orbital (LUMO), is calculated at the density functional theory (DFT) level based on snapshots obtained from classical MD trajectories. The snapshots were relaxed using AIMD and quenched to their local energy minima, which assures that the HOMO/LUMO calculations are based on stable configurations on the same potential energy surface. The new procedure was applied to a group of ionic liquids for which the ECWs were also experimentally measured in a self-consistent manner. It was found that the predicted ECWs not only agree with the experimental trend very well but also the values are quantitatively accurate. The proposed method provides an efficient way to compare ECWs of ionic liquids in the same context, which has been difficult in experiments or simulation due to the fact that ECW values sensitively depend on experimental setup and conditions.
Yang, Peihua; Sun, Peng; Chai, Zhisheng; Huang, Langhuan; Cai, Xiang; Tan, Shaozao; Song, Jinhui; Mai, Wenjie
2014-10-27
Multifunctional glass windows that combine energy storage and electrochromism have been obtained by facile thermal evaporation and electrodeposition methods. For example, WO3 films that had been deposited on fluorine-doped tin oxide (FTO) glass exhibited a high specific capacitance of 639.8 F g(-1). Their color changed from transparent to deep blue with an abrupt decrease in optical transmittance from 91.3% to 15.1% at a wavelength of 633 nm when a voltage of -0.6 V (vs. Ag/AgCl) was applied, demonstrating its excellent energy-storage and electrochromism properties. As a second example, a polyaniline-based pseudocapacitive glass was also developed, and its color can change from green to blue. A large-scale pseudocapacitive WO3-based glass window (15×15 cm(2)) was fabricated as a prototype. Such smart pseudocapacitive glass windows show great potential in functioning as electrochromic windows and concurrently powering electronic devices, such as mobile phones or laptops. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brendel, Bernhard, E-mail: bernhard.brendel@philips.com; Teuffenbach, Maximilian von; Noël, Peter B.
2016-01-15
Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penaltymore » comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts.« less
Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
Prediction on sunspot activity based on fuzzy information granulation and support vector machine
NASA Astrophysics Data System (ADS)
Peng, Lingling; Yan, Haisheng; Yang, Zhigang
2018-04-01
In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.
1990-12-01
keys 7 Executing PBPKSIM 10 Main Menu 12 File Selection 13 Data 13 simulation 13 All 14 sTatistics 14 Change directory 14 dos Shell 15 eXit 15 Data...the PBPKSIM program are based upon the window design seen here: TITLE I MENU BAR I INFORMATION LINE I I I IMIN DISPLAY AREAI1 1 I I I I I I I STATUS...AREAI Title shows the location of the program by supplying the name of the window being exeLuted. Menu Bar displays the other windows or other
Phase locking route behind complex periodic windows in a forced oscillator
NASA Astrophysics Data System (ADS)
Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei
2013-09-01
Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.
Platform for Postprocessing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don
2008-01-01
Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).
2015-01-01
Retinal fundus images are widely used in diagnosing and providing treatment for several eye diseases. Prior works using retinal fundus images detected the presence of exudation with the aid of publicly available dataset using extensive segmentation process. Though it was proved to be computationally efficient, it failed to create a diabetic retinopathy feature selection system for transparently diagnosing the disease state. Also the diagnosis of diseases did not employ machine learning methods to categorize candidate fundus images into true positive and true negative ratio. Several candidate fundus images did not include more detailed feature selection technique for diabetic retinopathy. To apply machine learning methods and classify the candidate fundus images on the basis of sliding window a method called, Diabetic Fundus Image Recuperation (DFIR) is designed in this paper. The initial phase of DFIR method select the feature of optic cup in digital retinal fundus images based on Sliding Window Approach. With this, the disease state for diabetic retinopathy is assessed. The feature selection in DFIR method uses collection of sliding windows to obtain the features based on the histogram value. The histogram based feature selection with the aid of Group Sparsity Non-overlapping function provides more detailed information of features. Using Support Vector Model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy diseases. The ranking of disease level for each candidate set provides a much promising result for developing practically automated diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, specificity rate, ranking efficiency and feature selection time. PMID:25974230
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Wu, Tiee-Jian; Huang, Ying-Hsueh; Li, Lung-An
2005-11-15
Several measures of DNA sequence dissimilarity have been developed. The purpose of this paper is 3-fold. Firstly, we compare the performance of several word-based or alignment-based methods. Secondly, we give a general guideline for choosing the window size and determining the optimal word sizes for several word-based measures at different window sizes. Thirdly, we use a large-scale simulation method to simulate data from the distribution of SK-LD (symmetric Kullback-Leibler discrepancy). These simulated data can be used to estimate the degree of dissimilarity beta between any pair of DNA sequences. Our study shows (1) for whole sequence similiarity/dissimilarity identification the window size taken should be as large as possible, but probably not >3000, as restricted by CPU time in practice, (2) for each measure the optimal word size increases with window size, (3) when the optimal word size is used, SK-LD performance is superior in both simulation and real data analysis, (4) the estimate beta of beta based on SK-LD can be used to filter out quickly a large number of dissimilar sequences and speed alignment-based database search for similar sequences and (5) beta is also applicable in local similarity comparison situations. For example, it can help in selecting oligo probes with high specificity and, therefore, has potential in probe design for microarrays. The algorithm SK-LD, estimate beta and simulation software are implemented in MATLAB code, and are available at http://www.stat.ncku.edu.tw/tjwu
Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun
2016-01-01
The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods. PMID:27258276
Shen, Chong; Li, Jie; Zhang, Xiaoming; Shi, Yunbo; Tang, Jun; Cao, Huiliang; Liu, Jun
2016-05-31
The different noise components in a dual-mass micro-electromechanical system (MEMS) gyroscope structure is analyzed in this paper, including mechanical-thermal noise (MTN), electronic-thermal noise (ETN), flicker noise (FN) and Coriolis signal in-phase noise (IPN). The structure equivalent electronic model is established, and an improved white Gaussian noise reduction method for dual-mass MEMS gyroscopes is proposed which is based on sample entropy empirical mode decomposition (SEEMD) and time-frequency peak filtering (TFPF). There is a contradiction in TFPS, i.e., selecting a short window length may lead to good preservation of signal amplitude but bad random noise reduction, whereas selecting a long window length may lead to serious attenuation of the signal amplitude but effective random noise reduction. In order to achieve a good tradeoff between valid signal amplitude preservation and random noise reduction, SEEMD is adopted to improve TFPF. Firstly, the original signal is decomposed into intrinsic mode functions (IMFs) by EMD, and the SE of each IMF is calculated in order to classify the numerous IMFs into three different components; then short window TFPF is employed for low frequency component of IMFs, and long window TFPF is employed for high frequency component of IMFs, and the noise component of IMFs is wiped off directly; at last the final signal is obtained after reconstruction. Rotation experimental and temperature experimental are carried out to verify the proposed SEEMD-TFPF algorithm, the verification and comparison results show that the de-noising performance of SEEMD-TFPF is better than that achievable with the traditional wavelet, Kalman filter and fixed window length TFPF methods.
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow.
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L
2014-07-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. Copyright © 2014 Elsevier B.V. All rights reserved.
A revised load estimation procedure for the Susquehanna, Potomac, Patuxent, and Choptank rivers
Yochum, Steven E.
2000-01-01
The U.S. Geological Survey?s Chesapeake Bay River Input Program has updated the nutrient and suspended-sediment load data base for the Susquehanna, Potomac, Patuxent, and Choptank Rivers using a multiple-window, center-estimate regression methodology. The revised method optimizes the seven-parameter regression approach that has been used historically by the program. The revised method estimates load using the fifth or center year of a sliding 9-year window. Each year a new model is run for each site and constituent, the most recent year is added, and the previous 4 years of estimates are updated. The fifth year in the 9-year window is considered the best estimate and is kept in the data base. The last year of estimation shows the most change from the previous year?s estimate and this change approaches a minimum at the fifth year. Differences between loads computed using this revised methodology and the loads populating the historical data base have been noted but the load estimates do not typically change drastically. The data base resulting from the application of this revised methodology is populated by annual and monthly load estimates that are known with greater certainty than in the previous load data base.
Wang, Shenghao; Zhang, Yuyan; Cao, Fuyi; Pei, Zhenying; Gao, Xuewei; Zhang, Xu; Zhao, Yong
2018-02-13
This paper presents a novel spectrum analysis tool named synergy adaptive moving window modeling based on immune clone algorithm (SA-MWM-ICA) considering the tedious and inconvenient labor involved in the selection of pre-processing methods and spectral variables by prior experience. In this work, immune clone algorithm is first introduced into the spectrum analysis field as a new optimization strategy, covering the shortage of the relative traditional methods. Based on the working principle of the human immune system, the performance of the quantitative model is regarded as antigen, and a special vector corresponding to the above mentioned antigen is regarded as antibody. The antibody contains a pre-processing method optimization region which is created by 11 decimal digits, and a spectrum variable optimization region which is formed by some moving windows with changeable width and position. A set of original antibodies are created by modeling with this algorithm. After calculating the affinity of these antibodies, those with high affinity will be selected to clone. The regulation for cloning is that the higher the affinity, the more copies will be. In the next step, another import operation named hyper-mutation is applied to the antibodies after cloning. Moreover, the regulation for hyper-mutation is that the lower the affinity, the more possibility will be. Several antibodies with high affinity will be created on the basis of these steps. Groups of simulated dataset, gasoline near-infrared spectra dataset, and soil near-infrared spectra dataset are employed to verify and illustrate the performance of SA-MWM-ICA. Analysis results show that the performance of the quantitative models adopted by SA-MWM-ICA are better especially for structures with relatively complex spectra than traditional models such as partial least squares (PLS), moving window PLS (MWPLS), genetic algorithm PLS (GAPLS), and pretreatment method classification and adjustable parameter changeable size moving window PLS (CA-CSMWPLS). The selected pre-processing methods and spectrum variables are easily explained. The proposed method will converge in few generations and can be used not only for near-infrared spectroscopy analysis but also for other similar spectral analysis, such as infrared spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Chest CT window settings with multiscale adaptive histogram equalization: pilot study.
Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald
2002-06-01
Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.
Multiple-instance ensemble learning for hyperspectral images
NASA Astrophysics Data System (ADS)
Ergul, Ugur; Bilgin, Gokhan
2017-10-01
An ensemble framework for multiple-instance (MI) learning (MIL) is introduced for use in hyperspectral images (HSIs) by inspiring the bagging (bootstrap aggregation) method in ensemble learning. Ensemble-based bagging is performed by a small percentage of training samples, and MI bags are formed by a local windowing process with variable window sizes on selected instances. In addition to bootstrap aggregation, random subspace is another method used to diversify base classifiers. The proposed method is implemented using four MIL classification algorithms. The classifier model learning phase is carried out with MI bags, and the estimation phase is performed over single-test instances. In the experimental part of the study, two different HSIs that have ground-truth information are used, and comparative results are demonstrated with state-of-the-art classification methods. In general, the MI ensemble approach produces more compact results in terms of both diversity and error compared to equipollent non-MIL algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, L; Shen, C; Wang, J
Purpose: To reduce cone beam CT (CBCT) imaging dose, we previously proposed a progressive dose control (PDC) scheme to employ temporal correlation between CBCT images at different fractions for image quality enhancement. A temporal non-local means (TNLM) method was developed to enhance quality of a new low-dose CBCT using existing high-quality CBCT. To enhance a voxel value, the TNLM method searches for similar voxels in a window. Due to patient deformation among the two CBCTs, a large searching window was required, reducing image quality and computational efficiency. This abstract proposes a deformation-assisted TNLM (DA-TNLM) method to solve this problem. Methods:more » For a low-dose CBCT to be enhanced using a high-quality CBCT, we first performed deformable image registration between the low-dose CBCT and the high-quality CBCT to approximately establish voxel correspondence between the two. A searching window for a voxel was then set based on the deformation vector field. Specifically, the search window for each voxel was shifted by the deformation vector. A TNLM step was then applied using only voxels within this determined window to correct image intensity at the low-dose CBCT. Results: We have tested the proposed scheme on simulated CIRS phantom data and real patient data. The CITS phantom was scanned on Varian onboard imaging CBCT system with coach shifting and dose reducing for each time. The real patient data was acquired in four fractions with dose reduced from standard CBCT dose to 12.5% of standard dose. It was found that the DA-TNLM method can reduce total dose by over 75% on average in the first four fractions. Conclusion: We have developed a PDC scheme which can enhance the quality of image scanned at low dose using a DA-TNLM method. Tests in phantom and patient studies demonstrated promising results.« less
Okamoto, Takuma; Sakaguchi, Atsushi
2017-03-01
Generating acoustically bright and dark zones using loudspeakers is gaining attention as one of the most important acoustic communication techniques for such uses as personal sound systems and multilingual guide services. Although most conventional methods are based on numerical solutions, an analytical approach based on the spatial Fourier transform with a linear loudspeaker array has been proposed, and its effectiveness has been compared with conventional acoustic energy difference maximization and presented by computer simulations. To describe the effectiveness of the proposal in actual environments, this paper investigates the experimental validation of the proposed approach with rectangular and Hann windows and compared it with three conventional methods: simple delay-and-sum beamforming, contrast maximization, and least squares-based pressure matching using an actually implemented linear array of 64 loudspeakers in an anechoic chamber. The results of both the computer simulations and the actual experiments show that the proposed approach with a Hann window more accurately controlled the bright and dark zones than the conventional methods.
Windowed Multitaper Correlation Analysis of Multimodal Brain Monitoring Parameters
Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome. PMID:25821507
A novel configurable VLSI architecture design of window-based image processing method
NASA Astrophysics Data System (ADS)
Zhao, Hui; Sang, Hongshi; Shen, Xubang
2018-03-01
Most window-based image processing architecture can only achieve a certain kind of specific algorithms, such as 2D convolution, and therefore lack the flexibility and breadth of application. In addition, improper handling of the image boundary can cause loss of accuracy, or consume more logic resources. For the above problems, this paper proposes a new VLSI architecture of window-based image processing operations, which is configurable and based on consideration of the image boundary. An efficient technique is explored to manage the image borders by overlapping and flushing phases at the end of row and the end of frame, which does not produce new delay and reduce the overhead in real-time applications. Maximize the reuse of the on-chip memory data, in order to reduce the hardware complexity and external bandwidth requirements. To perform different scalar function and reduction function operations in pipeline, this can support a variety of applications of window-based image processing. Compared with the performance of other reported structures, the performance of the new structure has some similarities to some of the structures, but also superior to some other structures. Especially when compared with a systolic array processor CWP, this structure at the same frequency of approximately 12.9% of the speed increases. The proposed parallel VLSI architecture was implemented with SIMC 0.18-μm CMOS technology, and the maximum clock frequency, power consumption, and area are 125Mhz, 57mW, 104.8K Gates, respectively, furthermore the processing time is independent of the different window-based algorithms mapped to the structure
Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng
2016-06-24
Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jin, Tao; Chen, Yiyang; Flesch, Rodolfo C. C.
2017-11-01
Harmonics pose a great threat to safe and economical operation of power grids. Therefore, it is critical to detect harmonic parameters accurately to design harmonic compensation equipment. The fast Fourier transform (FFT) is widely used for electrical popular power harmonics analysis. However, the barrier effect produced by the algorithm itself and spectrum leakage caused by asynchronous sampling often affects the harmonic analysis accuracy. This paper examines a new approach for harmonic analysis based on deducing the modifier formulas of frequency, phase angle, and amplitude, utilizing the Nuttall-Kaiser window double spectrum line interpolation method, which overcomes the shortcomings in traditional FFT harmonic calculations. The proposed approach is verified numerically and experimentally to be accurate and reliable.
PARAMS is a Windows-based computer program that implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into eight categories: (1) the properties o...
Using Parameters of Dynamic Pulse Function for 3d Modeling in LOD3 Based on Random Textures
NASA Astrophysics Data System (ADS)
Alizadehashrafi, B.
2015-12-01
The pulse function (PF) is a technique based on procedural preprocessing system to generate a computerized virtual photo of the façade with in a fixed size square(Alizadehashrafi et al., 2009, Musliman et al., 2010). Dynamic Pulse Function (DPF) is an enhanced version of PF which can create the final photo, proportional to real geometry. This can avoid distortion while projecting the computerized photo on the generated 3D model(Alizadehashrafi and Rahman, 2013). The challenging issue that might be handled for having 3D model in LoD3 rather than LOD2, is the final aim that have been achieved in this paper. In the technique based DPF the geometries of the windows and doors are saved in an XML file schema which does not have any connections with the 3D model in LoD2 and CityGML format. In this research the parameters of Dynamic Pulse Functions are utilized via Ruby programming language in SketchUp Trimble to generate (exact position and deepness) the windows and doors automatically in LoD3 based on the same concept of DPF. The advantage of this technique is automatic generation of huge number of similar geometries e.g. windows by utilizing parameters of DPF along with defining entities and window layers. In case of converting the SKP file to CityGML via FME software or CityGML plugins the 3D model contains the semantic database about the entities and window layers which can connect the CityGML to MySQL(Alizadehashrafi and Baig, 2014). The concept behind DPF, is to use logical operations to project the texture on the background image which is dynamically proportional to real geometry. The process of projection is based on two vertical and horizontal dynamic pulses starting from upper-left corner of the background wall in down and right directions respectively based on image coordinate system. The logical one/zero on the intersections of two vertical and horizontal dynamic pulses projects/does not project the texture on the background image. It is possible to define priority for each layer. For instance the priority of the door layer can be higher than window layer which means that window texture cannot be projected on the door layer. Orthogonal and rectified perpendicular symmetric photos of the 3D objects that are proportional to the real façade geometry must be utilized for the generation of the output frame for DPF. The DPF produces very high quality and small data size of output image files in quite smaller dimension compare with the photorealistic texturing method. The disadvantage of DPF is its preprocessing method to generate output image file rather than online processing to generate the texture within the 3D environment such as CityGML. Furthermore the result of DPF can be utilized for 3D model in LOD2 rather than LOD3. In the current work the random textures of the window layers are created based on parameters of DPF within Ruby console of SketchUp Trimble to generate the deeper geometries of the windows and their exact position on the façade automatically along with random textures to increase Level of Realism (LoR)(Scarpino, 2010). As the output frame in DPF is proportional to real geometry (height and width of the façade) it is possible to query the XML database and convert them to units such as meter automatically. In this technique, the perpendicular terrestrial photo from the façade is rectified by employing projective transformation based on the frame which is in constrain proportion to real geometry. The rectified photos which are not suitable for texturing but necessary for measuring, can be resized in constrain proportion to real geometry before measuring process. Height and width of windows, doors, horizontal and vertical distance between windows from upper left corner of the photo dimensions of doors and windows are parameters that should be measured to run the program as a plugins in SketchUp Trimble. The system can use these parameters and texture file names and file paths to create the façade semi-automatically. To avoid leaning geometry the textures of windows, doors and etc, should be cropped and rectified from perpendicular photos, so that they can be used in the program to create the whole façade along with its geometries. Texture enhancement should be done in advance such as removing disturbing objects, exposure setting, left-right up-down transformation, and so on. In fact, the quality, small data size, scale and semantic database for each façade are the prominent advantages of this method.
NASA Astrophysics Data System (ADS)
Li, C.; Zhou, X.; Tang, D.; Zhu, Z.
2018-04-01
Resolution and sidelobe are mutual restrict for SAR image. Usually sidelobe suppression is based on resolution reduction. This paper provide a method for resolution enchancement using sidelobe opposition speciality of hanning window and SAR image. The method can keep high resolution on the condition of sidelobe suppression. Compare to traditional method, this method can enchance 50 % resolution when sidelobe is -30dB.
Windowed multipole for cross section Doppler broadening
NASA Astrophysics Data System (ADS)
Josey, C.; Ducru, P.; Forget, B.; Smith, K.
2016-02-01
This paper presents an in-depth analysis on the accuracy and performance of the windowed multipole Doppler broadening method. The basic theory behind cross section data is described, along with the basic multipole formalism followed by the approximations leading to windowed multipole method and the algorithm used to efficiently evaluate Doppler broadened cross sections. The method is tested by simulating the BEAVRS benchmark with a windowed multipole library composed of 70 nuclides. Accuracy of the method is demonstrated on a single assembly case where total neutron production rates and 238U capture rates compare within 0.1% to ACE format files at the same temperature. With regards to performance, clock cycle counts and cache misses were measured for single temperature ACE table lookup and for windowed multipole. The windowed multipole method was found to require 39.6% more clock cycles to evaluate, translating to a 7.9% performance loss overall. However, the algorithm has significantly better last-level cache performance, with 3 fewer misses per evaluation, or a 65% reduction in last-level misses. This is due to the small memory footprint of the windowed multipole method and better memory access pattern of the algorithm.
Characterizing artifacts in RR stress test time series.
Astudillo-Salinas, Fabian; Palacio-Baus, Kenneth; Solano-Quinde, Lizandro; Medina, Ruben; Wong, Sara
2016-08-01
Electrocardiographic stress test records have a lot of artifacts. In this paper we explore a simple method to characterize the amount of artifacts present in unprocessed RR stress test time series. Four time series classes were defined: Very good lead, Good lead, Low quality lead and Useless lead. 65 ECG, 8 lead, records of stress test series were analyzed. Firstly, RR-time series were annotated by two experts. The automatic methodology is based on dividing the RR-time series in non-overlapping windows. Each window is marked as noisy whenever it exceeds an established standard deviation threshold (SDT). Series are classified according to the percentage of windows that exceeds a given value, based upon the first manual annotation. Different SDT were explored. Results show that SDT close to 20% (as a percentage of the mean) provides the best results. The coincidence between annotators classification is 70.77% whereas, the coincidence between the second annotator and the automatic method providing the best matches is larger than 63%. Leads classified as Very good leads and Good leads could be combined to improve automatic heartbeat labeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shlivinski, A., E-mail: amirshli@ee.bgu.ac.il; Lomakin, V., E-mail: vlomakin@eng.ucsd.edu
2016-03-01
Scattering or coupling of electromagnetic beam-field at a surface discontinuity separating two homogeneous or inhomogeneous media with different propagation characteristics is formulated using surface integral equation, which are solved by the Method of Moments with the aid of the Gabor-based Gaussian window frame set of basis and testing functions. The application of the Gaussian window frame provides (i) a mathematically exact and robust tool for spatial-spectral phase-space formulation and analysis of the problem; (ii) a system of linear equations in a transmission-line like form relating mode-like wave objects of one medium with mode-like wave objects of the second medium; (iii)more » furthermore, an appropriate setting of the frame parameters yields mode-like wave objects that blend plane wave properties (as if solving in the spectral domain) with Green's function properties (as if solving in the spatial domain); and (iv) a representation of the scattered field with Gaussian-beam propagators that may be used in many large (in terms of wavelengths) systems.« less
NASA Astrophysics Data System (ADS)
Yao, Hua-Dong; Davidson, Lars
2018-03-01
We investigate the interior noise caused by turbulent flows past a generic side-view mirror. A rectangular glass window is placed downstream of the mirror. The window vibration is excited by the surface pressure fluctuations and emits the interior noise in a cuboid cavity. The turbulent flows are simulated using a compressible large eddy simulation method. The window vibration and interior noise are predicted with a finite element method. The wavenumber-frequency spectra of the surface pressure fluctuations are analyzed. The spectra are identified with some new features that cannot be explained by the Chase model for turbulent boundary layers. The spectra contain a minor hydrodynamic domain in addition to the hydrodynamic domain caused by the main convection of the turbulent boundary layer. The minor domain results from the local convection of the recirculating flow. These domains are formed in bent elliptic shapes. The spanwise expansion of the wake is found causing the bending. Based on the wavenumber-frequency relationships in the spectra, the surface pressure fluctuations are decomposed into hydrodynamic and acoustic components. The acoustic component is more efficient in the generation of the interior noise than the hydrodynamic component. However, the hydrodynamic component is still dominant at low frequencies below approximately 250 Hz since it has low transmission losses near the hydrodynamic critical frequency of the window. The structural modes of the window determine the low-frequency interior tonal noise. The combination of the mode shapes of the window and cavity greatly affects the magnitude distribution of the interior noise.
Formulation of ionic liquid electrolyte to expand the voltage window of supercapacitors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Aken, Katherine L.; Beidaghi, Majid; Gogotsi, Yury
We report an effective method to expand the operating potential window (OPW) of electrochemical capacitors based on formulating the ionic liquid (IL) electrolytes. Moreover, using model electrochemical cells based on two identical onion like carbon (OLC) electrodes and two different IL electrolytes and their mixtures, it was shown that the asymmetric behavior of the electrolyte’s cation and anion toward the two electrodes limits the OPW of the cell and therefore its energy density. Additionally, a general solution to this problem is proposed by formulating the IL electrolyte mixtures to balance the capacitance of electrodes in a symmetric supercapacitor.
Formulation of Ionic-Liquid Electrolyte To Expand the Voltage Window of Supercapacitors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Aken, Katherine L.; Beidaghi, Majid; Gogotsi, Yury
An effective method to expand the operating potential window (OPW) of electrochemical capacitors based on formulating the ionic-liquid (IL) electrolytes is reported. Using model electrochemical cells based on two identical onion-like carbon (OLC) electrodes and two different IL electrolytes and their mixtures, it was shown that the asymmetric behavior of the electrolyte cation and anion toward the two electrodes limits the OPW of the cell and therefore its energy density. Also, a general solution to this problem is proposed by formulating the IL electrolyte mixtures to balance the capacitance of electrodes in a symmetric supercapacitor.
Formulation of ionic liquid electrolyte to expand the voltage window of supercapacitors
Van Aken, Katherine L.; Beidaghi, Majid; Gogotsi, Yury
2015-03-18
We report an effective method to expand the operating potential window (OPW) of electrochemical capacitors based on formulating the ionic liquid (IL) electrolytes. Moreover, using model electrochemical cells based on two identical onion like carbon (OLC) electrodes and two different IL electrolytes and their mixtures, it was shown that the asymmetric behavior of the electrolyte’s cation and anion toward the two electrodes limits the OPW of the cell and therefore its energy density. Additionally, a general solution to this problem is proposed by formulating the IL electrolyte mixtures to balance the capacitance of electrodes in a symmetric supercapacitor.
Writers Identification Based on Multiple Windows Features Mining
NASA Astrophysics Data System (ADS)
Fadhil, Murad Saadi; Alkawaz, Mohammed Hazim; Rehman, Amjad; Saba, Tanzila
2016-03-01
Now a days, writer identification is at high demand to identify the original writer of the script at high accuracy. The one of the main challenge in writer identification is how to extract the discriminative features of different authors' scripts to classify precisely. In this paper, the adaptive division method on the offline Latin script has been implemented using several variant window sizes. Fragments of binarized text a set of features are extracted and classified into clusters in the form of groups or classes. Finally, the proposed approach in this paper has been tested on various parameters in terms of text division and window sizes. It is observed that selection of the right window size yields a well positioned window division. The proposed approach is tested on IAM standard dataset (IAM, Institut für Informatik und angewandte Mathematik, University of Bern, Bern, Switzerland) that is a constraint free script database. Finally, achieved results are compared with several techniques reported in the literature.
Prediction of CpG-island function: CpG clustering vs. sliding-window methods
2010-01-01
Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903
Towards component-based validation of GATE: aspects of the coincidence processor.
Moraes, Eder R; Poon, Jonathan K; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D
2015-02-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to "ground truth" obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the "multiple window method"), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the "single window method"). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cristescu, Constantin P.; Stan, Cristina; Scarlat, Eugen I.; Minea, Teofil; Cristescu, Cristina M.
2012-04-01
We present a novel method for the parameter oriented analysis of mutual correlation between independent time series or between equivalent structures such as ordered data sets. The proposed method is based on the sliding window technique, defines a new type of correlation measure and can be applied to time series from all domains of science and technology, experimental or simulated. A specific parameter that can characterize the time series is computed for each window and a cross correlation analysis is carried out on the set of values obtained for the time series under investigation. We apply this method to the study of some currency daily exchange rates from the point of view of the Hurst exponent and the intermittency parameter. Interesting correlation relationships are revealed and a tentative crisis prediction is presented.
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-01-01
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction. PMID:26540059
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-11-03
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert-Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500-800 and a m range of 50-300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction.
Wavelet-based clustering of resting state MRI data in the rat.
Medda, Alessio; Hoffmann, Lukas; Magnuson, Matthew; Thompson, Garth; Pan, Wen-Ju; Keilholz, Shella
2016-01-01
While functional connectivity has typically been calculated over the entire length of the scan (5-10min), interest has been growing in dynamic analysis methods that can detect changes in connectivity on the order of cognitive processes (seconds). Previous work with sliding window correlation has shown that changes in functional connectivity can be observed on these time scales in the awake human and in anesthetized animals. This exciting advance creates a need for improved approaches to characterize dynamic functional networks in the brain. Previous studies were performed using sliding window analysis on regions of interest defined based on anatomy or obtained from traditional steady-state analysis methods. The parcellation of the brain may therefore be suboptimal, and the characteristics of the time-varying connectivity between regions are dependent upon the length of the sliding window chosen. This manuscript describes an algorithm based on wavelet decomposition that allows data-driven clustering of voxels into functional regions based on temporal and spectral properties. Previous work has shown that different networks have characteristic frequency fingerprints, and the use of wavelets ensures that both the frequency and the timing of the BOLD fluctuations are considered during the clustering process. The method was applied to resting state data acquired from anesthetized rats, and the resulting clusters agreed well with known anatomical areas. Clusters were highly reproducible across subjects. Wavelet cross-correlation values between clusters from a single scan were significantly higher than the values from randomly matched clusters that shared no temporal information, indicating that wavelet-based analysis is sensitive to the relationship between areas. Copyright © 2015 Elsevier Inc. All rights reserved.
Centroid estimation for a Shack-Hartmann wavefront sensor based on stream processing.
Kong, Fanpeng; Polo, Manuel Cegarra; Lambert, Andrew
2017-08-10
Using center of gravity to estimate the centroid of the spot in a Shack-Hartmann wavefront sensor, the measurement corrupts with photon and detector noise. Parameters, like window size, often require careful optimization to balance the noise error, dynamic range, and linearity of the response coefficient under different photon flux. It also needs to be substituted by the correlation method for extended sources. We propose a centroid estimator based on stream processing, where the center of gravity calculation window floats with the incoming pixel from the detector. In comparison with conventional methods, we show that the proposed estimator simplifies the choice of optimized parameters, provides a unit linear coefficient response, and reduces the influence of background and noise. It is shown that the stream-based centroid estimator also works well for limited size extended sources. A hardware implementation of the proposed estimator is discussed.
Mitigating reentry radio blackout by using a traveling magnetic field
NASA Astrophysics Data System (ADS)
Zhou, Hui; Li, Xiaoping; Xie, Kai; Liu, Yanming; Yu, Yuanyuan
2017-10-01
A hypersonic flight or a reentry vehicle is surrounded by a plasma layer that prevents electromagnetic wave transmission, which results in radio blackout. The magnetic-window method is considered a promising means to mitigate reentry communication blackout. However, the real application of this method is limited because of the need for strong magnetic fields. To reduce the required magnetic field strength, a novel method that applies a traveling magnetic field (TMF) is proposed in this study. A mathematical model based on magneto-hydrodynamic theory is adopted to analyze the effect of TMF on plasma. The mitigating effects of the TMF on the blackout of typical frequency bands, including L-, S-, and C-bands, are demonstrated. Results indicate that a significant reduction of plasma density occurs in the magnetic-window region by applying a TMF, and the reduction ratio is positively correlated with the velocity of the TMF. The required traveling velocities for eliminating the blackout of the Global Positioning System (GPS) and the typical telemetry system are also discussed. Compared with the constant magnetic-window method, the TMF method needs lower magnetic field strength and is easier to realize in the engineering field.
Obtaining high-resolution velocity spectra using weighted semblance
NASA Astrophysics Data System (ADS)
Ebrahimi, Saleh; Kahoo, Amin Roshandel; Porsani, Milton J.; Kalateh, Ali Nejati
2017-02-01
Velocity analysis employs coherency measurement along a hyperbolic or non-hyperbolic trajectory time window to build velocity spectra. Accuracy and resolution are strictly related to the method of coherency measurements. Semblance, the most common coherence measure, has poor resolution velocity which affects one's ability to distinguish and pick distinct peaks. Increase the resolution of the semblance velocity spectra causes the accuracy of estimated velocity for normal moveout correction and stacking is improved. The low resolution of semblance spectra depends on its low sensitivity to velocity changes. In this paper, we present a new weighted semblance method that ensures high-resolution velocity spectra. To increase the resolution of semblance spectra, we introduce two weighting functions based on the first to second singular values ratio of the time window and the position of the seismic wavelet in the time window to the semblance equation. We test the method on both synthetic and real field data to compare the resolution of weighted and conventional semblance methods. Numerical examples with synthetic and real seismic data indicate that the new proposed weighted semblance method provides higher resolution than conventional semblance and can separate the reflectors which are mixed in the semblance spectrum.
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.
Investigations on the carbon contaminations on the alkali cells of DPAL with hydrocarbon buffer gas
NASA Astrophysics Data System (ADS)
Li, Zhiyong; Tan, Rongqing; Wang, Yujie; Ye, Qing; Bian, Jintian; Huang, Wei; Li, Hui; Han, Gaoce
2017-10-01
Diode pumped alkali laser (DPAL) with hydrocarbon buffer gases has the features of low threshold and high efficiency. The chemical reaction between alkali and hydrocarbon gases affects the life time of DPAL. In this paper, a method based on Fourier transform infrared spectroscopy and Lambert-Beer law is adopted to find a safe temperature at which DPAL runs for a long term. A theoretical model is established to figure out ways to reduce the peak temperature in the cell window. The results indicates that 170 °C is a safe temperature. Although the absorbance of the cell window to the pump light and alkali laser is lower, there is temperature increase. Small light-transmitting area and air blowing on the windows can reduce the peak temperature effectively. Cooling the cell window is essential and critical in a long-term running DPAL.
Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.
Somasundaram, K; Rajendran, P Alli
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.
Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach
Somasundaram, K.; Alli Rajendran, P.
2015-01-01
Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices.
Song, Soonyong; Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-04-17
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions.
Gaussian windows: A tool for exploring multivariate data
NASA Technical Reports Server (NTRS)
Jaeckel, Louis A.
1990-01-01
Presented here is a method for interactively exploring a large set of quantitative multivariate data, in order to estimate the shape of the underlying density function. It is assumed that the density function is more or less smooth, but no other specific assumptions are made concerning its structure. The local structure of the data in a given region may be examined by viewing the data through a Gaussian window, whose location and shape are chosen by the user. A Gaussian window is defined by giving each data point a weight based on a multivariate Gaussian function. The weighted sample mean and sample covariance matrix are then computed, using the weights attached to the data points. These quantities are used to compute an estimate of the shape of the density function in the window region. The local structure of the data is described by a method similar to the method of principal components. By taking many such local views of the data, we can form an idea of the structure of the data set. The method is applicable in any number of dimensions. The method can be used to find and describe simple structural features such as peaks, valleys, and saddle points in the density function, and also extended structures in higher dimensions. With some practice, we can apply our geometrical intuition to these structural features in any number of dimensions, so that we can think about and describe the structure of the data. Since the computations involved are relatively simple, the method can easily be implemented on a small computer.
Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn
2016-01-01
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197
Simplified Computation for Nonparametric Windows Method of Probability Density Function Estimation.
Joshi, Niranjan; Kadir, Timor; Brady, Michael
2011-08-01
Recently, Kadir and Brady proposed a method for estimating probability density functions (PDFs) for digital signals which they call the Nonparametric (NP) Windows method. The method involves constructing a continuous space representation of the discrete space and sampled signal by using a suitable interpolation method. NP Windows requires only a small number of observed signal samples to estimate the PDF and is completely data driven. In this short paper, we first develop analytical formulae to obtain the NP Windows PDF estimates for 1D, 2D, and 3D signals, for different interpolation methods. We then show that the original procedure to calculate the PDF estimate can be significantly simplified and made computationally more efficient by a judicious choice of the frame of reference. We have also outlined specific algorithmic details of the procedures enabling quick implementation. Our reformulation of the original concept has directly demonstrated a close link between the NP Windows method and the Kernel Density Estimator.
Wang, Bing; Baby, Varghese; Tong, Wilson; Xu, Lei; Friedman, Michelle; Runser, Robert; Glesk, Ivan; Prucnal, Paul
2002-01-14
A novel optical switch based on cascading two terahertz optical asymmetric demultiplexers (TOAD) is presented. By utilizing the sharp edge of the asymmetric TOAD switching window profile, two TOAD switching windows are overlapped to produce a narrower aggregate switching window, not limited by the pulse propagation time in the SOA of the TOAD. Simulations of the cascaded TOAD switching window show relatively constant window amplitude for different window sizes. Experimental results on cascading two TOADs, each with a switching window of 8ps, but with the SOA on opposite sides of the fiber loop, show a minimum switching window of 2.7ps.
ERIC Educational Resources Information Center
Mathematics Teacher, 2004
2004-01-01
Some inexpensive or free ways that enable to capture and use images in work are mentioned. The first tip demonstrates the methods of using some of the built-in capabilities of the Macintosh and Windows-based PC operating systems, and the second tip describes methods to capture and create images using SnagIt.
A window-based time series feature extraction method.
Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife
2017-10-01
This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Adjoint-Based Climate Model Tuning: Application to the Planet Simulator
NASA Astrophysics Data System (ADS)
Lyu, Guokun; Köhl, Armin; Matei, Ion; Stammer, Detlef
2018-01-01
The adjoint method is used to calibrate the medium complexity climate model "Planet Simulator" through parameter estimation. Identical twin experiments demonstrate that this method can retrieve default values of the control parameters when using a long assimilation window of the order of 2 months. Chaos synchronization through nudging, required to overcome limits in the temporal assimilation window in the adjoint method, is employed successfully to reach this assimilation window length. When assimilating ERA-Interim reanalysis data, the observations of air temperature and the radiative fluxes are the most important data for adjusting the control parameters. The global mean net longwave fluxes at the surface and at the top of the atmosphere are significantly improved by tuning two model parameters controlling the absorption of clouds and water vapor. The global mean net shortwave radiation at the surface is improved by optimizing three model parameters controlling cloud optical properties. The optimized parameters improve the free model (without nudging terms) simulation in a way similar to that in the assimilation experiments. Results suggest a promising way for tuning uncertain parameters in nonlinear coupled climate models.
Hydrofluoric acid-resistant composite window and method for its fabrication
Ostenak, C.A.; Mackay, H.A.
1985-07-18
A hydrofluoric acid-resistant composite window and method for its fabrication are disclosed. The composite window comprises a window having first and second sides. The first side is oriented towards an environment containing hydrofluoric acid. An adhesive is applied to the first side. A layer of transparent hydrofluoric acid-resistant material, such as Mylar, is applied to the adhesive and completely covers the first side. The adhesive is then cured.
Hydrofluoric acid-resistant composite window and method for its fabrication
Ostenak, Carl A.; Mackay, Harold A.
1987-01-01
A hydrofluoric acid-resistant composite window and method for its fabrication are disclosed. The composite window comprises a window having first and second sides. The first side is oriented towards an environment containing hydrofluoric acid. An adhesive is applied to the first side. A layer of transparent hydrofluoric acid-resistant material, such as Mylar, is applied to the adhesive and completely covers the first side. The adhesive is then cured.
NASA Technical Reports Server (NTRS)
Scholtz, P.; Smyth, P.
1992-01-01
This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.
A modified TEW approach to scatter correction for In-111 and Tc-99m dual-isotope small-animal SPECT.
Prior, Paul; Timmins, Rachel; Petryk, Julia; Strydhorst, Jared; Duan, Yin; Wei, Lihui; Glenn Wells, R
2016-10-01
In dual-isotope (Tc-99m/In-111) small-animal single-photon emission computed tomography (SPECT), quantitative accuracy of Tc-99m activity measurements is degraded due to the detection of Compton-scattered photons in the Tc-99m photopeak window, which originate from the In-111 emissions (cross talk) and from the Tc-99m emission (self-scatter). The standard triple-energy window (TEW) estimates the total scatter (self-scatter and cross talk) using one scatter window on either side of the Tc-99m photopeak window, but the estimate is biased due to the presence of unscattered photons in the scatter windows. The authors present a modified TEW method to correct for total scatter that compensates for this bias and evaluate the method in phantoms and in vivo. The number of unscattered Tc-99m and In-111 photons present in each scatter-window projection is estimated based on the number of photons detected in the photopeak of each isotope, using the isotope-dependent energy resolution of the detector. The camera-head-specific energy resolutions for the 140 keV Tc-99m and 171 keV In-111 emissions were determined experimentally by separately sampling the energy spectra of each isotope. Each sampled spectrum was fit with a Linear + Gaussian function. The fitted Gaussian functions were integrated across each energy window to determine the proportion of unscattered photons from each emission detected in the scatter windows. The method was first tested and compared to the standard TEW in phantoms containing Tc-99m:In-111 activity ratios between 0.15 and 6.90. True activities were determined using a dose calibrator, and SPECT activities were estimated from CT-attenuation-corrected images with and without scatter-correction. The method was then tested in vivo in six rats using In-111-liposome and Tc-99m-tetrofosmin to generate cross talk in the area of the myocardium. The myocardium was manually segmented using the SPECT and CT images, and partial-volume correction was performed using a template-based approach. The rat heart was counted in a well-counter to determine the true activity. In the phantoms without correction for Compton-scatter, Tc-99m activity quantification errors as high as 85% were observed. The standard TEW method quantified Tc-99m activity with an average accuracy of -9.0% ± 0.7%, while the modified TEW was accurate within 5% of truth in phantoms with Tc-99m:In-111 activity ratios ≥0.52. Without scatter-correction, In-111 activity was quantified with an average accuracy of 4.1%, and there was no dependence of accuracy on the activity ratio. In rat myocardia, uncorrected images were overestimated by an average of 23% ± 5%, and the standard TEW had an accuracy of -13.8% ± 1.6%, while the modified TEW yielded an accuracy of -4.0% ± 1.6%. Cross talk and self-scatter were shown to produce quantification errors in phantoms as well as in vivo. The standard TEW provided inaccurate results due to the inclusion of unscattered photons in the scatter windows. The modified TEW improved the scatter estimate and reduced the quantification errors in phantoms and in vivo.
climwin: An R Toolbox for Climate Window Analysis.
Bailey, Liam D; van de Pol, Martijn
2016-01-01
When studying the impacts of climate change, there is a tendency to select climate data from a small set of arbitrary time periods or climate windows (e.g., spring temperature). However, these arbitrary windows may not encompass the strongest periods of climatic sensitivity and may lead to erroneous biological interpretations. Therefore, there is a need to consider a wider range of climate windows to better predict the impacts of future climate change. We introduce the R package climwin that provides a number of methods to test the effect of different climate windows on a chosen response variable and compare these windows to identify potential climate signals. climwin extracts the relevant data for each possible climate window and uses this data to fit a statistical model, the structure of which is chosen by the user. Models are then compared using an information criteria approach. This allows users to determine how well each window explains variation in the response variable and compare model support between windows. climwin also contains methods to detect type I and II errors, which are often a problem with this type of exploratory analysis. This article presents the statistical framework and technical details behind the climwin package and demonstrates the applicability of the method with a number of worked examples.
Mof-Tree: A Spatial Access Method To Manipulate Multiple Overlapping Features.
ERIC Educational Resources Information Center
Manolopoulos, Yannis; Nardelli, Enrico; Papadopoulos, Apostolos; Proietti, Guido
1997-01-01
Investigates the manipulation of large sets of two-dimensional data representing multiple overlapping features, and presents a new access method, the MOF-tree. Analyzes storage requirements and time with respect to window query operations involving multiple features. Examines both the pointer-based and pointerless MOF-tree representations.…
Plontke, Stefan K.; Mynatt, Robert; Gill, Ruth M.; Borgmann, Stefan; Salt, Alec N.
2008-01-01
Objectives The distribution of gentamicin along the fluid spaces of the cochlea following local applications has never previously been demonstrated. Computer simulations have predicted that significant basal-apical concentration gradients might be expected and histological studies indicate that hair cell damage is greater at the base than at the apex following local gentamicin application. In the present study, gradients of gentamicin along the cochlea were measured. Methods A recently-developed method of sampling perilymph from the cochlear apex of guinea pigs was used, in which the samples represent fluid originating from different regions along scala tympani. Gentamicin concentration was determined in sequential apical samples which were taken following up to three hours of local application to the round window niche. Results Substantial gradients of gentamicin along the length of scala tympani were demonstrated and quantified, averaging more than 4000 times greater concentration at the base compared to the apex at the time of sampling. Peak concentrations and gradients for gentamicin varied considerably between animals, likely resulting from variations in round window membrane permeability and rates of perilymph flow. Conclusions The large gradients for gentamicin demonstrated here in guinea pigs account for how it is possible to suppress vestibular function in some patients with a local application of gentamicin without damaging auditory function. Variations in round window membrane permeability and in perilymph flow could account for why hearing losses are observed in some patients. PMID:17603318
Bounded Linear Stability Analysis - A Time Delay Margin Estimation Approach for Adaptive Control
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.; Ishihara, Abraham K.; Krishnakumar, Kalmanje Srinlvas; Bakhtiari-Nejad, Maryam
2009-01-01
This paper presents a method for estimating time delay margin for model-reference adaptive control of systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent the conventional model-reference adaptive law by a locally bounded linear approximation within a small time window using the comparison lemma. The locally bounded linear approximation of the combined adaptive system is cast in a form of an input-time-delay differential equation over a small time window. The time delay margin of this system represents a local stability measure and is computed analytically by a matrix measure method, which provides a simple analytical technique for estimating an upper bound of time delay margin. Based on simulation results for a scalar model-reference adaptive control system, both the bounded linear stability method and the matrix measure method are seen to provide a reasonably accurate and yet not too conservative time delay margin estimation.
NASA Astrophysics Data System (ADS)
Jiang, Peng; Peng, Lihui; Xiao, Deyun
2007-06-01
This paper presents a regularization method by using different window functions as regularization for electrical capacitance tomography (ECT) image reconstruction. Image reconstruction for ECT is a typical ill-posed inverse problem. Because of the small singular values of the sensitivity matrix, the solution is sensitive to the measurement noise. The proposed method uses the spectral filtering properties of different window functions to make the solution stable by suppressing the noise in measurements. The window functions, such as the Hanning window, the cosine window and so on, are modified for ECT image reconstruction. Simulations with respect to five typical permittivity distributions are carried out. The reconstructions are better and some of the contours are clearer than the results from the Tikhonov regularization. Numerical results show that the feasibility of the image reconstruction algorithm using different window functions as regularization.
Online frequency estimation with applications to engine and generator sets
NASA Astrophysics Data System (ADS)
Manngård, Mikael; Böling, Jari M.
2017-07-01
Frequency and spectral analysis based on the discrete Fourier transform is a fundamental task in signal processing and machine diagnostics. This paper aims at presenting computationally efficient methods for real-time estimation of stationary and time-varying frequency components in signals. A brief survey of the sliding time window discrete Fourier transform and Goertzel filter is presented, and two filter banks consisting of: (i) sliding time window Goertzel filters (ii) infinite impulse response narrow bandpass filters are proposed for estimating instantaneous frequencies. The proposed methods show excellent results on both simulation studies and on a case study using angular speed data measurements of the crankshaft of a marine diesel engine-generator set.
An Adaptive Channel Access Method for Dynamic Super Dense Wireless Sensor Networks.
Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Zhang, Xuekun
2015-12-03
Super dense and distributed wireless sensor networks have become very popular with the development of small cell technology, Internet of Things (IoT), Machine-to-Machine (M2M) communications, Vehicular-to-Vehicular (V2V) communications and public safety networks. While densely deployed wireless networks provide one of the most important and sustainable solutions to improve the accuracy of sensing and spectral efficiency, a new channel access scheme needs to be designed to solve the channel congestion problem introduced by the high dynamics of competing nodes accessing the channel simultaneously. In this paper, we firstly analyzed the channel contention problem using a novel normalized channel contention analysis model which provides information on how to tune the contention window according to the state of channel contention. We then proposed an adaptive channel contention window tuning algorithm in which the contention window tuning rate is set dynamically based on the estimated channel contention level. Simulation results show that our proposed adaptive channel access algorithm based on fast contention window tuning can achieve more than 95 % of the theoretical optimal throughput and 0 . 97 of fairness index especially in dynamic and dense networks.
Application of MEMS-based x-ray optics as tuneable nanosecond choppers
NASA Astrophysics Data System (ADS)
Chen, Pice; Walko, Donald A.; Jung, Il Woong; Li, Zhilong; Gao, Ya; Shenoy, Gopal K.; Lopez, Daniel; Wang, Jin
2017-08-01
Time-resolved synchrotron x-ray measurements often rely on using a mechanical chopper to isolate a set of x-ray pulses. We have started the development of micro electromechanical systems (MEMS)-based x-ray optics, as an alternate method to manipulate x-ray beams. In the application of x-ray pulse isolation, we recently achieved a pulse-picking time window of half a nanosecond, which is more than 100 times faster than mechanical choppers can achieve. The MEMS device consists of a comb-drive silicon micromirror, designed for efficiently diffracting an x-ray beam during oscillation. The MEMS devices were operated in Bragg geometry and their oscillation was synchronized to x-ray pulses, with a frequency matching subharmonics of the cycling frequency of x-ray pulses. The microscale structure of the silicon mirror in terms of the curvature and the quality of crystallinity ensures a narrow angular spread of the Bragg reflection. With the discussion of factors determining the diffractive time window, this report showed our approaches to narrow down the time window to half a nanosecond. The short diffractive time window will allow us to select single x-ray pulse out of a train of pulses from synchrotron radiation facilities.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improved modified energy ratio method using a multi-window approach for accurate arrival picking
NASA Astrophysics Data System (ADS)
Lee, Minho; Byun, Joongmoo; Kim, Dowan; Choi, Jihun; Kim, Myungsun
2017-04-01
To identify accurately the location of microseismic events generated during hydraulic fracture stimulation, it is necessary to detect the first break of the P- and S-wave arrival times recorded at multiple receivers. These microseismic data often contain high-amplitude noise, which makes it difficult to identify the P- and S-wave arrival times. The short-term-average to long-term-average (STA/LTA) and modified energy ratio (MER) methods are based on the differences in the energy densities of the noise and signal, and are widely used to identify the P-wave arrival times. The MER method yields more consistent results than the STA/LTA method for data with a low signal-to-noise (S/N) ratio. However, although the MER method shows good results regardless of the delay of the signal wavelet for signals with a high S/N ratio, it may yield poor results if the signal is contaminated by high-amplitude noise and does not have the minimum delay. Here we describe an improved MER (IMER) method, whereby we apply a multiple-windowing approach to overcome the limitations of the MER method. The IMER method contains calculations of an additional MER value using a third window (in addition to the original MER window), as well as the application of a moving average filter to each MER data point to eliminate high-frequency fluctuations in the original MER distributions. The resulting distribution makes it easier to apply thresholding. The proposed IMER method was applied to synthetic and real datasets with various S/N ratios and mixed-delay wavelets. The results show that the IMER method yields a high accuracy rate of around 80% within five sample errors for the synthetic datasets. Likewise, in the case of real datasets, 94.56% of the P-wave picking results obtained by the IMER method had a deviation of less than 0.5 ms (corresponding to 2 samples) from the manual picks.
Katzman, G L
2001-03-01
The goal of the project was to create a method by which an in-house digital teaching file could be constructed that was simple, inexpensive, independent of hypertext markup language (HTML) restrictions, and appears identical on multiple platforms. To accomplish this, Microsoft PowerPoint and Adobe Acrobat were used in succession to assemble digital teaching files in the Acrobat portable document file format. They were then verified to appear identically on computers running Windows, Macintosh Operating Systems (OS), and the Silicon Graphics Unix-based OS as either a free-standing file using Acrobat Reader software or from within a browser window using the Acrobat browser plug-in. This latter display method yields a file viewed through a browser window, yet remains independent of underlying HTML restrictions, which may confer an advantage over simple HTML teaching file construction. Thus, a hybrid of HTML-distributed Adobe Acrobat generated WWW documents may be a viable alternative for digital teaching file construction and distribution.
The method for detecting small lesions in medical image based on sliding window
NASA Astrophysics Data System (ADS)
Han, Guilai; Jiao, Yuan
2016-10-01
At present, the research on computer-aided diagnosis includes the sample image segmentation, extracting visual features, generating the classification model by learning, and according to the model generated to classify and judge the inspected images. However, this method has a large scale of calculation and speed is slow. And because medical images are usually low contrast, when the traditional image segmentation method is applied to the medical image, there is a complete failure. As soon as possible to find the region of interest, improve detection speed, this topic attempts to introduce the current popular visual attention model into small lesions detection. However, Itti model is mainly for natural images. But the effect is not ideal when it is used to medical images which usually are gray images. Especially in the early stages of some cancers, the focus of a disease in the whole image is not the most significant region and sometimes is very difficult to be found. But these lesions are prominent in the local areas. This paper proposes a visual attention mechanism based on sliding window, and use sliding window to calculate the significance of a local area. Combined with the characteristics of the lesion, select the features of gray, entropy, corner and edge to generate a saliency map. Then the significant region is segmented and distinguished. This method reduces the difficulty of image segmentation, and improves the detection accuracy of small lesions, and it has great significance to early discovery, early diagnosis and treatment of cancers.
Apparatus and method for in-situ cleaning of resist outgassing windows
Klebanoff, Leonard E.; Haney, Steven J.
2001-01-01
An apparatus and method for in-situ cleaning of resist outgassing windows. The apparatus includes a chamber located in a structure, with the chamber having an outgassing window to be cleaned positioned in alignment with a slot in the chamber, whereby radiation energy passes through the window, the chamber, and the slot onto a resist-coated wafer mounted in the structure. The chamber is connected to a gas supply and the structure is connected to a vacuum pump. Within the chamber are two cylindrical sector electrodes and a filament is electrically connected to one sector electrode and a power supply. In a first cleaning method the sector electrodes are maintained at the same voltage, the filament is unheated, the chamber is filled with argon (Ar) gas under pressure, and the window is maintained at a zero voltage, whereby Ar ions are accelerated onto the window surface, sputtering away carbon deposits that build up as a result of resist outgassing. A second cleaning method is similar except oxygen gas (O.sub.2) is admitted to the chamber instead of Ar. These two methods can be carried out during lithographic operation. A third method, carried out during a maintenance period, involves admitting CO.sub.2 into the chamber, heating the filament to a point of thermionic emission, the sector electrodes are at different voltages, excited CO.sub.2 gas molecules are created which impact the carbon contamination on the window, and gasify it, producing CO gaseous products that are pumped away.
Near real-time vaccine safety surveillance with partially accrued data.
Greene, Sharon K; Kulldorff, Martin; Yin, Ruihua; Yih, W Katherine; Lieu, Tracy A; Weintraub, Eric S; Lee, Grace M
2011-06-01
The Vaccine Safety Datalink (VSD) Project conducts near real-time vaccine safety surveillance using sequential analytic methods. Timely surveillance is critical in identifying potential safety problems and preventing additional exposure before most vaccines are administered. For vaccines that are administered during a short period, such as influenza vaccines, timeliness can be improved by undertaking analyses while risk windows following vaccination are ongoing and by accommodating predictable and unpredictable data accrual delays. We describe practical solutions to these challenges, which were adopted by the VSD Project during pandemic and seasonal influenza vaccine safety surveillance in 2009/2010. Adjustments were made to two sequential analytic approaches. The Poisson-based approach compared the number of pre-defined adverse events observed following vaccination with the number expected using historical data. The expected number was adjusted for the proportion of the risk window elapsed and the proportion of inpatient data estimated to have accrued. The binomial-based approach used a self-controlled design, comparing the observed numbers of events in risk versus comparison windows. Events were included in analysis only if they occurred during a week that had already passed for both windows. Analyzing data before risk windows fully elapsed improved the timeliness of safety surveillance. Adjustments for data accrual lags were tailored to each data source and avoided biasing analyses away from detecting a potential safety problem, particularly early during surveillance. The timeliness of vaccine and drug safety surveillance can be improved by properly accounting for partially elapsed windows and data accrual delays. Copyright © 2011 John Wiley & Sons, Ltd.
Degradation Prediction Model Based on a Neural Network with Dynamic Windows
Zhang, Xinghui; Xiao, Lei; Kang, Jianshe
2015-01-01
Tracking degradation of mechanical components is very critical for effective maintenance decision making. Remaining useful life (RUL) estimation is a widely used form of degradation prediction. RUL prediction methods when enough run-to-failure condition monitoring data can be used have been fully researched, but for some high reliability components, it is very difficult to collect run-to-failure condition monitoring data, i.e., from normal to failure. Only a certain number of condition indicators in certain period can be used to estimate RUL. In addition, some existing prediction methods have problems which block RUL estimation due to poor extrapolability. The predicted value converges to a certain constant or fluctuates in certain range. Moreover, the fluctuant condition features also have bad effects on prediction. In order to solve these dilemmas, this paper proposes a RUL prediction model based on neural network with dynamic windows. This model mainly consists of three steps: window size determination by increasing rate, change point detection and rolling prediction. The proposed method has two dominant strengths. One is that the proposed approach does not need to assume the degradation trajectory is subject to a certain distribution. The other is it can adapt to variation of degradation indicators which greatly benefits RUL prediction. Finally, the performance of the proposed RUL prediction model is validated by real field data and simulation data. PMID:25806873
NASA Astrophysics Data System (ADS)
Zaripov, D. I.; Renfu, Li
2018-05-01
The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.
Rules based process window OPC
NASA Astrophysics Data System (ADS)
O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark
2008-03-01
As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.
NASA Technical Reports Server (NTRS)
Grosveld, F.; Navaneethan, R.; Roskam, J.
1981-01-01
This paper presents results of a systematic experimental investigation of parameters which affect sound transmission through general aviation structures. Parameters studied include angle of sound incidence, panel curvature, panel stresses, and edge conditions for bare panels; pane thickness, spacing, inclination of window panes, and depressurization for dual pane windows; densities of hard foam and sound absorption materials, air gaps, and trim panel thickness for multilayered panels. Based on the study, some promising methods for reducing interior noise in general aviation airplanes are discussed.
NASA Astrophysics Data System (ADS)
Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari
2018-01-01
The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.
Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.
Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian
2018-01-01
Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user’s credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method. PMID:29742134
Fiber optic sensor based on Mach-Zehnder interferometer for securing entrance areas of buildings
NASA Astrophysics Data System (ADS)
Nedoma, Jan; Fajkus, Marcel; Martinek, Radek; Mec, Pavel; Novak, Martin; Bednarek, Lukas; Vasinek, Vladimir
2017-10-01
Authors of this article focused on the utilization of fiber optic sensors based on interferometric measurements for securing entrance areas of buildings such as windows and doors. We described the implementation of the fiber-optic interferometer (type Mach-Zehnder) into the window frame or door, sensor sensitivity, analysis of the background noise and methods of signal evaluation. The advantage of presented solution is the use of standard telecommunication fiber standard G.652.D, high sensitivity, immunity of sensor to electromagnetic interference (EMI) and passivity of the sensor regarding power supply. Authors implemented the Graphical User Interface (GUI) which offers the possibility of remote monitoring presented sensing solution.
NASA Astrophysics Data System (ADS)
Zhang, Yu-Feng; Dai, Jing-Min; Zhang, Lei; Pan, Wei-Dong
2013-08-01
The spectral emissivity and transmissivity of zinc sulphide (ZnS) infrared windows in the spectral region from 2 to 12 μm and temperature range from 20 to 700°C is measured by a facility built at the Harbin Institute of Technology (HIT). The facility is based on the integrating-sphere reflectometry. Measurements have been performed on two samples made of ZnS. The results measured at 20°C are in good agreement with those obtained by the method of radiant energy comparison using a Fourier transform infrared spectrometer. Emissivity measurements performed with this facility present an uncertainty of 5.5% (cover factor=2).
Infrared small target detection based on multiscale center-surround contrast measure
NASA Astrophysics Data System (ADS)
Fu, Hao; Long, Yunli; Zhu, Ran; An, Wei
2018-04-01
Infrared(IR) small target detection plays a critical role in the Infrared Search And Track (IRST) system. Although it has been studied for years, there are some difficulties remained to the clutter environment. According to the principle of human discrimination of small targets from a natural scene that there is a signature of discontinuity between the object and its neighboring regions, we develop an efficient method for infrared small target detection called multiscale centersurround contrast measure (MCSCM). First, to determine the maximum neighboring window size, an entropy-based window selection technique is used. Then, we construct a novel multiscale center-surround contrast measure to calculate the saliency map. Compared with the original image, the MCSCM map has less background clutters and noise residual. Subsequently, a simple threshold is used to segment the target. Experimental results show our method achieves better performance.
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices
Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-01-01
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions. PMID:29673206
Lang, Augustus W; Li, Yuanyuan; De Keersmaecker, Michel; Shen, D Eric; Österholm, Anna M; Berglund, Lars; Reynolds, John R
2018-03-09
Transparent wood composites, with their high strength and toughness, thermal insulation, and excellent transmissivity, offer a route to replace glass for diffusely transmitting windows. Here, conjugated-polymer-based electrochromic devices (ECDs) that switch on-demand are demonstrated using transparent wood coated with poly(3,4-ethylenedioxythiophene):poly(styrene sulfonate) (PEDOT:PSS) as a transparent conducting electrode. These ECDs exhibit a vibrant magenta-to-clear color change that results from a remarkably colorless bleached state. Furthermore, they require low energy and power inputs of 3 mWh m -2 at 2 W m -2 to switch due to a high coloration efficiency (590 cm 2 C -1 ) and low driving voltage (0.8 V). Each device component is processed with high-throughput methods, which highlights the opportunity to apply this approach to fabricate mechanically robust, energy-efficient smart windows on a large scale. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
Gap-filling methods to impute eddy covariance flux data by preserving variance.
NASA Astrophysics Data System (ADS)
Kunwor, S.; Staudhammer, C. L.; Starr, G.; Loescher, H. W.
2015-12-01
To represent carbon dynamics, in terms of exchange of CO2 between the terrestrial ecosystem and the atmosphere, eddy covariance (EC) data has been collected using eddy flux towers from various sites across globe for more than two decades. However, measurements from EC data are missing for various reasons: precipitation, routine maintenance, or lack of vertical turbulence. In order to have estimates of net ecosystem exchange of carbon dioxide (NEE) with high precision and accuracy, robust gap-filling methods to impute missing data are required. While the methods used so far have provided robust estimates of the mean value of NEE, little attention has been paid to preserving the variance structures embodied by the flux data. Preserving the variance of these data will provide unbiased and precise estimates of NEE over time, which mimic natural fluctuations. We used a non-linear regression approach with moving windows of different lengths (15, 30, and 60-days) to estimate non-linear regression parameters for one year of flux data from a long-leaf pine site at the Joseph Jones Ecological Research Center. We used as our base the Michaelis-Menten and Van't Hoff functions. We assessed the potential physiological drivers of these parameters with linear models using micrometeorological predictors. We then used a parameter prediction approach to refine the non-linear gap-filling equations based on micrometeorological conditions. This provides us an opportunity to incorporate additional variables, such as vapor pressure deficit (VPD) and volumetric water content (VWC) into the equations. Our preliminary results indicate that improvements in gap-filling can be gained with a 30-day moving window with additional micrometeorological predictors (as indicated by lower root mean square error (RMSE) of the predicted values of NEE). Our next steps are to use these parameter predictions from moving windows to gap-fill the data with and without incorporation of potential driver variables of the parameters traditionally used. Then, comparisons of the predicted values from these methods and 'traditional' gap-filling methods (using 12 fixed monthly windows) will be assessed to show the scale of preserving variance. Further, this method will be applied to impute artificially created gaps for analyzing if variance is preserved.
Bruno, Oscar P.; Turc, Catalin; Venakides, Stephanos
2016-01-01
This work, part I in a two-part series, presents: (i) a simple and highly efficient algorithm for evaluation of quasi-periodic Green functions, as well as (ii) an associated boundary-integral equation method for the numerical solution of problems of scattering of waves by doubly periodic arrays of scatterers in three-dimensional space. Except for certain ‘Wood frequencies’ at which the quasi-periodic Green function ceases to exist, the proposed approach, which is based on smooth windowing functions, gives rise to tapered lattice sums which converge superalgebraically fast to the Green function—that is, faster than any power of the number of terms used. This is in sharp contrast to the extremely slow convergence exhibited by the lattice sums in the absence of smooth windowing. (The Wood-frequency problem is treated in part II.) This paper establishes rigorously the superalgebraic convergence of the windowed lattice sums. A variety of numerical results demonstrate the practical efficiency of the proposed approach. PMID:27493573
Design and fabrication of a large area freestanding compressive stress SiO2 optical window
NASA Astrophysics Data System (ADS)
Van Toan, Nguyen; Sangu, Suguru; Ono, Takahito
2016-07-01
This paper reports the design and fabrication of a 7.2 mm × 9.6 mm freestanding compressive stress SiO2 optical window without buckling. An application of the SiO2 optical window with and without liquid penetration has been demonstrated for an optical modulator and its optical characteristic is evaluated by using an image sensor. Two methods for SiO2 optical window fabrication have been presented. The first method is a combination of silicon etching and a thermal oxidation process. Silicon capillaries fabricated by deep reactive ion etching (deep RIE) are completely oxidized to form the SiO2 capillaries. The large compressive stress of the oxide causes buckling of the optical window, which is reduced by optimizing the design of the device structure. A magnetron-type RIE, which is investigated for deep SiO2 etching, is the second method. This method achieves deep SiO2 etching together with smooth surfaces, vertical shapes and a high aspect ratio. Additionally, in order to avoid a wrinkling optical window, the idea of a Peano curve structure has been proposed to achieve a freestanding compressive stress SiO2 optical window. A 7.2 mm × 9.6 mm optical window area without buckling integrated with an image sensor for an optical modulator has been successfully fabricated. The qualitative and quantitative evaluations have been performed in cases with and without liquid penetration.
Oval Window Size and Shape: a Micro-CT Anatomical Study With Considerations for Stapes Surgery.
Zdilla, Matthew J; Skrzat, Janusz; Kozerska, Magdalena; Leszczyński, Bartosz; Tarasiuk, Jacek; Wroński, Sebastian
2018-06-01
The oval window is an important structure with regard to stapes surgeries, including stapedotomy for the treatment of otosclerosis. Recent study of perioperative imaging of the oval window has revealed that oval window niche height can indicate both operative difficulty and subjective discomfort during otosclerosis surgery. With regard to shape, structures incorporated into the oval window niche, such as cartilage grafts, must be compatible with the shape of the oval window. Despite the clinical importance of the oval window, there is little information regarding its size and shape. This study assessed oval window size and shape via micro-computed tomography paired with modern morphometric methodology in the fetal, infant, child, and adult populations. Additionally, the study compared oval window size and shape between sexes and between left- and right-sided ears. No significant differences were found among traditional morphometric parameters among age groups, sides, or sexes. However, geometric morphometric methods revealed shape differences between age groups. Further, geometric morphometric methods provided the average oval window shape and most-likely shape variance. Beyond demonstrating oval window size and shape variation, the results of this report will aid in identifying patients among whom anatomical variation may contribute to surgical difficulty and surgeon discomfort, or otherwise warrant preoperative adaptations for the incorporation of materials into and around the oval window.
NASA Technical Reports Server (NTRS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-01-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
NASA Astrophysics Data System (ADS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-12-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
Measurement of skeletal related events in SEER-Medicare: a comparison of claims-based methods.
Aly, Abdalla; Onukwugha, Eberechukwu; Woods, Corinne; Mullins, C Daniel; Kwok, Young; Qian, Yi; Arellano, Jorge; Balakumaran, Arun; Hussain, Arif
2015-08-19
Skeletal related events (SREs) are common in men with metastatic prostate cancer (mPC). Various methods have been used to identify SREs from claims data. The objective of this study was to provide a framework for measuring SREs from claims and compare SRE prevalence and cumulative incidence estimates based on alternative approaches in men with mPC. Several claims-based approaches for identifying SREs were developed and applied to data for men aged [greater than or equal to] 66 years newly diagnosed with mPC between 2000 and 2009 in the SEER-Medicare datasets and followed through 2010 or until censoring. Post-diagnosis SREs were identified using claims that indicated spinal cord compression (SCC), pathologic fracture (PF), surgery to bone (BS), or radiation (suggestive of bone palliative radiation, RAD). To measure SRE prevalence, two SRE definitions were created: 'base case' (most commonly used in the literature) and 'alternative' in which different claims were used to identify each type of SRE. To measure cumulative incidence, we used the 'base case' definition and applied three periods in which claims were clustered to episodes: 14-, 21-, and 28-day windows. Among 8997 mPC patients, 46 % experienced an SRE according to the 'base case' definition and 43 % patients experienced an SRE according to the 'alternative' definition. Varying the code definition from 'base case' to 'alternative' resulted in an 8 % increase in the overall SRE prevalence. Using the 21-day window, a total of 12,930 SRE episodes were observed during follow up. Varying the window length from 21 to 28 days resulted in an 8 % decrease in SRE cumulative incidence (RAD: 10 %, PF: 8 %, SCC: 6 %, BS: 0.2 %). SRE prevalence was affected by the codes used, with PF being most impacted. The overall SRE cumulative incidence was affected by the window length used, with RAD being most affected. These results underscore the importance of the baseline definitions used to study claims data when attempting to understand relevant clinical events such as SREs in the real world setting.
Helicopter TEM parameters analysis and system optimization based on time constant
NASA Astrophysics Data System (ADS)
Xiao, Pan; Wu, Xin; Shi, Zongyang; Li, Jutao; Liu, Lihua; Fang, Guangyou
2018-03-01
Helicopter transient electromagnetic (TEM) method is a kind of common geophysical prospecting method, widely used in mineral detection, underground water exploration and environment investigation. In order to develop an efficient helicopter TEM system, it is necessary to analyze and optimize the system parameters. In this paper, a simple and quantitative method is proposed to analyze the system parameters, such as waveform, power, base frequency, measured field and sampling time. A wire loop model is used to define a comprehensive 'time constant domain' that shows a range of time constant, analogous to a range of conductance, after which the characteristics of the system parameters in this domain is obtained. It is found that the distortion caused by the transmitting base frequency is less than 5% when the ratio of the transmitting period to the target time constant is greater than 6. When the sampling time window is less than the target time constant, the distortion caused by the sampling time window is less than 5%. According to this method, a helicopter TEM system, called CASHTEM, is designed, and flight test has been carried out in the known mining area. The test results show that the system has good detection performance, verifying the effectiveness of the method.
Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom
2016-01-01
The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159
HPOTP low-speed flexible rotor balancing, phase 1
NASA Technical Reports Server (NTRS)
Giordano, J.; Zorzi, E.
1985-01-01
A method was developed that shows promise in overcoming many balancing limitations. This method establishes one or more windows for low speed, out-of-housing balancing of flexible rotors. These windows are regions of speed and support flexibility where two conditions are simultaneously fulfilled. First, the rotor system behaves flexibly; therefore, there is separation among balance planes. Second, the response due to balance weights is large enough to reliably measure. The analytic formulation of the low-speed flexible rotor balancing method is described. The results of proof-of-principle tests conducted under the program are presented. Based on this effort, it is concluded that low speed flexible rotor balancing is a viable technology. In particular, the method can be used to balance a rotor bearing system at low speed which results in smooth operation above more than one bending critical speed. Furthermore, this balancing methodology is applicable to SSME turbopump rotors.
Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin
2018-03-01
The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.
Thermal/structural/optical integrated design for optical sensor mounted on unmanned aerial vehicle
NASA Astrophysics Data System (ADS)
Zhang, Gaopeng; Yang, Hongtao; Mei, Chao; Wu, Dengshan; Shi, Kui
2016-01-01
With the rapid development of science and technology and the promotion of many local wars in the world, altitude optical sensor mounted on unmanned aerial vehicle is more widely applied in the airborne remote sensing, measurement and detection. In order to obtain high quality image of the aero optical remote sensor, it is important to analysis its thermal-optical performance on the condition of high speed and high altitude. Especially for the key imaging assembly, such as optical window, the temperature variation and temperature gradient can result in defocus and aberrations in optical system, which will lead to the poor quality image. In order to improve the optical performance of a high speed aerial camera optical window, the thermal/structural/optical integrated design method is developed. Firstly, the flight environment of optical window is analyzed. Based on the theory of aerodynamics and heat transfer, the convection heat transfer coefficient is calculated. The temperature distributing of optical window is simulated by the finite element analysis software. The maximum difference in temperature of the inside and outside of optical window is obtained. Then the deformation of optical window under the boundary condition of the maximum difference in temperature is calculated. The optical window surface deformation is fitted in Zernike polynomial as the interface, the calculated Zernike fitting coefficients is brought in and analyzed by CodeV Optical Software. At last, the transfer function diagrams of the optical system on temperature field are comparatively analyzed. By comparing and analyzing the result, it can be obtained that the optical path difference caused by thermal deformation of the optical window is 138.2 nm, which is under PV ≤1 4λ . The above study can be used as an important reference for other optical window designs.
Process Flow Features as a Host-Based Event Knowledge Representation
2012-06-14
an executing process during a window of time called a process flow. Process flows are calculated from key process data structures extracted from...for Cluster 98. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 4.9. Davies- Boldin Dunn Index Sliding Window 5 on Windows 7...82 4.10. Davies- Boldin Dunn Index Sliding Window 10 on Windows 7 . 83 4.11. Davies- Boldin Dunn Index Sliding Window 20 on Windows 7 . 83 ix List of
Super-Resolution for Color Imagery
2017-09-01
separately; however, it requires performing the super-resolution computation 3 times. We transform images in the default red, green, blue (RGB) color space...chrominance components based on ARL’s alias-free image upsampling using Fourier-based windowing methods. A reverse transformation is performed on... Transformation from sRGB to CIELAB............................................... 3 Fig. 2 YCbCr mathematical coordinate transformation
Mock Target Window OTR and IR Design and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wass, Alexander Joseph
In order to fully verify temperature measurements made on the target window using infrared (IR) optical non-contact methods, actual comparative measurements are made with a real beam distribution as the heat source using Argonne National Laboratory’s (ANL) 35 MeV electron accelerator. Using Monte Carlo N-Particle (MCNP) simulations and thermal Finite Element Analysis (FEA), a cooled mock target window with thermocouple implants is designed to be used in such a test to achieve window temperatures up to 700°C. An uncoated and blackcoated mock window is designed to enhance the IR temperature measurements and verify optical transmitted radiation (OTR) imagery. This allowsmore » us to fully verify and characterize our temperature accuracy with our current IR camera method and any future method we may wish to explore using actual production conditions. This test also provides us with valuable conclusions/concerns regarding the calibration method we developed using our IR test stand at TA-53 in MPF-14.« less
Overview of Fabrication Techniques and Lessons Learned with Accelerator Vacuum Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ader, C. R.; McGee, M. W.; Nobrega, L. E.
Vacuum thin windows have been used in Fermilab's accelerators for decades and typically have been overlooked in terms of their criticality and fragility. Vacuum windows allow beam to pass through while creating a boundary between vacuum and air or high vacuum and low vacuum areas. The design of vacuum windows, including Titanium and Beryllium windows, will be discussed as well as fabrication, testing, and operational concerns. Failure of windows will be reviewed as well as safety approaches to mitigating failures and extending the lifetimes of vacuum windows. Various methods of calculating the strengths of vacuum windows will be explored, includingmore » FEA.« less
NASA Astrophysics Data System (ADS)
Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas
2016-09-01
Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.
Guide to Mathematics Released Items: Understanding Scoring. 2015
ERIC Educational Resources Information Center
Partnership for Assessment of Readiness for College and Careers, 2015
2015-01-01
The 2014-2015 administrations of the PARCC assessment included two separate test administration windows: the Performance-Based Assessment (PBA) and the End-of-Year (EOY), both of which were administered in paper-based and computer-based formats. The first window was for administration of the PBA, and the second window was for the administration of…
Method of fabricating a microelectronic device package with an integral window
Peterson, Kenneth A.; Watson, Robert D.
2003-01-01
A method of fabricating a microelectronic device package with an integral window for providing optical access through an aperture in the package. The package is made of a multilayered insulating material, e.g., a low-temperature cofired ceramic (LTCC) or high-temperature cofired ceramic (HTCC). The window is inserted in-between personalized layers of ceramic green tape during stackup and registration. Then, during baking and firing, the integral window is simultaneously bonded to the sintered ceramic layers of the densified package. Next, the microelectronic device is flip-chip bonded to cofired thick-film metallized traces on the package, where the light-sensitive side is optically accessible through the window. Finally, a cover lid is attached to the opposite side of the package. The result is a compact, low-profile package, flip-chip bonded, hermetically-sealed package having an integral window.
A large, switchable optical clearing skull window for cerebrovascular imaging
Zhang, Chao; Feng, Wei; Zhao, Yanjie; Yu, Tingting; Li, Pengcheng; Xu, Tonghui; Luo, Qingming; Zhu, Dan
2018-01-01
Rationale: Intravital optical imaging is a significant method for investigating cerebrovascular structure and function. However, its imaging contrast and depth are limited by the turbid skull. Tissue optical clearing has a great potential for solving this problem. Our goal was to develop a transparent skull window, without performing a craniotomy, for use in assessing cerebrovascular structure and function. Methods: Skull optical clearing agents were topically applied to the skulls of mice to create a transparent window within 15 min. The clearing efficacy, repeatability, and safety of the skull window were then investigated. Results: Imaging through the optical clearing skull window enhanced both the contrast and the depth of intravital imaging. The skull window could be used on 2-8-month-old mice and could be expanded from regional to bi-hemispheric. In addition, the window could be repeatedly established without inducing observable inflammation and metabolic toxicity. Conclusion: We successfully developed an easy-to-handle, large, switchable, and safe optical clearing skull window. Combined with various optical imaging techniques, cerebrovascular structure and function can be observed through this optical clearing skull window. Thus, it has the potential for use in basic research on the physiopathologic processes of cortical vessels. PMID:29774069
New machining method of high precision infrared window part
NASA Astrophysics Data System (ADS)
Yang, Haicheng; Su, Ying; Xu, Zengqi; Guo, Rui; Li, Wenting; Zhang, Feng; Liu, Xuanmin
2016-10-01
Most of the spherical shell of the photoelectric multifunctional instrument was designed as multi optical channel mode to adapt to the different band of the sensor, there were mainly TV, laser and infrared channels. Without affecting the optical diameter, wind resistance and pneumatic performance of the optical system, the overall layout of the spherical shell was optimized to save space and reduce weight. Most of the shape of the optical windows were special-shaped, each optical window directly participated in the high resolution imaging of the corresponding sensor system, and the optical axis parallelism of each sensor needed to meet the accuracy requirement of 0.05mrad.Therefore precision machining of optical window parts quality will directly affect the photoelectric system's pointing accuracy and interchangeability. Processing and testing of the TV and laser window had been very mature, while because of the special nature of the material, transparent and high refractive rate, infrared window parts had the problems of imaging quality and the control of the minimum focal length and second level parallel in the processing. Based on years of practical experience, this paper was focused on how to control the shape and parallel difference precision of infrared window parts in the processing. Single pass rate was increased from 40% to more than 95%, the processing efficiency was significantly enhanced, an effective solution to the bottleneck problem in the actual processing, which effectively solve the bottlenecks in research and production.
Superconductive radiofrequency window assembly
Phillips, Harry Lawrence; Elliott, Thomas S.
1998-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconductive radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1998-05-19
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The SRF window assembly has a superconducting metal-ceramic design. The SRF window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the SRF window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
Superconducting radiofrequency window assembly
Phillips, Harry L.; Elliott, Thomas S.
1997-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconducting radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1997-03-11
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly has a superconducting metal-ceramic design. The srf window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanford, J.W.; Huang, Y.J.
The energy performance of skylights is similar to that of windows in admitting solar heat gain, while at the same time providing a pathway for convective and conductive heat transfer through the building envelope. Since skylights are typically installed at angles ranging from 0{degrees} to 45{degrees}, and differ from windows in both their construction and operation, their conductive and convective heat gains or losses, as well as solar heat gain, will differ for the same rough opening and thermal characteristics. The objective of this work is to quantify the impact of solar gain through skylights on building heating and coolingmore » loads in 45 climates, and to develop a method for including these data into the SP53 residential loads data base previously developed by LBL in support of DOE`s Automated Residential Energy Standard (ARES) program. The authors used the DOE-2.1C program to simulate the heating and cooling loads of a prototypical residential building while varying the size and solar characteristics of skylights and windows. The results are presented as Skylight Solar Loads, which are the contribution of solar gains through skylights to the overall building heating and cooling loads, and as Skylight Solar Load Ratios, which are the ratios of skylight solar loads to those for windows with the same orientation. The study shows that skylight solar loads are larger than those for windows in both heating and cooling. Skylight solar cooling loads are from three to four times greater than those for windows regardless of the skylight tilt, except for those facing north. These cooling loads are largest for south-facing skylights at a tilt angle of approximately 20{degrees}, and drop off at higher tilts and other orientations.« less
Rich, David Q; Rhoads, George G; Yiin, Lih-Ming; Zhang, Junfeng; Bai, Zhipeng; Adgate, John L; Ashley, Peter J; Lioy, Paul J
2002-01-01
High efficiency particulate air filter (HEPA) vacuums, which collect particles > 0.3 micro m, and trisodium phosphate (TSP), a detergent claimed to selectively remove lead, have been included in the HUD Guidelines for the Evaluation and Control of Lead Based Paint Hazards in Housing without systematic validation of their effectiveness. At the time the study was initiated, both HEPA vacuums and TSP were relatively expensive, they were not readily found in urban retail centers, and there were environmental concerns about the use and disposal of high-phosphate detergents. A randomized, controlled trial was conducted in urban high-risk homes in northern New Jersey to determine whether a more readily available and less expensive low-phosphate, non-TSP detergent and non-HEPA vacuum could perform as well as TSP and a HEPA vacuum in a cleaning protocol. Homes were randomized to one of three cleaning methods: TSP/HEPA vacuum, TSP/non-HEPA vacuum, or non-TSP/non-HEPA vacuum. Change in log-transformed lead loading was used in mixed models to compare the efficacy of the three cleaning techniques separately for uncarpeted floors, window sills, and window troughs. After we adjusted for baseline lead loading, the non-HEPA vacuum produced larger reductions on hard floors [19%; 95% confidence interval (CI), 3-38%], but the HEPA vacuum produced larger reductions on window sills (22%; 95% CI, 11-32%) and larger reductions on window troughs (16%; 95% CI, -4 to 33%). The non-TSP produced larger reductions on window troughs (21%; 95% CI, -2 to 50%), but TSP produced larger reductions on hard floors (5%; 95% CI, -12 to 19%) and window sills (8%; 95% CI, -5 to 20%). TSP/HEPA produced larger reductions on window sills (28%; 95% CI, 18-37%) and larger reductions on window troughs (2%; 95% CI, -24 to 23%), whereas the non-TSP/non-HEPA method produced larger reductions on hard floors (13%; 95% CI, -5 to 34%). Because neither vacuum nor detergent produced consistent results across surface types, the use of low-phosphate detergents and non-HEPA vacuums in a temporary control measure is supported. PMID:12204823
Rich, David Q; Rhoads, George G; Yiin, Lih-Ming; Zhang, Junfeng; Bai, Zhipeng; Adgate, John L; Ashley, Peter J; Lioy, Paul J
2002-09-01
High efficiency particulate air filter (HEPA) vacuums, which collect particles > 0.3 micro m, and trisodium phosphate (TSP), a detergent claimed to selectively remove lead, have been included in the HUD Guidelines for the Evaluation and Control of Lead Based Paint Hazards in Housing without systematic validation of their effectiveness. At the time the study was initiated, both HEPA vacuums and TSP were relatively expensive, they were not readily found in urban retail centers, and there were environmental concerns about the use and disposal of high-phosphate detergents. A randomized, controlled trial was conducted in urban high-risk homes in northern New Jersey to determine whether a more readily available and less expensive low-phosphate, non-TSP detergent and non-HEPA vacuum could perform as well as TSP and a HEPA vacuum in a cleaning protocol. Homes were randomized to one of three cleaning methods: TSP/HEPA vacuum, TSP/non-HEPA vacuum, or non-TSP/non-HEPA vacuum. Change in log-transformed lead loading was used in mixed models to compare the efficacy of the three cleaning techniques separately for uncarpeted floors, window sills, and window troughs. After we adjusted for baseline lead loading, the non-HEPA vacuum produced larger reductions on hard floors [19%; 95% confidence interval (CI), 3-38%], but the HEPA vacuum produced larger reductions on window sills (22%; 95% CI, 11-32%) and larger reductions on window troughs (16%; 95% CI, -4 to 33%). The non-TSP produced larger reductions on window troughs (21%; 95% CI, -2 to 50%), but TSP produced larger reductions on hard floors (5%; 95% CI, -12 to 19%) and window sills (8%; 95% CI, -5 to 20%). TSP/HEPA produced larger reductions on window sills (28%; 95% CI, 18-37%) and larger reductions on window troughs (2%; 95% CI, -24 to 23%), whereas the non-TSP/non-HEPA method produced larger reductions on hard floors (13%; 95% CI, -5 to 34%). Because neither vacuum nor detergent produced consistent results across surface types, the use of low-phosphate detergents and non-HEPA vacuums in a temporary control measure is supported.
Thin film solar cell configuration and fabrication method
Menezes, Shalini
2009-07-14
A new photovoltaic device configuration based on an n-copper indium selenide absorber and a p-type window is disclosed. A fabrication method to produce this device on flexible or rigid substrates is described that reduces the number of cell components, avoids hazardous materials, simplifies the process steps and hence the costs for high volume solar cell manufacturing.
Gostian, Antoniu-Oreste; Schwarz, David; Mandt, Philipp; Anagiotos, Andreas; Ortmann, Magdalene; Pazen, David; Beutner, Dirk; Hüttenbrink, Karl-Bernd
2016-11-01
The round window vibroplasty is a feasible option for the treatment of conductive, sensorineural and mixed hearing loss. Although clinical data suggest a satisfying clinical outcome with various coupling methods, the most efficient coupling technique of the floating mass transducer to the round window is still a matter of debate. For this, a soft silicone-made coupler has been developed recently that aims to ease and optimize the stimulation of the round window membrane of this middle ear implant. We performed a temporal bone study evaluating the performance of the soft coupler compared to the coupling with individually shaped cartilage, perichondrium and the titanium round window coupler with loads up to 20 mN at the unaltered and fully exposed round window niche. The stimulation of the cochlea was measured by the volume velocities of the stapes footplate detected by a laser Doppler vibrometer. The coupling method was computed as significant factor with cartilage and perichondrium allowing for the highest volume velocities followed by the soft and titanium coupler. Exposure of the round window niche allowed for higher volume velocities while the applied load did not significantly affect the results. The soft coupler allows for a good contact to the round window membrane and an effective backward stimulation of the cochlea. Clinical data are mandatory to evaluate performance of this novel coupling method in vivo.
Novel hermetic packaging methods for MOEMS
NASA Astrophysics Data System (ADS)
Stark, David
2003-01-01
Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.
Bispectral analysis: comparison of two windowing functions
NASA Astrophysics Data System (ADS)
Silvagni, D.; Djerroud, C.; Réveillé, T.; Gravier, E.
2018-02-01
Amongst all the normalized forms of bispectrum, the bicoherence is shown to be a very useful diagnostic tool in experimental studies of nonlinear wave interactions in plasma, as it measures the fraction of wave power due to the quadratic wave coupling in a self-excited fluctuation spectrum [1, 2]. In order to avoid spectral leakage, the application of a windowing function is needed during the bicoherence computation. Spectral leakage from statistically dependent components are of crucial importance in the discrimination between coupled and uncoupled modes, as they will introduce in the bicoherence spectrum phase-coupled modes which in reality do not exist. Therefore, the windowing function plays a key role in the bicoherence estimation. In this paper, two windowing methods are compared: the multiplication of the initial signal by the Hanning function and the subtraction of the straight line which links the two extremities of the signal. The influence of these two windowing methods on both the power spectrum and the bicoherence spectrum is showed. Although both methods give precise results, the Hanning function appears to be the more suitable window.
Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang
2017-11-01
The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.
Rhythm-based heartbeat duration normalization for atrial fibrillation detection.
Islam, Md Saiful; Ammour, Nassim; Alajlan, Naif; Aboalsamh, Hatim
2016-05-01
Screening of atrial fibrillation (AF) for high-risk patients including all patients aged 65 years and older is important for prevention of risk of stroke. Different technologies such as modified blood pressure monitor, single lead ECG-based finger-probe, and smart phone using plethysmogram signal have been emerging for this purpose. All these technologies use irregularity of heartbeat duration as a feature for AF detection. We have investigated a normalization method of heartbeat duration for improved AF detection. AF is an arrhythmia in which heartbeat duration generally becomes irregularly irregular. From a window of heartbeat duration, we estimate the possible rhythm of the majority of heartbeats and normalize duration of all heartbeats in the window based on the rhythm so that we can measure the irregularity of heartbeats for both AF and non-AF rhythms in the same scale. Irregularity is measured by the entropy of distribution of the normalized duration. Then we classify a window of heartbeats as AF or non-AF by thresholding the measured irregularity. The effect of this normalization is evaluated by comparing AF detection performances using duration with the normalization, without normalization, and with other existing normalizations. Sensitivity and specificity of AF detection using normalized heartbeat duration were tested on two landmark databases available online and compared with results of other methods (with/without normalization) by receiver operating characteristic (ROC) curves. ROC analysis showed that the normalization was able to improve the performance of AF detection and it was consistent for a wide range of sensitivity and specificity for use of different thresholds. Detection accuracy was also computed for equal rates of sensitivity and specificity for different methods. Using normalized heartbeat duration, we obtained 96.38% accuracy which is more than 4% improvement compared to AF detection without normalization. The proposed normalization method was found useful for improving performance and robustness of AF detection. Incorporation of this method in a screening device could be crucial to reduce the risk of AF-related stroke. In general, the incorporation of the rhythm-based normalization in an AF detection method seems important for developing a robust AF screening device. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dynamic simulation of road vehicle door window regulator mechanism of cross arm type
NASA Astrophysics Data System (ADS)
Miklos, I. Zs; Miklos, C.; Alic, C.
2017-01-01
The paper presents issues related to the dynamic simulation of a motor-drive operating mechanism of cross arm type, for the manipulation of road vehicle door windows, using Autodesk Inventor Professional software. The dynamic simulation of the mechanism involves a 3D modelling, kinematic coupling, drive motion parameters and external loads, as well as the graphically view of the kinematic and kinetostatic results for the various elements and kinematic couplings of the mechanism, under real operating conditions. Also, based on the results, the analysis of the mechanism components has been carried out using the finite element method.
On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2011-01-01
This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.
Smart windows based on cholesteric liquid crystals (Conference Presentation)
NASA Astrophysics Data System (ADS)
Khandelwal, Hitesh; Debije, Michael G.; Schenning, Albert P. H. J.
2017-02-01
With increase in global warming, use of active cooling and heating devices are continuously increasing to maintain interior temperature of built environment, greenhouses and cars. To reduce the consumption of tremendous amount of energy on cooling and heating devices we need an improved control of transparent features (i.e. windows). In this respect, smart window which is capable for reflecting solar infrared energy without interfering with the visible light would be very attractive. Most of the technologies developed so far are to control the visible light. These technologies block visual contact to the outside world which cause negative effects on human health. An appealing method to selectively control infrared transmission is via utilizing the reflection properties of cholesteric liquid crystals. In our research, we have fabricated a smart window which is capable of reflecting different amount of solar infrared energy depending on the specific climate conditions. The reflection bandwidth can be tuned from 120 nm to 1100 nm in the infrared region without interfering with the visible solar radiations. Calculations reveal that between 8% and 45% of incident solar infrared light can be reflected with a single cell. Simulation studies predicted that more than 12% of the energy spent on heating, cooling and lighting in the built environment can be saved by using the fabricated smart window compared to standard double glazing window.
Image-based corrosion recognition for ship steel structures
NASA Astrophysics Data System (ADS)
Ma, Yucong; Yang, Yang; Yao, Yuan; Li, Shengyuan; Zhao, Xuefeng
2018-03-01
Ship structures are subjected to corrosion inevitably in service. Existed image-based methods are influenced by the noises in images because they recognize corrosion by extracting features. In this paper, a novel method of image-based corrosion recognition for ship steel structures is proposed. The method utilizes convolutional neural networks (CNN) and will not be affected by noises in images. A CNN used to recognize corrosion was designed through fine-turning an existing CNN architecture and trained by datasets built using lots of images. Combining the trained CNN classifier with a sliding window technique, the corrosion zone in an image can be recognized.
Multi-alternative decision-making with non-stationary inputs.
Nunes, Luana F; Gurney, Kevin
2016-08-01
One of the most widely implemented models for multi-alternative decision-making is the multihypothesis sequential probability ratio test (MSPRT). It is asymptotically optimal, straightforward to implement, and has found application in modelling biological decision-making. However, the MSPRT is limited in application to discrete ('trial-based'), non-time-varying scenarios. By contrast, real world situations will be continuous and entail stimulus non-stationarity. In these circumstances, decision-making mechanisms (like the MSPRT) which work by accumulating evidence, must be able to discard outdated evidence which becomes progressively irrelevant. To address this issue, we introduce a new decision mechanism by augmenting the MSPRT with a rectangular integration window and a transparent decision boundary. This allows selection and de-selection of options as their evidence changes dynamically. Performance was enhanced by adapting the window size to problem difficulty. Further, we present an alternative windowing method which exponentially decays evidence and does not significantly degrade performance, while greatly reducing the memory resources necessary. The methods presented have proven successful at allowing for the MSPRT algorithm to function in a non-stationary environment.
NASA Astrophysics Data System (ADS)
Wang, Pan-Pan; Yu, Qiang; Hu, Yong-Jun; Miao, Chang-Xin
2017-11-01
Current research in broken rotor bar (BRB) fault detection in induction motors is primarily focused on a high-frequency resolution analysis of the stator current. Compared with a discrete Fourier transformation, the parametric spectrum estimation technique has a higher frequency accuracy and resolution. However, the existing detection methods based on parametric spectrum estimation cannot realize online detection, owing to the large computational cost. To improve the efficiency of BRB fault detection, a new detection method based on the min-norm algorithm and least square estimation is proposed in this paper. First, the stator current is filtered using a band-pass filter and divided into short overlapped data windows. The min-norm algorithm is then applied to determine the frequencies of the fundamental and fault characteristic components with each overlapped data window. Next, based on the frequency values obtained, a model of the fault current signal is constructed. Subsequently, a linear least squares problem solved through singular value decomposition is designed to estimate the amplitudes and phases of the related components. Finally, the proposed method is applied to a simulated current and an actual motor, the results of which indicate that, not only parametric spectrum estimation technique.
Network Penetration Testing and Research
NASA Technical Reports Server (NTRS)
Murphy, Brandon F.
2013-01-01
This paper will focus the on research and testing done on penetrating a network for security purposes. This research will provide the IT security office new methods of attacks across and against a company's network as well as introduce them to new platforms and software that can be used to better assist with protecting against such attacks. Throughout this paper testing and research has been done on two different Linux based operating systems, for attacking and compromising a Windows based host computer. Backtrack 5 and BlackBuntu (Linux based penetration testing operating systems) are two different "attacker'' computers that will attempt to plant viruses and or NASA USRP - Internship Final Report exploits on a host Windows 7 operating system, as well as try to retrieve information from the host. On each Linux OS (Backtrack 5 and BlackBuntu) there is penetration testing software which provides the necessary tools to create exploits that can compromise a windows system as well as other operating systems. This paper will focus on two main methods of deploying exploits 1 onto a host computer in order to retrieve information from a compromised system. One method of deployment for an exploit that was tested is known as a "social engineering" exploit. This type of method requires interaction from unsuspecting user. With this user interaction, a deployed exploit may allow a malicious user to gain access to the unsuspecting user's computer as well as the network that such computer is connected to. Due to more advance security setting and antivirus protection and detection, this method is easily identified and defended against. The second method of exploit deployment is the method mainly focused upon within this paper. This method required extensive research on the best way to compromise a security enabled protected network. Once a network has been compromised, then any and all devices connected to such network has the potential to be compromised as well. With a compromised network, computers and devices can be penetrated through deployed exploits. This paper will illustrate the research done to test ability to penetrate a network without user interaction, in order to retrieve personal information from a targeted host.
Method of identifying features in indexed data
Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA
2001-06-26
The present invention is a method of identifying features in indexed data, especially useful for distinguishing signal from noise in data provided as a plurality of ordered pairs. Each of the plurality of ordered pairs has an index and a response. The method has the steps of: (a) providing an index window having a first window end located on a first index and extending across a plurality of indices to a second window end; (b) selecting responses corresponding to the plurality of indices within the index window and computing a measure of dispersion of the responses; and (c) comparing the measure of dispersion to a dispersion critical value. Advantages of the present invention include minimizing signal to noise ratio, signal drift, varying baseline signal and combinations thereof.
An impact of environmental changes on flows in the reach scale under a range of climatic conditions
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Romanowicz, Renata J.
2016-04-01
The present paper combines detection and adequate identification of causes of changes in flow regime at cross-sections along the Middle River Vistula reach using different methods. Two main experimental set ups (designs) have been applied to study the changes, a moving three-year window and low- and high-flow event based approach. In the first experiment, a Stochastic Transfer Function (STF) model and a quantile-based statistical analysis of flow patterns were compared. These two methods are based on the analysis of changes of the STF model parameters and standardised differences of flow quantile values. In the second experiment, in addition to the STF-based also a 1-D distributed model, MIKE11 was applied. The first step of the procedure used in the study is to define the river reaches that have recorded information on land use and water management changes. The second task is to perform the moving window analysis of standardised differences of flow quantiles and moving window optimisation of the STF model for flow routing. The third step consists of an optimisation of the STF and MIKE11 models for high- and low-flow events. The final step is to analyse the results and relate the standardised quantile changes and model parameter changes to historical land use changes and water management practices. Results indicate that both models give consistent assessment of changes in the channel for medium and high flows. ACKNOWLEDGEMENTS This research was supported by the Institute of Geophysics Polish Academy of Sciences through the Young Scientist Grant no. 3b/IGF PAN/2015.
Hybrid window layer for photovoltaic cells
Deng, Xunming
2010-02-23
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Hybrid window layer for photovoltaic cells
Deng, Xunming [Syvania, OH; Liao, Xianbo [Toledo, OH; Du, Wenhui [Toledo, OH
2011-10-04
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Hybrid window layer for photovoltaic cells
Deng, Xunming [Sylvania, OH; Liao, Xianbo [Toledo, OH; Du, Wenhui [Toledo, OH
2011-02-01
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Determination of the Optimum Harvest Window for Apples Using the Non-Destructive Biospeckle Method.
Skic, Anna; Szymańska-Chargot, Monika; Kruk, Beata; Chylińska, Monika; Pieczywek, Piotr Mariusz; Kurenda, Andrzej; Zdunek, Artur; Rutkowski, Krzysztof P
2016-05-10
Determination of the optimum harvest window plays a key role in the agro-food chain as the quality of fruit depends on the right harvesting time and appropriate storage conditions during the postharvest period. Usually, indices based on destructive measurements are used for this purpose, like the De Jager Index (PFW-1), FARS index and the most popular Streif Index. In this study, we proposed a biospeckle method for the evaluation of the optimum harvest window (OHW) of the "Ligol" and "Szampion" apple cultivars. The experiment involved eight different maturity stages, of which four were followed by long cold storage and shelf life to assist the determination of the optimum harvest window. The biospeckle activity was studied in relation to standard quality attributes (firmness, acidity, starch, soluble solids content, Streif Index) and physiological parameters (respiration and ethylene emission) of both apple cultivars. Changes of biospeckle activity (BA) over time showed moderate relationships with biochemical changes during apple maturation and ripening. The harvest date suggested by the Streif Index and postharvest quality indicators matched with characteristic decrease in BA. The ability of biospeckle method to characterize the biological state of apples was confirmed by significant correlations of BA with firmness, starch index, total soluble solids and Streif Index, as well as good match with changes in carbon dioxide and ethylene emission. However, it should be noted that correlations between variables changing over time are not as meaningful as independent observations. Also, it is a well-known property of the Pearson's correlation that its value is highly susceptible to outlier data. Due to its non-selective nature the BA reflected only the current biological state of the fruit and could be affected by many other factors. The investigations showed that the optimum harvest window for apples was indicated by the characteristic drop of BA during pre-harvest development. Despite this, at the current state of development the BA method cannot be used as an indicator alone. Due to rather poor results for prediction in OHW the BA measurements should be supported by other destructive methods to compensate its low selectivity.
Determination of the Optimum Harvest Window for Apples Using the Non-Destructive Biospeckle Method
Skic, Anna; Szymańska-Chargot, Monika; Kruk, Beata; Chylińska, Monika; Pieczywek, Piotr Mariusz; Kurenda, Andrzej; Zdunek, Artur; Rutkowski, Krzysztof P.
2016-01-01
Determination of the optimum harvest window plays a key role in the agro-food chain as the quality of fruit depends on the right harvesting time and appropriate storage conditions during the postharvest period. Usually, indices based on destructive measurements are used for this purpose, like the De Jager Index (PFW-1), FARS index and the most popular Streif Index. In this study, we proposed a biospeckle method for the evaluation of the optimum harvest window (OHW) of the “Ligol” and “Szampion” apple cultivars. The experiment involved eight different maturity stages, of which four were followed by long cold storage and shelf life to assist the determination of the optimum harvest window. The biospeckle activity was studied in relation to standard quality attributes (firmness, acidity, starch, soluble solids content, Streif Index) and physiological parameters (respiration and ethylene emission) of both apple cultivars. Changes of biospeckle activity (BA) over time showed moderate relationships with biochemical changes during apple maturation and ripening. The harvest date suggested by the Streif Index and postharvest quality indicators matched with characteristic decrease in BA. The ability of biospeckle method to characterize the biological state of apples was confirmed by significant correlations of BA with firmness, starch index, total soluble solids and Streif Index, as well as good match with changes in carbon dioxide and ethylene emission. However, it should be noted that correlations between variables changing over time are not as meaningful as independent observations. Also, it is a well-known property of the Pearson’s correlation that its value is highly susceptible to outlier data. Due to its non-selective nature the BA reflected only the current biological state of the fruit and could be affected by many other factors. The investigations showed that the optimum harvest window for apples was indicated by the characteristic drop of BA during pre-harvest development. Despite this, at the current state of development the BA method cannot be used as an indicator alone. Due to rather poor results for prediction in OHW the BA measurements should be supported by other destructive methods to compensate its low selectivity. PMID:27171093
NASA Astrophysics Data System (ADS)
Zhu, Keyong; Pilon, Laurent
2017-11-01
This study aims to investigate systematically light transfer through semitransparent windows with absorbing cap-shaped droplets condensed on their backside as encountered in greenhouses, solar desalination plants, photobioreactors and covered raceway ponds. The Monte Carlo ray-tracing method was used to predict the normal-hemispherical transmittance, reflectance, and normal absorptance accounting for reflection and refraction at the air/droplet, droplet/window, and window/air interfaces and absorption in both the droplets and the window. The droplets were monodisperse or polydisperse and arranged either in an ordered hexagonal pattern or randomly distributed on the backside with droplet contact angle θc ranging between 0 and 180° The normal-hemispherical transmittance was found to be independent of the spatial distribution of droplets. However, it decreased with increasing droplet diameter and polydispersity. The normal-hemispherical transmittance featured four distinct optical regimes for semitransparent window supporting nonabsorbing droplets. These optical regimes were defined based on contact angle and critical angle for internal reflection at the droplet/air interface. However, for strongly absorbing droplets, the normal-hemispherical transmittance (i) decreased monotonously with increasing contact angle for θc <90° and (ii) remained constant and independent of droplet absorption index kd, droplet mean diameter dm, and contact angle θc for θc ≥ 90° Analytical expressions for the normal-hemispherical transmittance were provided in the asymptotic cases when (1) the window was absorbing but the droplets were nonabsorbing with any contact angles θc, and (2) the droplets were strongly absorbing with contact angle θc >90° Finally, the spectral normal-hemispherical transmittance of a 3 mm-thick glass window supporting condensed water droplets for wavelength between 0.4 and 5 μm was predicted and discussed in light of the earlier parametric study and asymptotic behavior.
Detection of Early Ischemic Changes in Noncontrast CT Head Improved with "Stroke Windows".
Mainali, Shraddha; Wahba, Mervat; Elijovich, Lucas
2014-01-01
Introduction. Noncontrast head CT (NCCT) is the standard radiologic test for patients presenting with acute stroke. Early ischemic changes (EIC) are often overlooked on initial NCCT. We determine the sensitivity and specificity of improved EIC detection by a standardized method of image evaluation (Stroke Windows). Methods. We performed a retrospective chart review to identify patients with acute ischemic stroke who had NCCT at presentation. EIC was defined by the presence of hyperdense MCA/basilar artery sign; sulcal effacement; basal ganglia/subcortical hypodensity; and loss of cortical gray-white differentiation. NCCT was reviewed with standard window settings and with specialized Stroke Windows. Results. Fifty patients (42% females, 58% males) with a mean NIHSS of 13.4 were identified. EIC was detected in 9 patients with standard windows, while EIC was detected using Stroke Windows in 35 patients (18% versus 70%; P < 0.0001). Hyperdense MCA sign was the most commonly reported EIC; it was better detected with Stroke Windows (14% and 36%; P < 0.0198). Detection of the remaining EIC also improved with Stroke Windows (6% and 46%; P < 0.0001). Conclusions. Detection of EIC has important implications in diagnosis and treatment of acute ischemic stroke. Utilization of Stroke Windows significantly improved detection of EIC.
HPC in a HEP lab: lessons learned from setting up cost-effective HPC clusters
NASA Astrophysics Data System (ADS)
Husejko, Michal; Agtzidis, Ioannis; Baehler, Pierre; Dul, Tadeusz; Evans, John; Himyr, Nils; Meinhard, Helge
2015-12-01
In this paper we present our findings gathered during the evaluation and testing of Windows Server High-Performance Computing (Windows HPC) in view of potentially using it as a production HPC system for engineering applications. The Windows HPC package, an extension of Microsofts Windows Server product, provides all essential interfaces, utilities and management functionality for creating, operating and monitoring a Windows-based HPC cluster infrastructure. The evaluation and test phase was focused on verifying the functionalities of Windows HPC, its performance, support of commercial tools and the integration with the users work environment. We describe constraints imposed by the way the CERN Data Centre is operated, licensing for engineering tools and scalability and behaviour of the HPC engineering applications used at CERN. We will present an initial set of requirements, which were created based on the above constraints and requests from the CERN engineering user community. We will explain how we have configured Windows HPC clusters to provide job scheduling functionalities required to support the CERN engineering user community, quality of service, user- and project-based priorities, and fair access to limited resources. Finally, we will present several performance tests we carried out to verify Windows HPC performance and scalability.
Resource-constrained scheduling with hard due windows and rejection penalties
NASA Astrophysics Data System (ADS)
Garcia, Christopher
2016-09-01
This work studies a scheduling problem where each job must be either accepted and scheduled to complete within its specified due window, or rejected altogether. Each job has a certain processing time and contributes a certain profit if accepted or penalty cost if rejected. There is a set of renewable resources, and no resource limit can be exceeded at any time. Each job requires a certain amount of each resource when processed, and the objective is to maximize total profit. A mixed-integer programming formulation and three approximation algorithms are presented: a priority rule heuristic, an algorithm based on the metaheuristic for randomized priority search and an evolutionary algorithm. Computational experiments comparing these four solution methods were performed on a set of generated benchmark problems covering a wide range of problem characteristics. The evolutionary algorithm outperformed the other methods in most cases, often significantly, and never significantly underperformed any method.
Extinction Map of Baade's Window
NASA Astrophysics Data System (ADS)
Stanek, K. Z.
1996-03-01
Recently Wozniak & Stanek proposed a new method to investigate interstellar extinction, based on two-band photometry, which uses red clump stars as a means to construct the reddening curve. I apply this method to the color-magnitude diagrams obtained by the Optical Gravitational Lensing Experiment to construct an extinction map of a 40' x 40' region of Baade's window, with resolution of ~30". Such a map should be useful for studies of this frequently observed region of the Galactic bulge. The map and software useful for its applications are available via anonymous ftp. The total extinction AV varies from 1.26 to 2.79 mag within the 40' x 40' field of view centered on ( alpha 2000, delta 2000) = (18:03:20.9, -30:02:06), i.e., (l, b) = (1.001, -3.885). The ratio AV/E(V - I) = 2.49 +/- 0.02 is determined with this new method.
Griffiths, Jason I.; Fronhofer, Emanuel A.; Garnier, Aurélie; Seymour, Mathew; Altermatt, Florian; Petchey, Owen L.
2017-01-01
The development of video-based monitoring methods allows for rapid, dynamic and accurate monitoring of individuals or communities, compared to slower traditional methods, with far reaching ecological and evolutionary applications. Large amounts of data are generated using video-based methods, which can be effectively processed using machine learning (ML) algorithms into meaningful ecological information. ML uses user defined classes (e.g. species), derived from a subset (i.e. training data) of video-observed quantitative features (e.g. phenotypic variation), to infer classes in subsequent observations. However, phenotypic variation often changes due to environmental conditions, which may lead to poor classification, if environmentally induced variation in phenotypes is not accounted for. Here we describe a framework for classifying species under changing environmental conditions based on the random forest classification. A sliding window approach was developed that restricts temporal and environmentally conditions to improve the classification. We tested our approach by applying the classification framework to experimental data. The experiment used a set of six ciliate species to monitor changes in community structure and behavior over hundreds of generations, in dozens of species combinations and across a temperature gradient. Differences in biotic and abiotic conditions caused simplistic classification approaches to be unsuccessful. In contrast, the sliding window approach allowed classification to be highly successful, as phenotypic differences driven by environmental change, could be captured by the classifier. Importantly, classification using the random forest algorithm showed comparable success when validated against traditional, slower, manual identification. Our framework allows for reliable classification in dynamic environments, and may help to improve strategies for long-term monitoring of species in changing environments. Our classification pipeline can be applied in fields assessing species community dynamics, such as eco-toxicology, ecology and evolutionary ecology. PMID:28472193
SSVEP recognition using common feature analysis in brain-computer interface.
Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej
2015-04-15
Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-07-01
Epidemiological research supports an association between maternal exposure to air pollution during pregnancy and adverse children's health outcomes. Advances in exposure assessment and statistics allow for estimation of both critical windows of vulnerability and exposure effect heterogeneity. Simultaneous estimation of windows of vulnerability and effect heterogeneity can be accomplished by fitting a distributed lag model (DLM) stratified by subgroup. However, this can provide an incomplete picture of how effects vary across subgroups because it does not allow for subgroups to have the same window but different within-window effects or to have different windows but the same within-window effect. Because the timing of some developmental processes are common across subpopulations of infants while for others the timing differs across subgroups, both scenarios are important to consider when evaluating health risks of prenatal exposures. We propose a new approach that partitions the DLM into a constrained functional predictor that estimates windows of vulnerability and a scalar effect representing the within-window effect directly. The proposed method allows for heterogeneity in only the window, only the within-window effect, or both. In a simulation study we show that a model assuming a shared component across groups results in lower bias and mean squared error for the estimated windows and effects when that component is in fact constant across groups. We apply the proposed method to estimate windows of vulnerability in the association between prenatal exposures to fine particulate matter and each of birth weight and asthma incidence, and estimate how these associations vary by sex and maternal obesity status in a Boston-area prospective pre-birth cohort study. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Video-based noncooperative iris image segmentation.
Du, Yingzi; Arslanturk, Emrah; Zhou, Zhi; Belcher, Craig
2011-02-01
In this paper, we propose a video-based noncooperative iris image segmentation scheme that incorporates a quality filter to quickly eliminate images without an eye, employs a coarse-to-fine segmentation scheme to improve the overall efficiency, uses a direct least squares fitting of ellipses method to model the deformed pupil and limbic boundaries, and develops a window gradient-based method to remove noise in the iris region. A remote iris acquisition system is set up to collect noncooperative iris video images. An objective method is used to quantitatively evaluate the accuracy of the segmentation results. The experimental results demonstrate the effectiveness of this method. The proposed method would make noncooperative iris recognition or iris surveillance possible.
VISUAL PLUMES CONCEPTS TO POTENTIALLY ADAPT OR ADOPT IN MODELING PLATFORMS SUCH AS VISJET
Windows-based programs share many familiar features and components. For example, file dialogue windows are familiar to most Windows-based personal computer users. Such program elements are desirable because the user is already familiar with how they function, obviating the need f...
Structural Information Detection Based Filter for GF-3 SAR Images
NASA Astrophysics Data System (ADS)
Sun, Z.; Song, Y.
2018-04-01
GF-3 satellite with high resolution, large swath, multi-imaging mode, long service life and other characteristics, can achieve allweather and all day monitoring for global land and ocean. It has become the highest resolution satellite system in the world with the C-band multi-polarized synthetic aperture radar (SAR) satellite. However, due to the coherent imaging system, speckle appears in GF-3 SAR images, and it hinders the understanding and interpretation of images seriously. Therefore, the processing of SAR images has big challenges owing to the appearance of speckle. The high-resolution SAR images produced by the GF-3 satellite are rich in information and have obvious feature structures such as points, edges, lines and so on. The traditional filters such as Lee filter and Gamma MAP filter are not appropriate for the GF-3 SAR images since they ignore the structural information of images. In this paper, the structural information detection based filter is constructed, successively including the point target detection in the smallest window, the adaptive windowing method based on regional characteristics, and the most homogeneous sub-window selection. The despeckling experiments on GF-3 SAR images demonstrate that compared with the traditional filters, the proposed structural information detection based filter can well preserve the points, edges and lines as well as smooth the speckle more sufficiently.
The non-parametric Parzen's window in stereo vision matching.
Pajares, G; de la Cruz, J
2002-01-01
This paper presents an approach to the local stereovision matching problem using edge segments as features with four attributes. From these attributes we compute a matching probability between pairs of features of the stereo images. A correspondence is said true when such a probability is maximum. We introduce a nonparametric strategy based on Parzen's window (1962) to estimate a probability density function (PDF) which is used to obtain the matching probability. This is the main finding of the paper. A comparative analysis of other recent matching methods is included to show that this finding can be justified theoretically. A generalization of the proposed method is made in order to give guidelines about its use with the similarity constraint and also in different environments where other features and attributes are more suitable.
Radar cross section models for limited aspect angle windows
NASA Astrophysics Data System (ADS)
Robinson, Mark C.
1992-12-01
This thesis presents a method for building Radar Cross Section (RCS) models of aircraft based on static data taken from limited aspect angle windows. These models statistically characterize static RCS. This is done to show that a limited number of samples can be used to effectively characterize static aircraft RCS. The optimum models are determined by performing both a Kolmogorov and a Chi-Square goodness-of-fit test comparing the static RCS data with a variety of probability density functions (pdf) that are known to be effective at approximating the static RCS of aircraft. The optimum parameter estimator is also determined by the goodness of-fit tests if there is a difference in pdf parameters obtained by the Maximum Likelihood Estimator (MLE) and the Method of Moments (MoM) estimators.
Sojoudi, Alireza; Goodyear, Bradley G
2016-12-01
Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Mirmohseni, A; Abdollahi, H; Rostamizadeh, K
2007-02-28
Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.
Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark
2015-01-01
This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Design strategies to minimize the radiative efficiency of global warming molecules
Bera, Partha P.; Francisco, Joseph S.; Lee, Timothy J.
2010-01-01
A strategy is devised to screen molecules based on their radiative efficiency. The methodology should be useful as one additional constraint when determining the best molecule to use for an industrial application. The strategy is based on the results of a recent study where we examined molecular properties of global warming molecules using ab initio electronic structure methods to determine which fundamental molecular properties are important in assessing the radiative efficiency of a molecule. Six classes of perfluorinated compounds are investigated. For similar numbers of fluorine atoms, their absorption of radiation in the IR window decreases according to perfluoroethers > perfluorothioethers ≈ sulfur/carbon compounds > perfluorocarbons > perfluoroolefins > carbon/nitrogen compounds. Perfluoroethers and hydrofluorethers are shown to possess a large absorption in the IR window due to (i) the C─O bonds are very polar, (ii) the C-O stretches fall within the IR window and have large IR intensity due to their polarity, and (iii) the IR intensity for C-F stretches in which the fluorine atom is bonded to the carbon that is bonded to the oxygen atom is enhanced due to a larger C─F bond polarity. Lengthening the carbon chain leads to a larger overall absorption in the IR window, though the IR intensity per bond is smaller. Finally, for a class of partially fluorinated compounds with a set number of electronegative atoms, the overall absorption in the IR window can vary significantly, as much as a factor of 2, depending on how the fluorine atoms are distributed within the molecule. PMID:20439762
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157
Efficient full-chip SRAF placement using machine learning for best accuracy and improved consistency
NASA Astrophysics Data System (ADS)
Wang, Shibing; Baron, Stanislas; Kachwala, Nishrin; Kallingal, Chidam; Sun, Dezheng; Shu, Vincent; Fong, Weichun; Li, Zero; Elsaid, Ahmad; Gao, Jin-Wei; Su, Jing; Ser, Jung-Hoon; Zhang, Quan; Chen, Been-Der; Howell, Rafael; Hsu, Stephen; Luo, Larry; Zou, Yi; Zhang, Gary; Lu, Yen-Wen; Cao, Yu
2018-03-01
Various computational approaches from rule-based to model-based methods exist to place Sub-Resolution Assist Features (SRAF) in order to increase process window for lithography. Each method has its advantages and drawbacks, and typically requires the user to make a trade-off between time of development, accuracy, consistency and cycle time. Rule-based methods, used since the 90 nm node, require long development time and struggle to achieve good process window performance for complex patterns. Heuristically driven, their development is often iterative and involves significant engineering time from multiple disciplines (Litho, OPC and DTCO). Model-based approaches have been widely adopted since the 20 nm node. While the development of model-driven placement methods is relatively straightforward, they often become computationally expensive when high accuracy is required. Furthermore these methods tend to yield less consistent SRAFs due to the nature of the approach: they rely on a model which is sensitive to the pattern placement on the native simulation grid, and can be impacted by such related grid dependency effects. Those undesirable effects tend to become stronger when more iterations or complexity are needed in the algorithm to achieve required accuracy. ASML Brion has developed a new SRAF placement technique on the Tachyon platform that is assisted by machine learning and significantly improves the accuracy of full chip SRAF placement while keeping consistency and runtime under control. A Deep Convolutional Neural Network (DCNN) is trained using the target wafer layout and corresponding Continuous Transmission Mask (CTM) images. These CTM images have been fully optimized using the Tachyon inverse mask optimization engine. The neural network generated SRAF guidance map is then used to place SRAF on full-chip. This is different from our existing full-chip MB-SRAF approach which utilizes a SRAF guidance map (SGM) of mask sensitivity to improve the contrast of optical image at the target pattern edges. In this paper, we demonstrate that machine learning assisted SRAF placement can achieve a superior process window compared to the SGM model-based SRAF method, while keeping the full-chip runtime affordable, and maintain consistency of SRAF placement . We describe the current status of this machine learning assisted SRAF technique and demonstrate its application to full chip mask synthesis and discuss how it can extend the computational lithography roadmap.
The dynamic financial distress prediction method of EBW-VSTW-SVM
NASA Astrophysics Data System (ADS)
Sun, Jie; Li, Hui; Chang, Pei-Chann; He, Kai-Yu
2016-07-01
Financial distress prediction (FDP) takes important role in corporate financial risk management. Most of former researches in this field tried to construct effective static FDP (SFDP) models that are difficult to be embedded into enterprise information systems, because they are based on horizontal data-sets collected outside the modelling enterprise by defining the financial distress as the absolute conditions such as bankruptcy or insolvency. This paper attempts to propose an approach for dynamic evaluation and prediction of financial distress based on the entropy-based weighting (EBW), the support vector machine (SVM) and an enterprise's vertical sliding time window (VSTW). The dynamic FDP (DFDP) method is named EBW-VSTW-SVM, which keeps updating the FDP model dynamically with time goes on and only needs the historic financial data of the modelling enterprise itself and thus is easier to be embedded into enterprise information systems. The DFDP method of EBW-VSTW-SVM consists of four steps, namely evaluation of vertical relative financial distress (VRFD) based on EBW, construction of training data-set for DFDP modelling according to VSTW, training of DFDP model based on SVM and DFDP for the future time point. We carry out case studies for two listed pharmaceutical companies and experimental analysis for some other companies to simulate the sliding of enterprise vertical time window. The results indicated that the proposed approach was feasible and efficient to help managers improve corporate financial management.
Updated System-Availability and Resource-Allocation Program
NASA Technical Reports Server (NTRS)
Viterna, Larry
2004-01-01
A second version of the Availability, Cost and Resource Allocation (ACARA) computer program has become available. The first version was reported in an earlier tech brief. To recapitulate: ACARA analyzes the availability, mean-time-between-failures of components, life-cycle costs, and scheduling of resources of a complex system of equipment. ACARA uses a statistical Monte Carlo method to simulate the failure and repair of components while complying with user-specified constraints on spare parts and resources. ACARA evaluates the performance of the system on the basis of a mathematical model developed from a block-diagram representation. The previous version utilized the MS-DOS operating system and could not be run by use of the most recent versions of the Windows operating system. The current version incorporates the algorithms of the previous version but is compatible with Windows and utilizes menus and a file-management approach typical of Windows-based software.
[Discussion on relationship between "living alone with closed windows and doors" and depression].
Liu, Fengfeng; Li, Rui
2018-03-12
"Living alone with closed windows and doors"was mentioned in the pathological manifestations of stomach meridian of foot- yangming in Neijing ( The Inner Canon of Huangdi ), which is similar to the symptoms of depression. Currently the treatment of depression is mostly based on"spirit being stored in five organs"theory, and little attention is paid on stomach meridian of foot- yangming . From the pathological manifestations of"living alone with closed windows and doors"in stomach meridian of foot- yangming , the relationship between stomach meridian and depression is discussed from ying-yang and qi -blood. In addition, the close relationship between the stomach meridian and qi -blood, qi movement of five organs, heart and brain is discussed to explore the mechanism of treating depression. In conclusion, the literature and modern research regarding treating depression from stomach meridian are summarized, hoping to provide more clinical methods for the treatment of depression.
Tunable far-infrared plasmonically induced transparency in graphene based nano-structures
NASA Astrophysics Data System (ADS)
Dolatabady, Alireza; Granpayeh, Nosrat
2018-07-01
In this paper, a structure is proposed to show the phenomenon of tunable far-infrared plasmonically induced transparency. The structure includes a nano-ribbon waveguide side-coupled to nano-stub resonators. The realized effect is due to the coupling between the consecutive nano-stub resonators spaced in properly designed distances, providing a constructive interference in the virtually created Fabry–Perot cavity. Due to the Fabry–Perot like cavity created between two consecutive nano-stubs, periodic values of nano-stubs separation can produce transparency windows. Increasing the number of nano-stubs would increase the number of transparency windows in different frequencies. The structure is theoretically investigated and numerically simulated by using the finite difference time domain method. Owing to the chemical potential dependency of graphene conductivity, the transparency windows can be actively tuned. The proposed component can be extensively utilized in nano-scale switching and slow-light systems.
Polyp measurement with CT colonography: multiple-reader, multiple-workstation comparison.
Young, Brett M; Fletcher, J G; Paulsen, Scott R; Booya, Fargol; Johnson, C Daniel; Johnson, Kristina T; Melton, Zackary; Rodysill, Drew; Mandrekar, Jay
2007-01-01
The risk of invasive colorectal cancer in colorectal polyps correlates with lesion size. Our purpose was to define the most accurate methods for measuring polyp size at CT colonography (CTC) using three models of workstations and multiple observers. Six reviewers measured 24 unique polyps of known size (5, 7, 10, and 12 mm), shape (sessile, flat, and pedunculated), and location (straight or curved bowel segment) using CTC data sets obtained at two doses (5 mAs and 65 mAs) and a previously described colonic phantom model. Reviewers measured the largest diameter of polyps on three proprietary workstations. Each polyp was measured with lung and soft-tissue windows on axial, 2D multiplanar reconstruction (MPR), and 3D images. There were significant differences among measurements obtained at various settings within each workstation (p < 0.0001). Measurements on 2D images were more accurate with lung window than with soft-tissue window settings (p < 0.0001). For the 65-mAs data set, the most accurate measurements were obtained in analysis of axial images with lung window, 2D MPR images with lung window, and 3D tissue cube images for Wizard, Advantage, and Vitrea workstations, respectively, without significant differences in accuracy among techniques (0.11 < p < 0.59). The mean absolute error values for these optimal settings were 0.48 mm, 0.61 mm, and 0.76 mm, respectively, for the three workstations. Within the ultralow-dose 5-mAs data set the best methods for Wizard, Advantage, and Vitrea were axial with lung window, 2D MPR with lung window, and 2D MPR with lung window, respectively. Use of nearly all measurement methods, except for the Vitrea 3D tissue cube and the Wizard 2D MPR with lung window, resulted in undermeasurement of the true size of the polyps. Use of CTC computer workstations facilitates accurate polyp measurement. For routine CTC examinations, polyps should be measured with lung window settings on 2D axial or MPR images (Wizard and Advantage) or 3D images (Vitrea). When these optimal methods are used, these three commercial workstations do not differ significantly in acquisition of accurate polyp measurements at routine dose settings.
NASA Astrophysics Data System (ADS)
Ojeda, GermáN. Y.; Whitman, Dean
2002-11-01
The effective elastic thickness (Te) of the lithosphere is a parameter that describes the flexural strength of a plate. A method routinely used to quantify this parameter is to calculate the coherence between the two-dimensional gravity and topography spectra. Prior to spectra calculation, data grids must be "windowed" in order to avoid edge effects. We investigated the sensitivity of Te estimates obtained via the coherence method to mirroring, Hanning and multitaper windowing techniques on synthetic data as well as on data from northern South America. These analyses suggest that the choice of windowing technique plays an important role in Te estimates and may result in discrepancies of several kilometers depending on the selected windowing method. Te results from mirrored grids tend to be greater than those from Hanning smoothed or multitapered grids. Results obtained from mirrored grids are likely to be over-estimates. This effect may be due to artificial long wavelengths introduced into the data at the time of mirroring. Coherence estimates obtained from three subareas in northern South America indicate that the average effective elastic thickness is in the range of 29-30 km, according to Hanning and multitaper windowed data. Lateral variations across the study area could not be unequivocally determined from this study. We suggest that the resolution of the coherence method does not permit evaluation of small (i.e., ˜5 km), local Te variations. However, the efficiency and robustness of the coherence method in rendering continent-scale estimates of elastic thickness has been confirmed.
Detecting, anticipating, and predicting critical transitions in spatially extended systems.
Kwasniok, Frank
2018-03-01
A data-driven linear framework for detecting, anticipating, and predicting incipient bifurcations in spatially extended systems based on principal oscillation pattern (POP) analysis is discussed. The dynamics are assumed to be governed by a system of linear stochastic differential equations which is estimated from the data. The principal modes of the system together with corresponding decay or growth rates and oscillation frequencies are extracted as the eigenvectors and eigenvalues of the system matrix. The method can be applied to stationary datasets to identify the least stable modes and assess the proximity to instability; it can also be applied to nonstationary datasets using a sliding window approach to track the changing eigenvalues and eigenvectors of the system. As a further step, a genuinely nonstationary POP analysis is introduced. Here, the system matrix of the linear stochastic model is time-dependent, allowing for extrapolation and prediction of instabilities beyond the learning data window. The methods are demonstrated and explored using the one-dimensional Swift-Hohenberg equation as an example, focusing on the dynamics of stochastic fluctuations around the homogeneous stable state prior to the first bifurcation. The POP-based techniques are able to extract and track the least stable eigenvalues and eigenvectors of the system; the nonstationary POP analysis successfully predicts the timing of the first instability and the unstable mode well beyond the learning data window.
Detecting, anticipating, and predicting critical transitions in spatially extended systems
NASA Astrophysics Data System (ADS)
Kwasniok, Frank
2018-03-01
A data-driven linear framework for detecting, anticipating, and predicting incipient bifurcations in spatially extended systems based on principal oscillation pattern (POP) analysis is discussed. The dynamics are assumed to be governed by a system of linear stochastic differential equations which is estimated from the data. The principal modes of the system together with corresponding decay or growth rates and oscillation frequencies are extracted as the eigenvectors and eigenvalues of the system matrix. The method can be applied to stationary datasets to identify the least stable modes and assess the proximity to instability; it can also be applied to nonstationary datasets using a sliding window approach to track the changing eigenvalues and eigenvectors of the system. As a further step, a genuinely nonstationary POP analysis is introduced. Here, the system matrix of the linear stochastic model is time-dependent, allowing for extrapolation and prediction of instabilities beyond the learning data window. The methods are demonstrated and explored using the one-dimensional Swift-Hohenberg equation as an example, focusing on the dynamics of stochastic fluctuations around the homogeneous stable state prior to the first bifurcation. The POP-based techniques are able to extract and track the least stable eigenvalues and eigenvectors of the system; the nonstationary POP analysis successfully predicts the timing of the first instability and the unstable mode well beyond the learning data window.
Multilocus Association Mapping Using Variable-Length Markov Chains
Browning, Sharon R.
2006-01-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests. PMID:16685642
Multilocus association mapping using variable-length Markov chains.
Browning, Sharon R
2006-06-01
I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests.
NASA Astrophysics Data System (ADS)
Magdy, Nancy; Ayad, Miriam F.
2015-02-01
Two simple, accurate, precise, sensitive and economic spectrophotometric methods were developed for the simultaneous determination of Simvastatin and Ezetimibe in fixed dose combination products without prior separation. The first method depends on a new chemometrics-assisted ratio spectra derivative method using moving window polynomial least square fitting method (Savitzky-Golay filters). The second method is based on a simple modification for the ratio subtraction method. The suggested methods were validated according to USP guidelines and can be applied for routine quality control testing.
Errors in the estimation method for the rejection of vibrations in adaptive optics systems
NASA Astrophysics Data System (ADS)
Kania, Dariusz
2017-06-01
In recent years the problem of the mechanical vibrations impact in adaptive optics (AO) systems has been renewed. These signals are damped sinusoidal signals and have deleterious effect on the system. One of software solutions to reject the vibrations is an adaptive method called AVC (Adaptive Vibration Cancellation) where the procedure has three steps: estimation of perturbation parameters, estimation of the frequency response of the plant, update the reference signal to reject/minimalize the vibration. In the first step a very important problem is the estimation method. A very accurate and fast (below 10 ms) estimation method of these three parameters has been presented in several publications in recent years. The method is based on using the spectrum interpolation and MSD time windows and it can be used to estimate multifrequency signals. In this paper the estimation method is used in the AVC method to increase the system performance. There are several parameters that affect the accuracy of obtained results, e.g. CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, b - number of ADC bits, γ - damping ratio of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value for systematic error is approximately 10^-10 Hz/Hz for N = 2048 and CiR = 0.1. This paper presents equations that can used to estimate maximum systematic errors for given values of H, CiR and N before the start of the estimation process.
Ding, Jun; Arigong, Bayaner; Ren, Han; Zhou, Mi; Shao, Jin; Lu, Meng; Chai, Yang; Lin, Yuankun; Zhang, Hualiang
2014-01-01
Novel graphene-based tunable plasmonic metamaterials featuring single and multiple transparency windows are numerically studied in this paper. The designed structures consist of a graphene layer perforated with quadrupole slot structures and dolmen-like slot structures printed on a substrate. Specifically, the graphene-based quadrupole slot structure can realize a single transparency window, which is achieved without breaking the structure symmetry. Further investigations have shown that the single transparency window in the proposed quadrupole slot structure is more likely originated from the quantum effect of Autler-Townes splitting. Then, by introducing a dipole slot to the quadrupole slot structure to form the dolmen-like slot structure, an additional transmission dip could occur in the transmission spectrum, thus, a multiple-transparency-window system can be achieved (for the first time for graphene-based devices). More importantly, the transparency windows for both the quadrupole slot and the dolmen-like slot structures can be dynamically controlled over a broad frequency range by varying the Fermi energy levels of the graphene layer (through electrostatic gating). The proposed slot metamaterial structures with tunable single and multiple transparency windows could find potential applications in many areas such as multiple-wavelength slow-light devices, active plasmonic switching, and optical sensing. PMID:25146672
Ding, Jun; Arigong, Bayaner; Ren, Han; Zhou, Mi; Shao, Jin; Lu, Meng; Chai, Yang; Lin, Yuankun; Zhang, Hualiang
2014-08-22
Novel graphene-based tunable plasmonic metamaterials featuring single and multiple transparency windows are numerically studied in this paper. The designed structures consist of a graphene layer perforated with quadrupole slot structures and dolmen-like slot structures printed on a substrate. Specifically, the graphene-based quadrupole slot structure can realize a single transparency window, which is achieved without breaking the structure symmetry. Further investigations have shown that the single transparency window in the proposed quadrupole slot structure is more likely originated from the quantum effect of Autler-Townes splitting. Then, by introducing a dipole slot to the quadrupole slot structure to form the dolmen-like slot structure, an additional transmission dip could occur in the transmission spectrum, thus, a multiple-transparency-window system can be achieved (for the first time for graphene-based devices). More importantly, the transparency windows for both the quadrupole slot and the dolmen-like slot structures can be dynamically controlled over a broad frequency range by varying the Fermi energy levels of the graphene layer (through electrostatic gating). The proposed slot metamaterial structures with tunable single and multiple transparency windows could find potential applications in many areas such as multiple-wavelength slow-light devices, active plasmonic switching, and optical sensing.
Radiation-transparent windows, method for imaging fluid transfers
Shu, Deming [Darien, IL; Wang, Jin [Burr Ridge, IL
2011-07-26
A thin, x-ray-transparent window system for environmental chambers involving pneumatic pressures above 40 bar is presented. The window allows for x-ray access to such phenomena as fuel sprays injected into a pressurized chamber that mimics realistic internal combustion engine cylinder operating conditions.
Li, Rongxia; Stewart, Brock; Weintraub, Eric
2016-01-01
The self-controlled case series (SCCS) and self-controlled risk interval (SCRI) designs have recently become widely used in the field of post-licensure vaccine safety monitoring to detect potential elevated risks of adverse events following vaccinations. The SCRI design can be viewed as a subset of the SCCS method in that a reduced comparison time window is used for the analysis. Compared to the SCCS method, the SCRI design has less statistical power due to fewer events occurring in the shorter control interval. In this study, we derived the asymptotic relative efficiency (ARE) between these two methods to quantify this loss in power in the SCRI design. The equation is formulated as [Formula: see text] (a: control window-length ratio between SCRI and SCCS designs; b: ratio of risk window length and control window length in the SCCS design; and [Formula: see text]: relative risk of exposed window to control window). According to this equation, the relative efficiency declines as the ratio of control-period length between SCRI and SCCS methods decreases, or with an increase in the relative risk [Formula: see text]. We provide an example utilizing data from the Vaccine Safety Datalink (VSD) to study the potential elevated risk of febrile seizure following seasonal influenza vaccine in the 2010-2011 season.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., but not limited to, certain window, floor, and stair surfaces. Impact surface means an interior or.... Interior window sill means the portion of the horizontal window ledge that protrudes into the interior of... based on the equation [60+(3*100)+(4*110)]/(1+3+4). Window trough means, for a typical double-hung...
Potential for Bias When Estimating Critical Windows for Air Pollution in Children's Health.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-12-01
Evidence supports an association between maternal exposure to air pollution during pregnancy and children's health outcomes. Recent interest has focused on identifying critical windows of vulnerability. An analysis based on a distributed lag model (DLM) can yield estimates of a critical window that are different from those from an analysis that regresses the outcome on each of the 3 trimester-average exposures (TAEs). Using a simulation study, we assessed bias in estimates of critical windows obtained using 3 regression approaches: 1) 3 separate models to estimate the association with each of the 3 TAEs; 2) a single model to jointly estimate the association between the outcome and all 3 TAEs; and 3) a DLM. We used weekly fine-particulate-matter exposure data for 238 births in a birth cohort in and around Boston, Massachusetts, and a simulated outcome and time-varying exposure effect. Estimates using separate models for each TAE were biased and identified incorrect windows. This bias arose from seasonal trends in particulate matter that induced correlation between TAEs. Including all TAEs in a single model reduced bias. DLM produced unbiased estimates and added flexibility to identify windows. Analysis of body mass index z score and fat mass in the same cohort highlighted inconsistent estimates from the 3 methods. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Interoperability through standardization: Electronic mail, and X Window systems
NASA Technical Reports Server (NTRS)
Amin, Ashok T.
1993-01-01
Since the introduction of computing machines, there has been continual advances in computer and communication technologies and approaching limits. The user interface has evolved from a row of switches, character based interface using teletype terminals and then video terminals, to present day graphical user interface. It is expected that next significant advances will come in the availability of services, such as electronic mail and directory services, as the standards for applications are developed and in the 'easy to use' interfaces, such as Graphical User Interface for example Window and X Window, which are being standardized. Various proprietary electronic mail (email) systems are in use within organizations at each center of NASA. Each system provides email services to users within an organization, however the support for email services across organizations and across centers exists at centers to a varying degree and is often easy to use. A recent NASA email initiative is intended 'to provide a simple way to send email across organizational boundaries without disruption of installed base.' The initiative calls for integration of existing organizational email systems through gateways connected by a message switch, supporting X.400 and SMTP protocols, to create a NASA wide email system and for implementation of NASA wide email directory services based on OSI standard X.500. A brief overview of MSFC efforts as a part of this initiative are described. Window based graphical user interfaces make computers easy to use. X window protocol has been developed at Massachusetts Institute of Technology in 1984/1985 to provide uniform window based interface in a distributed computing environment with heterogenous computers. It has since become a standard supported by a number of major manufacturers. Z Windows systems, terminals and workstations, and X Window applications are becoming available. However impact of its use in the Local Area Network environment on the network traffic are not well understood. It is expected that the use of X Windows systems will increase at MSFC especially for Unix based systems. An overview of X Window protocol is presented and its impact on the network traffic is examined. It is proposed that an analytical model of X Window systems in the network environment be developed and validated through the use of measurements to generate application and user profiles.
2D and 3D Method of Characteristic Tools for Complex Nozzle Development
NASA Technical Reports Server (NTRS)
Rice, Tharen
2003-01-01
This report details the development of a 2D and 3D Method of Characteristic (MOC) tool for the design of complex nozzle geometries. These tools are GUI driven and can be run on most Windows-based platforms. The report provides a user's manual for these tools as well as explains the mathematical algorithms used in the MOC solutions.
Sood, Mehak; Besson, Pierre; Muthalib, Makii; Jindal, Utkarsh; Perrey, Stephane; Dutta, Anirban; Hayashibe, Mitsuhiro
2016-12-01
Transcranial direct current stimulation (tDCS) has been shown to perturb both cortical neural activity and hemodynamics during (online) and after the stimulation, however mechanisms of these tDCS-induced online and after-effects are not known. Here, online resting-state spontaneous brain activation may be relevant to monitor tDCS neuromodulatory effects that can be measured using electroencephalography (EEG) in conjunction with near-infrared spectroscopy (NIRS). We present a Kalman Filter based online parameter estimation of an autoregressive (ARX) model to track the transient coupling relation between the changes in EEG power spectrum and NIRS signals during anodal tDCS (2mA, 10min) using a 4×1 ring high-definition montage. Our online ARX parameter estimation technique using the cross-correlation between log (base-10) transformed EEG band-power (0.5-11.25Hz) and NIRS oxy-hemoglobin signal in the low frequency (≤0.1Hz) range was shown in 5 healthy subjects to be sensitive to detect transient EEG-NIRS coupling changes in resting-state spontaneous brain activation during anodal tDCS. Conventional sliding window cross-correlation calculations suffer a fundamental problem in computing the phase relationship as the signal in the window is considered time-invariant and the choice of the window length and step size are subjective. Here, Kalman Filter based method allowed online ARX parameter estimation using time-varying signals that could capture transients in the coupling relationship between EEG and NIRS signals. Our new online ARX model based tracking method allows continuous assessment of the transient coupling between the electrophysiological (EEG) and the hemodynamic (NIRS) signals representing resting-state spontaneous brain activation during anodal tDCS. Published by Elsevier B.V.
A Comparison of Three Methods for Measuring Distortion in Optical Windows
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Nurge, Mark A.; Skow, Miles
2015-01-01
It's important that imagery seen through large-area windows, such as those used on space vehicles, not be substantially distorted. Many approaches are described in the literature for measuring the distortion of an optical window, but most suffer from either poor resolution or processing difficulties. In this paper a new definition of distortion is presented, allowing accurate measurement using an optical interferometer. This new definition is shown to be equivalent to the definitions provided by the military and the standards organizations. In order to determine the advantages and disadvantages of this new approach, the distortion of an acrylic window is measured using three different methods: image comparison, moiré interferometry, and phase-shifting interferometry.
NASA Technical Reports Server (NTRS)
Thompson, Shelby; Litaker, Harry; Howard, Robert
2009-01-01
A natural component to driving any type of vehicle, be it Earth-based or space-based, is visibility. In its simplest form visibility is a measure of the distance at which an object can be seen. With the National Aeronautics and Space Administration s (NASA) Space Shuttle and the International Space Station (ISS), there are human factors design guidelines for windows. However, for planetary exploration related vehicles, especially land-based vehicles, relatively little has been written on the importance of windows. The goal of the current study was to devise a proper methodology and to obtain preliminary human-in-the-loop data on window placement and location for the small pressurized rover (SPR). Nine participants evaluated multiple areas along the vehicle s front "nose", while actively maneuvering through several lunar driving simulations. Subjective data was collected on seven different aspects measuring areas of necessity, frequency of views, and placement/configuration of windows using questionnaires and composite drawings. Results indicated a desire for a large horizontal field-of-view window spanning the front of the vehicle for most driving situations with slightly reduced window areas for the lower front, lower corners, and side views.
NASA Technical Reports Server (NTRS)
Newell, J. D.; Keller, R. A.; Baily, N. A.
1974-01-01
A simple method for outlining or contouring any area defined by a change in film density or fluoroscopic screen intensity is described. The entire process, except for the positioning of an electronic window, is accomplished using a small computer having appropriate softwave. The electronic window is operator positioned over the area to be processed. The only requirement is that the window be large enough to encompass the total area to be considered.
Launch window analysis of satellites in high eccentricity or large circular orbits
NASA Technical Reports Server (NTRS)
Renard, M. L.; Bhate, S. K.; Sridharan, R.
1973-01-01
Numerical methods and computer programs for studying the stability and evolution of orbits of large eccentricity are presented. Methods for determining launch windows and target dates are developed. Mathematical models are prepared to analyze the characteristics of specific missions.
A Simple Laser Microphone for Classroom Demonstration
ERIC Educational Resources Information Center
Moses, James M.; Trout, K. P.
2006-01-01
Communication through the modulation of electromagnetic radiation has become a foundational technique in modern technology. In this paper we discuss a modern day method of eavesdropping based upon the modulation of laser light reflected from a window pane. A simple and affordable classroom demonstration of a "laser microphone" is…
Cloud tolerance of remote sensing technologies to measure land surface temperature
USDA-ARS?s Scientific Manuscript database
Conventional means to estimate land surface temperature (LST) from space relies on the thermal infrared (TIR) spectral window and is limited to cloud-free scenes. To also provide LST estimates during periods with clouds, a new method was developed to estimate LST based on passive microwave (MW) obse...
Can we estimate total magnetization directions from aeromagnetic data using Helbig's integrals?
Phillips, J.D.
2005-01-01
An algorithm that implements Helbig's (1963) integrals for estimating the vector components (mx, my, mz) of tile magnetic dipole moment from the first order moments of the vector magnetic field components (??X, ??Y, ??Z) is tested on real and synthetic data. After a grid of total field aeromagnetic data is converted to vector component grids using Fourier filtering, Helbig's infinite integrals are evaluated as finite integrals in small moving windows using a quadrature algorithm based on the 2-D trapezoidal rule. Prior to integration, best-fit planar surfaces must be removed from the component data within the data windows in order to make the results independent of the coordinate system origin. Two different approaches are described for interpreting the results of the integration. In the "direct" method, results from pairs of different window sizes are compared to identify grid nodes where the angular difference between solutions is small. These solutions provide valid estimates of total magnetization directions for compact sources such as spheres or dipoles, but not for horizontally elongated or 2-D sources. In the "indirect" method, which is more forgiving of source geometry, results of the quadrature analysis are scanned for solutions that are parallel to a specified total magnetization direction.
NASA Astrophysics Data System (ADS)
Liu, Shenggang; Li, Jiabo; Li, Jun; Xue, Tao; Tao, Tianjiong; Ma, Heli; Wang, Xiang; Weng, Jidong; Li, Zeren
2018-04-01
A novel method based on signal superimposing has been presented to simultaneously measure the dynamic emissivity and the radiance of a shocked sample/window interface in the near-infrared wavelength. In this method, we have used three rectangle laser pulses to illuminate the sample/window interface via an integrating sphere and expect that the reflected laser pulses from the sample/window interface can be superimposed on its thermal radiation at the shocked steady state by time precision synchronization. In the two proving trials, the second laser pulse reflected from the Al/LiF interface has been successfully superimposed on its thermal radiation despite large flyer velocity uncertainty. The dynamic emissivity and the radiance at 1064 nm have been obtained simultaneously from the superimposing signals. The obtained interface temperatures are 1842 ± 82 K and 1666 ± 154 K, respectively, the corresponding release pressures are 65.7 GPa and 62.6 GPa, and the deduced Hugonoit temperatures are consistent with the theoretical calculations. In comparison, the fitting temperatures from the gray body model are 300-500 K higher than our experimental measurement results and the theoretical calculations.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
Colour image compression by grey to colour conversion
NASA Astrophysics Data System (ADS)
Drew, Mark S.; Finlayson, Graham D.; Jindal, Abhilash
2011-03-01
Instead of de-correlating image luminance from chrominance, some use has been made of using the correlation between the luminance component of an image and its chromatic components, or the correlation between colour components, for colour image compression. In one approach, the Green colour channel was taken as a base, and the other colour channels or their DCT subbands were approximated as polynomial functions of the base inside image windows. This paper points out that we can do better if we introduce an addressing scheme into the image description such that similar colours are grouped together spatially. With a Luminance component base, we test several colour spaces and rearrangement schemes, including segmentation. and settle on a log-geometric-mean colour space. Along with PSNR versus bits-per-pixel, we found that spatially-keyed s-CIELAB colour error better identifies problem regions. Instead of segmentation, we found that rearranging on sorted chromatic components has almost equal performance and better compression. Here, we sort on each of the chromatic components and separately encode windows of each. The result consists of the original greyscale plane plus the polynomial coefficients of windows of rearranged chromatic values, which are then quantized. The simplicity of the method produces a fast and simple scheme for colour image and video compression, with excellent results.
Configuring Eclipse for GMAT Builds: Instructions for Windows Users, Rev. 0.3
NASA Technical Reports Server (NTRS)
Conway, Darrel J.
2007-01-01
This document provides instructions about how to configure the Eclipse IDE to build GMAT on Windows based PCs. The current instructions are preliminary; the Windows builds using Eclipse are currently a bit crude. These instructions are intended to give you enough information to get Eclipse setup to build wxWidgets based executables in general, and GMAT in particular.
Spectroscopy by joint spectral and time domain optical coherence tomography
NASA Astrophysics Data System (ADS)
Szkulmowski, Maciej; Tamborski, Szymon; Wojtkowski, Maciej
2015-03-01
We present the methodology for spectroscopic examination of absorbing media being the combination of Spectral Optical Coherence Tomography and Fourier Transform Spectroscopy. The method bases on the joint Spectral and Time OCT computational scheme and simplifies data analysis procedure as compared to the mostly used windowing-based Spectroscopic OCT methods. The proposed experimental setup is self-calibrating in terms of wavelength-pixel assignment. The performance of the method in measuring absorption spectrum was checked with the use of the reflecting phantom filled with the absorbing agent (indocyanine green). The results show quantitative accordance with the controlled exact results provided by the reference method.
NASA Astrophysics Data System (ADS)
Hansell, Richard Allen, Jr.
The radiative effects of dust aerosol on our climate system have yet to be fully understood and remain a topic of contemporary research. To investigate these effects, detection/retrieval methods for dust events over major dust outbreak and transport areas have been developed using satellite and ground-based approaches. To this end, both the shortwave and longwave surface radiative forcing of dust aerosol were investigated. The ground-based remote sensing approach uses the Atmospheric Emitted Radiance Interferometer brightness temperature spectra to detect mineral dust events and to retrieve their properties. Taking advantage of the high spectral resolution of the AERI instrument, absorptive differences in prescribed thermal IR window sub-band channels were exploited to differentiate dust from cirrus clouds. AERI data collected during the UAE2 at Al-Ain UAE was employed for dust retrieval. Assuming a specified dust composition model a priori and using the light scattering programs of T-matrix and the finite difference time domain methods for oblate spheroids and hexagonal plates, respectively, dust optical depths have been retrieved and compared to those inferred from a collocated and coincident AERONET sun-photometer dataset. The retrieved optical depths were then used to determine the dust longwave surface forcing during the UAE2. Likewise, dust shortwave surface forcing is investigated employing a differential technique from previous field studies. The satellite-based approach uses MODIS thermal infrared brightness temperature window data for the simultaneous detection/separation of mineral dust and cirrus clouds. Based on the spectral variability of dust emissivity at the 3.75, 8.6, 11 and 12 mum wavelengths, the D*-parameter, BTD-slope and BTD3-11 tests are combined to identify dust and cirrus. MODIS data for the three dust-laden scenes have been analyzed to demonstrate the effectiveness of this detection/separation method. Detected daytime dust and cloud coverage for the Persian Gulf case compare reasonably well to those from the "Deep Blue" algorithm developed at NASA-GSFC. The nighttime dust/cloud detection for the cases surrounding Cape Verde and Niger, West Africa has been validated by comparing to coincident and collocated ground-based micro-pulse lidar measurements.
Old practices, new windows: reflections on a communications skills innovation.
Cantillon, Peter
2017-03-01
Most of the great innovations in communication skills education, from Balint's concept of the 'doctor as drug' to the Calgary Cambridge conceptualisation of the consultation, were founded in general practice. It can be argued however, that there has been a hiatus in the development of new approaches to analysing the consultation since the mid-1990s. It is most welcome therefore that in this issue of the journal two papers are presented that describe and evaluate a novel approach to consultation analysis entitled 'the windows method'. Building on the more structured approaches that preceded it, the windows method offers some genuine innovations in terms of its emphasis on emotional knowledge and the manner in which it addresses many of the potential deficiencies in feedback practice associated with older methods. The new approach is very much in step with current thinking about emotional development and the establishment of appropriate environments for feedback. The windows method has the potential to breathe fresh life into old and well-established communication skills education practices.
Measuring Glial Metabolism in Repetitive Brain Trauma and Alzheimer’s Disease
2016-09-01
Six methods: Single value decomposition (SVD), wavelet, sliding window, sliding window with Gaussian weighting, spline and spectral improvements...comparison of a range of different denoising methods for dynamic MRS. Six denoising methods were considered: Single value decomposition (SVD), wavelet...project by improving the software required for the data analysis by developing six different denoising methods. He also assisted with the testing
Advances in Global Adjoint Tomography - Data Assimilation and Inversion Strategy
NASA Astrophysics Data System (ADS)
Ruan, Y.; Lei, W.; Lefebvre, M. P.; Modrak, R. T.; Smith, J. A.; Bozdag, E.; Tromp, J.
2016-12-01
Seismic tomography provides the most direct way to understand Earth's interior by imaging elastic heterogeneity, anisotropy and anelasticity. Resolving thefine structure of these properties requires accurate simulations of seismic wave propagation in complex 3-D Earth models. On the supercomputer "Titan" at Oak Ridge National Laboratory, we are employing a spectral-element method (Komatitsch & Tromp 1999, 2002) in combination with an adjoint method (Tromp et al., 2005) to accurately calculate theoretical seismograms and Frechet derivatives. Using 253 carefully selected events, Bozdag et al. (2016) iteratively determined a transversely isotropic earth model (GLAD_M15) using 15 preconditioned conjugate-gradient iterations. To obtain higher resolution images of the mantle, we have expanded our database to more than 4,220 Mw5.0-7.0 events occurred between 1995 and 2014. Instead of using the entire database all at once, we choose to draw subsets of about 1,000 events from our database for each iteration to achieve a faster convergence rate with limited computing resources. To provide good coverage of deep structures, we selected approximately 700 deep and intermedia earthquakes and 300 shallow events to start a new iteration. We reinverted the CMT solutions of these events in the latest model, and recalculated synthetic seismograms. Using the synthetics as reference seismograms, we selected time windows that show good agreement with data and make measurements within the windows. From the measurements we further assess the overall quality of each event and station, and exclude bad measurements base upon certain criteria. So far, with very conservative criteria, we have assimilated more than 8.0 million windows from 1,000 earthquakes in three period bands for the new iteration. For subsequent iterations, we will change the period bands and window selecting criteria to include more window. In the inversion, dense array data (e.g., USArray) usually dominate model updates. In order to better handle this issue, we introduced weighting of stations and events based upon their relative distance and showed that the contribution from dense array is better balanced in the Frechet derivatives. We will present a summary of this form of data assimilation and preliminary results of the first few iterations.
Spuler, Martin
2015-08-01
A Brain-Computer Interface (BCI) allows to control a computer by brain activity only, without the need for muscle control. In this paper, we present an EEG-based BCI system based on code-modulated visual evoked potentials (c-VEPs) that enables the user to work with arbitrary Windows applications. Other BCI systems, like the P300 speller or BCI-based browsers, allow control of one dedicated application designed for use with a BCI. In contrast, the system presented in this paper does not consist of one dedicated application, but enables the user to control mouse cursor and keyboard input on the level of the operating system, thereby making it possible to use arbitrary applications. As the c-VEP BCI method was shown to enable very fast communication speeds (writing more than 20 error-free characters per minute), the presented system is the next step in replacing the traditional mouse and keyboard and enabling complete brain-based control of a computer.
Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D
2018-06-08
Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.
A new Hessian - based approach for segmentation of CT porous media images
NASA Astrophysics Data System (ADS)
Timofey, Sizonenko; Marina, Karsanina; Dina, Gilyazetdinova; Kirill, Gerke
2017-04-01
Hessian matrix based methods are widely used in image analysis for features detection, e.g., detection of blobs, corners and edges. Hessian matrix of the imageis the matrix of 2nd order derivate around selected voxel. Most significant features give highest values of Hessian transform and lowest values are located at smoother parts of the image. Majority of conventional segmentation techniques can segment out cracks, fractures and other inhomogeneities in soils and rocks only if the rest of the image is significantly "oversigmented". To avoid this disadvantage, we propose to enhance greyscale values of voxels belonging to such specific inhomogeneities on X-ray microtomography scans. We have developed and implemented in code a two-step approach to attack the aforementioned problem. During the first step we apply a filter that enhances the image and makes outstanding features more sharply defined. During the second step we apply Hessian filter based segmentation. The values of voxels on the image to be segmented are calculated in conjunction with the values of other voxels within prescribed region. Contribution from each voxel within such region is computed by weighting according to the local Hessian matrix value. We call this approach as Hessian windowed segmentation. Hessian windowed segmentation has been tested on different porous media X-ray microtomography images, including soil, sandstones, carbonates and shales. We also compared this new method against others widely used methods such as kriging, Markov random field, converging active contours and region grow. We show that our approach is more accurate in regions containing special features such as small cracks, fractures, elongated inhomogeneities and other features with low contrast related to the background solid phase. Moreover, Hessian windowed segmentation outperforms some of these methods in computational efficiency. We further test our segmentation technique by computing permeability of segmented images and comparing them against laboratory based measurements. This work was partially supported by RFBR grant 15-34-20989 (X-ray tomography and image fusion) and RSF grant 14-17-00658 (image segmentation and pore-scale modelling).
Radiation damage of gallium arsenide production cells
NASA Technical Reports Server (NTRS)
Mardesich, N.; Garlick, G. F. J.
1987-01-01
High-efficiency gallium arsenide cells, made by the liquid epitaxy method (LPE), have been irradiated with 1-MeV electrons up to fluences of 10 to the 16th e/sq cm. Measurements have been made of cell spectral response and dark and light-excited current-voltage characteristics and analyzed using computer-based models to determine underlying parameters such as damage coefficients. It is possible to use spectral response to sort out damage effects in the different cell component layers. Damage coefficients are similar to other reported in the literature for the emitter and buffer (base). However, there is also a damage effect in the window layer and possibly at the window emitter interface similar to that found for proton-irradiated liquid-phase epitaxy-grown cells. Depletion layer recombination is found to be less than theoretically expected at high fluence.
The Study of Residential Areas Extraction Based on GF-3 Texture Image Segmentation
NASA Astrophysics Data System (ADS)
Shao, G.; Luo, H.; Tao, X.; Ling, Z.; Huang, Y.
2018-04-01
The study chooses the standard stripe and dual polarization SAR images of GF-3 as the basic data. Residential areas extraction processes and methods based upon GF-3 images texture segmentation are compared and analyzed. GF-3 images processes include radiometric calibration, complex data conversion, multi-look processing, images filtering, and then conducting suitability analysis for different images filtering methods, the filtering result show that the filtering method of Kuan is efficient for extracting residential areas, then, we calculated and analyzed the texture feature vectors using the GLCM (the Gary Level Co-occurrence Matrix), texture feature vectors include the moving window size, step size and angle, the result show that window size is 11*11, step is 1, and angle is 0°, which is effective and optimal for the residential areas extracting. And with the FNEA (Fractal Net Evolution Approach), we segmented the GLCM texture images, and extracted the residential areas by threshold setting. The result of residential areas extraction verified and assessed by confusion matrix. Overall accuracy is 0.897, kappa is 0.881, and then we extracted the residential areas by SVM classification based on GF-3 images, the overall accuracy is less 0.09 than the accuracy of extraction method based on GF-3 Texture Image Segmentation. We reached the conclusion that residential areas extraction based on GF-3 SAR texture image multi-scale segmentation is simple and highly accurate. although, it is difficult to obtain multi-spectrum remote sensing image in southern China, in cloudy and rainy weather throughout the year, this paper has certain reference significance.
A staggered-grid convolutional differentiator for elastic wave modelling
NASA Astrophysics Data System (ADS)
Sun, Weijia; Zhou, Binzhong; Fu, Li-Yun
2015-11-01
The computation of derivatives in governing partial differential equations is one of the most investigated subjects in the numerical simulation of physical wave propagation. An analytical staggered-grid convolutional differentiator (CD) for first-order velocity-stress elastic wave equations is derived in this paper by inverse Fourier transformation of the band-limited spectrum of a first derivative operator. A taper window function is used to truncate the infinite staggered-grid CD stencil. The truncated CD operator is almost as accurate as the analytical solution, and as efficient as the finite-difference (FD) method. The selection of window functions will influence the accuracy of the CD operator in wave simulation. We search for the optimal Gaussian windows for different order CDs by minimizing the spectral error of the derivative and comparing the windows with the normal Hanning window function for tapering the CD operators. It is found that the optimal Gaussian window appears to be similar to the Hanning window function for tapering the same CD operator. We investigate the accuracy of the windowed CD operator and the staggered-grid FD method with different orders. Compared to the conventional staggered-grid FD method, a short staggered-grid CD operator achieves an accuracy equivalent to that of a long FD operator, with lower computational costs. For example, an 8th order staggered-grid CD operator can achieve the same accuracy of a 16th order staggered-grid FD algorithm but with half of the computational resources and time required. Numerical examples from a homogeneous model and a crustal waveguide model are used to illustrate the superiority of the CD operators over the conventional staggered-grid FD operators for the simulation of wave propagations.
Attenuating fearful memories: effect of cued extinction on intrusions.
Marks, Elizabeth H; Zoellner, Lori A
2014-12-01
Exposure-based therapies for posttraumatic stress disorder are thought to reduce intrusive memories through extinction processes. Methods that enhance extinction may translate to improved treatment. Rat research suggests retrieving a memory via a conditioned stimulus (CS) cue, and then modifying the retrieved memory within a specific reconsolidation window may enhance extinction. In humans, studies (e.g., Kindt & Soeter, 2013; Schiller et al., 2010) using basic learning paradigms show discrepant findings. Using a distressing film paradigm, participants (N = 148) completed fear acquisition and extinction. At extinction, they were randomized to 1 of 3 groups: CS cue within reconsolidation window, CS cue outside window, or non-CS cue within window. Intrusions were assessed 24 hr after extinction. Participants receiving the CS cue and completing extinction within the reconsolidation window had more intrusions (M = 2.40, SD = 2.54) than those cued outside (M = 1.65, SD = 1.70) or those receiving a non-CS cue (M = 1.24, SD = 1.26), F(2, 145) = 4.52, p = .01, d = 0.55. Consistent with the reconsolidation hypothesis, presenting a CS cue does appear to activate a specific period of time during which a memory can be updated. However, the CS cue caused increased, rather than decreased, frequency of intrusions. Understanding parameters of preextinction cueing may help us better understand reconsolidation as a potential memory updating mechanism.
Lun, Aaron T.L.; Smyth, Gordon K.
2016-01-01
Chromatin immunoprecipitation with massively parallel sequencing (ChIP-seq) is widely used to identify binding sites for a target protein in the genome. An important scientific application is to identify changes in protein binding between different treatment conditions, i.e. to detect differential binding. This can reveal potential mechanisms through which changes in binding may contribute to the treatment effect. The csaw package provides a framework for the de novo detection of differentially bound genomic regions. It uses a window-based strategy to summarize read counts across the genome. It exploits existing statistical software to test for significant differences in each window. Finally, it clusters windows into regions for output and controls the false discovery rate properly over all detected regions. The csaw package can handle arbitrarily complex experimental designs involving biological replicates. It can be applied to both transcription factor and histone mark datasets, and, more generally, to any type of sequencing data measuring genomic coverage. csaw performs favorably against existing methods for de novo DB analyses on both simulated and real data. csaw is implemented as a R software package and is freely available from the open-source Bioconductor project. PMID:26578583
Code of Federal Regulations, 2010 CFR
2010-10-01
... certification pursuant to § 0.459 of this chapter. (b) Initial certification window. Following the effective... window for digital output protection technologies or recording methods. Within thirty (30) days after the... certification window, the Commission shall issue a public notice identifying the certifications received and...
Estimating Characteristics of a Maneuvering Reentry Vehicle Observed by Multiple Sensors
2010-03-01
instead of as one large data set. This method allowed the filter to respond to changing dynamics. Jackson and Farbman’s approach could be of...portion of the entire acceleration was due to drag. Lee and Liu adopted a more hybrid approach , combining a least squares and Kalman filters [9...grows again as the window approaches the end of the available data. Three values for minimum window size, window size, and maximum window size are
Integration in PACS of DICOM with TCP/IP, SQL, and X Windows
NASA Astrophysics Data System (ADS)
Reijns, Gerard L.
1994-05-01
The DICOM standard (Digital Imaging and Communications in Medicine) has been developed in order to obtain compatibility at the higher OSI levels. This paper describes the implementation of DICOM into our developed low cost PACS, which uses as much as possible standard software and standard protocols such as SQL, X Windows and TCP/IP. We adopted the requirement that all messages on the communication network have to be DICOM compatible. The translation between DICOM messages and SQL commands, which drive the relational data base, has been accommodated in our PACS supervisor. The translation takes only between 10 and 20 milliseconds. Images, that will be used the current day are stored in a distributed, parallel operating image base for reasons of performance. Extensive use has been made of X Windows to visualize images. A maximum of 12 images can simultaneously be displayed, of which one selected image can be manipulated (e.g., magnified, rotated, etc.), without affecting the other displayed images. The emphasis of the future work will be on performance measurements and modeling of our PACS and bringing the results of both methods in agreement with each other.
Remembering the Important Things: Semantic Importance in Stream Reasoning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Rui; Greaves, Mark T.; Smith, William P.
Reasoning and querying over data streams rely on the abil- ity to deliver a sequence of stream snapshots to the processing algo- rithms. These snapshots are typically provided using windows as views into streams and associated window management strategies. Generally, the goal of any window management strategy is to preserve the most im- portant data in the current window and preferentially evict the rest, so that the retained data can continue to be exploited. A simple timestamp- based strategy is rst-in-rst-out (FIFO), in which items are replaced in strict order of arrival. All timestamp-based strategies implicitly assume that a temporalmore » ordering reliably re ects importance to the processing task at hand, and thus that window management using timestamps will maximize the ability of the processing algorithms to deliver accurate interpretations of the stream. In this work, we explore a general no- tion of semantic importance that can be used for window management for streams of RDF data using semantically-aware processing algorithms like deduction or semantic query. Semantic importance exploits the infor- mation carried in RDF and surrounding ontologies for ranking window data in terms of its likely contribution to the processing algorithms. We explore the general semantic categories of query contribution, prove- nance, and trustworthiness, as well as the contribution of domain-specic ontologies. We describe how these categories behave using several con- crete examples. Finally, we consider how a stream window management strategy based on semantic importance could improve overall processing performance, especially as available window sizes decrease.« less
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
Power measurement system of ECRH on HL-2A
NASA Astrophysics Data System (ADS)
Wang, He; Lu, Zhihong; Kubo, Shin; Chen, Gangyu; Wang, Chao; Zhou, Jun; Huang, Mei; Rao, Jun
2015-03-01
Electron Cyclotron Resonance Heating (ECRH) is one of the main auxiliary heating systems for HL-2A tokamak. The ECRH system with total output power 5MW has been equipped on HL-2A which include 6 sets of 0.5MW/1.0s at a frequency of 68GHz and 2 sets of 1MW/3s at a frequency of 140GHz. The power is one of important parameters in ECRH system. In this paper, the method for measuring the power of ECRH system on HL-2A is introduced which include calorimetric techniques and directional coupler. Calorimetric techniques is an existing method, which is used successfully in ECRH commissioning and experiment, and the transmission efficiency of ECRH system is achieved by measuring the absorbed microwave power in the Match Optical Unit (MOU), gyrotron output window and tours window of the EC system use this method. Now base on the theory of electromagnetic coupling through apertures, directional couplers are being designed, which is a new way for us.
Combining point context and dynamic time warping for online gesture recognition
NASA Astrophysics Data System (ADS)
Mao, Xia; Li, Chen
2017-05-01
Previous gesture recognition methods usually focused on recognizing gestures after the entire gesture sequences were obtained. However, in many practical applications, a system has to identify gestures before they end to give instant feedback. We present an online gesture recognition approach that can realize early recognition of unfinished gestures with low latency. First, a curvature buffer-based point context (CBPC) descriptor is proposed to extract the shape feature of a gesture trajectory. The CBPC descriptor is a complete descriptor with a simple computation, and thus has its superiority in online scenarios. Then, we introduce an online windowed dynamic time warping algorithm to realize online matching between the ongoing gesture and the template gestures. In the algorithm, computational complexity is effectively decreased by adding a sliding window to the accumulative distance matrix. Lastly, the experiments are conducted on the Australian sign language data set and the Kinect hand gesture (KHG) data set. Results show that the proposed method outperforms other state-of-the-art methods especially when gesture information is incomplete.
Qualitative mechanism models and the rationalization of procedures
NASA Technical Reports Server (NTRS)
Farley, Arthur M.
1989-01-01
A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.
Towards component-based validation of GATE: aspects of the coincidence processor
Moraes, Eder R.; Poon, Jonathan K.; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D.
2014-01-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to “ground truth” obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the “multiple window method”), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the “single window method”). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. PMID:25240897
Windows of sensitivity to toxic chemicals in the motor effects development.
Ingber, Susan Z; Pohl, Hana R
2016-02-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8-17 [rats], GD 12-14 and PND 3-10 [mice]) and motor function performance (insufficient data for rats, GD 12-17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. Published by Elsevier Inc.
Windows of sensitivity to toxic chemicals in the motor effects development✩
Ingber, Susan Z.; Pohl, Hana R.
2017-01-01
Many chemicals currently used are known to elicit nervous system effects. In addition, approximately 2000 new chemicals introduced annually have not yet undergone neurotoxicity testing. This review concentrated on motor development effects associated with exposure to environmental neurotoxicants to help identify critical windows of exposure and begin to assess data needs based on a subset of chemicals thoroughly reviewed by the Agency for Toxic Substances and Disease Registry (ATSDR) in Toxicological Profiles and Addenda. Multiple windows of sensitivity were identified that differed based on the maturity level of the neurological system at the time of exposure, as well as dose and exposure duration. Similar but distinct windows were found for both motor activity (GD 8–17 [rats], GD 12–14 and PND 3–10 [mice]) and motor function performance (insufficient data for rats, GD 12–17 [mice]). Identifying specific windows of sensitivity in animal studies was hampered by study designs oriented towards detection of neurotoxicity that occurred at any time throughout the developmental process. In conclusion, while this investigation identified some critical exposure windows for motor development effects, it demonstrates a need for more acute duration exposure studies based on neurodevelopmental windows, particularly during the exposure periods identified in this review. PMID:26686904
An image mosaic method based on corner
NASA Astrophysics Data System (ADS)
Jiang, Zetao; Nie, Heting
2015-08-01
In view of the shortcomings of the traditional image mosaic, this paper describes a new algorithm for image mosaic based on the Harris corner. Firstly, Harris operator combining the constructed low-pass smoothing filter based on splines function and circular window search is applied to detect the image corner, which allows us to have better localisation performance and effectively avoid the phenomenon of cluster. Secondly, the correlation feature registration is used to find registration pair, remove the false registration using random sampling consensus. Finally use the method of weighted trigonometric combined with interpolation function for image fusion. The experiments show that this method can effectively remove the splicing ghosting and improve the accuracy of image mosaic.
Levels of polychlorinated biphenyls (PCBs) in caulk and window glazing material samples from older buildings were determined, using a method developed for this purpose. This method was evaluated by analyzing a combination of 47 samples of caulk, glazing materials, including quali...
NASA Technical Reports Server (NTRS)
Tilton, James C. (Inventor)
2010-01-01
A method, computer readable storage, and apparatus for implementing recursive segmentation of data with spatial characteristics into regions including splitting-remerging of pixels with contagious region designations and a user controlled parameter for providing a preference for merging adjacent regions to eliminate window artifacts.
Integrated self-cleaning window assembly for optical transmission in combustion environments
Kass, Michael D [Oak Ridge, TN
2007-07-24
An integrated window design for optical transmission in combustion environments is described. The invention consists of an integrated optical window design that prevents and removes the accumulation of carbon-based particulate matter and gaseous hydrocarbons through a combination of heat and catalysis. These windows will enable established optical technologies to be applied to combustion environments and their exhaust systems.
A simple gold-coated microstructure fiber polarization filter in two communication windows
NASA Astrophysics Data System (ADS)
Feng, Xinxing; Li, Shuguang; Du, Huijing; Zhang, Yinan; Liu, Qiang
2018-03-01
A polarization filter is designed at two communication windows of 1310 and 1550 nm based on microstructured optical fiber. The model has four large diameter air holes and two gold-coated air holes. The influence of the geometrical parameters of the photonic crystal fiber on the performance of the polarization filter is analyzed by the finite element method. The numerical simulation shows that when the fiber length is 300 μm, the corresponding extinction ratio is 209.7 dB and 179.8 dB, the bandwidth of extinction ratio (ER) better than 20 dB is 150 nm and 350 nm at the communication wavelength of 1310 nm and 1550 nm.
BE-PLUS: a new base editing tool with broadened editing window and enhanced fidelity.
Jiang, Wen; Feng, Songjie; Huang, Shisheng; Yu, Wenxia; Li, Guanglei; Yang, Guang; Liu, Yajing; Zhang, Yu; Zhang, Lei; Hou, Yu; Chen, Jia; Chen, Jieping; Huang, Xingxu
2018-06-06
Base editor (BE), containing a cytidine deaminase and catalytically defective Cas9, has been widely used to perform base editing. However, the narrow editing window of BE limits its utility. Here, we developed a new editing technology named as base editor for programming larger C to U (T) scope (BE-PLUS) by fusing 10 copies of GCN4 peptide to nCas9(D10A) for recruiting scFv-APOBEC-UGI-GB1 to the target sites. The new system achieves base editing with a broadened window, resulting in an increased genome-targeting scope. Interestingly, the new system yielded much fewer unwanted indels and non-C-to-T conversions. We also demonstrated its potential use in gene disruption across the whole genome through induction of stop codons (iSTOP). Taken together, the BE-PLUS system offers a new editing tool with increased editing window and enhanced fidelity.
Plontke, Stefan K; Mynatt, Robert; Gill, Ruth M; Borgmann, Stefan; Salt, Alec N
2007-07-01
The distribution of gentamicin along the fluid spaces of the cochlea after local applications has never previously been demonstrated. Computer simulations have predicted that significant basal-apical concentration gradients might be expected, and histologic studies indicate that hair cell damage is greater at the base than at the apex after local gentamicin application. In the present study, gradients of gentamicin along the cochlea were measured. A recently developed method of sampling perilymph from the cochlear apex of guinea pigs was used in which the samples represent fluid originating from different regions along the scala tympani. Gentamicin concentration was determined in sequential apical samples that were taken after up to 3 hours of local application to the round window niche. Substantial gradients of gentamicin along the length of the scala tympani were demonstrated and quantified, averaging more than 4,000 times greater concentration at the base compared with the apex at the time of sampling. Peak concentrations and gradients for gentamicin varied considerably between animals, likely resulting from variations in round window membrane permeability and rates of perilymph flow. The large gradients for gentamicin demonstrated here in guinea pigs account for how it is possible to suppress vestibular function in some patients with a local application of gentamicin without damaging auditory function. Variations in round window membrane permeability and in perilymph flow could account for why hearing losses are observed in some patients.
Image dehazing based on non-local saturation
NASA Astrophysics Data System (ADS)
Wang, Linlin; Zhang, Qian; Yang, Deyun; Hou, Yingkun; He, Xiaoting
2018-04-01
In this paper, a method based on non-local saturation algorithm is proposed to avoid block and halo effect for single image dehazing with dark channel prior. First we convert original image from RGB color space into HSV color space with the idea of non-local method. Image saturation is weighted equally by the size of fixed window according to image resolution. Second we utilize the saturation to estimate the atmospheric light value and transmission rate. Then through the function of saturation and transmission, the haze-free image is obtained based on the atmospheric scattering model. Comparing the results of existing methods, our method can restore image color and enhance contrast. We guarantee the proposed method with quantitative and qualitative evaluation respectively. Experiments show the better visual effect with high efficiency.
Attenuating Stereo Pixel-Locking via Affine Window Adaptation
NASA Technical Reports Server (NTRS)
Stein, Andrew N.; Huertas, Andres; Matthies, Larry H.
2006-01-01
For real-time stereo vision systems, the standard method for estimating sub-pixel stereo disparity given an initial integer disparity map involves fitting parabolas to a matching cost function aggregated over rectangular windows. This results in a phenomenon known as 'pixel-locking,' which produces artificially-peaked histograms of sub-pixel disparity. These peaks correspond to the introduction of erroneous ripples or waves in the 3D reconstruction of truly Rat surfaces. Since stereo vision is a common input modality for autonomous vehicles, these inaccuracies can pose a problem for safe, reliable navigation. This paper proposes a new method for sub-pixel stereo disparity estimation, based on ideas from Lucas-Kanade tracking and optical flow, which substantially reduces the pixel-locking effect. In addition, it has the ability to correct much larger initial disparity errors than previous approaches and is more general as it applies not only to the ground plane.
Kim, Joowhan; Min, Sung-Wook; Lee, Byoungho
2007-10-01
Integral floating display is a recently proposed three-dimensional (3D) display method which provides a dynamic 3D image in the vicinity to an observer. It has a viewing window only through which correct 3D images can be observed. However, the positional difference between the viewing window and the floating image causes limited viewing zone in integral floating system. In this paper, we provide the principle and experimental results of the location adjustment of the viewing window of the integral floating display system by modifying the elemental image region for integral imaging. We explain the characteristics of the viewing window and propose how to move the viewing window to maximize the viewing zone.
Design of laser monitoring and sound localization system
NASA Astrophysics Data System (ADS)
Liu, Yu-long; Xu, Xi-ping; Dai, Yu-ming; Qiao, Yang
2013-08-01
In this paper, a novel design of laser monitoring and sound localization system is proposed. It utilizes laser to monitor and locate the position of the indoor conversation. In China most of the laser monitors no matter used in labor in an instrument uses photodiode or phototransistor as a detector at present. At the laser receivers of those facilities, light beams are adjusted to ensure that only part of the window in photodiodes or phototransistors received the beams. The reflection would deviate from its original path because of the vibration of the detected window, which would cause the changing of imaging spots in photodiode or phototransistor. However, such method is limited not only because it could bring in much stray light in receivers but also merely single output of photocurrent could be obtained. Therefore a new method based on quadrant detector is proposed. It utilizes the relation of the optical integral among quadrants to locate the position of imaging spots. This method could eliminate background disturbance and acquired two-dimensional spots vibrating data pacifically. The principle of this whole system could be described as follows. Collimated laser beams are reflected from vibrate-window caused by the vibration of sound source. Therefore reflected beams are modulated by vibration source. Such optical signals are collected by quadrant detectors and then are processed by photoelectric converters and corresponding circuits. Speech signals are eventually reconstructed. In addition, sound source localization is implemented by the means of detecting three different reflected light sources simultaneously. Indoor mathematical models based on the principle of Time Difference Of Arrival (TDOA) are established to calculate the twodimensional coordinate of sound source. Experiments showed that this system is able to monitor the indoor sound source beyond 15 meters with a high quality of speech reconstruction and to locate the sound source position accurately.
The convolutional differentiator method for numerical modelling of acoustic and elastic wavefields
NASA Astrophysics Data System (ADS)
Zhang, Zhong-Jie; Teng, Ji-Wen; Yang, Ding-Hui
1996-02-01
Based on the techniques of forward and inverse Fourier transformation, the authors discussed the design scheme of ordinary differentiator used and applied in the simulation of acoustic and elastic wavefields in isotropic media respectively. To compress Gibbs effects by truncation effectively, Hanning window is introduced in. The model computation shows that, the convolutional differentiator method has the advantages of rapidity, low requirements of computer’s inner storage and high precision, which is a potential method of numerical simulation.
Measure Guideline. Wood Window Repair, Rehabilitation, and Replacement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, P.; Eng, P.
2012-12-01
This measure guideline provides information and guidance on rehabilitating, retrofitting, and replacing existing window assemblies in residential construction. The intent is to provide information regarding means and methods to improve the energy and comfort performance of existing wood window assemblies in a way that takes into consideration component durability, in-service operation, and long term performance of the strategies.
Measure Guideline: Window Repair, Rehabilitation, and Replacement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, P.
2012-12-01
This measure guideline provides information and guidance on rehabilitating, retrofitting, and replacing existing window assemblies in residential construction. The intent is to provide information regarding means and methods to improve the energy and comfort performance of existing wood window assemblies in a way that takes into consideration component durability, in-service operation, and long term performance of the strategies.
4. NORTHWEST FRONT, WITH FOUR BULLET GLASS WINDOWS. Edwards ...
4. NORTHWEST FRONT, WITH FOUR BULLET GLASS WINDOWS. - Edwards Air Force Base, South Base Sled Track, Observation Block House, Station "O" area, east end of Sled Track, Lancaster, Los Angeles County, CA
A frequency-based window width optimized two-dimensional S-Transform profilometry
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao
2017-11-01
A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.
Fabrication of glass gas cells for the HALOE and MAPS satellite experiments
NASA Technical Reports Server (NTRS)
Sullivan, E. M.; Walthall, H. G.
1984-01-01
The Halogen Occultation Experiment (HALOE) and the Measurement of Air Pollution from Satellites (MAPS) experiment are satellite-borne experiments which measure trace constituents in the Earth's atmosphere. The instruments which obtain the data for these experiments are based on the gas filter correlation radiometer measurement technique. In this technique, small samples of the gases of interest are encapsulated in glass cylinders, called gas cells, which act as very selective optical filters. This report describes the techniques employed in the fabrication of the gas cells for the HALOE and MAPS instruments. Details of the method used to fuse the sapphire windows (required for IR transmission) to the glass cell bodies are presented along with detailed descriptions of the jigs and fixtures used during the assembly process. The techniques and equipment used for window inspection and for pairing the HALOE windows are discussed. Cell body materials and the steps involved in preparing the cell bodies for the glass-to-sapphire fusion process are given.
Vibration measurement by temporal Fourier analyses of a digital hologram sequence.
Fu, Yu; Pedrini, Giancarlo; Osten, Wolfgang
2007-08-10
A method for whole-field noncontact measurement of displacement, velocity, and acceleration of a vibrating object based on image-plane digital holography is presented. A series of digital holograms of a vibrating object are captured by use of a high-speed CCD camera. The result of the reconstruction is a three-dimensional complex-valued matrix with noise. We apply Fourier analysis and windowed Fourier analysis in both the spatial and the temporal domains to extract the displacement, the velocity, and the acceleration. The instantaneous displacement is obtained by temporal unwrapping of the filtered phase map, whereas the velocity and acceleration are evaluated by Fourier analysis and by windowed Fourier analysis along the time axis. The combination of digital holography and temporal Fourier analyses allows for evaluation of the vibration, without a phase ambiguity problem, and smooth spatial distribution of instantaneous displacement, velocity, and acceleration of each instant are obtained. The comparison of Fourier analysis and windowed Fourier analysis in velocity and acceleration measurements is also presented.
The FEM Simulation on End Mill of Plastic Doors and Windows Corner Cleaning Based on Deform-3D
NASA Astrophysics Data System (ADS)
Li, Guoping; Huang, Zhenyong; Wang, Xiaohui
2017-12-01
In the plastic doors and windows corner cleaning process, the rotating speed, the feed rate and the milling cutter diameter are the main factors that affect the efficiency and quality of the of corner cleaning. In this paper, SolidWorks will be used to establish the 3D model of end mills, and use Deform-3D to research the end mill milling process. And using orthogonal experiment design method to analyze the effect of rotating speed, the feed rate and the milling cutter diameter on the axial force variation, and to get the overall trend of axial force and the selection of various parameters according to the influence of axial force change. Finally, simulate milling experiment used to get the actual axial force data to verify the reliability of the FEM simulation model. And the conclusion obtained in this paper has important theoretical value in improving the plastic doors and windows corner cleaning efficiency and quality.
Bluff-body drag reduction using a deflector
NASA Astrophysics Data System (ADS)
Fourrié, Grégoire; Keirsbulck, Laurent; Labraga, Larbi; Gilliéron, Patrick
2011-02-01
A passive flow control on a generic car model was experimentally studied. This control consists of a deflector placed on the upper edge of the model rear window. The study was carried out in a wind tunnel at Reynolds numbers based on the model height of 3.1 × 105 and 7.7 × 105. The flow was investigated via standard and stereoscopic particle image velocimetry, Kiel pressure probes and surface flow visualization. The aerodynamic drag was measured using an external balance and calculated using a wake survey method. Drag reductions up to 9% were obtained depending on the deflector angle. The deflector increases the separated region on the rear window. The results show that when this separated region is wide enough, it disrupts the development of the counter-rotating longitudinal vortices appearing on the lateral edges of the rear window. The current study suggests that flow control on such geometries should consider all the flow structures that contribute to the model wake flow.
Fast Human Detection for Intelligent Monitoring Using Surveillance Visible Sensors
Ko, Byoung Chul; Jeong, Mira; Nam, JaeYeal
2014-01-01
Human detection using visible surveillance sensors is an important and challenging work for intruder detection and safety management. The biggest barrier of real-time human detection is the computational time required for dense image scaling and scanning windows extracted from an entire image. This paper proposes fast human detection by selecting optimal levels of image scale using each level's adaptive region-of-interest (ROI). To estimate the image-scaling level, we generate a Hough windows map (HWM) and select a few optimal image scales based on the strength of the HWM and the divide-and-conquer algorithm. Furthermore, adaptive ROIs are arranged per image scale to provide a different search area. We employ a cascade random forests classifier to separate candidate windows into human and nonhuman classes. The proposed algorithm has been successfully applied to real-world surveillance video sequences, and its detection accuracy and computational speed show a better performance than those of other related methods. PMID:25393782
Wojtas-Niziurski, Wojciech; Meng, Yilin; Roux, Benoit; Bernèche, Simon
2013-01-01
The potential of mean force describing conformational changes of biomolecules is a central quantity that determines the function of biomolecular systems. Calculating an energy landscape of a process that depends on three or more reaction coordinates might require a lot of computational power, making some of multidimensional calculations practically impossible. Here, we present an efficient automatized umbrella sampling strategy for calculating multidimensional potential of mean force. The method progressively learns by itself, through a feedback mechanism, which regions of a multidimensional space are worth exploring and automatically generates a set of umbrella sampling windows that is adapted to the system. The self-learning adaptive umbrella sampling method is first explained with illustrative examples based on simplified reduced model systems, and then applied to two non-trivial situations: the conformational equilibrium of the pentapeptide Met-enkephalin in solution and ion permeation in the KcsA potassium channel. With this method, it is demonstrated that a significant smaller number of umbrella windows needs to be employed to characterize the free energy landscape over the most relevant regions without any loss in accuracy. PMID:23814508
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-01-01
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW’s) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach. PMID:27314363
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data.
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-06-15
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration's (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003-2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW's) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.
DETAIL OF ORIGINAL WINDOWS ON SECOND FLOOR AT THE EAST ...
DETAIL OF ORIGINAL WINDOWS ON SECOND FLOOR AT THE EAST END, SHOWING CLEARANCE BETWEEN WINDOW SASH AND PILASTER. VIEW FACING NORTH-NORTHWEST. - U.S. Naval Base, Pearl Harbor, Aviation Storehouse, Vincennes Avenue at Simms Street, Pearl City, Honolulu County, HI
NASA Astrophysics Data System (ADS)
Wang, Sheng; Li, Zhiwei
2018-06-01
S-wave velocity and attenuation structures of shallow sediments play important roles in accurate prediction of strong ground motion. However, it is more difficult to investigate the attenuation than velocity structures. In this study, we developed a new approach for estimating frequency-dependent S-wave attenuation (Q_S^{ - 1}) structures of shallow sediments based on multiple time window analysis of borehole seismograms from local earthquakes. Multiple time windows for separating direct and surface-reflected S-waves in local earthquake waveforms at borehole stations are selected with a global optimization scheme. With respect to different time windows, the transfer functions between direct and surface-reflected S-waves are achieved with a weighted averaging scheme, based on which frequency dependent Q_S^{ - 1} values are obtained. Synthetic tests suggest that the proposed method can restore robust and reliableQ_S^{ - 1} values, especially when the dataset of local earthquakes is not abundant. We utilize this method for local earthquake waveforms at 14 borehole seismic stations in the North China basin, and obtain Q_S^{ - 1} values in 2 ˜ 10 Hz frequency band, as well as average {V_P}, {V_S} and {V_P}/{{}}{V_S} ratio for shallow sediments deep to a few hundred meters. Results suggest that Q_S^{ - 1} values are to 0.01˜0.06, and generally decrease with frequency. The average attenuation structure of shallow sediments within the depth of a few hundred meters beneath 14 borehole stations in the North China basin can be modeled as Q_S^{ - 1} = 0.056{f^{ - 0.61}}. It is generally consistent with the attenuation structure of sedimentary basins in other areas, such as Mississippi Embayment sediments in the United States and Sendai basin in Japan.
An adaptive segment method for smoothing lidar signal based on noise estimation
NASA Astrophysics Data System (ADS)
Wang, Yuzhao; Luo, Pingping
2014-10-01
An adaptive segmentation smoothing method (ASSM) is introduced in the paper to smooth the signal and suppress the noise. In the ASSM, the noise is defined as the 3σ of the background signal. An integer number N is defined for finding the changing positions in the signal curve. If the difference of adjacent two points is greater than 3Nσ, the position is recorded as an end point of the smoothing segment. All the end points detected as above are recorded and the curves between them will be smoothed separately. In the traditional method, the end points of the smoothing windows in the signals are fixed. The ASSM creates changing end points in different signals and the smoothing windows could be set adaptively. The windows are always set as the half of the segmentations and then the average smoothing method will be applied in the segmentations. The Iterative process is required for reducing the end-point aberration effect in the average smoothing method and two or three times are enough. In ASSM, the signals are smoothed in the spacial area nor frequent area, that means the frequent disturbance will be avoided. A lidar echo was simulated in the experimental work. The echo was supposed to be created by a space-born lidar (e.g. CALIOP). And white Gaussian noise was added to the echo to act as the random noise resulted from environment and the detector. The novel method, ASSM, was applied to the noisy echo to filter the noise. In the test, N was set to 3 and the Iteration time is two. The results show that, the signal could be smoothed adaptively by the ASSM, but the N and the Iteration time might be optimized when the ASSM is applied in a different lidar.
Method for preparing dosimeter for measuring skin dose
Jones, Donald E.; Parker, DeRay; Boren, Paul R.
1982-01-01
A personnel dosimeter includes a plurality of compartments containing thermoluminescent dosimeter phosphors for registering radiation dose absorbed in the wearer's sensitive skin layer and for registering more deeply penetrating radiation. Two of the phosphor compartments communicate with thin windows of different thicknesses to obtain a ratio of shallowly penetrating radiation, e.g. beta. A third phosphor is disposed within a compartment communicating with a window of substantially greater thickness than the windows of the first two compartments for estimating the more deeply penetrating radiation dose. By selecting certain phosphors that are insensitive to neutrons and by loading the holder material with neutron-absorbing elements, energetic neutron dose can be estimated separately from other radiation dose. This invention also involves a method of injection molding of dosimeter holders with thin windows of consistent thickness at the corresponding compartments of different holders. This is achieved through use of a die insert having the thin window of precision thickness in place prior to the injection molding step.
3. NORTH FRONT, BULLET GLASS OBSERVATION WINDOWS FACE SLED TRACK. ...
3. NORTH FRONT, BULLET GLASS OBSERVATION WINDOWS FACE SLED TRACK. - Edwards Air Force Base, South Base Sled Track, Instrumentation & Control Building, South of Sled Track, Station "50" area, Lancaster, Los Angeles County, CA
NASA Astrophysics Data System (ADS)
Feng, Guixiang; Ming, Dongping; Wang, Min; Yang, Jianyu
2017-06-01
Scale problems are a major source of concern in the field of remote sensing. Since the remote sensing is a complex technology system, there is a lack of enough cognition on the connotation of scale and scale effect in remote sensing. Thus, this paper first introduces the connotations of pixel-based scale and summarizes the general understanding of pixel-based scale effect. Pixel-based scale effect analysis is essentially important for choosing the appropriate remote sensing data and the proper processing parameters. Fractal dimension is a useful measurement to analysis pixel-based scale. However in traditional fractal dimension calculation, the impact of spatial resolution is not considered, which leads that the scale effect change with spatial resolution can't be clearly reflected. Therefore, this paper proposes to use spatial resolution as the modified scale parameter of two fractal methods to further analyze the pixel-based scale effect. To verify the results of two modified methods (MFBM (Modified Windowed Fractal Brownian Motion Based on the Surface Area) and MDBM (Modified Windowed Double Blanket Method)); the existing scale effect analysis method (information entropy method) is used to evaluate. And six sub-regions of building areas and farmland areas were cut out from QuickBird images to be used as the experimental data. The results of the experiment show that both the fractal dimension and information entropy present the same trend with the decrease of spatial resolution, and some inflection points appear at the same feature scales. Further analysis shows that these feature scales (corresponding to the inflection points) are related to the actual sizes of the geo-object, which results in fewer mixed pixels in the image, and these inflection points are significantly indicative of the observed features. Therefore, the experiment results indicate that the modified fractal methods are effective to reflect the pixel-based scale effect existing in remote sensing data and it is helpful to analyze the observation scale from different aspects. This research will ultimately benefit for remote sensing data selection and application.
Displacement and frequency analyses of vibratory systems
NASA Astrophysics Data System (ADS)
Low, K. H.
1995-02-01
This paper deals with the frequency and response studies of vibratory systems, which are represented by a set of n coupled second-order differential equations. The following numerical methods are used in the response analysis: central difference, fourth-order Runge-Kutta and modal methods. Data generated in the response analysis are processed to obtain the system frequencies by using the fast Fourier transform (FFT) or harmonic response methods. Two types of the windows are used in the FFT analysis: rectangular and Hanning windows. Examples of two, four and seven degrees of freedom systems are considered, to illustrate the proposed algorithms. Comparisons with those existing results confirm the validity of the proposed methods. The Hanning window attenuates the results that give a narrower bandwidth around the peak if compared with those using the rectangular window. It is also found that in free vibrations of a multi-mass system, the masses will vibrate in a manner that is the superposition of the natural frequencies of the system, while the system will vibrate at the driving frequency in forced vibrations.
Fast focus estimation using frequency analysis in digital holography.
Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung
2014-11-17
A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.
Rugged sensor window materials for harsh environments
NASA Astrophysics Data System (ADS)
Bayya, Shyam; Villalobos, Guillermo; Kim, Woohong; Sanghera, Jasbinger; Hunt, Michael; Aggarwal, Ishwar D.
2014-09-01
There are several military or commercial systems operating in very harsh environments that require rugged windows. On some of these systems, windows become the single point of failure. These applications include sensor or imaging systems, high-energy laser weapons systems, submarine photonic masts, IR countermeasures and missiles. Based on the sea or land or air based platforms the window or dome on these systems must withstand wave slap, underwater or ground based explosions, or survive flight through heavy rain and sand storms while maintaining good optical transmission in the desired wavelength range. Some of these applications still use softer ZnS or fused silica windows because of lack of availability of rugged materials in shapes or sizes required. Sapphire, ALON and spinel are very rugged materials with significantly higher strengths compared to ZnS and fused silica. There have been recent developments in spinel, ALON and sapphire materials to fabricate in large sizes and conformal shapes. We have been developing spinel ceramics for several of these applications. We are also developing β-SiC as a transparent window material as it has higher hardness, strength, and toughness than sapphire, ALON and spinel. This paper gives a summary of our recent findings.
A Group Increment Scheme for Infrared Absorption Intensities of Greenhouse Gases
NASA Technical Reports Server (NTRS)
Kokkila, Sara I.; Bera, Partha P.; Francisco, Joseph S.; Lee, Timothy J.
2012-01-01
A molecule's absorption in the atmospheric infrared (IR) window (IRW) is an indicator of its efficiency as a greenhouse gas. A model for estimating the absorption of a fluorinated molecule within the IRW was developed to assess its radiative impact. This model will be useful in comparing different hydrofluorocarbons and hydrofluoroethers contribution to global warming. The absorption of radiation by greenhouse gases, in particular hydrofluoroethers and hydrofluorocarbons, was investigated using ab initio quantum mechanical methods. Least squares regression techniques were used to create a model based on this data. The placement and number of fluorines in the molecule were found to affect the absorption in the IR window and were incorporated into the model. Several group increment models are discussed. An additive model based on one-carbon groups is found to work satisfactorily in predicting the ab initio calculated vibrational intensities.
Marko, Michael; Meng, Xing; Hsieh, Chyongere; Roussie, James; Striemer, Christopher
2013-01-01
Imaging with Zernike phase plates is increasingly being used in cryo-TEM tomography and cryo-EM single-particle applications. However, rapid ageing of the phase plates, together with the cost and effort in producing them, present serious obstacles to widespread adoption. We are experimenting with phase plates based on silicon chips that have thin windows; such phase plates could be mass-produced and made available at moderate cost. The windows are coated with conductive layers to reduce charging, and this considerably extends the useful life of the phase plates compared to traditional pure-carbon phase plates. However, a compromise must be reached between robustness and transmission through the phase-plate film. Details are given on testing phase-plate performance by means of imaging an amorphous thin film and evaluating the power spectra of the images. PMID:23994351
On the use of transition matrix methods with extended ensembles.
Escobedo, Fernando A; Abreu, Charlles R A
2006-03-14
Different extended ensemble schemes for non-Boltzmann sampling (NBS) of a selected reaction coordinate lambda were formulated so that they employ (i) "variable" sampling window schemes (that include the "successive umbrella sampling" method) to comprehensibly explore the lambda domain and (ii) transition matrix methods to iteratively obtain the underlying free-energy eta landscape (or "importance" weights) associated with lambda. The connection between "acceptance ratio" and transition matrix methods was first established to form the basis of the approach for estimating eta(lambda). The validity and performance of the different NBS schemes were then assessed using as lambda coordinate the configurational energy of the Lennard-Jones fluid. For the cases studied, it was found that the convergence rate in the estimation of eta is little affected by the use of data from high-order transitions, while it is noticeably improved by the use of a broader window of sampling in the variable window methods. Finally, it is shown how an "elastic" window of sampling can be used to effectively enact (nonuniform) preferential sampling over the lambda domain, and how to stitch the weights from separate one-dimensional NBS runs to produce a eta surface over a two-dimensional domain.
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-06-16
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment.
Power strain imaging based on vibro-elastography techniques
NASA Astrophysics Data System (ADS)
Wen, Xu; Salcudean, S. E.
2007-03-01
This paper describes a new ultrasound elastography technique, power strain imaging, based on vibro-elastography (VE) techniques. With this method, tissue is compressed by a vibrating actuator driven by low-pass or band-pass filtered white noise, typically in the 0-20 Hz range. Tissue displacements at different spatial locations are estimated by correlation-based approaches on the raw ultrasound radio frequency signals and recorded in time sequences. The power spectra of these time sequences are computed by Fourier spectral analysis techniques. As the average of the power spectrum is proportional to the squared amplitude of the tissue motion, the square root of the average power over the range of excitation frequencies is used as a measure of the tissue displacement. Then tissue strain is determined by the least squares estimation of the gradient of the displacement field. The computation of the power spectra of the time sequences can be implemented efficiently by using Welch's periodogram method with moving windows or with accumulative windows with a forgetting factor. Compared to the transfer function estimation originally used in VE, the computation of cross spectral densities is not needed, which saves both the memory and computational times. Phantom experiments demonstrate that the proposed method produces stable and operator-independent strain images with high signal-to-noise ratio in real time. This approach has been also tested on a few patient data of the prostate region, and the results are encouraging.
A Stationary Wavelet Entropy-Based Clustering Approach Accurately Predicts Gene Expression
Nguyen, Nha; Vo, An; Choi, Inchan
2015-01-01
Abstract Studying epigenetic landscapes is important to understand the condition for gene regulation. Clustering is a useful approach to study epigenetic landscapes by grouping genes based on their epigenetic conditions. However, classical clustering approaches that often use a representative value of the signals in a fixed-sized window do not fully use the information written in the epigenetic landscapes. Clustering approaches to maximize the information of the epigenetic signals are necessary for better understanding gene regulatory environments. For effective clustering of multidimensional epigenetic signals, we developed a method called Dewer, which uses the entropy of stationary wavelet of epigenetic signals inside enriched regions for gene clustering. Interestingly, the gene expression levels were highly correlated with the entropy levels of epigenetic signals. Dewer separates genes better than a window-based approach in the assessment using gene expression and achieved a correlation coefficient above 0.9 without using any training procedure. Our results show that the changes of the epigenetic signals are useful to study gene regulation. PMID:25383910
Field repair of AH-16 helicopter window cutting assemblies
NASA Technical Reports Server (NTRS)
Bement, L. J.
1984-01-01
The U.S. Army uses explosively actuated window cutting assemblies to provide emergency crew ground egress. Gaps between the system's explosive cords and acrylic windows caused a concern about functional reliability for a fleet of several hundred aircraft. A field repair method, using room temperature vulcanizing silicone compound (RTV), was developed and demonstrated to fill gaps as large as 0.250 inch.
Windows System Engineer with the Computational Science Center. He implements, supports, and integrates Windows-based technology solutions at the ESIF and manages a portion of the VMware infrastructure . Throughout his career, Tony has built a strong skillset in enterprise Windows Engineering and Active
6. SOUTH SIDE, DETAIL OF BULLET GLASS WINDOWS AT GROUND ...
6. SOUTH SIDE, DETAIL OF BULLET GLASS WINDOWS AT GROUND LEVEL. - Edwards Air Force Base, South Base Sled Track, Firing Control Blockhouse, South of Sled Track at east end, Lancaster, Los Angeles County, CA
Window acoustic study for advanced turboprop aircraft
NASA Technical Reports Server (NTRS)
Prydz, R. A.; Balena, F. J.
1984-01-01
An acoustic analysis was performed to establish window designs for advanced turboprop powered aircraft. The window transmission loss requirements were based on A-weighted interior noise goals of 80 and 75 dBA. The analytical results showed that a triple pane window consisting of two glass outer panes and an inner pane of acrylic would provide the required transmission loss and meet the sidewall space limits. Two window test articles were fabricated for laboratory evaluation and verification of the predicted transmission loss. Procedures for performing laboratory tests are presented.
NASA Astrophysics Data System (ADS)
Wihardi, Y.; Setiawan, W.; Nugraha, E.
2018-01-01
On this research we try to build CBIRS based on Learning Distance/Similarity Function using Linear Discriminant Analysis (LDA) and Histogram of Oriented Gradient (HoG) feature. Our method is invariant to depiction of image, such as similarity of image to image, sketch to image, and painting to image. LDA can decrease execution time compared to state of the art method, but it still needs an improvement in term of accuracy. Inaccuracy in our experiment happen because we did not perform sliding windows search and because of low number of negative samples as natural-world images.
Flokou, Angeliki; Aletras, Vassilis; Niakas, Dimitris
2017-01-01
The main objective of this study was to apply the non-parametric method of Data Envelopment Analysis (DEA) to measure the efficiency of Greek NHS hospitals between 2009–2013. Hospitals were divided into four separate groups with common characteristics which allowed comparisons to be carried out in the context of increased homogeneity. The window-DEA method was chosen since it leads to increased discrimination on the results especially when applied to small samples and it enables year-by-year comparisons of the results. Three inputs -hospital beds, physicians and other health professionals- and three outputs—hospitalized cases, surgeries and outpatient visits- were chosen as production variables in an input-oriented 2-year window DEA model for the assessment of technical and scale efficiency as well as for the identification of returns to scale. The Malmquist productivity index together with its components (i.e. pure technical efficiency change, scale efficiency change and technological scale) were also calculated in order to analyze the sources of productivity change between the first and last year of the study period. In the context of window analysis, the study identified the individual efficiency trends together with “all-windows” best and worst performers and revealed that a high level of technical and scale efficiency was maintained over the entire 5-year period. Similarly, the relevant findings of Malmquist productivity index analysis showed that both scale and pure technical efficiency were improved in 2013 whilst technological change was found to be in favor of the two groups with the largest hospitals. PMID:28542362
NASA Astrophysics Data System (ADS)
Mousavi Anzehaee, Mohammad; Adib, Ahmad; Heydarzadeh, Kobra
2015-10-01
The manner of microtremor data collection and filtering operation and also the method used for processing have a considerable effect on the accuracy of estimation of dynamic soil parameters. In this paper, running variance method was used to improve the automatic detection of data sections infected by local perturbations. In this method, the microtremor data running variance is computed using a sliding window. Then the obtained signal is used to remove the ranges of data affected by perturbations from the original data. Additionally, to determinate the fundamental frequency of a site, this study has proposed a statistical characteristics-based method. Actually, statistical characteristics, such as the probability density graph and the average and the standard deviation of all the frequencies corresponding to the maximum peaks in the H/ V spectra of all data windows, are used to differentiate the real peaks from the false peaks resulting from perturbations. The methods have been applied to the data recorded for the City of Meybod in central Iran. Experimental results show that the applied methods are able to successfully reduce the effects of extensive local perturbations on microtremor data and eventually to estimate the fundamental frequency more accurately compared to other common methods.
CNN for breaking text-based CAPTCHA with noise
NASA Astrophysics Data System (ADS)
Liu, Kaixuan; Zhang, Rong; Qing, Ke
2017-07-01
A CAPTCHA ("Completely Automated Public Turing test to tell Computers and Human Apart") system is a program that most humans can pass but current computer programs could hardly pass. As the most common type of CAPTCHAs , text-based CAPTCHA has been widely used in different websites to defense network bots. In order to breaking textbased CAPTCHA, in this paper, two trained CNN models are connected for the segmentation and classification of CAPTCHA images. Then base on these two models, we apply sliding window segmentation and voting classification methods realize an end-to-end CAPTCHA breaking system with high success rate. The experiment results show that our method is robust and effective in breaking text-based CAPTCHA with noise.
Research on the honeycomb restrain layer application to the high power microwave dielectric window
NASA Astrophysics Data System (ADS)
Zhang, Qingyuan; Shao, Hao; Huang, Wenhua; Guo, Letian
2018-01-01
Dielectric window breakdown is an important problem of high power microwave radiation. A honeycomb layer can suppress the multipactor in two directions to restrain dielectric window breakdown. This paper studies the effect of the honeycomb restrain layer on improving the dielectric window power capability. It also studies the multipactor suppression mechanism by using the electromagnetic particle-in-cell software, gives the design method, and accomplishes the test experiment. The experimental results indicated that the honeycomb restrain layer can effectively improve the power capability twice.
Smart glass as the method of improving the energy efficiency of high-rise buildings
NASA Astrophysics Data System (ADS)
Gamayunova, Olga; Gumerova, Eliza; Miloradova, Nadezda
2018-03-01
The question that has to be answered in high-rise building is glazing and its service life conditions. Contemporary market offers several types of window units, for instance, wooden, aluminum, PVC and combined models. Wooden and PVC windows become the most widespread and competitive between each other. In recent times design engineers choose smart glass. In this article, the advantages and drawbacks of all types of windows are reviewed, and the recommendations are given according to choice of window type in order to improve energy efficiency of buildings.
Research on the honeycomb restrain layer application to the high power microwave dielectric window.
Zhang, Qingyuan; Shao, Hao; Huang, Wenhua; Guo, Letian
2018-01-01
Dielectric window breakdown is an important problem of high power microwave radiation. A honeycomb layer can suppress the multipactor in two directions to restrain dielectric window breakdown. This paper studies the effect of the honeycomb restrain layer on improving the dielectric window power capability. It also studies the multipactor suppression mechanism by using the electromagnetic particle-in-cell software, gives the design method, and accomplishes the test experiment. The experimental results indicated that the honeycomb restrain layer can effectively improve the power capability twice.
Design of a digital phantom population for myocardial perfusion SPECT imaging research.
Ghaly, Michael; Du, Yong; Fung, George S K; Tsui, Benjamin M W; Links, Jonathan M; Frey, Eric
2014-06-21
Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.
Design of a digital phantom population for myocardial perfusion SPECT imaging research
NASA Astrophysics Data System (ADS)
Ghaly, Michael; Du, Yong; Fung, George S. K.; Tsui, Benjamin M. W.; Links, Jonathan M.; Frey, Eric
2014-06-01
Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.
ERIC Educational Resources Information Center
Park, Moonyoung
2018-01-01
Aviation English proficiency is a core competency in the global air traffic controller profession. There is, however, growing concern about the current ineffective paper-based assessment methods and the severe lack of interactive online testing for such a critical profession, one that should be ideally assessed in an authentic task and situation…
Investigating the Limitations of Advanced Design Methods through Real World Application
2016-03-31
36 War Room Laptop Display ( MySQL , JMP 9 Pro, 64-bit Windows) Georgia Tech Secure Collaborative Visualization Environment ( MySQL , JMP 9 Pro...investigate expanding the EA for VC3ATS • Would like to consider both an expansion of the use of current Java -based BPM approach and other potential EA
NASA Astrophysics Data System (ADS)
Rai, A.; Minsker, B. S.
2016-12-01
In this work we introduce a novel dataset GRID: GReen Infrastructure Detection Dataset and a framework for identifying urban green storm water infrastructure (GI) designs (wetlands/ponds, urban trees, and rain gardens/bioswales) from social media and satellite aerial images using computer vision and machine learning methods. Along with the hydrologic benefits of GI, such as reducing runoff volumes and urban heat islands, GI also provides important socio-economic benefits such as stress recovery and community cohesion. However, GI is installed by many different parties and cities typically do not know where GI is located, making study of its impacts or siting new GI difficult. We use object recognition learning methods (template matching, sliding window approach, and Random Hough Forest method) and supervised machine learning algorithms (e.g., support vector machines) as initial screening approaches to detect potential GI sites, which can then be investigated in more detail using on-site surveys. Training data were collected from GPS locations of Flickr and Instagram image postings and Amazon Mechanical Turk identification of each GI type. Sliding window method outperformed other methods and achieved an average F measure, which is combined metric for precision and recall performance measure of 0.78.
Dong, Shaopeng; Yuan, Mei; Wang, Qiusheng; Liang, Zhiling
2018-05-21
The acoustic emission (AE) method is useful for structural health monitoring (SHM) of composite structures due to its high sensitivity and real-time capability. The main challenge, however, is how to classify the AE data into different failure mechanisms because the detected signals are affected by various factors. Empirical wavelet transform (EWT) is a solution for analyzing the multi-component signals and has been used to process the AE data. In order to solve the spectrum separation problem of the AE signals, this paper proposes a novel modified separation method based on local window maxima (LWM) algorithm. It searches the local maxima of the Fourier spectrum in a proper window, and automatically determines the boundaries of spectrum segmentations, which helps to eliminate the impact of noise interference or frequency dispersion in the detected signal and obtain the meaningful empirical modes that are more related to the damage characteristics. Additionally, both simulation signal and AE signal from the composite structures are used to verify the effectiveness of the proposed method. Finally, the experimental results indicate that the proposed method performs better than the original EWT method in identifying different damage mechanisms of composite structures.
Dong, Shaopeng; Yuan, Mei; Wang, Qiusheng; Liang, Zhiling
2018-01-01
The acoustic emission (AE) method is useful for structural health monitoring (SHM) of composite structures due to its high sensitivity and real-time capability. The main challenge, however, is how to classify the AE data into different failure mechanisms because the detected signals are affected by various factors. Empirical wavelet transform (EWT) is a solution for analyzing the multi-component signals and has been used to process the AE data. In order to solve the spectrum separation problem of the AE signals, this paper proposes a novel modified separation method based on local window maxima (LWM) algorithm. It searches the local maxima of the Fourier spectrum in a proper window, and automatically determines the boundaries of spectrum segmentations, which helps to eliminate the impact of noise interference or frequency dispersion in the detected signal and obtain the meaningful empirical modes that are more related to the damage characteristics. Additionally, both simulation signal and AE signal from the composite structures are used to verify the effectiveness of the proposed method. Finally, the experimental results indicate that the proposed method performs better than the original EWT method in identifying different damage mechanisms of composite structures. PMID:29883411
SOUTHWEST SIDE AND SOUTHEAST FRONT, BUILDING 1932. OBSERVATION WINDOWS ARE ...
SOUTHWEST SIDE AND SOUTHEAST FRONT, BUILDING 1932. OBSERVATION WINDOWS ARE BEHIND THE METAL GRATING. ENTRY HATCH IS ON NORTHWEST FACADE - Edwards Air Force Base, X-15 Engine Test Complex, Observation Bunker Types, Rogers Dry Lake, east of runway between North Base & South Base, Boron, Kern County, CA
Robotic Attention Processing And Its Application To Visual Guidance
NASA Astrophysics Data System (ADS)
Barth, Matthew; Inoue, Hirochika
1988-03-01
This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.
2013-01-01
Background Screening of houses might have impact on density of indoor host-seeking Anopheles mosquitoes. A randomized trial of screening windows and doors with metal mesh, and closing openings on eves and walls by mud was conducted to assess if reduce indoor densities of biting mosquitoes. Methods Mosquitoes were collected in forty houses using Centers for Diseases Control and Prevention (CDC) light traps biweekly in March and April 2011. A randomization of houses into control and intervention groups was done based on the baseline data. Windows and doors of 20 houses were screened by metal mesh, and openings on the walls and eves closed by mud and the rest 20 houses were used as control group. Mosquitoes were collected biweekly in October and November 2011 from both control and intervention houses. A Generalized Estimating Equations (GEE) with a negative binomial error distribution was used to account for over dispersion of Anopheles arabiensis and culicine counts and repeated catches made in the same house. Results Screening doors and windows, and closing openings on eves and wall by mud reduced the overall indoor densities of An. arabiensis by 40%. The effect of screenings pronounced on unfed An. arabiensis by resulting 42% reduction in houses with interventions. The total costs for screening windows and doors, and to close openings on the eves and walls by mud was 7.34 USD per house. Conclusion Screening houses reduced indoor density of An. arabiensis, and it was cheap and can easily incorporated into malaria vector strategies by local communities, but improving doors and windows fitness for screening should be considered during house construction to increase the efficacy of screenings. PMID:24028542
Non-valvular main pulmonary artery vegetation associated with aortopulmonary window.
Unal, M; Tuncer, C; Serçe, K; Bostan, M; Gökçe, M; Erem, C
1995-01-01
We present a 32-year-old female with aortopulmonary window and vegetation of non-valvular main pulmonary artery. The aortopulmonary window is a rare congenital disease in which the aorta and pulmonary arteries are communicated by a defect of variable diameter. The pulmonic valve is the least commonly involved valve in bacterial endocarditis, but there is no vegetation of non-valvular main pulmonary artery in the literature. Colour duplex sonography showed an aortopulmonary window with aortic regurgitation. Magnetic resonance (MR) imaging demonstrating the vegetation on the wall of main pulmonary artery, is an useful and complementary method, and can be used for demonstration of congenital and acquired cardiovascular pathologies including aortopulmonary window and subpulmonic or suprapulmonic vegetations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruemmer, David J; Walton, Miles C
Methods and systems for controlling a plurality of robots through a single user interface include at least one robot display window for each of the plurality of robots with the at least one robot display window illustrating one or more conditions of a respective one of the plurality of robots. The user interface further includes at least one robot control window for each of the plurality of robots with the at least one robot control window configured to receive one or more commands for sending to the respective one of the plurality of robots. The user interface further includes amore » multi-robot common window comprised of information received from each of the plurality of robots.« less
Short segment search method for phylogenetic analysis using nested sliding windows
NASA Astrophysics Data System (ADS)
Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.
2017-10-01
To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.
Tencer, Michal; Berini, Pierre
2008-11-04
We describe a method for the selective desorption of thiol self-assembled monolayers from gold surfaces having micrometer-scale separations on a substrate. In an electrolyte solution, the electrical resistance between the adjacent areas can be much lower than the resistance between a surface and the counter electrode. Also, both reductive and oxidative thiol desorption may occur. Therefore, the potentials of the surfaces must be independently controlled with a multichannel potentiostat and operating windows for a given thiol/electrolyte system must be established. In this study operating windows were established for 1-dodecanethiol-based SAMs in phosphate buffer, phosphate-buffered saline, and sodium hydroxide solution, and selective SAM removal was successfully performed in a four-electrode configuration.
Genkawa, Takuma; Shinzawa, Hideyuki; Kato, Hideaki; Ishikawa, Daitaro; Murayama, Kodai; Komiyama, Makoto; Ozaki, Yukihiro
2015-12-01
An alternative baseline correction method for diffuse reflection near-infrared (NIR) spectra, searching region standard normal variate (SRSNV), was proposed. Standard normal variate (SNV) is an effective pretreatment method for baseline correction of diffuse reflection NIR spectra of powder and granular samples; however, its baseline correction performance depends on the NIR region used for SNV calculation. To search for an optimal NIR region for baseline correction using SNV, SRSNV employs moving window partial least squares regression (MWPLSR), and an optimal NIR region is identified based on the root mean square error (RMSE) of cross-validation of the partial least squares regression (PLSR) models with the first latent variable (LV). The performance of SRSNV was evaluated using diffuse reflection NIR spectra of mixture samples consisting of wheat flour and granular glucose (0-100% glucose at 5% intervals). From the obtained NIR spectra of the mixture in the 10 000-4000 cm(-1) region at 4 cm intervals (1501 spectral channels), a series of spectral windows consisting of 80 spectral channels was constructed, and then SNV spectra were calculated for each spectral window. Using these SNV spectra, a series of PLSR models with the first LV for glucose concentration was built. A plot of RMSE versus the spectral window position obtained using the PLSR models revealed that the 8680–8364 cm(-1) region was optimal for baseline correction using SNV. In the SNV spectra calculated using the 8680–8364 cm(-1) region (SRSNV spectra), a remarkable relative intensity change between a band due to wheat flour at 8500 cm(-1) and that due to glucose at 8364 cm(-1) was observed owing to successful baseline correction using SNV. A PLSR model with the first LV based on the SRSNV spectra yielded a determination coefficient (R2) of 0.999 and an RMSE of 0.70%, while a PLSR model with three LVs based on SNV spectra calculated in the full spectral region gave an R2 of 0.995 and an RMSE of 2.29%. Additional evaluation of SRSNV was carried out using diffuse reflection NIR spectra of marzipan and corn samples, and PLSR models based on SRSNV spectra showed good prediction results. These evaluation results indicate that SRSNV is effective in baseline correction of diffuse reflection NIR spectra and provides regression models with good prediction accuracy.
Shehzad, Khurram; Xu, Yang; Gao, Chao; Li, Hanying; Dang, Zhi-Min; Hasan, Tawfique; Luo, Jack; Duan, Xiangfeng
2017-03-01
Polymer dielectrics offer key advantages over their ceramic counterparts such as flexibility, scalability, low cost, and high breakdown voltages. However, a major drawback that limits more widespread application of polymer dielectrics is their temperature-dependent dielectric properties. Achieving dielectric constants with low/zero-temperature coefficient (L/0TC) over a broad temperature range is essential for applications in diverse technologies. Here, we report a hybrid filler strategy to produce polymer composites with an ultrawide L/0TC window of dielectric constant, as well as a significantly enhanced dielectric value, maximum energy storage density, thermal conductivity, and stability. By creating a series of percolative polymer composites, we demonstrated hybrid carbon filler based composites can exhibit a zero-temperature coefficient window of 200 °C (from -50 to 150 °C), the widest 0TC window for all polymer composite dielectrics reported to date. We further show the electric and dielectric temperature coefficient of the composites is highly stable against stretching and bending, even under AC electric field with frequency up to 1 MHz. We envision that our method will push the functional limits of polymer dielectrics for flexible electronics in extreme conditions such as in hybrid vehicles, aerospace, power electronics, and oil/gas exploration.
Windowed time-reversal music technique for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements.
Komorowski, Dariusz; Pietraszek, Stanislaw
2016-01-01
This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.
Wisp, the Windows Interface for Simulating Plumes, is designed to be an easy-to-use windows platform program for aquatic modeling. Wisp inherits many of its capabilities from its predecessor, the DOS-based PLUMES (Baumgartner, Frick, Roberts, 1994). These capabilities have been ...
Rule-Based Motion Coordination for the Adaptive Suspension Vehicle on Ternary-Type Terrain
1990-12-01
robot-window-array* nil) (defvar *robot..window..width* nil) (defvar * rebot -.window..heig)ht* nil) (defvar *terrain-buffer* nil) (defvar *terrain...cond ((momrber leg lift-able-leg. -test #’equal) log) (t nil)) .(dafmethod (test-overlap- rebot ipltcable-leg) (log) (nond ((and (member leg place-able
Patterning of organic photovoltaic on R2R processed thin film barriers using IR laser sources
NASA Astrophysics Data System (ADS)
Fledderus, H.; Akkerman, H. B.; Salem, A.; Friedrich Schilling, N.; Klotzbach, U.
2017-02-01
We present the development of laser processes for flexible OPV on roll-to-roll (RR2R) produced thin film barrier with indium tin oxide (ITO) as transparent conductive (TC) bottom electrode. Direct laser structuring of ITO on such barrier films (so-called P1 process) is very challenging since the layers are all transparent, a complete electrical isolation is required, and the laser process should not influence the barrier performance underneath the scribes. Based on the optical properties off the SiN and ITTO, ultra-short pulse lasers inn picosecond and femtosecond regime with standard infrared (IR) wavelength as well as lasers with new a wavelength (22 μm regime) are tested for this purpose. To determine a process window for a specific laser a fixed methodology is adopted. Single pulse ablation tests were followed by scribing experiments where the pulse overlap was tuned by varying laser pulse fluence, writing speed and frequency. To verify that the laser scribing does not result inn barrier damage underneath, a new test method was developed based on the optical Ca-test. This method shows a clear improvement in damage analysis underneath laser scribes over normal optical inspection methods (e.g. microscope, optical profiler, SEM). This way clear process windows can be obtained for IR TC patterning.
Gabler, Christopher A; Siemann, Evan
2013-01-01
The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our 'outgrow the stress' hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management.
Gabler, Christopher A.; Siemann, Evan
2013-01-01
The rate of new exotic recruitment following removal of adult invaders (reinvasion pressure) influences restoration outcomes and costs but is highly variable and poorly understood. We hypothesize that broad variation in average reinvasion pressure of Triadica sebifera (Chinese tallow tree, a major invader) arises from differences among habitats in spatiotemporal availability of realized recruitment windows. These windows are periods of variable duration long enough to permit establishment given local environmental conditions. We tested this hypothesis via a greenhouse mesocosm experiment that quantified how the duration of favorable moisture conditions prior to flood or drought stress (window duration), competition and nutrient availability influenced Triadica success in high stress environments. Window duration influenced pre-stress seedling abundance and size, growth during stress and final abundance; it interacted with other factors to affect final biomass and germination during stress. Stress type and competition impacted final size and biomass, plus germination, mortality and changes in size during stress. Final abundance also depended on competition and the interaction of window duration, stress type and competition. Fertilization interacted with competition and stress to influence biomass and changes in height, respectively, but did not affect Triadica abundance. Overall, longer window durations promoted Triadica establishment, competition and drought (relative to flood) suppressed establishment, and fertilization had weak effects. Interactions among factors frequently produced different effects in specific contexts. Results support our ‘outgrow the stress’ hypothesis and show that temporal availability of abiotic windows and factors that influence growth rates govern Triadica recruitment in stressful environments. These findings suggest that native seed addition can effectively suppress superior competitors in stressful environments. We also describe environmental scenarios where specific management methods may be more or less effective. Our results enable better niche-based estimates of local reinvasion pressure, which can improve restoration efficacy and efficiency by informing site selection and optimal management. PMID:23967212
Takeuchi, Wataru; Suzuki, Atsuro; Shiga, Tohru; Kubo, Naoki; Morimoto, Yuichi; Ueno, Yuichiro; Kobashi, Keiji; Umegaki, Kikuo; Tamaki, Nagara
2016-12-01
A brain single-photon emission computed tomography (SPECT) system using cadmium telluride (CdTe) solid-state detectors was previously developed. This CdTe-SPECT system is suitable for simultaneous dual-radionuclide imaging due to its fine energy resolution (6.6 %). However, the problems of down-scatter and low-energy tail due to the spectral characteristics of a pixelated solid-state detector should be addressed. The objective of this work was to develop a system for simultaneous Tc-99m and I-123 brain studies and evaluate its accuracy. A scatter correction method using five energy windows (FiveEWs) was developed. The windows are Tc-lower, Tc-main, shared sub-window of Tc-upper and I-lower, I-main, and I-upper. This FiveEW method uses pre-measured responses for primary gamma rays from each radionuclide to compensate for the overestimation of scatter by the triple-energy window method that is used. Two phantom experiments and a healthy volunteer experiment were conducted using the CdTe-SPECT system. A cylindrical phantom and a six-compartment phantom with five different mixtures of Tc-99m and I-123 and a cold one were scanned. The quantitative accuracy was evaluated using 18 regions of interest for each phantom. In the volunteer study, five healthy volunteers were injected with Tc-99m human serum albumin diethylene triamine pentaacetic acid (HSA-D) and scanned (single acquisition). They were then injected with I-123 N-isopropyl-4-iodoamphetamine hydrochloride (IMP) and scanned again (dual acquisition). The counts of the Tc-99m images for the single and dual acquisitions were compared. In the cylindrical phantom experiments, the percentage difference (PD) between the single and dual acquisitions was 5.7 ± 4.0 % (mean ± standard deviation). In the six-compartment phantom experiment, the PDs between measured and injected activity for Tc-99m and I-123 were 14.4 ± 11.0 and 2.3 ± 1.8 %, respectively. In the volunteer study, the PD between the single and dual acquisitions was 4.5 ± 3.4 %. This CdTe-SPECT system using the FiveEW method can provide accurate simultaneous dual-radionuclide imaging. A solid-state detector SPECT system using the FiveEW method will permit quantitative simultaneous Tc-99m and I-123 study to become clinically applicable.
Hsu, Hsiao-Hsien Leon; Chiu, Yueh-Hsiu Mathilda; Coull, Brent A; Kloog, Itai; Schwartz, Joel; Lee, Alison; Wright, Robert O; Wright, Rosalind J
2015-11-01
The influence of particulate air pollution on respiratory health starts in utero. Fetal lung growth and structural development occurs in stages; thus, effects on postnatal respiratory disorders may differ based on timing of exposure. We implemented an innovative method to identify sensitive windows for effects of prenatal exposure to particulate matter with a diameter less than or equal to 2.5 μm (PM2.5) on children's asthma development in an urban pregnancy cohort. Analyses included 736 full-term (≥37 wk) children. Each mother's daily PM2.5 exposure was estimated over gestation using a validated satellite-based spatiotemporal resolved model. Using distributed lag models, we examined associations between weekly averaged PM2.5 levels over pregnancy and physician-diagnosed asthma in children by age 6 years. Effect modification by sex was also examined. Most mothers were ethnic minorities (54% Hispanic, 30% black), had 12 or fewer years of education (66%), and did not smoke in pregnancy (80%). In the sample as a whole, distributed lag models adjusting for child age, sex, and maternal factors (education, race and ethnicity, smoking, stress, atopy, prepregnancy obesity) showed that increased PM2.5 exposure levels at 16-25 weeks gestation were significantly associated with early childhood asthma development. An interaction between PM2.5 and sex was significant (P = 0.01) with sex-stratified analyses showing that the association exists only for boys. Higher prenatal PM2.5 exposure at midgestation was associated with asthma development by age 6 years in boys. Methods to better characterize vulnerable windows may provide insight into underlying mechanisms.
Tsuo, S.; Langford, A.A.
1989-03-28
Unwanted build-up of the film deposited on the transparent light-transmitting window of a photochemical vacuum deposition (photo-CVD) chamber is eliminated by flowing an etchant into the part of the photolysis region in the chamber immediately adjacent the window and remote from the substrate and from the process gas inlet. The respective flows of the etchant and the process gas are balanced to confine the etchant reaction to the part of the photolysis region proximate to the window and remote from the substrate. The etchant is preferably one that etches film deposit on the window, does not etch or affect the window itself, and does not produce reaction by-products that are deleterious to either the desired film deposited on the substrate or to the photolysis reaction adjacent the substrate. 3 figs.
Tsuo, Simon; Langford, Alison A.
1989-01-01
Unwanted build-up of the film deposited on the transparent light-transmitting window of a photochemical vacuum deposition (photo-CVD) chamber is eliminated by flowing an etchant into the part of the photolysis region in the chamber immediately adjacent the window and remote from the substrate and from the process gas inlet. The respective flows of the etchant and the process gas are balanced to confine the etchant reaction to the part of the photolysis region proximate to the window and remote from the substrate. The etchant is preferably one that etches film deposit on the window, does not etch or affect the window itself, and does not produce reaction by-products that are deleterious to either the desired film deposited on the substrate or to the photolysis reaction adjacent the substrate.
3. NORTHEAST SIDE, WITH A SINGLE BULLET GLASS WINDOW AND ...
3. NORTHEAST SIDE, WITH A SINGLE BULLET GLASS WINDOW AND SOUTHEAST REAR WITH ENTRY DOOR. - Edwards Air Force Base, South Base Sled Track, Observation Block House, Station "O" area, east end of Sled Track, Lancaster, Los Angeles County, CA
Error-Based Design Space Windowing
NASA Technical Reports Server (NTRS)
Papila, Melih; Papila, Nilay U.; Shyy, Wei; Haftka, Raphael T.; Fitz-Coy, Norman
2002-01-01
Windowing of design space is considered in order to reduce the bias errors due to low-order polynomial response surfaces (RS). Standard design space windowing (DSW) uses a region of interest by setting a requirement on response level and checks it by a global RS predictions over the design space. This approach, however, is vulnerable since RS modeling errors may lead to the wrong region to zoom on. The approach is modified by introducing an eigenvalue error measure based on point-to-point mean squared error criterion. Two examples are presented to demonstrate the benefit of the error-based DSW.
Electrochemical Stability of Li 10GeP 2S 12 and Li 7La 3Zr 2O 12 Solid Electrolytes
Han, Fudong; Zhu, Yizhou; He, Xingfeng; ...
2016-01-21
The electrochemical stability window of solid electrolyte is overestimated by the conventional experimental method using a Li/electrolyte/inert metal semiblocking electrode because of the limited contact area between solid electrolyte and inert metal. Since the battery is cycled in the overestimated stability window, the decomposition of the solid electrolyte at the interfaces occurs but has been ignored as a cause for high interfacial resistances in previous studies, limiting the performance improvement of the bulk-type solid-state battery despite the decades of research efforts. Thus, there is an urgent need to identify the intrinsic stability window of the solid electrolyte. The thermodynamic electrochemicalmore » stability window of solid electrolytes is calculated using first principles computation methods, and an experimental method is developed to measure the intrinsic electrochemical stability window of solid electrolytes using a Li/electrolyte/electrolyte-carbon cell. The most promising solid electrolytes, Li10GeP2S12 and cubic Li-garnet Li7La3Zr2O12, are chosen as the model materials for sulfide and oxide solid electrolytes, respectively. The results provide valuable insights to address the most challenging problems of the interfacial stability and resistance in high-performance solid-state batteries.« less
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
Liu, Yu; Xia, Jun; Shi, Chun-Xiang; Hong, Yang
2009-01-01
The crowning objective of this research was to identify a better cloud classification method to upgrade the current window-based clustering algorithm used operationally for China’s first operational geostationary meteorological satellite FengYun-2C (FY-2C) data. First, the capabilities of six widely-used Artificial Neural Network (ANN) methods are analyzed, together with the comparison of two other methods: Principal Component Analysis (PCA) and a Support Vector Machine (SVM), using 2864 cloud samples manually collected by meteorologists in June, July, and August in 2007 from three FY-2C channel (IR1, 10.3–11.3 μm; IR2, 11.5–12.5 μm and WV 6.3–7.6 μm) imagery. The result shows that: (1) ANN approaches, in general, outperformed the PCA and the SVM given sufficient training samples and (2) among the six ANN networks, higher cloud classification accuracy was obtained with the Self-Organizing Map (SOM) and Probabilistic Neural Network (PNN). Second, to compare the ANN methods to the present FY-2C operational algorithm, this study implemented SOM, one of the best ANN network identified from this study, as an automated cloud classification system for the FY-2C multi-channel data. It shows that SOM method has improved the results greatly not only in pixel-level accuracy but also in cloud patch-level classification by more accurately identifying cloud types such as cumulonimbus, cirrus and clouds in high latitude. Findings of this study suggest that the ANN-based classifiers, in particular the SOM, can be potentially used as an improved Automated Cloud Classification Algorithm to upgrade the current window-based clustering method for the FY-2C operational products. PMID:22346714
Liu, Yu; Xia, Jun; Shi, Chun-Xiang; Hong, Yang
2009-01-01
The crowning objective of this research was to identify a better cloud classification method to upgrade the current window-based clustering algorithm used operationally for China's first operational geostationary meteorological satellite FengYun-2C (FY-2C) data. First, the capabilities of six widely-used Artificial Neural Network (ANN) methods are analyzed, together with the comparison of two other methods: Principal Component Analysis (PCA) and a Support Vector Machine (SVM), using 2864 cloud samples manually collected by meteorologists in June, July, and August in 2007 from three FY-2C channel (IR1, 10.3-11.3 μm; IR2, 11.5-12.5 μm and WV 6.3-7.6 μm) imagery. The result shows that: (1) ANN approaches, in general, outperformed the PCA and the SVM given sufficient training samples and (2) among the six ANN networks, higher cloud classification accuracy was obtained with the Self-Organizing Map (SOM) and Probabilistic Neural Network (PNN). Second, to compare the ANN methods to the present FY-2C operational algorithm, this study implemented SOM, one of the best ANN network identified from this study, as an automated cloud classification system for the FY-2C multi-channel data. It shows that SOM method has improved the results greatly not only in pixel-level accuracy but also in cloud patch-level classification by more accurately identifying cloud types such as cumulonimbus, cirrus and clouds in high latitude. Findings of this study suggest that the ANN-based classifiers, in particular the SOM, can be potentially used as an improved Automated Cloud Classification Algorithm to upgrade the current window-based clustering method for the FY-2C operational products.
Temporally rendered automatic cloud extraction (TRACE) system
NASA Astrophysics Data System (ADS)
Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.
1999-10-01
Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.
Data in support of energy performance of double-glazed windows.
Shakouri, Mahmoud; Banihashemi, Saeed
2016-06-01
This paper provides the data used in a research project to propose a new simplified windows rating system based on saved annual energy ("Developing an empirical predictive energy-rating model for windows by using Artificial Neural Network" (Shakouri Hassanabadi and Banihashemi Namini, 2012) [1], "Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates" (Banihashemi et al., 2015) [2]). A full factorial simulation study was conducted to evaluate the performance of 26 different types of windows in a four-story residential building. In order to generalize the results, the selected windows were tested in four climates of cold, tropical, temperate, and hot and arid; and four different main orientations of North, West, South and East. The accompanied datasets include the annual saved cooling and heating energy in different climates and orientations by using the selected windows. Moreover, a complete dataset is provided that includes the specifications of 26 windows, climate data, month, and orientation of the window. This dataset can be used to make predictive models for energy efficiency assessment of double glazed windows.
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
NASA is currently using a set of applications called the Display Builder and Display Manager. They run on Concurrent systems and heavily depend on the Graphic Kernel System (GKS). At this time however, these two applications would more appropriately be developed in X Windows, in which a low X is used for all actual text and graphics display and a standard widget set (such as Motif) is used for the user interface. Use of the X Windows will increase performance, improve the user interface, enhance portability, and improve reliability. Prototype of X Window/Motif based Display Manager provides the following advantages over a GKS based application: improved performance by using a low level X Windows, display of graphic and text will be more efficient; improved user interface by using Motif; Improved portability by operating on both Concurrent and Sun workstations; and Improved reliability.
Casing window milling with abrasive fluid jet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vestavik, O.M.; Fidtje, T.H.; Faure, A.M.
1995-12-31
Methods for through tubing re-entry drilling of multilateral wells has a large potential for increasing hydrocarbon production and total recovery. One of the bottle-necks of this technology is initiation of the side-track by milling a window in the casing downhole. A new approach to this problem has been investigated in a joint industry project. An experimental set-up has been built for milling a 4 inch window in a 7 inch steel casing at surface in the laboratory. A specially designed bit developed at RIF using abrasive jet cutting technology has been used for the window milling. The bit has anmore » abrasive jet beam which is always directed in the desired side-track direction, even if the bit is rotating uniformly. The bit performs the milling with a combined mechanical and hydraulic jet action. The method has been successfully demonstrated. The experiments has shown that the window milling can be performed with very low WOB and torque, and that only small side forces are required to perform the operation. Casing milling has been performed without a whipstock, a cement plug has been the only support for the tool. The tests indicate that milling operations can be performed more efficiently with less time and costs than what is required with conventional techniques. However, the method still needs some development of the downhole motor for coiled tubing applications. The method can be used both for milling and drilling giving the advantage of improved rate of penetration, improved bit life and increased horizontal reach. The method is planned to be demonstrated downhole in the near future.« less
A simple method to incorporate water vapor absorption in the 15 microns remote temperature sounding
NASA Technical Reports Server (NTRS)
Dallu, G.; Prabhakara, C.; Conhath, B. J.
1975-01-01
The water vapor absorption in the 15 micron CO2 band, which can affect the remotely sensed temperatures near the surface, are estimated with the help of an empirical method. This method is based on the differential absorption properties of the water vapor in the 11-13 micron window region and does not require a detailed knowledge of the water vapor profile. With this approach Nimbus 4 IRIS radiance measurements are inverted to obtain temperature profiles. These calculated profiles agree with radiosonde data within about 2 C.
NASA Astrophysics Data System (ADS)
Whitmore, Alexander Jason
Concentrating solar power systems are currently the predominant solar power technology for generating electricity at the utility scale. The central receiver system, which is a concentrating solar power system, uses a field of mirrors to concentrate solar radiation onto a receiver where a working fluid is heated to drive a turbine. Current central receiver systems operate on a Rankine cycle, which has a large demand for cooling water. This demand for water presents a challenge for the current central receiver systems as the ideal locations for solar power plants have arid climates. An alternative to the current receiver technology is the small particle receiver. The small particle receiver has the potential to produce working fluid temperatures suitable for use in a Brayton cycle which can be more efficient when pressurized to 0.5 MPa. Using a fused quartz window allows solar energy into the receiver while maintaining a pressurized small particle receiver. In this thesis, a detailed numerical investigation for a spectral, three dimensional, cylindrical glass window for a small particle receiver was performed. The window is 1.7 meters in diameter and 0.0254 meters thick. There are three Monte Carlo Ray Trace codes used within this research. The first MCRT code, MIRVAL, was developed by Sandia National Laboratory and modified by a fellow San Diego State University colleague Murat Mecit. This code produces the solar rays on the exterior surface of the window. The second MCRT code was developed by Steve Ruther and Pablo Del Campo. This code models the small particle receiver, which creates the infrared spectral direction flux on the interior surface of the window used in this work. The third MCRT, developed for this work, is used to model radiation heat transfer within the window itself and is coupled to an energy equation solver to produce a temperature distribution. The MCRT program provides a source term to the energy equation. This in turn, produces a new temperature field for the MCRT program; together the equations are solved iteratively. These iterations repeat until convergence is reached for a steady state temperature field. The energy equation was solved using a finite volume method. The window's thermal conductivity is modeled as a function of temperature. This thermal model is used to investigate the effects of different materials, receiver geometries, interior convection coefficients and exterior convection coefficients. To prevent devitrification and the ultimate failure of the window, the window needs to stay below the devitrification temperature of the material. In addition, the temperature gradients within the window need to be kept to a minimum to prevent thermal stresses. A San Diego State University colleague E-Fann Saung uses these temperature maps to insure that the mounting of the window does not produce thermal stresses which can cause cracking in the brittle fused quartz. The simulations in this thesis show that window temperatures are below the devitrification temperature of the window when there are cooling jets on both surfaces of the window. Natural convection on the exterior window surface was explored and it does not provide adequate cooling; therefore forced convection is required. Due to the low thermal conductivity of the window, the edge mounting thermal boundary condition has little effect on the maximum temperature of the window. The simulations also showed that the solar input flux absorbed less than 1% of the incoming radiation while the window absorbed closer to 20% of the infrared radiation emitted by the receiver. The main source of absorbed power in the window is located directly on the interior surface of the window where the infrared radiation is absorbed. The geometry of the receiver has a large impact on the amount of emitted power which reached the interior surface of the window, and using a conical shaped receiver dramatically reduced the receiver's infrared flux on the window. The importance of internal emission is explored within this research. Internal emission produces a more even emission field throughout the receiver than applying radiation surface emission only. Due to a majority of the infrared receiver re-radiation being absorbed right at the interior surface, the surface emission only approximation method produces lower maximum temperatures.
Calculating the Effect of External Shading on the Solar Heat Gain Coefficient of Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohler, Christian; Shukla, Yash; Rawal, Rajan
Current prescriptive building codes have limited ways to account for the effect of solar shading, such as overhangs and awnings, on window solar heat gains. We propose two new indicators, the adjusted Solar Heat Gain Coefficient (aSHGC) which accounts for external shading while calculating the SHGC of a window, and a weighted SHGC (SHGCw) which provides a seasonal SHGC weighted by solar intensity. We demonstrate a method to calculate these indices using existing tools combined with additional calculations. The method is demonstrated by calculating the effect of an awning on a clear double glazing in New Delhi.
Dobramysl, U; Holcman, D
2018-02-15
Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.
Photorefractive-based adaptive optical windows
NASA Astrophysics Data System (ADS)
Liu, Yuexin; Yang, Yi; Wang, Bo; Fu, John Y.; Yin, Shizhuo; Guo, Ruyan; Yu, Francis T.
2004-10-01
Optical windows have been widely used in optical spectrographic processing system. In this paper, various window profiles, such as rectangular, triangular, Hamming, Hanning, and Blackman etc., have been investigated in detail, regarding their effect on the generated spectrograms, such as joint time-frequency resolution ΔtΔw, the sidelobe amplitude attenuation etc.. All of these windows can be synthesized in a photorefractive crystal by angular multiplexing holographic technique, which renders the system more adaptive. Experimental results are provided.
Wimberley, Catriona J; Fischer, Kristina; Reilhac, Anthonin; Pichler, Bernd J; Gregoire, Marie Claude
2014-10-01
The partial saturation approach (PSA) is a simple, single injection experimental protocol that will estimate both B(avail) and appK(D) without the use of blood sampling. This makes it ideal for use in longitudinal studies of neurodegenerative diseases in the rodent. The aim of this study was to increase the range and applicability of the PSA by developing a data driven strategy for determining reliable regional estimates of receptor density (B(avail)) and in vivo affinity (1/appK(D)), and validate the strategy using a simulation model. The data driven method uses a time window guided by the dynamic equilibrium state of the system as opposed to using a static time window. To test the method, simulations of partial saturation experiments were generated and validated against experimental data. The experimental conditions simulated included a range of receptor occupancy levels and three different B(avail) and appK(D) values to mimic diseases states. Also the effect of using a reference region and typical PET noise on the stability and accuracy of the estimates was investigated. The investigations showed that the parameter estimates in a simulated healthy mouse, using the data driven method were within 10±30% of the simulated input for the range of occupancy levels simulated. Throughout all experimental conditions simulated, the accuracy and robustness of the estimates using the data driven method were much improved upon the typical method of using a static time window, especially at low receptor occupancy levels. Introducing a reference region caused a bias of approximately 10% over the range of occupancy levels. Based on extensive simulated experimental conditions, it was shown the data driven method provides accurate and precise estimates of B(avail) and appK(D) for a broader range of conditions compared to the original method. Copyright © 2014 Elsevier Inc. All rights reserved.
Opto-mechanical design of optical window for aero-optics effect simulation instruments
NASA Astrophysics Data System (ADS)
Wang, Guo-ming; Dong, Dengfeng; Zhou, Weihu; Ming, Xing; Zhang, Yan
2016-10-01
A complete theory is established for opto-mechanical systems design of the window in this paper, which can make the design more rigorous .There are three steps about the design. First, the universal model of aerodynamic environment is established based on the theory of Computational Fluid Dynamics, and the pneumatic pressure distribution and temperature data of optical window surface is obtained when aircraft flies in 5-30km altitude, 0.5-3Ma speed and 0-30°angle of attack. The temperature and pressure distribution values for the maximum constraint is selected as the initial value of external conditions on the optical window surface. Then, the optical window and mechanical structure are designed, which is also divided into two parts: First, mechanical structure which meet requirements of the security and tightness is designed. Finally, rigorous analysis and evaluation are given about the structure of optics and mechanics we have designed. There are two parts to be analyzed. First, the Fluid-Solid-Heat Coupled Model is given based on finite element analysis. And the deformation of the glass and structure can be obtained by the model, which can assess the feasibility of the designed optical windows and ancillary structure; Second, the new optical surface is fitted by Zernike polynomials according to the deformation of the surface of the optical window, which can evaluate imaging quality impact of spectral camera by the deformation of window.
Removing damped sinusoidal vibrations in adaptive optics systems using a DFT-based estimation method
NASA Astrophysics Data System (ADS)
Kania, Dariusz
2017-06-01
The problem of a vibrations rejection in adaptive optics systems is still present in publications. These undesirable signals emerge because of shaking the system structure, the tracking process, etc., and they usually are damped sinusoidal signals. There are some mechanical solutions to reduce the signals but they are not very effective. One of software solutions are very popular adaptive methods. An AVC (Adaptive Vibration Cancellation) method has been presented and developed in recent years. The method is based on the estimation of three vibrations parameters and values of frequency, amplitude and phase are essential to produce and adjust a proper signal to reduce or eliminate vibrations signals. This paper presents a fast (below 10 ms) and accurate estimation method of frequency, amplitude and phase of a multifrequency signal that can be used in the AVC method to increase the AO system performance. The method accuracy depends on several parameters: CiR - number of signal periods in a measurement window, N - number of samples in the FFT procedure, H - time window order, SNR, THD, b - number of A/D converter bits in a real time system, γ - the damping ratio of the tested signal, φ - the phase of the tested signal. Systematic errors increase when N, CiR, H decrease and when γ increases. The value of systematic error for γ = 0.1%, CiR = 1.1 and N = 32 is approximately 10^-4 Hz/Hz. This paper focuses on systematic errors of and effect of the signal phase and values of γ on the results.
Ionospheric gravity wave measurements with the USU dynasonde
NASA Technical Reports Server (NTRS)
Berkey, Frank T.; Deng, Jun Yuan
1992-01-01
A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.
A Hybrid Approach for CpG Island Detection in the Human Genome.
Yang, Cheng-Hong; Lin, Yu-Da; Chiang, Yi-Cheng; Chuang, Li-Yeh
2016-01-01
CpG islands have been demonstrated to influence local chromatin structures and simplify the regulation of gene activity. However, the accurate and rapid determination of CpG islands for whole DNA sequences remains experimentally and computationally challenging. A novel procedure is proposed to detect CpG islands by combining clustering technology with the sliding-window method (PSO-based). Clustering technology is used to detect the locations of all possible CpG islands and process the data, thus effectively obviating the need for the extensive and unnecessary processing of DNA fragments, and thus improving the efficiency of sliding-window based particle swarm optimization (PSO) search. This proposed approach, named ClusterPSO, provides versatile and highly-sensitive detection of CpG islands in the human genome. In addition, the detection efficiency of ClusterPSO is compared with eight CpG island detection methods in the human genome. Comparison of the detection efficiency for the CpG islands in human genome, including sensitivity, specificity, accuracy, performance coefficient (PC), and correlation coefficient (CC), ClusterPSO revealed superior detection ability among all of the test methods. Moreover, the combination of clustering technology and PSO method can successfully overcome their respective drawbacks while maintaining their advantages. Thus, clustering technology could be hybridized with the optimization algorithm method to optimize CpG island detection. The prediction accuracy of ClusterPSO was quite high, indicating the combination of CpGcluster and PSO has several advantages over CpGcluster and PSO alone. In addition, ClusterPSO significantly reduced implementation time.
Three-dimensional laser window formation for industrial application
NASA Technical Reports Server (NTRS)
Verhoff, Vincent G.; Kowalski, David
1993-01-01
The NASA Lewis Research Center has developed and implemented a unique process for forming flawless three-dimensional, compound-curvature laser windows to extreme accuracies. These windows represent an integral component of specialized nonintrusive laser data acquisition systems that are used in a variety of compressor and turbine research testing facilities. These windows are molded to the flow surface profile of turbine and compressor casings and are required to withstand extremely high pressures and temperatures. This method of glass formation could also be used to form compound-curvature mirrors that would require little polishing and for a variety of industrial applications, including research view ports for testing devices and view ports for factory machines with compound-curvature casings. Currently, sodium-alumino-silicate glass is recommended for three-dimensional laser windows because of its high strength due to chemical strengthening and its optical clarity. This paper discusses the main aspects of three-dimensional laser window formation. It focuses on the unique methodology and the peculiarities that are associated with the formation of these windows.
Effect of window length on performance of the elbow-joint angle prediction based on electromyography
NASA Astrophysics Data System (ADS)
Triwiyanto; Wahyunggoro, Oyas; Adi Nugroho, Hanung; Herianto
2017-05-01
The high performance of the elbow joint angle prediction is essential on the development of the devices based on electromyography (EMG) control. The performance of the prediction depends on the feature of extraction parameters such as window length. In this paper, we evaluated the effect of the window length on the performance of the elbow-joint angle prediction. The prediction algorithm consists of zero-crossing feature extraction and second order of Butterworth low pass filter. The feature was used to extract the EMG signal by varying window length. The EMG signal was collected from the biceps muscle while the elbow was moved in the flexion and extension motion. The subject performed the elbow motion by holding a 1-kg load and moved the elbow in different periods (12 seconds, 8 seconds and 6 seconds). The results indicated that the window length affected the performance of the prediction. The 250 window lengths yielded the best performance of the prediction algorithm of (mean±SD) root mean square error = 5.68%±1.53% and Person’s correlation = 0.99±0.0059.
Sliding Window-Based Region of Interest Extraction for Finger Vein Images
Yang, Lu; Yang, Gongping; Yin, Yilong; Xiao, Rongyang
2013-01-01
Region of Interest (ROI) extraction is a crucial step in an automatic finger vein recognition system. The aim of ROI extraction is to decide which part of the image is suitable for finger vein feature extraction. This paper proposes a finger vein ROI extraction method which is robust to finger displacement and rotation. First, we determine the middle line of the finger, which will be used to correct the image skew. Then, a sliding window is used to detect the phalangeal joints and further to ascertain the height of ROI. Last, for the corrective image with certain height, we will obtain the ROI by using the internal tangents of finger edges as the left and right boundary. The experimental results show that the proposed method can extract ROI more accurately and effectively compared with other methods, and thus improve the performance of finger vein identification system. Besides, to acquire the high quality finger vein image during the capture process, we propose eight criteria for finger vein capture from different aspects and these criteria should be helpful to some extent for finger vein capture. PMID:23507824
Artificial Intelligence Methods Applied to Parameter Detection of Atrial Fibrillation
NASA Astrophysics Data System (ADS)
Arotaritei, D.; Rotariu, C.
2015-09-01
In this paper we present a novel method to develop an atrial fibrillation (AF) based on statistical descriptors and hybrid neuro-fuzzy and crisp system. The inference of system produce rules of type if-then-else that care extracted to construct a binary decision system: normal of atrial fibrillation. We use TPR (Turning Point Ratio), SE (Shannon Entropy) and RMSSD (Root Mean Square of Successive Differences) along with a new descriptor, Teager- Kaiser energy, in order to improve the accuracy of detection. The descriptors are calculated over a sliding window that produce very large number of vectors (massive dataset) used by classifier. The length of window is a crisp descriptor meanwhile the rest of descriptors are interval-valued type. The parameters of hybrid system are adapted using Genetic Algorithm (GA) algorithm with fitness single objective target: highest values for sensibility and sensitivity. The rules are extracted and they are part of the decision system. The proposed method was tested using the Physionet MIT-BIH Atrial Fibrillation Database and the experimental results revealed a good accuracy of AF detection in terms of sensitivity and specificity (above 90%).
Diffraction leveraged modulation of X-ray pulses using MEMS-based X-ray optics
Lopez, Daniel; Shenoy, Gopal; Wang, Jin; Walko, Donald A.; Jung, Il-Woong; Mukhopadhyay, Deepkishore
2016-08-09
A method and apparatus are provided for implementing Bragg-diffraction leveraged modulation of X-ray pulses using MicroElectroMechanical systems (MEMS) based diffractive optics. An oscillating crystalline MEMS device generates a controllable time-window for diffraction of the incident X-ray radiation. The Bragg-diffraction leveraged modulation of X-ray pulses includes isolating a particular pulse, spatially separating individual pulses, and spreading a single pulse from an X-ray pulse-train.
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
Measurement of the noise power spectrum in digital x-ray detectors
NASA Astrophysics Data System (ADS)
Aufrichtig, Richard; Su, Yu; Cheng, Yu; Granfors, Paul R.
2001-06-01
The noise power spectrum, NPS, is a key imaging property of a detector and one of the principle quantities needed to compute the detective quantum efficiency. NPS is measured by computing the Fourier transform of flat field images. Different measurement methods are investigated and evaluated with images obtained from an amorphous silicon flat panel x-ray imaging detector. First, the influence of fixed pattern structures is minimized by appropriate background corrections. For a given data set the effect of using different types of windowing functions is studied. Also different window sizes and amounts of overlap between windows are evaluated and compared to theoretical predictions. Results indicate that measurement error is minimized when applying overlapping Hanning windows on the raw data. Finally it is shown that radial averaging is a useful method of reducing the two-dimensional noise power spectrum to one dimension.
Inter-aquifer Dynamics in and Near a Confining Unit Window in Shelby County, Tennessee, USA
NASA Astrophysics Data System (ADS)
Gentry, R. W.; McKay, L. D.; Larsen, D.; Carmichael, J. K.; Solomon, D. K.; Thonnard, N.; Anderson, J. L.
2003-12-01
An interdisplinary research team is investigating the interaction between the surficial alluvial aquifer and the deeper confined Memphis aquifer in the Memphis area, Shelby County, Tennessee. Previous research has identified a window in the clay-rich, upper Claiborne confining unit that separates the two aquifers near a closed municipal landfill in east-central Shelby County, an area undergoing rapid urbanization. For this investigation, a combination of environmental tracers (tritium/helium-3), major and trace ion geochemistry, hydraulic response testing, measurement of hydraulic gradients, and groundwater flow modeling is being used to quantify recharge of young water from the alluvial aquifer through the window to the Memphis aquifer. The research will provide results to better understand how windows were formed and how they influence recharge and water quality in otherwise confined parts of the Memphis aquifer downdip of its outcrop/subcrop area. Examination of continuous core samples and geophysical logs from wells installed for the study using Rotasonic drilling methods confirmed the existence of a sand-dominated window that may be as much as 1 km in diameter in the upper Claiborne confining unit. The upper Claiborne confining unit is 15 to 20 m thick in most of the study area and is overlain by a 10 to 12 m thick alluvial aquifer. The window is interpreted to have formed as a result of depositional and incisional processes in an Eocene-age deltaic system. Hydraulic gradients of several feet exist vertically between the alluvial and Memphis aquifers within the window, indicating downward flow. Groundwater age-dates from tritium/helium-3 analyses indicate that groundwater in the window at the depth of the base of the surrounding confining unit (approximately 30 m) has an apparent age of 19.8 years, which confirms the occurrence of downward flow. Young groundwater age dates (less than 32 years) also were obtained from wells in the Memphis aquifer at confined sites downgradient of the window, suggesting that a plume of young water is spreading outwards from the window and mixing with the older Memphis aquifer water. Preliminary inverse modeling of the site using a genetic algorithm coupled with a central finite difference flow model indicates a probable steady-state downward flux of about 12,000 m3/d through the window. Collection and analysis of additional groundwater samples are planned to examine geochemical conditions in the confining unit and in the Memphis aquifer upgradient of the window. These analyses will aid in developing a final conceptual model and in subsequent numerical modeling of mixing of the young recharge water with the older Memphis aquifer water.
Sun, S P; Lu, W; Lei, Y B; Men, X M; Zuo, B; Ding, S G
2017-08-07
Objective: To discuss the prediction of round window(RW) visibility in cochlear implantation(CI) with temporal bone high resolution computed tomography(HRCT). Methods: From January 2013 to January 2017, 130 cases underwent both HRCT and CI in our hospital were analyzed. The distance from facial nerve to posterior canal wall(FWD), the angle between facial nerve and inner margin of round window(FRA), and the angle between facial nerve and tympanic anulus to inner margin of round window(FRAA) were detected at the level of round window on axial temporal bone HRCT. A line parallel to the posterior wall of ear canal was drawn from the anterior wall of facial nerve at the level of round window on axial temporal bone HRCT and its relationship with round window was detected (facial-round window line, FRL): type0-posterior to the round window, type1-between the round window, type2-anterior to the round window. Their(FWD, FRA, FRAA, FRL) relationships with intra-operative round window visibility were analyzed by SPSS 17.0 software. Results: FWD( F =18.76, P =0.00), FRA( F =34.57, P =0.00), FRAA ( F =14.24, P =0.00) could affect the intra-operative RW visibility significantly. RW could be exposed completely during CI when preoperative HRCT showing type0 FRL. RW might be partly exposed and not exposed when preoperative HRCT showing type1 and type2 FRL respectively. Conclusion: FWD, FRA, FRAA and FRL of temporal bone HRCT can predict intra-operative round window visibility effectively in CI surgery.
State-of-the-art software for window energy-efficiency rating and labeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arasteh, D.; Finlayson, E.; Huang, J.
1998-07-01
Measuring the thermal performance of windows in typical residential buildings is an expensive proposition. Not only is laboratory testing expensive, but each window manufacturer typically offers hundreds of individual products, each of which has different thermal performance properties. With over a thousand window manufacturers nationally, a testing-based rating system would be prohibitively expensive to the industry and to consumers. Beginning in the early 1990s, simulation software began to be used as part of a national program for rating window U-values. The rating program has since been expanded to include Solar Hear Gain Coefficients and is now being extended to annualmore » energy performance. This paper describes four software packages available to the public from Lawrence Berkeley National Laboratory (LBNL). These software packages are used to evaluate window thermal performance: RESFEN (for evaluating annual energy costs), WINDOW (for calculating a product`s thermal performance properties), THERM (a preprocessor for WINDOW that determines two-dimensional heat-transfer effects), and Optics (a preprocessor for WINDOW`s glass database). Software not only offers a less expensive means than testing to evaluate window performance, it can also be used during the design process to help manufacturers produce windows that will meet target specifications. In addition, software can show small improvements in window performance that might not be detected in actual testing because of large uncertainties in test procedures.« less
Rong, Xing; Du, Yong; Frey, Eric C
2012-06-21
Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for calculating the bias due to model-mismatch and the variance of the VOI activity estimates, respectively. To obtain the optimal acquisition energy window for general situations of interest in clinical (90)Y microsphere imaging, we generated phantoms with multiple tumors of various sizes and various tumor-to-normal activity concentration ratios using a digital phantom that realistically simulates human anatomy, simulated (90)Y microsphere imaging with a clinical SPECT system and typical imaging parameters using a previously validated Monte Carlo simulation code, and used a previously proposed method for modeling the image degrading effects in quantitative SPECT reconstruction. The obtained optimal acquisition energy window was 100-160 keV. The values of the proposed FOM were much larger than the FOM taking into account only the variance of the activity estimates, thus demonstrating in our experiment that the bias of the activity estimates due to model-mismatch was a more important factor than the variance in terms of limiting the reliability of activity estimates.
Temperature compensated and self-calibrated current sensor using reference current
Yakymyshyn, Christopher Paul [Seminole, FL; Brubaker, Michael Allen [Loveland, CO; Yakymyshyn, Pamela Jane [Seminole, FL
2008-01-22
A method is described to provide temperature compensation and self-calibration of a current sensor based on a plurality of magnetic field sensors positioned around a current carrying conductor. A reference electrical current carried by a conductor positioned within the sensing window of the current sensor is used to correct variations in the output signal due to temperature variations and aging.
ERIC Educational Resources Information Center
Atwell, Margaret, Ed.; Klein, Adria, Ed.
The paper presented at this conference, the theme of which was "Celebrating Literacy," focused on theories and applications of literature-based education and the use of holistic methods across the curriculum. Following an introduction by the editors, the book contains the following 12 papers: "Windows and Mirrors: Children's Books…
Sequential Geoacoustic Filtering and Geoacoustic Inversion
2015-09-30
and online algorithms. We show here that CS obtains higher resolution than MVDR, even in scenarios, which favor classical high-resolution methods...windows actually performs better than conventional beamforming and MVDR/ MUSIC (see Figs. 1-2). Compressive geoacoustic inversion Geoacoustic...histograms based on 100 Monte Carlo simulations, and c)(CS, exhaustive-search, CBF, MVDR, and MUSIC performance versus SNR. The true source positions
Debugging classification and anti-debugging strategies
NASA Astrophysics Data System (ADS)
Gao, Shang; Lin, Qian; Xia, Mingyuan; Yu, Miao; Qi, Zhengwei; Guan, Haibing
2011-12-01
Debugging, albeit useful for software development, is also a double-edge sword since it could also be exploited by malicious attackers. This paper analyzes the prevailing debuggers and classifies them into 4 categories based on the debugging mechanism. Furthermore, as an opposite, we list 13 typical anti-debugging strategies adopted in Windows. These methods intercept specific execution points which expose the diagnostic behavior of debuggers.
Method and apparatus for monitoring the flow of mercury in a system
Grossman, Mark W.
1987-01-01
An apparatus and method for monitoring the flow of mercury in a system. The equipment enables the entrainment of the mercury in a carrier gas e.g., an inert gas, which passes as mercury vapor between a pair of optically transparent windows. The attenuation of the emission is indicative of the quantity of mercury (and its isotopes) in the system. A 253.7 nm light is shone through one of the windows and the unabsorbed light is detected through the other window. The absorption of the 253.7 nm light is thereby measured whereby the quantity of mercury passing between the windows can be determined. The apparatus includes an in-line sensor for measuring the quantity of mercury. It includes a conduit together with a pair of apertures disposed in a face to face relationship and arranged on opposite sides of the conduit. A pair of optically transparent windows are disposed upon a pair of viewing tubes. A portion of each of the tubes is disposed inside of the conduit and within each of the apertures. The two windows are disposed in a face to face relationship on the ends of the viewing tubes and the entire assembly is hermetically sealed from the atmosphere whereby when 253.7 nm ultraviolet light is shone through one of the windows and detected through the other, the quantity of mercury which is passing by can be continuously monitored due to absorption which is indicated by attenuation of the amplitude of the observed emission.
Automatic 3D Moment tensor inversions for southern California earthquakes
NASA Astrophysics Data System (ADS)
Liu, Q.; Tape, C.; Friberg, P.; Tromp, J.
2008-12-01
We present a new source mechanism (moment-tensor and depth) catalog for about 150 recent southern California earthquakes with Mw ≥ 3.5. We carefully select the initial solutions from a few available earthquake catalogs as well as our own preliminary 3D moment tensor inversion results. We pick useful data windows by assessing the quality of fits between the data and synthetics using an automatic windowing package FLEXWIN (Maggi et al 2008). We compute the source Fréchet derivatives of moment-tensor elements and depth for a recent 3D southern California velocity model inverted based upon finite-frequency event kernels calculated by the adjoint methods and a nonlinear conjugate gradient technique with subspace preconditioning (Tape et al 2008). We then invert for the source mechanisms and event depths based upon the techniques introduced by Liu et al 2005. We assess the quality of this new catalog, as well as the other existing ones, by computing the 3D synthetics for the updated 3D southern California model. We also plan to implement the moment-tensor inversion methods to automatically determine the source mechanisms for earthquakes with Mw ≥ 3.5 in southern California.
Ground-based remote sensing of thin clouds in the Arctic
NASA Astrophysics Data System (ADS)
Garrett, T. J.; Zhao, C.
2012-11-01
This paper describes a method for using interferometer measurements of downwelling thermal radiation to retrieve the properties of single-layer clouds. Cloud phase is determined from ratios of thermal emission in three "micro-windows" where absorption by water vapor is particularly small. Cloud microphysical and optical properties are retrieved from thermal emission in two micro-windows, constrained by the transmission through clouds of stratospheric ozone emission. Assuming a cloud does not approximate a blackbody, the estimated 95% confidence retrieval errors in effective radius, visible optical depth, number concentration, and water path are, respectively, 10%, 20%, 38% (55% for ice crystals), and 16%. Applied to data from the Atmospheric Radiation Measurement program (ARM) North Slope of Alaska - Adjacent Arctic Ocean (NSA-AAO) site near Barrow, Alaska, retrievals show general agreement with ground-based microwave radiometer measurements of liquid water path. Compared to other retrieval methods, advantages of this technique include its ability to characterize thin clouds year round, that water vapor is not a primary source of retrieval error, and that the retrievals of microphysical properties are only weakly sensitive to retrieved cloud phase. The primary limitation is the inapplicability to thicker clouds that radiate as blackbodies.
Bordel, Sergio
2018-04-13
In order to choose optimal personalized anticancer treatments, transcriptomic data should be analyzed within the frame of biological networks. The best known human biological network (in terms of the interactions between its different components) is metabolism. Cancer cells have been known to have specific metabolic features for a long time and currently there is a growing interest in characterizing new cancer specific metabolic hallmarks. In this article it is presented a method to find personalized therapeutic windows using RNA-seq data and Genome Scale Metabolic Models. This method is implemented in the python library, pyTARG. Our predictions showed that the most anticancer selective (affecting 27 out of 34 considered cancer cell lines and only 1 out of 6 healthy mesenchymal stem cell lines) single metabolic reactions are those involved in cholesterol biosynthesis. Excluding cholesterol biosynthesis, all the considered cell lines can be selectively affected by targeting different combinations (from 1 to 5 reactions) of only 18 metabolic reactions, which suggests that a small subset of drugs or siRNAs combined in patient specific manners could be at the core of metabolism based personalized treatments.
7. BULLET GLASS OBSERVATION WINDOW AT GROUND LEVEL ON WEST ...
7. BULLET GLASS OBSERVATION WINDOW AT GROUND LEVEL ON WEST REAR. - Edwards Air Force Base, South Base Sled Track, Firing & Control Blockhouse for 10,000-foot Track, South of Sled Track at midpoint of 20,000-foot track, Lancaster, Los Angeles County, CA
Leyde, Brian P; Klein, Sanford A; Nellis, Gregory F; Skye, Harrison
2017-03-01
This paper presents a new method called the Crossed Contour Method for determining the effective properties (borehole radius and ground thermal conductivity) of a vertical ground-coupled heat exchanger. The borehole radius is used as a proxy for the overall borehole thermal resistance. The method has been applied to both simulated and experimental borehole Thermal Response Test (TRT) data using the Duct Storage vertical ground heat exchanger model implemented in the TRansient SYstems Simulation software (TRNSYS). The Crossed Contour Method generates a parametric grid of simulated TRT data for different combinations of borehole radius and ground thermal conductivity in a series of time windows. The error between the average of the simulated and experimental bore field inlet and outlet temperatures is calculated for each set of borehole properties within each time window. Using these data, contours of the minimum error are constructed in the parameter space of borehole radius and ground thermal conductivity. When all of the minimum error contours for each time window are superimposed, the point where the contours cross (intersect) identifies the effective borehole properties for the model that most closely represents the experimental data in every time window and thus over the entire length of the experimental data set. The computed borehole properties are compared with results from existing model inversion methods including the Ground Property Measurement (GPM) software developed by Oak Ridge National Laboratory, and the Line Source Model.
Mahmood, Hafiz Sultan; Hoogmoed, Willem B.; van Henten, Eldert J.
2013-01-01
Fine-scale spatial information on soil properties is needed to successfully implement precision agriculture. Proximal gamma-ray spectroscopy has recently emerged as a promising tool to collect fine-scale soil information. The objective of this study was to evaluate a proximal gamma-ray spectrometer to predict several soil properties using energy-windows and full-spectrum analysis methods in two differently managed sandy loam fields: conventional and organic. In the conventional field, both methods predicted clay, pH and total nitrogen with a good accuracy (R2 ≥ 0.56) in the top 0–15 cm soil depth, whereas in the organic field, only clay content was predicted with such accuracy. The highest prediction accuracy was found for total nitrogen (R2 = 0.75) in the conventional field in the energy-windows method. Predictions were better in the top 0–15 cm soil depths than in the 15–30 cm soil depths for individual and combined fields. This implies that gamma-ray spectroscopy can generally benefit soil characterisation for annual crops where the condition of the seedbed is important. Small differences in soil structure (conventional vs. organic) cannot be determined. As for the methodology, we conclude that the energy-windows method can establish relations between radionuclide data and soil properties as accurate as the full-spectrum analysis method. PMID:24287541
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.
Remote sensing image ship target detection method based on visual attention model
NASA Astrophysics Data System (ADS)
Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong
2017-11-01
The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.
A Deep Ensemble Learning Method for Monaural Speech Separation.
Zhang, Xiao-Lei; Wang, DeLiang
2016-03-01
Monaural speech separation is a fundamental problem in robust speech processing. Recently, deep neural network (DNN)-based speech separation methods, which predict either clean speech or an ideal time-frequency mask, have demonstrated remarkable performance improvement. However, a single DNN with a given window length does not leverage contextual information sufficiently, and the differences between the two optimization objectives are not well understood. In this paper, we propose a deep ensemble method, named multicontext networks, to address monaural speech separation. The first multicontext network averages the outputs of multiple DNNs whose inputs employ different window lengths. The second multicontext network is a stack of multiple DNNs. Each DNN in a module of the stack takes the concatenation of original acoustic features and expansion of the soft output of the lower module as its input, and predicts the ratio mask of the target speaker; the DNNs in the same module employ different contexts. We have conducted extensive experiments with three speech corpora. The results demonstrate the effectiveness of the proposed method. We have also compared the two optimization objectives systematically and found that predicting the ideal time-frequency mask is more efficient in utilizing clean training speech, while predicting clean speech is less sensitive to SNR variations.
Strength analysis of welded corners of PVC window profiles
NASA Astrophysics Data System (ADS)
Postawa, P.; Stachowiak, T.; Gnatowski, A.
2017-08-01
The article presents the results of researches which main purpose was to define the influence of welding parameters on strength of welded corners of PVC window profile. PVC profiles of a branded name GENEO® produced by Rehau Company were used for this research. The profiles were made by using a co-extrusion method. The surface of the profile was made of PVC mixture with no additives. Its main task was to get a smooth surface resistant to a smudge. The use of an unfilled polyester provides an aesthetic look and improves the resistance of extrudate to water getting into inner layers. The profile's inner layers have been filled up with glass fibre in order to improve its stiffness and mechanical properties. Window frames with cut corners used for this research, were produced on technological line of EUROCOLOR Company based in Pyskowice (Poland). The main goal of this analysis was to evaluate the influence of the main welding parameter (temperature upsetting) on hardness of welds we received in whole process. A universal testing machine was used for the research.
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
NASA Astrophysics Data System (ADS)
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustavsen Ph.D., Arild; Goudey, Howdy; Kohler, Christian
2010-06-17
While window frames typically represent 20-30percent of the overall window area, their impact on the total window heat transfer rates may be much larger. This effect is even greater in low-conductance (highly insulating) windows which incorporate very low conductance glazings. Developing low-conductance window frames requires accurate simulation tools for product research and development. The Passivhaus Institute in Germany states that windows (glazing and frames, combined) should have U-values not exceeding 0.80 W/(m??K). This has created a niche market for highly insulating frames, with frame U-values typically around 0.7-1.0 W/(m2 cdot K). The U-values reported are often based on numerical simulationsmore » according to international simulation standards. It is prudent to check the accuracy of these calculation standards, especially for high performance products before more manufacturers begin to use them to improve other product offerings. In this paper the thermal transmittance of five highly insulating window frames (three wooden frames, one aluminum frame and one PVC frame), found from numerical simulations and experiments, are compared. Hot box calorimeter results are compared with numerical simulations according to ISO 10077-2 and ISO 15099. In addition CFD simulations have been carried out, in order to use the most accurate tool available to investigate the convection and radiation effects inside the frame cavities. Our results show that available tools commonly used to evaluate window performance, based on ISO standards, give good overall agreement, but specific areas need improvement.« less
ERIC Educational Resources Information Center
Lowe, Phyllis; And Others
This module, one of ten competency based modules developed for vocational home economics teachers, is based on a job cluster in window treatment services. It can be used for various types of learners such as the handicapped, slowlearners, high school students, and adults including senior citizens. Focusing on the specific job title of window…
A data variance technique for automated despiking of magnetotelluric data with a remote reference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kappler, K.
2011-02-15
The magnetotelluric method employs co-located surface measurements of electric and magnetic fields to infer the local electrical structure of the earth. The frequency-dependent 'apparent resistivity' curves can be inaccurate at long periods if input data are contaminated - even when robust remote reference techniques are employed. Data despiking prior to processing can result in significantly more reliable estimates of long period apparent resistivities. This paper outlines a two-step method of automatic identification and replacement for spike-like contamination of magnetotelluric data; based on the simultaneity of natural electric and magnetic field variations at distant sites. This simultaneity is exploited both tomore » identify windows in time when the array data are compromised, and to generate synthetic data that replace observed transient noise spikes. In the first step, windows in data time series containing spikes are identified via intersite comparison of channel 'activity' - such as the variance of differenced data within each window. In the second step, plausible data for replacement of flagged windows is calculated by Wiener filtering coincident data in clean channels. The Wiener filters - which express the time-domain relationship between various array channels - are computed using an uncontaminated segment of array training data. Examples are shown where the algorithm is applied to artificially contaminated data, and to real field data. In both cases all spikes are successfully identified. In the case of implanted artificial noise, the synthetic replacement time series are very similar to the original recording. In all cases, apparent resistivity and phase curves obtained by processing the despiked data are much improved over curves obtained from raw data.« less
Determination of the optimal tolerance for MLC positioning in sliding window and VMAT techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hernandez, V., E-mail: vhernandezmasgrau@gmail.com; Abella, R.; Calvo, J. F.
2015-04-15
Purpose: Several authors have recommended a 2 mm tolerance for multileaf collimator (MLC) positioning in sliding window treatments. In volumetric modulated arc therapy (VMAT) treatments, however, the optimal tolerance for MLC positioning remains unknown. In this paper, the authors present the results of a multicenter study to determine the optimal tolerance for both techniques. Methods: The procedure used is based on dynalog file analysis. The study was carried out using seven Varian linear accelerators from five different centers. Dynalogs were collected from over 100 000 clinical treatments and in-house software was used to compute the number of tolerance faults as amore » function of the user-defined tolerance. Thus, the optimal value for this tolerance, defined as the lowest achievable value, was investigated. Results: Dynalog files accurately predict the number of tolerance faults as a function of the tolerance value, especially for low fault incidences. All MLCs behaved similarly and the Millennium120 and the HD120 models yielded comparable results. In sliding window techniques, the number of beams with an incidence of hold-offs >1% rapidly decreases for a tolerance of 1.5 mm. In VMAT techniques, the number of tolerance faults sharply drops for tolerances around 2 mm. For a tolerance of 2.5 mm, less than 0.1% of the VMAT arcs presented tolerance faults. Conclusions: Dynalog analysis provides a feasible method for investigating the optimal tolerance for MLC positioning in dynamic fields. In sliding window treatments, the tolerance of 2 mm was found to be adequate, although it can be reduced to 1.5 mm. In VMAT treatments, the typically used 5 mm tolerance is excessively high. Instead, a tolerance of 2.5 mm is recommended.« less
The signal extraction of fetal heart rate based on wavelet transform and BP neural network
NASA Astrophysics Data System (ADS)
Yang, Xiao Hong; Zhang, Bang-Cheng; Fu, Hu Dai
2005-04-01
This paper briefly introduces the collection and recognition of bio-medical signals, designs the method to collect FM signals. A detailed discussion on the system hardware, structure and functions is also given. Under LabWindows/CVI,the hardware and the driver do compatible, the hardware equipment work properly actively. The paper adopts multi threading technology for real-time analysis and makes use of latency time of CPU effectively, expedites program reflect speed, improves the program to perform efficiency. One threading is collecting data; the other threading is analyzing data. Using the method, it is broaden to analyze the signal in real-time. Wavelet transform to remove the main interference in the FM and by adding time-window to recognize with BP network; Finally the results of collecting signals and BP networks are discussed. 8 pregnant women's signals of FM were collected successfully by using the sensor. The correctness rate of BP network recognition is about 83.3% by using the above measure.
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Air transparent soundproof window
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Sang-Hoon, E-mail: shkim@mmu.ac.kr; Lee, Seong-Hyun
2014-11-15
A soundproof window or wall which is transparent to airflow is presented. The design is based on two wave theories: the theory of diffraction and the theory of acoustic metamaterials. It consists of a three-dimensional array of strong diffraction-type resonators with many holes centered on each individual resonator. The negative effective bulk modulus of the resonators produces evanescent wave, and at the same time the air holes with subwavelength diameter existed on the surfaces of the window for macroscopic air ventilation. The acoustic performance levels of two soundproof windows with air holes of 20mm and 50mm diameters were measured. Themore » sound level was reduced by about 30 - 35dB in the frequency range of 400 - 5,000Hz with the 20mm window, and by about 20 - 35dB in the frequency range of 700 - 2,200Hz with the 50mm window. Multi stop-band was created by the multi-layers of the window. The attenuation length or the thickness of the window was limited by background noise. The effectiveness of the soundproof window with airflow was demonstrated by a real installation.« less
Conformal ALON® and spinel windows
NASA Astrophysics Data System (ADS)
Goldman, Lee M.; Smith, Mark; Ramisetty, Mohan; Jha, Santosh; Sastri, Suri
2017-05-01
The requirements for modern aircraft based reconnaissance systems are driving the need for conformal windows for future sensor systems. However, limitations on optical systems and the ability to produce windows in complex geometries currently limit the geometry of existing windows and window assemblies to faceted assemblies of flat windows. ALON consists primarily of aluminum and oxygen, similar to that of alumina, with a small amount of nitrogen added to help stabilize the cubic gamma-AlON phase. ALON's chemical similarity to alumina, translates into a robust manufacturing process. This ease of processing has allowed Surmet to produce ALON windows and domes in a wide variety of geometries and sizes. Spinel (MgAl2O4) contains equal molar amounts of MgO and Al2O3, and is a cubic material, that transmits further into the Infrared than ALON. Spinel is produced via powder processing techniques similar to those used to produce ALON. Surmet is now applying the lessons learned with ALON to produce conformal spinel windows and domes as well.
Shang, Jianyu; Deng, Zhihong; Fu, Mengyin; Wang, Shunting
2016-01-01
Traditional artillery guidance can significantly improve the attack accuracy and overall combat efficiency of projectiles, which makes it more adaptable to the information warfare of the future. Obviously, the accurate measurement of artillery spin rate, which has long been regarded as a daunting task, is the basis of precise guidance and control. Magnetoresistive (MR) sensors can be applied to spin rate measurement, especially in the high-spin and high-g projectile launch environment. In this paper, based on the theory of a MR sensor measuring spin rate, the mathematical relationship model between the frequency of MR sensor output and projectile spin rate was established through a fundamental derivation. By analyzing the characteristics of MR sensor output whose frequency varies with time, this paper proposed the Chirp z-Transform (CZT) time-frequency (TF) domain analysis method based on the rolling window of a Blackman window function (BCZT) which can accurately extract the projectile spin rate. To put it into practice, BCZT was applied to measure the spin rate of 155 mm artillery projectile. After extracting the spin rate, the impact that launch rotational angular velocity and aspect angle have on the extraction accuracy of the spin rate was analyzed. Simulation results show that the BCZT TF domain analysis method can effectively and accurately measure the projectile spin rate, especially in a high-spin and high-g projectile launch environment. PMID:27322266
NASA Astrophysics Data System (ADS)
Janneck, Robby; Vercesi, Federico; Heremans, Paul; Genoe, Jan; Rolin, Cedric
2016-09-01
Organic thin film transistors (OTFTs) based on single crystalline thin films of organic semiconductors have seen considerable development in the recent years. The most successful method for the fabrication of single crystalline films are solution-based meniscus guided coating techniques such as dip-coating, solution shearing or zone casting. These upscalable methods enable rapid and efficient film formation without additional processing steps. The single-crystalline film quality is strongly dependent on solvent choice, substrate temperature and coating speed. So far, however, process optimization has been conducted by trial and error methods, involving, for example, the variation of coating speeds over several orders of magnitude. Through a systematic study of solvent phase change dynamics in the meniscus region, we develop a theoretical framework that links the optimal coating speed to the solvent choice and the substrate temperature. In this way, we can accurately predict an optimal processing window, enabling fast process optimization. Our approach is verified through systematic OTFT fabrication based on films grown with different semiconductors, solvents and substrate temperatures. The use of best predicted coating speeds delivers state of the art devices. In the case of C8BTBT, OTFTs show well-behaved characteristics with mobilities up to 7 cm2/Vs and onset voltages close to 0 V. Our approach also explains well optimal recipes published in the literature. This route considerably accelerates parameter screening for all meniscus guided coating techniques and unveils the physics of single crystalline film formation.
Location identification of closed crack based on Duffing oscillator transient transition
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Bo, Lin; Liu, Yaolu; Zhao, Youxuan; Zhang, Jun; Deng, Mingxi; Hu, Ning
2018-02-01
The existence of a closed micro-crack in plates can be detected by using the nonlinear harmonic characteristics of the Lamb wave. However, its location identification is difficult. By considering the transient nonlinear Lamb under the noise interference, we proposed a location identification method for the closed crack based on the quantitative measurement of Duffing oscillator transient transfer in the phase space. The sliding short-time window was used to create a window truncation of to-be-detected signal. And then, the periodic extension processing for transient nonlinear Lamb wave was performed to ensure that the Duffing oscillator has adequate response time to reach a steady state. The transient autocorrelation method was used to reduce the occurrence of missed harmonic detection due to the random variable phase of nonlinear Lamb wave. Moreover, to overcome the deficiency in the quantitative analysis of Duffing system state by phase trajectory diagram and eliminate the misjudgment caused by harmonic frequency component contained in broadband noise, logic operation method of oscillator state transition function based on circular zone partition was adopted to establish the mapping relation between the oscillator transition state and the nonlinear harmonic time domain information. Final state transition discriminant function of Duffing oscillator was used as basis for identifying the reflected and transmitted harmonics from the crack. Chirplet time-frequency analysis was conducted to identify the mode of generated harmonics and determine the propagation speed. Through these steps, accurate position identification of the closed crack was achieved.
Ground-based remote sensing of thin clouds in the Arctic
NASA Astrophysics Data System (ADS)
Garrett, T. J.; Zhao, C.
2013-05-01
This paper describes a method for using interferometer measurements of downwelling thermal radiation to retrieve the properties of single-layer clouds. Cloud phase is determined from ratios of thermal emission in three "micro-windows" at 862.5 cm-1, 935.8 cm-1, and 988.4 cm-1 where absorption by water vapour is particularly small. Cloud microphysical and optical properties are retrieved from thermal emission in the first two of these micro-windows, constrained by the transmission through clouds of primarily stratospheric ozone emission at 1040 cm-1. Assuming a cloud does not approximate a blackbody, the estimated 95% confidence retrieval errors in effective radius re, visible optical depth τ, number concentration N, and water path WP are, respectively, 10%, 20%, 38% (55% for ice crystals), and 16%. Applied to data from the Atmospheric Radiation Measurement programme (ARM) North Slope of Alaska - Adjacent Arctic Ocean (NSA-AAO) site near Barrow, Alaska, retrievals show general agreement with both ground-based microwave radiometer measurements of liquid water path and a method that uses combined shortwave and microwave measurements to retrieve re, τ and N. Compared to other retrieval methods, advantages of this technique include its ability to characterise thin clouds year round, that water vapour is not a primary source of retrieval error, and that the retrievals of microphysical properties are only weakly sensitive to retrieved cloud phase. The primary limitation is the inapplicability to thicker clouds that radiate as blackbodies and that it relies on a fairly comprehensive suite of ground based measurements.
Comparison of Frequency-Domain Array Methods for Studying Earthquake Rupture Process
NASA Astrophysics Data System (ADS)
Sheng, Y.; Yin, J.; Yao, H.
2014-12-01
Seismic array methods, in both time- and frequency- domains, have been widely used to study the rupture process and energy radiation of earthquakes. With better spatial resolution, the high-resolution frequency-domain methods, such as Multiple Signal Classification (MUSIC) (Schimdt, 1986; Meng et al., 2011) and the recently developed Compressive Sensing (CS) technique (Yao et al., 2011, 2013), are revealing new features of earthquake rupture processes. We have performed various tests on the methods of MUSIC, CS, minimum-variance distortionless response (MVDR) Beamforming and conventional Beamforming in order to better understand the advantages and features of these methods for studying earthquake rupture processes. We use the ricker wavelet to synthesize seismograms and use these frequency-domain techniques to relocate the synthetic sources we set, for instance, two sources separated in space but, their waveforms completely overlapping in the time domain. We also test the effects of the sliding window scheme on the recovery of a series of input sources, in particular, some artifacts that are caused by the sliding window scheme. Based on our tests, we find that CS, which is developed from the theory of sparsity inversion, has relatively high spatial resolution than the other frequency-domain methods and has better performance at lower frequencies. In high-frequency bands, MUSIC, as well as MVDR Beamforming, is more stable, especially in the multi-source situation. Meanwhile, CS tends to produce more artifacts when data have poor signal-to-noise ratio. Although these techniques can distinctly improve the spatial resolution, they still produce some artifacts along with the sliding of the time window. Furthermore, we propose a new method, which combines both the time-domain and frequency-domain techniques, to suppress these artifacts and obtain more reliable earthquake rupture images. Finally, we apply this new technique to study the 2013 Okhotsk deep mega earthquake in order to better capture the rupture characteristics (e.g., rupture area and velocity) of this earthquake.
Wildcat5 for Windows, a rainfall-runoff hydrograph model: user manual and documentation
R. H. Hawkins; A. Barreto-Munoz
2016-01-01
Wildcat5 for Windows (Wildcat5) is an interactive Windows Excel-based software package designed to assist watershed specialists in analyzing rainfall runoff events to predict peak flow and runoff volumes generated by single-event rainstorms for a variety of watershed soil and vegetation conditions. Model inputs are: (1) rainstorm characteristics, (2) parameters related...
van Holle, Lionel; Bauchau, Vincent
2014-01-01
Purpose Disproportionality methods measure how unexpected the observed number of adverse events is. Time-to-onset (TTO) methods measure how unexpected the TTO distribution of a vaccine-event pair is compared with what is expected from other vaccines and events. Our purpose is to compare the performance associated with each method. Methods For the disproportionality algorithms, we defined 336 combinations of stratification factors (sex, age, region and year) and threshold values of the multi-item gamma Poisson shrinker (MGPS). For the TTO algorithms, we defined 18 combinations of significance level and time windows. We used spontaneous reports of adverse events recorded for eight vaccines. The vaccine product labels were used as proxies for true safety signals. Algorithms were ranked according to their positive predictive value (PPV) for each vaccine separately; amedian rank was attributed to each algorithm across vaccines. Results The algorithm with the highest median rank was based on TTO with a significance level of 0.01 and a time window of 60 days after immunisation. It had an overall PPV 2.5 times higher than for the highest-ranked MGPS algorithm, 16th rank overall, which was fully stratified and had a threshold value of 0.8. A TTO algorithm with roughly the same sensitivity as the highest-ranked MGPS had better specificity but longer time-to-detection. Conclusions Within the scope of this study, the majority of the TTO algorithms presented a higher PPV than for any MGPS algorithm. Considering the complementarity of TTO and disproportionality methods, a signal detection strategy combining them merits further investigation. PMID:24038719
ToF-SIMS characterization of robust window material for use in diode pumped alkali lasers
NASA Astrophysics Data System (ADS)
Fletcher, Aaron; Turner, David; Fairchild, Steven; Rice, Christopher; Pitz, Gregory
2018-03-01
Developments in diode pumped alkali laser (DPAL) systems have been impeded because of the catastrophic failure of laser windows. The window's failure is caused by localized laser-induced heating of window material. This heating is believed to occur due to increases in absorption on or near the surface of the window. This increase is believed to be caused by either adsorption of carbon-based soot from the collisional gas or by the diffusion of rubidium into the bulk material. The work presented here will focus on the diffusion of Rb into the bulk window materials and will strive to identify a superior material to use as windows. The results of this research indicate that aluminum oxynitride (ALON), sapphire, MgAl2O4 (spinel), and ZrO2 are resistant to alkali-induced changes in optical properties.
VIRTIS on Venus Express: retrieval of real surface emissivity on global scales
NASA Astrophysics Data System (ADS)
Arnold, Gabriele E.; Kappel, David; Haus, Rainer; Telléz Pedroza, Laura; Piccioni, Giuseppe; Drossart, Pierre
2015-09-01
The extraction of surface emissivity data provides the data base for surface composition analyses and enables to evaluate Venus' geology. The Visible and InfraRed Thermal Imaging Spectrometer (VIRTIS) aboard ESA's Venus Express mission measured, inter alia, the nightside thermal emission of Venus in the near infrared atmospheric windows between 1.0 and 1.2 μm. These data can be used to determine information about surface properties on global scales. This requires a sophisticated approach to understand and consider the effects and interferences of different atmospheric and surface parameters influencing the retrieved values. In the present work, results of a new technique for retrieval of the 1.0 - 1.2 μm - surface emissivity are summarized. It includes a Multi-Window Retrieval Technique, a Multi-Spectrum Retrieval technique (MSR), and a detailed reliability analysis. The MWT bases on a detailed radiative transfer model making simultaneous use of information from different atmospheric windows of an individual spectrum. MSR regularizes the retrieval by incorporating available a priori mean values, standard deviations as well as spatial-temporal correlations of parameters to be retrieved. The capability of this method is shown for a selected surface target area. Implications for geologic investigations are discussed. Based on these results, the work draws conclusions for future Venus surface composition analyses on global scales using spectral remote sensing techniques. In that context, requirements for observational scenarios and instrumental performances are investigated, and recommendations are derived to optimize spectral measurements for Venus' surface studies.
NASA Astrophysics Data System (ADS)
Li, M.; Yu, T.; Chunliang, X.; Zuo, X.; Liu, Z.
2017-12-01
A new method for estimating the equatorial plasma bubbles (EPBs) motions from airglow emission all-sky images is presented in this paper. This method, which is called 'cloud-derived wind technology' and widely used in satellite observation of wind, could reasonable derive zonal and meridional velocity vectors of EPBs drifts by tracking a series of successive airglow 630.0 nm emission images. Airglow emission images data are available from an all sky airglow camera in Hainan Fuke (19.5°N, 109.2°E) supported by China Meridional Project, which can receive the 630.0nm emission from the ionosphere F region at low-latitudes to observe plasma bubbles. A series of pretreatment technology, e.g. image enhancement, orientation correction, image projection are utilized to preprocess the raw observation. Then the regions of plasma bubble extracted from the images are divided into several small tracing windows and each tracing window can find a target window in the searching area in following image, which is considered as the position tracing window moved to. According to this, velocities in each window are calculated by using the technology of cloud-derived wind. When applying the cloud-derived wind technology, the maximum correlation coefficient (MCC) and the histogram of gradient (HOG) methods to find the target window, which mean to find the maximum correlation and the minimum euclidean distance between two gradient histograms in respectively, are investigated and compared in detail. The maximum correlation method is fianlly adopted in this study to analyze the velocity of plasma bubbles because of its better performance than HOG. All-sky images from Hainan Fuke, between August 2014 and October 2014, are analyzed to investigate the plasma bubble drift velocities using MCC method. The data at different local time at 9 nights are studied and find that zonal drift velocity in different latitude at different local time ranges from 50 m/s to 180 m/s and there is a peak value at about 20°N. For comparison and validation, EPBs motions obtained from three traditional methods are also investigated and compared with MC method. The advantages and disadvantages of using cloud-derived wind technology to calculate EPB drift velocity are discussed.
Measurement method for the refractive index of thick solid and liquid layers.
Santić, Branko; Gracin, Davor; Juraić, Krunoslav
2009-08-01
A simple method is proposed for the refractive index measurement of thick solid and liquid layers. In contrast to interferometric methods, no mirrors are used, and the experimental setup is undemanding and simple. The method is based on the variation of transmission caused by optical interference within the layer as a function of incidence angle. A new equation is derived for the positions of the interference extrema versus incidence angle. Scattering at the surfaces and within the sample, as well as weak absorption, do not play important roles. The method is illustrated by the refractive index measurements of sapphire, window glass, and water.
Evaluation of Energy Efficiency Performance of Heated Windows
NASA Astrophysics Data System (ADS)
Jammulamadaka, Hari Swarup
The study about the evaluation of the performance of the heated windows was funded by the WVU Research Office as a technical assistance award at the 2014 TransTech Energy Business Development Conference to the Green Heated Glass company/project owned by Frank Dlubak. The award supports a WVU researcher to conduct a project important for commercialization. This project was awarded to the WVU Industrial Assessment Center in 2015. The current study attempted to evaluate the performance of the heated windows by developing an experimental setup to test the window at various temperatures by varying the current input to the window. The heated double pane window was installed in an insulated box. A temperature gradient was developed across the window by cooling one side of the window using gel based ice packs. The other face of the window was heated by passing current at different wattages through the window. The temperature of the inside and outside panes, current and voltage input, room and box temperature were recorded, and used to calculate the apparent R-value of the window when not being heated vs when being heated. It has been concluded from the study that the heated double pane window is more effective in reducing heat losses by as much as 50% than a non-heated double pane window, if the window temperature is maintained close to the room temperature. If the temperature of the window is much higher than the room temperature, the losses through the window appear to increase beyond that of a non-heated counterpart. The issues encountered during the current round of experiments are noted, and recommendations provided for future studies.
Multiple infrared bands absorber based on multilayer gratings
NASA Astrophysics Data System (ADS)
Liu, Xiaoyi; Gao, Jinsong; Yang, Haigui; Wang, Xiaoyi; Guo, Chengli
2018-03-01
The present study offers an Ag/Si multilayer-grating microstructure based on an Si substrate. The microstructure exhibits designable narrowband absorption in multiple infrared wavebands, especially in mid- and long-wave infrared atmospheric windows. We investigate its resonance mode mechanism, and calculate the resonance wavelengths by the Fabry-Perot and metal-insulator-metal theories for comparison with the simulation results. Furthermore, we summarize the controlling rules of the absorption peak wavelength of the microstructure to provide a new method for generating a Si-based device with multiple working bands in infrared.
Aslan, Alper; Destek, Mehmet Akif; Okumus, Ilyas
2018-01-01
This study aims to examine the validity of inverted U-shaped Environmental Kuznets Curve by investigating the relationship between economic growth and environmental pollution for the period from 1966 to 2013 in the USA. Previous studies based on the assumption of parameter stability and obtained parameters do not change over the full sample. This study uses bootstrap rolling window estimation method to detect the possible changes in causal relations and also obtain the parameters for sub-sample periods. The results show that the parameter of economic growth has increasing trend in 1982-1996 sub-sample periods, and it has decreasing trend in 1996-2013 sub-sample periods. Therefore, the existence of inverted U-shaped Environmental Kuznets Curve is confirmed in the USA.
Design and comparison of laser windows for high-power lasers
NASA Astrophysics Data System (ADS)
Niu, Yanxiong; Liu, Wenwen; Liu, Haixia; Wang, Caili; Niu, Haisha; Man, Da
2014-11-01
High-power laser systems are getting more and more widely used in industry and military affairs. It is necessary to develop a high-power laser system which can operate over long periods of time without appreciable degradation in performance. When a high-energy laser beam transmits through a laser window, it is possible that the permanent damage is caused to the window because of the energy absorption by window materials. So, when we design a high-power laser system, a suitable laser window material must be selected and the laser damage threshold of the window must be known. In this paper, a thermal analysis model of high-power laser window is established, and the relationship between the laser intensity and the thermal-stress field distribution is studied by deducing the formulas through utilizing the integral-transform method. The influence of window radius, thickness and laser intensity on the temperature and stress field distributions is analyzed. Then, the performance of K9 glass and the fused silica glass is compared, and the laser-induced damage mechanism is analyzed. Finally, the damage thresholds of laser windows are calculated. The results show that compared with K9 glass, the fused silica glass has a higher damage threshold due to its good thermodynamic properties. The presented theoretical analysis and simulation results are helpful for the design and selection of high-power laser windows.
Bergmann, Helmar; Minear, Gregory; Raith, Maria; Schaffarich, Peter M
2008-12-09
The accuracy of multiple window spatial resolution characterises the performance of a gamma camera for dual isotope imaging. In the present study we investigate an alternative method to the standard NEMA procedure for measuring this performance parameter. A long-lived 133Ba point source with gamma energies close to 67Ga and a single bore lead collimator were used to measure the multiple window spatial registration error. Calculation of the positions of the point source in the images used the NEMA algorithm. The results were validated against the values obtained by the standard NEMA procedure which uses a liquid 67Ga source with collimation. Of the source-collimator configurations under investigation an optimum collimator geometry, consisting of a 5 mm thick lead disk with a diameter of 46 mm and a 5 mm central bore, was selected. The multiple window spatial registration errors obtained by the 133Ba method showed excellent reproducibility (standard deviation < 0.07 mm). The values were compared with the results from the NEMA procedure obtained at the same locations and showed small differences with a correlation coefficient of 0.51 (p < 0.05). In addition, the 133Ba point source method proved to be much easier to use. A Bland-Altman analysis showed that the 133Ba and the 67Ga Method can be used interchangeably. The 133Ba point source method measures the multiple window spatial registration error with essentially the same accuracy as the NEMA-recommended procedure, but is easier and safer to use and has the potential to replace the current standard procedure.
Multi-Window Controllers for Autonomous Space Systems
NASA Technical Reports Server (NTRS)
Lurie, B, J.; Hadaegh, F. Y.
1997-01-01
Multi-window controllers select between elementary linear controllers using nonlinear windows based on the amplitude and frequency content of the feedback error. The controllers are relatively simple to implement and perform much better than linear controllers. The commanders for such controllers only order the destination point and are freed from generating the command time-profiles. The robotic missions rely heavily on the tasks of acquisition and tracking. For autonomous and optimal control of the spacecraft, the control bandwidth must be larger while the feedback can (and, therefore, must) be reduced.. Combining linear compensators via multi-window nonlinear summer guarantees minimum phase character of the combined transfer function. It is shown that the solution may require using several parallel branches and windows. Several examples of multi-window nonlinear controller applications are presented.
NASA Astrophysics Data System (ADS)
Guo, X.; Li, Y.; Suo, T.; Liu, H.; Zhang, C.
2017-11-01
This paper proposes a method for de-blurring of images captured in the dynamic deformation of materials. De-blurring is achieved based on the dynamic-based approach, which is used to estimate the Point Spread Function (PSF) during the camera exposure window. The deconvolution process involving iterative matrix calculations of pixels, is then performed on the GPU to decrease the time cost. Compared to the Gauss method and the Lucy-Richardson method, it has the best result of the image restoration. The proposed method has been evaluated by using the Hopkinson bar loading system. In comparison to the blurry image, the proposed method has successfully restored the image. It is also demonstrated from image processing applications that the de-blurring method can improve the accuracy and the stability of the digital imaging correlation measurement.
SigHunt: horizontal gene transfer finder optimized for eukaryotic genomes.
Jaron, Kamil S; Moravec, Jiří C; Martínková, Natália
2014-04-15
Genomic islands (GIs) are DNA fragments incorporated into a genome through horizontal gene transfer (also called lateral gene transfer), often with functions novel for a given organism. While methods for their detection are well researched in prokaryotes, the complexity of eukaryotic genomes makes direct utilization of these methods unreliable, and so labour-intensive phylogenetic searches are used instead. We present a surrogate method that investigates nucleotide base composition of the DNA sequence in a eukaryotic genome and identifies putative GIs. We calculate a genomic signature as a vector of tetranucleotide (4-mer) frequencies using a sliding window approach. Extending the neighbourhood of the sliding window, we establish a local kernel density estimate of the 4-mer frequency. We score the number of 4-mer frequencies in the sliding window that deviate from the credibility interval of their local genomic density using a newly developed discrete interval accumulative score (DIAS). To further improve the effectiveness of DIAS, we select informative 4-mers in a range of organisms using the tetranucleotide quality score developed herein. We show that the SigHunt method is computationally efficient and able to detect GIs in eukaryotic genomes that represent non-ameliorated integration. Thus, it is suited to scanning for change in organisms with different DNA composition. Source code and scripts freely available for download at http://www.iba.muni.cz/index-en.php?pg=research-data-analysis-tools-sighunt are implemented in C and R and are platform-independent. 376090@mail.muni.cz or martinkova@ivb.cz. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
The ACE multi-user web-based Robotic Observatory Control System
NASA Astrophysics Data System (ADS)
Mack, P.
2003-05-01
We have developed an observatory control system that can be operated in interactive, remote or robotic modes. In interactive and remote mode the observer typically acquires the first object then creates a script through a window interface to complete observations for the rest of the night. The system closes early in the event of bad weather. In robotic mode observations are submitted ahead of time through a web-based interface. We present observations made with a 1.0-m telescope using these methods.
2011-04-01
this limitation the length of the windows needs to be shortened. It is also leads to a narrower confidence interval, see Figure 2.9. 82 The " big ...least one event will occur within the window. The windows are then grouped in sets of two and the process is reapeated for a window size twice as big ...0 505 T. Fu 1 506 D. Walden 1 508 J. Brown 1 55 T.Applebee 0 55 M. Dipper 1 551 T. Smith I 551 C. Bassler 3 3 551 V. Belenky 1 551 W. Belknap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoessel, Chris
2013-11-13
This project developed a new high-performance R-10/high SHGC window design, reviewed market positioning and evaluated manufacturing solutions required for broad market adoption. The project objectives were accomplished by: identifying viable technical solutions based on modeling of modern and potential coating stacks and IGU designs; development of new coating material sets for HM thin film stacks, as well as improved HM IGU designs to accept multiple layers of HM films; matching promising new coating designs with new HM IGU designs to demonstrate performance gains; and, in cooperation with a window manufacturer, assess the potential for high-volume manufacturing and cost efficiency ofmore » a HM-based R-10 window with improved solar heat gain characteristics. A broad view of available materials and design options was applied to achieve the desired improvements. Gated engineering methodologies were employed to guide the development process from concept generation to a window demonstration. The project determined that a slightly de-rated window performance allows formulation of a path to achieve the desired cost reductions to support end consumer adoption.« less
Fisher, Jeffrey D; Amico, K Rivet; Fisher, William A; Cornman, Deborah H; Shuper, Paul A; Trayling, Cynthia; Redding, Caroline; Barta, William; Lemieux, Anthony F; Altice, Frederick L; Dieckhaus, Kevin; Friedland, Gerald
2011-11-01
We evaluated the efficacy of LifeWindows, a theory-based, computer-administered antiretroviral (ARV) therapy adherence support intervention, delivered to HIV + patients at routine clinical care visits. 594 HIV + adults receiving HIV care at five clinics were randomized to intervention or control arms. Intervention vs. control impact in the intent-to-treat sample (including participants whose ARVs had been entirely discontinued, who infrequently attended care, or infrequently used LifeWindows) did not reach significance. Intervention impact in the On Protocol sample (328 intervention and control arm participants whose ARVs were not discontinued, who attended care and were exposed to LifeWindows regularly) was significant. On Protocol intervention vs. control participants achieved significantly higher levels of perfect 3-day ACTG-assessed adherence over time, with sensitivity analyses maintaining this effect down to 70% adherence. This study supports the utility of LifeWindows and illustrates that patients on ARVs who persist in care at clinical care sites can benefit from adherence promotion software.
Investigation of high temperature antennas for space shuttle
NASA Technical Reports Server (NTRS)
Kuhlman, E. A.
1973-01-01
The design and development of high temperature antennas for the space shuttle orbiter are discussed. The antenna designs were based on three antenna types, an annular slot (L-Band), a linear slot (C-Band), and a horn (C-Band). The design approach was based on combining an RF window, which provides thermal protection, with an off-the-shelf antenna. Available antenna window materials were reviewed and compared, and the materials most compatible with the design requirements were selected. Two antenna window design approaches were considered: one employed a high temperature dielectric material and a low density insulation material, and the other an insulation material usable for the orbiter thermal protection system. Preliminary designs were formulated and integrated into the orbiter structure. Simple electrical models, with a series of window configurations, were constructed and tested. The results of tests and analyses for the final antenna system designs are given and show that high temperature antenna systems consisting of off-the-shelf antennas thermally protected by RF windows can be designed for the Space Shuttle Orbiter.
NASA Astrophysics Data System (ADS)
Rawles, Christopher; Thurber, Clifford
2015-08-01
We present a simple, fast, and robust method for automatic detection of P- and S-wave arrivals using a nearest neighbours-based approach. The nearest neighbour algorithm is one of the most popular time-series classification methods in the data mining community and has been applied to time-series problems in many different domains. Specifically, our method is based on the non-parametric time-series classification method developed by Nikolov. Instead of building a model by estimating parameters from the data, the method uses the data itself to define the model. Potential phase arrivals are identified based on their similarity to a set of reference data consisting of positive and negative sets, where the positive set contains examples of analyst identified P- or S-wave onsets and the negative set contains examples that do not contain P waves or S waves. Similarity is defined as the square of the Euclidean distance between vectors representing the scaled absolute values of the amplitudes of the observed signal and a given reference example in time windows of the same length. For both P waves and S waves, a single pass is done through the bandpassed data, producing a score function defined as the ratio of the sum of similarity to positive examples over the sum of similarity to negative examples for each window. A phase arrival is chosen as the centre position of the window that maximizes the score function. The method is tested on two local earthquake data sets, consisting of 98 known events from the Parkfield region in central California and 32 known events from the Alpine Fault region on the South Island of New Zealand. For P-wave picks, using a reference set containing two picks from the Parkfield data set, 98 per cent of Parkfield and 94 per cent of Alpine Fault picks are determined within 0.1 s of the analyst pick. For S-wave picks, 94 per cent and 91 per cent of picks are determined within 0.2 s of the analyst picks for the Parkfield and Alpine Fault data set, respectively. For the Parkfield data set, our method picks 3520 P-wave picks and 3577 S-wave picks out of 4232 station-event pairs. For the Alpine Fault data set, the method picks 282 P-wave picks and 311 S-wave picks out of a total of 344 station-event pairs. For our testing, we note that the vast majority of station-event pairs have analyst picks, although some analyst picks are excluded based on an accuracy assessment. Finally, our tests suggest that the method is portable, allowing the use of a reference set from one region on data from a different region using relatively few reference picks.
Activity Recognition on Streaming Sensor Data.
Krishnan, Narayanan C; Cook, Diane J
2014-02-01
Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.
Kim, Sangmin; Raphael, Patrick D; Oghalai, John S; Applegate, Brian E
2016-04-01
Swept-laser sources offer a number of advantages for Phase-sensitive Optical Coherence Tomography (PhOCT). However, inter- and intra-sweep variability leads to calibration errors that adversely affect phase sensitivity. While there are several approaches to overcoming this problem, our preferred method is to simply calibrate every sweep of the laser. This approach offers high accuracy and phase stability at the expense of a substantial processing burden. In this approach, the Hilbert phase of the interferogram from a reference interferometer provides the instantaneous wavenumber of the laser, but is computationally expensive. Fortunately, the Hilbert transform may be approximated by a Finite Impulse-Response (FIR) filter. Here we explore the use of several FIR filter based Hilbert transforms for calibration, explicitly considering the impact of filter choice on phase sensitivity and OCT image quality. Our results indicate that the complex FIR filter approach is the most robust and accurate among those considered. It provides similar image quality and slightly better phase sensitivity than the traditional FFT-IFFT based Hilbert transform while consuming fewer resources in an FPGA implementation. We also explored utilizing the Hilbert magnitude of the reference interferogram to calculate an ideal window function for spectral amplitude calibration. The ideal window function is designed to carefully control sidelobes on the axial point spread function. We found that after a simple chromatic correction, calculating the window function using the complex FIR filter and the reference interferometer gave similar results to window functions calculated using a mirror sample and the FFT-IFFT Hilbert transform. Hence, the complex FIR filter can enable accurate and high-speed calibration of the magnitude and phase of spectral interferograms.
Kim, Sangmin; Raphael, Patrick D.; Oghalai, John S.; Applegate, Brian E.
2016-01-01
Swept-laser sources offer a number of advantages for Phase-sensitive Optical Coherence Tomography (PhOCT). However, inter- and intra-sweep variability leads to calibration errors that adversely affect phase sensitivity. While there are several approaches to overcoming this problem, our preferred method is to simply calibrate every sweep of the laser. This approach offers high accuracy and phase stability at the expense of a substantial processing burden. In this approach, the Hilbert phase of the interferogram from a reference interferometer provides the instantaneous wavenumber of the laser, but is computationally expensive. Fortunately, the Hilbert transform may be approximated by a Finite Impulse-Response (FIR) filter. Here we explore the use of several FIR filter based Hilbert transforms for calibration, explicitly considering the impact of filter choice on phase sensitivity and OCT image quality. Our results indicate that the complex FIR filter approach is the most robust and accurate among those considered. It provides similar image quality and slightly better phase sensitivity than the traditional FFT-IFFT based Hilbert transform while consuming fewer resources in an FPGA implementation. We also explored utilizing the Hilbert magnitude of the reference interferogram to calculate an ideal window function for spectral amplitude calibration. The ideal window function is designed to carefully control sidelobes on the axial point spread function. We found that after a simple chromatic correction, calculating the window function using the complex FIR filter and the reference interferometer gave similar results to window functions calculated using a mirror sample and the FFT-IFFT Hilbert transform. Hence, the complex FIR filter can enable accurate and high-speed calibration of the magnitude and phase of spectral interferograms. PMID:27446666
2013-03-01
Weave Welding Method Wheel Assembly Wind Load Wind Loads Wind Uplift Resistance Wind Uplift Resistance Class Window Category Window Finish Window... wind - blast Elongation UFGS 2.1 percent Insert Value Visual Defects UFGS 2.1 n/a Insert Value ERDC/CERL CR-13-1 39 Attribute Source...Sustainability COBie Guide n/a insert reqts FRP Strengthening UFGS 1.2 n/a seismic - wind - blast Elongation UFGS 2.2 percent Insert Value Tensile
Centrality and transverse momentum dependence of dihadron correlations in a hydrodynamic model
NASA Astrophysics Data System (ADS)
Castilho, Wagner M.; Qian, Wei-Liang
2018-06-01
In this work, we study the centrality as well as transverse momentum dependence of the dihadron correlation for Au+Au collisions at 200A GeV. The numerical simulations are carried out by using a hydrodynamical code NeXSPheRIO, where the initial conditions are obtained from a Regge-Gribov based microscopic model, NeXuS. In our calculations, the centrality windows are evaluated regarding multiplicity. The final correlations are obtained by the background subtraction via ZYAM methods, where higher harmonics are considered explicitly. The correlations are evaluated for the 0-20%, 20%-40% and 60%-92% centrality windows. Also, the transverse momentum dependence of the dihadron correlations is investigated. The obtained results are compared with experimental data. It is observed that the centrality dependence of the "ridge" and "double shoulder" structures is in consistency with the data. Based on specific set of parameters employed in the present study, it is found that different ZYAM subtraction schemes might lead to different features in the resultant correlations.
Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wilson, Ander; Coull, Brent A; Pendo, Mathew P; Baccarelli, Andrea; Kloog, Itai; Schwartz, Joel; Wright, Robert O; Taveras, Elsie M; Wright, Rosalind J
2017-10-01
Evolving animal studies and limited epidemiological data show that prenatal air pollution exposure is associated with childhood obesity. Timing of exposure and child sex may play an important role in these associations. We applied an innovative method to examine sex-specific sensitive prenatal windows of exposure to PM 2.5 on anthropometric measures in preschool-aged children. Analyses included 239 children born ≥ 37 weeks gestation in an ethnically-mixed lower-income urban birth cohort. Prenatal daily PM 2.5 exposure was estimated using a validated satellite-based spatio-temporal model. Body mass index z-score (BMI-z), fat mass, % body fat, subscapular and triceps skinfold thickness, waist and hip circumferences and waist-to-hip ratio (WHR) were assessed at age 4.0 ± 0.7 years. Using Bayesian distributed lag interaction models (BDLIMs), we examined sex differences in sensitive windows of weekly averaged PM 2.5 levels on these measures, adjusting for child age, maternal age, education, race/ethnicity, and pre-pregnancy BMI. Mothers were primarily Hispanic (55%) or Black (26%), had ≤ 12 years of education (66%) and never smoked (80%). Increased PM 2.5 exposure 8-17 and 15-22 weeks gestation was significantly associated with increased BMI z-scores and fat mass in boys, but not in girls. Higher PM 2.5 exposure 10-29 weeks gestation was significantly associated with increased WHR in girls, but not in boys. Prenatal PM 2.5 was not significantly associated with other measures of body composition. Estimated cumulative effects across pregnancy, accounting for sensitive windows and within-window effects, were 0.21 (95%CI = 0.01-0.37) for BMI-z and 0.36 (95%CI = 0.12-0.68) for fat mass (kg) in boys, and 0.02 (95%CI = 0.01-0.03) for WHR in girls, all per µg/m 3 increase in PM 2.5 . Increased prenatal PM 2.5 exposure was more strongly associated with indices of increased whole body size in boys and with an indicator of body shape in girls. Methods to better characterize vulnerable windows may provide insight into underlying mechanisms contributing to sex-specific associations. Copyright © 2017 Elsevier Inc. All rights reserved.
Prospective PET image quality gain calculation method by optimizing detector parameters.
Theodorakis, Lampros; Loudos, George; Prassopoulos, Vasilios; Kappas, Constantine; Tsougos, Ioannis; Georgoulias, Panagiotis
2015-12-01
Lutetium-based scintillators with high-performance electronics introduced time-of-flight (TOF) reconstruction in the clinical setting. Let G' be the total signal to noise ratio gain in a reconstructed image using the TOF kernel compared with conventional reconstruction modes. G' is then the product of G1 gain arising from the reconstruction process itself and (n-1) other gain factors (G2, G3, … Gn) arising from the inherent properties of the detector. We calculated G2 and G3 gains resulting from the optimization of the coincidence and energy window width for prompts and singles, respectively. Both quantitative and image-based validated Monte Carlo models of Lu2SiO5 (LSO) TOF-permitting and Bi4Ge3O12 (BGO) TOF-nonpermitting detectors were used for the calculations. G2 and G3 values were 1.05 and 1.08 for the BGO detector and G3 was 1.07 for the LSO. A value of almost unity for G2 of the LSO detector indicated a nonsignificant optimization by altering the energy window setting. G' was found to be ∼1.4 times higher for the TOF-permitting detector after reconstruction and optimization of the coincidence and energy windows. The method described could potentially predict image noise variations by altering detector acquisition parameters. It could also further contribute toward a long-lasting debate related to cost-efficiency issues of TOF scanners versus the non-TOF ones. Some vendors re-engage nowadays to non-TOF product line designs in an effort to reduce crystal costs. Therefore, exploring the limits of image quality gain by altering the parameters of these detectors remains a topical issue.
NASA Astrophysics Data System (ADS)
Karamat, Muhammad I.; Farncombe, Troy H.
2015-10-01
Simultaneous multi-isotope Single Photon Emission Computed Tomography (SPECT) imaging has a number of applications in cardiac, brain, and cancer imaging. The major concern however, is the significant crosstalk contamination due to photon scatter between the different isotopes. The current study focuses on a method of crosstalk compensation between two isotopes in simultaneous dual isotope SPECT acquisition applied to cancer imaging using 99mTc and 111In. We have developed an iterative image reconstruction technique that simulates the photon down-scatter from one isotope into the acquisition window of a second isotope. Our approach uses an accelerated Monte Carlo (MC) technique for the forward projection step in an iterative reconstruction algorithm. The MC estimated scatter contamination of a radionuclide contained in a given projection view is then used to compensate for the photon contamination in the acquisition window of other nuclide. We use a modified ordered subset-expectation maximization (OS-EM) algorithm named simultaneous ordered subset-expectation maximization (Sim-OSEM), to perform this step. We have undertaken a number of simulation tests and phantom studies to verify this approach. The proposed reconstruction technique was also evaluated by reconstruction of experimentally acquired phantom data. Reconstruction using Sim-OSEM showed very promising results in terms of contrast recovery and uniformity of object background compared to alternative reconstruction methods implementing alternative scatter correction schemes (i.e., triple energy window or separately acquired projection data). In this study the evaluation is based on the quality of reconstructed images and activity estimated using Sim-OSEM. In order to quantitate the possible improvement in spatial resolution and signal to noise ratio (SNR) observed in this study, further simulation and experimental studies are required.
Frequency domain analysis of errors in cross-correlations of ambient seismic noise
NASA Astrophysics Data System (ADS)
Liu, Xin; Ben-Zion, Yehuda; Zigone, Dimitri
2016-12-01
We analyse random errors (variances) in cross-correlations of ambient seismic noise in the frequency domain, which differ from previous time domain methods. Extending previous theoretical results on ensemble averaged cross-spectrum, we estimate confidence interval of stacked cross-spectrum of finite amount of data at each frequency using non-overlapping windows with fixed length. The extended theory also connects amplitude and phase variances with the variance of each complex spectrum value. Analysis of synthetic stationary ambient noise is used to estimate the confidence interval of stacked cross-spectrum obtained with different length of noise data corresponding to different number of evenly spaced windows of the same duration. This method allows estimating Signal/Noise Ratio (SNR) of noise cross-correlation in the frequency domain, without specifying filter bandwidth or signal/noise windows that are needed for time domain SNR estimations. Based on synthetic ambient noise data, we also compare the probability distributions, causal part amplitude and SNR of stacked cross-spectrum function using one-bit normalization or pre-whitening with those obtained without these pre-processing steps. Natural continuous noise records contain both ambient noise and small earthquakes that are inseparable from the noise with the existing pre-processing steps. Using probability distributions of random cross-spectrum values based on the theoretical results provides an effective way to exclude such small earthquakes, and additional data segments (outliers) contaminated by signals of different statistics (e.g. rain, cultural noise), from continuous noise waveforms. This technique is applied to constrain values and uncertainties of amplitude and phase velocity of stacked noise cross-spectrum at different frequencies, using data from southern California at both regional scale (˜35 km) and dense linear array (˜20 m) across the plate-boundary faults. A block bootstrap resampling method is used to account for temporal correlation of noise cross-spectrum at low frequencies (0.05-0.2 Hz) near the ocean microseismic peaks.
Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel
2018-01-01
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
Vendemia, Nicholas; Chao, Jerry; Ivanidze, Jana; Sanelli, Pina; Spinelli, Henry M
2011-01-01
Medpor (Porex Surgical, Inc, Newnan, GA) is composed of porous polyethylene and is commonly used in craniofacial reconstruction. When complications such as seroma or abscess formation arise, diagnostic modalities are limited because Medpor is radiolucent on conventional radiologic studies. This poses a problem in situations where imaging is necessary to distinguish the implant from surrounding tissues. To present a clinically useful method for imaging Medpor with conventional computed tomographic (CT) scanning. Eleven patients (12 total implants) who have undergone reconstructive surgery with Medpor were included in the study. A retrospective review of CT scans done between 1 and 16 months postoperatively was performed using 3 distinct CT window settings. Measurements of implant dimensions and Hounsfield units were recorded and qualitatively assessed. Of the 3 distinct window settings studied, namely, "bone" (W1100/L450), "soft tissue"; (W500/L50), and "implant" (W800/L200), the implant window proved the most ideal, allowing the investigators to visualize and evaluate Medpor in all cases. Qualitative analysis revealed that Medpor implants were able to be distinguished from surrounding tissue in both the implant and soft tissue windows, with a density falling between that of fat and fluid. In 1 case, Medpor could not be visualized in the soft tissue window, although it could be visualized in the implant window. Quantitative analysis demonstrated a mean (SD) density of -38.7 (7.4) Hounsfield units. Medpor may be optimally visualized on conventional CT scans using the implant window settings W800/L200, which can aid in imaging Medpor and diagnosing implant-related complications.
Method and apparatus for monitoring the flow of mercury in a system
Grossman, M.W.
1987-12-15
An apparatus and method for monitoring the flow of mercury in a system are disclosed. The equipment enables the entrainment of the mercury in a carrier gas e.g., an inert gas, which passes as mercury vapor between a pair of optically transparent windows. The attenuation of the emission is indicative of the quantity of mercury (and its isotopes) in the system. A 253.7 nm light is shone through one of the windows and the unabsorbed light is detected through the other window. The absorption of the 253.7 nm light is thereby measured whereby the quantity of mercury passing between the windows can be determined. The apparatus includes an in-line sensor for measuring the quantity of mercury. It includes a conduit together with a pair of apertures disposed in a face to face relationship and arranged on opposite sides of the conduit. A pair of optically transparent windows are disposed upon a pair of viewing tubes. A portion of each of the tubes is disposed inside of the conduit and within each of the apertures. The two windows are disposed in a face to face relationship on the ends of the viewing tubes and the entire assembly is hermetically sealed from the atmosphere whereby when 253.7 nm ultraviolet light is shone through one of the windows and detected through the other, the quantity of mercury which is passing by can be continuously monitored due to absorption which is indicated by attenuation of the amplitude of the observed emission. 4 figs.
Data assimilation experiments using diffusive back-and-forth nudging for the NEMO ocean model
NASA Astrophysics Data System (ADS)
Ruggiero, G. A.; Ourmières, Y.; Cosme, E.; Blum, J.; Auroux, D.; Verron, J.
2015-04-01
The diffusive back-and-forth nudging (DBFN) is an easy-to-implement iterative data assimilation method based on the well-known nudging method. It consists of a sequence of forward and backward model integrations, within a given time window, both of them using a feedback term to the observations. Therefore, in the DBFN, the nudging asymptotic behaviour is translated into an infinite number of iterations within a bounded time domain. In this method, the backward integration is carried out thanks to what is called backward model, which is basically the forward model with reversed time step sign. To maintain numeral stability, the diffusion terms also have their sign reversed, giving a diffusive character to the algorithm. In this article the DBFN performance to control a primitive equation ocean model is investigated. In this kind of model non-resolved scales are modelled by diffusion operators which dissipate energy that cascade from large to small scales. Thus, in this article, the DBFN approximations and their consequences for the data assimilation system set-up are analysed. Our main result is that the DBFN may provide results which are comparable to those produced by a 4Dvar implementation with a much simpler implementation and a shorter CPU time for convergence. The conducted sensitivity tests show that the 4Dvar profits of long assimilation windows to propagate surface information downwards, and that for the DBFN, it is worth using short assimilation windows to reduce the impact of diffusion-induced errors. Moreover, the DBFN is less sensitive to the first guess than the 4Dvar.
Thalagala, N
2015-11-01
The normative age ranges during which cohorts of children achieve milestones are called windows of achievement. The patterns of these windows of achievement are known to be both genetically and environmentally dependent. This study aimed to determine the windows of achievement for motor, social emotional, language and cognitive development milestones for infants and toddlers in Sri Lanka. A set of 293 milestones identified through a literature review were subjected to content validation using parent and expert reviews, which resulted in the selection of a revised set of 277 milestones. Thereafter, a sample of 1036 children from 2 months to 30 months was examined to see whether or not they had attained the selected milestones. Percentile ages of attaining milestone were determined using a rearranged closed form equation related to the logistic regression. The parameters required for calculations were derived through the logistic regression of milestone achievement statuses against ages of children. These percentile ages were used to define the respective windows of achievement. A set of 178 robust indicators that represent motor, socio emotional, language and cognitive development skills and their windows of achievement relevant to 2 to 24 months of age were determined. Windows of achievement for six gross motor milestones determined in the study were shown to closely overlap a similar set of windows of achievement published by the World Health Organization indicating the validity of some findings. A methodology combining the content validation based on qualitative techniques and age validation based on regression modelling found to be effective for determining age percentiles for realizing milestones and determining respective windows of achievement. © 2015 John Wiley & Sons Ltd.
Real-Time Detection of Dust Devils from Pressure Readings
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri
2009-01-01
A method for real-time detection of dust devils at a given location is based on identifying the abrupt, temporary decreases in atmospheric pressure that are characteristic of dust devils as they travel through that location. The method was conceived for use in a study of dust devils on the Martian surface, where bandwidth limitations encourage the transmission of only those blocks of data that are most likely to contain information about features of interest, such as dust devils. The method, which is a form of intelligent data compression, could readily be adapted to use for the same purpose in scientific investigation of dust devils on Earth. In this method, the readings of an atmospheric- pressure sensor are repeatedly digitized, recorded, and processed by an algorithm that looks for extreme deviations from a continually updated model of the current pressure environment. The question in formulating the algorithm is how to model current normal observations and what minimum magnitude deviation can be considered sufficiently anomalous as to indicate the presence of a dust devil. There is no single, simple answer to this question: any answer necessarily entails a compromise between false detections and misses. For the original Mars application, the answer was sought through analysis of sliding time windows of digitized pressure readings. Windows of 5-, 10-, and 15-minute durations were considered. The windows were advanced in increments of 30 seconds. Increments of other sizes can also be used, but computational cost increases as the increment decreases and analysis is performed more frequently. Pressure models were defined using a polynomial fit to the data within the windows. For example, the figure depicts pressure readings from a 10-minute window wherein the model was defined by a third-degree polynomial fit to the readings and dust devils were identified as negative deviations larger than both 3 standard deviations (from the mean) and 0.05 mbar in magnitude. An algorithm embodying the detection scheme of this example was found to yield a miss rate of just 8 percent and a false-detection rate of 57 percent when evaluated on historical pressure-sensor data collected by the Mars Pathfinder lander. Since dust devils occur infrequently over the course of a mission, prioritizing observations that contain successful detections could greatly conserve bandwidth allocated to a given mission. This technique can be used on future Mars landers and rovers, such as Mars Phoenix and the Mars Science Laboratory.
Derivation of cloud-free-region atmospheric motion vectors from FY-2E thermal infrared imagery
NASA Astrophysics Data System (ADS)
Wang, Zhenhui; Sui, Xinxiu; Zhang, Qing; Yang, Lu; Zhao, Hang; Tang, Min; Zhan, Yizhe; Zhang, Zhiguo
2017-02-01
The operational cloud-motion tracking technique fails to retrieve atmospheric motion vectors (AMVs) in areas lacking cloud; and while water vapor shown in water vapor imagery can be used, the heights assigned to the retrieved AMVs are mostly in the upper troposphere. As the noise-equivalent temperature difference (NEdT) performance of FY-2E split window (10.3-11.5 μm, 11.6-12.8 μm) channels has been improved, the weak signals representing the spatial texture of water vapor and aerosols in cloud-free areas can be strengthened with algorithms based on the difference principle, and applied in calculating AMVs in the lower troposphere. This paper is a preliminary summary for this purpose, in which the principles and algorithm schemes for the temporal difference, split window difference and second-order difference (SD) methods are introduced. Results from simulation and cases experiments are reported in order to verify and evaluate the methods, based on comparison among retrievals and the "truth". The results show that all three algorithms, though not perfect in some cases, generally work well. Moreover, the SD method appears to be the best in suppressing the surface temperature influence and clarifying the spatial texture of water vapor and aerosols. The accuracy with respect to NCEP 800 hPa reanalysis data was found to be acceptable, as compared with the accuracy of the cloud motion vectors.
Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae
2014-01-01
Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326
NASA Astrophysics Data System (ADS)
Horton, Pascal; Jaboyedoff, Michel; Obled, Charles
2018-01-01
Analogue methods provide a statistical precipitation prediction based on synoptic predictors supplied by general circulation models or numerical weather prediction models. The method samples a selection of days in the archives that are similar to the target day to be predicted, and consider their set of corresponding observed precipitation (the predictand) as the conditional distribution for the target day. The relationship between the predictors and predictands relies on some parameters that characterize how and where the similarity between two atmospheric situations is defined. This relationship is usually established by a semi-automatic sequential procedure that has strong limitations: (i) it cannot automatically choose the pressure levels and temporal windows (hour of the day) for a given meteorological variable, (ii) it cannot handle dependencies between parameters, and (iii) it cannot easily handle new degrees of freedom. In this work, a global optimization approach relying on genetic algorithms could optimize all parameters jointly and automatically. The global optimization was applied to some variants of the analogue method for the Rhône catchment in the Swiss Alps. The performance scores increased compared to reference methods, especially for days with high precipitation totals. The resulting parameters were found to be relevant and coherent between the different subregions of the catchment. Moreover, they were obtained automatically and objectively, which reduces the effort that needs to be invested in exploration attempts when adapting the method to a new region or for a new predictand. For example, it obviates the need to assess a large number of combinations of pressure levels and temporal windows of predictor variables that were manually selected beforehand. The optimization could also take into account parameter inter-dependencies. In addition, the approach allowed for new degrees of freedom, such as a possible weighting between pressure levels, and non-overlapping spatial windows.
The new analysis method of PWQ in the DRAM pattern
NASA Astrophysics Data System (ADS)
Han, Daehan; Chang, Jinman; Kim, Taeheon; Lee, Kyusun; Kim, Yonghyeon; Kang, Jinyoung; Hong, Aeran; Choi, Bumjin; Lee, Joosung; Kim, Hyoung Jun; Lee, Kweonjae; Hong, Hyoungsun; Jin, Gyoyoung
2016-03-01
In a sub 2Xnm node process, the feedback of pattern weak points is more and more significant. Therefore, it is very important to extract the systemic defect in Double Patterning Technology(DPT), however, it is impossible to predict exact systemic defect at the recent photo simulation tool.[1] Therefore, the method of Process Window Qualification (PWQ) is very serious and essential these days. Conventional PWQ methods are die to die image comparison by using an e-beam or bright field machine. Results are evaluated by the person, who reviews the images, in some cases. However, conventional die to die comparison method has critical problem. If reference die and comparison die have same problem, such as both of dies have pattern problems, the issue patterns are not detected by current defect detecting approach. Aside from the inspection accuracy, reviewing the wafer requires much effort and time to justify the genuine issue patterns. Therefore, our company adopts die to data based matching PWQ method that is using NGR machine. The main features of the NGR are as follows. First, die to data based matching, second High speed, finally massive data were used for evaluation of pattern inspection.[2] Even though our die to data based matching PWQ method measures the mass data, our margin decision process is based on image shape. Therefore, it has some significant problems. First, because of the long analysis time, the developing period of new device is increased. Moreover, because of the limitation of resources, it may not examine the full chip area. Consequently, the result of PWQ weak points cannot represent the all the possible defects. Finally, since the PWQ margin is not decided by the mathematical value, to make the solid definition of killing defect is impossible. To overcome these problems, we introduce a statistical values base process window qualification method that increases the accuracy of process margin and reduces the review time. Therefore, it is possible to see the genuine margin of the critical pattern issue which we cannot see on our conventional PWQ inspection; hence we can enhance the accuracy of PWQ margin.
Leyde, Brian P.; Klein, Sanford A; Nellis, Gregory F.; Skye, Harrison
2017-01-01
This paper presents a new method called the Crossed Contour Method for determining the effective properties (borehole radius and ground thermal conductivity) of a vertical ground-coupled heat exchanger. The borehole radius is used as a proxy for the overall borehole thermal resistance. The method has been applied to both simulated and experimental borehole Thermal Response Test (TRT) data using the Duct Storage vertical ground heat exchanger model implemented in the TRansient SYstems Simulation software (TRNSYS). The Crossed Contour Method generates a parametric grid of simulated TRT data for different combinations of borehole radius and ground thermal conductivity in a series of time windows. The error between the average of the simulated and experimental bore field inlet and outlet temperatures is calculated for each set of borehole properties within each time window. Using these data, contours of the minimum error are constructed in the parameter space of borehole radius and ground thermal conductivity. When all of the minimum error contours for each time window are superimposed, the point where the contours cross (intersect) identifies the effective borehole properties for the model that most closely represents the experimental data in every time window and thus over the entire length of the experimental data set. The computed borehole properties are compared with results from existing model inversion methods including the Ground Property Measurement (GPM) software developed by Oak Ridge National Laboratory, and the Line Source Model. PMID:28785125
Smith, Lauren H; Hargrove, Levi J; Lock, Blair A; Kuiken, Todd A
2011-04-01
Pattern recognition-based control of myoelectric prostheses has shown great promise in research environments, but has not been optimized for use in a clinical setting. To explore the relationship between classification error, controller delay, and real-time controllability, 13 able-bodied subjects were trained to operate a virtual upper-limb prosthesis using pattern recognition of electromyogram (EMG) signals. Classification error and controller delay were varied by training different classifiers with a variety of analysis window lengths ranging from 50 to 550 ms and either two or four EMG input channels. Offline analysis showed that classification error decreased with longer window lengths (p < 0.01 ). Real-time controllability was evaluated with the target achievement control (TAC) test, which prompted users to maneuver the virtual prosthesis into various target postures. The results indicated that user performance improved with lower classification error (p < 0.01 ) and was reduced with longer controller delay (p < 0.01 ), as determined by the window length. Therefore, both of these effects should be considered when choosing a window length; it may be beneficial to increase the window length if this results in a reduced classification error, despite the corresponding increase in controller delay. For the system employed in this study, the optimal window length was found to be between 150 and 250 ms, which is within acceptable controller delays for conventional multistate amplitude controllers.
The Use of Variable Q1 Isolation Windows Improves Selectivity in LC-SWATH-MS Acquisition.
Zhang, Ying; Bilbao, Aivett; Bruderer, Tobias; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard; Varesio, Emmanuel
2015-10-02
As tryptic peptides and metabolites are not equally distributed along the mass range, the probability of cross fragment ion interference is higher in certain windows when fixed Q1 SWATH windows are applied. We evaluated the benefits of utilizing variable Q1 SWATH windows with regards to selectivity improvement. Variable windows based on equalizing the distribution of either the precursor ion population (PIP) or the total ion current (TIC) within each window were generated by an in-house software, swathTUNER. These two variable Q1 SWATH window strategies outperformed, with respect to quantification and identification, the basic approach using a fixed window width (FIX) for proteomic profiling of human monocyte-derived dendritic cells (MDDCs). Thus, 13.8 and 8.4% additional peptide precursors, which resulted in 13.1 and 10.0% more proteins, were confidently identified by SWATH using the strategy PIP and TIC, respectively, in the MDDC proteomic sample. On the basis of the spectral library purity score, some improvement warranted by variable Q1 windows was also observed, albeit to a lesser extent, in the metabolomic profiling of human urine. We show that the novel concept of "scheduled SWATH" proposed here, which incorporates (i) variable isolation windows and (ii) precursor retention time segmentation further improves both peptide and metabolite identifications.
Yeo, Boon Y.; McLaughlin, Robert A.; Kirk, Rodney W.; Sampson, David D.
2012-01-01
We present a high-resolution three-dimensional position tracking method that allows an optical coherence tomography (OCT) needle probe to be scanned laterally by hand, providing the high degree of flexibility and freedom required in clinical usage. The method is based on a magnetic tracking system, which is augmented by cross-correlation-based resampling and a two-stage moving window average algorithm to improve upon the tracker's limited intrinsic spatial resolution, achieving 18 µm RMS position accuracy. A proof-of-principle system was developed, with successful image reconstruction demonstrated on phantoms and on ex vivo human breast tissue validated against histology. This freehand scanning method could contribute toward clinical implementation of OCT needle imaging. PMID:22808429
Optical Evaluation of DMDs with UV-Grade FS, Sapphire, MgF2 Windows and Reflectance of Bare Devices
NASA Technical Reports Server (NTRS)
Quijada, Manuel A.; Heap, Sara; Travinsky, Anton; Vorobiev, Dmitry; Ninkov, Zoran; Raisanen, Alan; Roberto, Massimo
2016-01-01
Digital Micro-mirror Devices (DMDs) have been identified as an alternative to microshutter arrays for space-based multi-object spectrometers (MOS). Specifically, the MOS at the heart of a proposed Galactic Evolution Spectroscopic Explorer (GESE) that uses the DMD as a reprogrammable slit mask. Unfortunately, the protective borosilicate windows limit the use of DMDs in the UV and IR regimes, where the glass has insufficient throughput. In this work, we present our efforts to replace standard DMD windows with custom windows made from UV-grade fused silica, Low Absorption Optical Sapphire (LAOS) and magnesium fluoride. We present reflectance measurements of the antireflection coated windows and a reflectance study of the DMDs active area (window removed). Furthermore, we investigated the long-term stability of the DMD reflectance and recoating device with fresh Al coatings.
A Laplacian based image filtering using switching noise detector.
Ranjbaran, Ali; Hassan, Anwar Hasni Abu; Jafarpour, Mahboobe; Ranjbaran, Bahar
2015-01-01
This paper presents a Laplacian-based image filtering method. Using a local noise estimator function in an energy functional minimizing scheme we show that Laplacian that has been known as an edge detection function can be used for noise removal applications. The algorithm can be implemented on a 3x3 window and easily tuned by number of iterations. Image denoising is simplified to the reduction of the pixels value with their related Laplacian value weighted by local noise estimator. The only parameter which controls smoothness is the number of iterations. Noise reduction quality of the introduced method is evaluated and compared with some classic algorithms like Wiener and Total Variation based filters for Gaussian noise. And also the method compared with the state-of-the-art method BM3D for some images. The algorithm appears to be easy, fast and comparable with many classic denoising algorithms for Gaussian noise.
Du, Feng; Jiao, Jun
2016-04-01
The present study used a spatial blink task and a cuing task to examine the boundary between feature-based capture and relation-based capture. Feature-based capture occurs when distractors match the target feature such as target color. The occurrence of relation-based capture is contingent upon the feature relation between target and distractor (e.g., color relation). The results show that color distractors that match the target-nontarget color relation do not consistently capture attention when they appear outside of the attentional window, but distractors appearing outside the attentional window that match the target color consistently capture attention. In contrast, color distractors that best match the target-nontarget color relation but not the target color, are more likely to capture attention when they appear within the attentional window. Consistently, color cues that match the target-nontarget color relation produce a cuing effect when they appear within the attentional window, while target-color matched cues do not. Such a double dissociation between color-based capture and color-relation-based capture indicates functionally distinct mechanisms for these 2 types of attentional selection. This also indicates that the spatial blink task and the uninformative cuing task are measuring distinctive aspects of involuntary attention. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Energy Conservation in the Home. Performance Based Lesson Plans.
ERIC Educational Resources Information Center
Alabama State Dept. of Education, Montgomery. Home Economics Service.
These ten performance-based lesson plans concentrate on tasks related to energy conservation in the home. They are (1) caulk cracks, holes, and joints; (2) apply weatherstripping to doors and windows; (3) add plastic/solar screen window covering; (4) arrange furniture for saving energy; (5) set heating/cooling thermostat; (6) replace faucet…
Protective broadband window coatings
NASA Astrophysics Data System (ADS)
Askinazi, Joel; Narayanan, Authi A.
1997-06-01
Optical windows employed in current and future airborne and ground based optical sensor systems are required to provide long service life under extreme environmental conditions including blowing sand and high speed rain. State of the art sensor systems are employing common aperture windows which must provide optical bandpasses from the TV to the LWIR. Operation Desert Storm experience indicates that current optical coatings provide limited environmental protection which adversely affects window life cycle cost. Most of these production coatings also have limited optical bandpasses (LWIR, MWIR, or TV-NIR). A family of optical coatings has been developed which provide a significant increase in rain and sand impact protection to current optical window materials. These coatings can also be tailored to provide either narrow optical bandwidth (e.g., LWIR) or broadband transmittance (TV- LWIR). They have been applied to a number of standard optical window materials. These coating have successfully completed airborne rain and sand abrasion test with minimal impact on optical window performance. Test results are presented. Low cost service life is anticipated as well as the ability to operate windows in even more taxing environments than currently feasible.
NASA Astrophysics Data System (ADS)
Bunai, Tasya; Rokhmatuloh; Wibowo, Adi
2018-05-01
In this paper, two methods to retrieve the Land Surface Temperature (LST) from thermal infrared data supplied by band 10 and 11 of the Thermal Infrared Sensor (TIRS) onboard the Landsat 8 is compared. The first is mono window algorithm developed by Qin et al. and the second is split window algorithm by Rozenstein et al. The purpose of this study is to perform the spatial distribution of land surface temperature, as well as to determine more accurate algorithm for retrieving land surface temperature by calculated root mean square error (RMSE). Finally, we present comparison the spatial distribution of land surface temperature by both of algorithm, and more accurate algorithm is split window algorithm refers to the root mean square error (RMSE) is 7.69° C.
Method for the rapid synthesis of large quantities of metal oxide nanowires at low temperatures
Sunkara, Mahendra Kumar [Louisville, KY; Vaddiraju, Sreeram [Mountain View, CA; Mozetic, Miran [Ljubljan, SI; Cvelbar, Uros [Idrija, SI
2009-09-22
A process for the rapid synthesis of metal oxide nanoparticles at low temperatures and methods which facilitate the fabrication of long metal oxide nanowires. The method is based on treatment of metals with oxygen plasma. Using oxygen plasma at low temperatures allows for rapid growth unlike other synthesis methods where nanomaterials take a long time to grow. Density of neutral oxygen atoms in plasma is a controlling factor for the yield of nanowires. The oxygen atom density window differs for different materials. By selecting the optimal oxygen atom density for various materials the yield can be maximized for nanowire synthesis of the metal.
Denoising Medical Images using Calculus of Variations
Kohan, Mahdi Nakhaie; Behnam, Hamid
2011-01-01
We propose a method for medical image denoising using calculus of variations and local variance estimation by shaped windows. This method reduces any additive noise and preserves small patterns and edges of images. A pyramid structure-texture decomposition of images is used to separate noise and texture components based on local variance measures. The experimental results show that the proposed method has visual improvement as well as a better SNR, RMSE and PSNR than common medical image denoising methods. Experimental results in denoising a sample Magnetic Resonance image show that SNR, PSNR and RMSE have been improved by 19, 9 and 21 percents respectively. PMID:22606674
Development of a collapsible reinforced cylindrical space observation window
NASA Technical Reports Server (NTRS)
Khan, A. Q.
1971-01-01
Existing material technology was applied to the development of a collapsible transparent window suitable for manned spacecraft structures. The effort reported encompasses the evaluation of flame retardants intended for use in the window matrix polymer, evaluation of reinforcement angle which would allow for a twisting pantographing motion as the cylindrical window is mechanically collapsed upon itself, and evaluation of several reinforcement embedment methods. A fabrication technique was developed to produce a reinforced cylindrical space window of 45.7 cm diameter and 61.0 cm length. The basic technique involved the application of a clear film on a male-section mold; winding axial and girth reinforcements and vacuum casting the outer layer. The high-strength transparent window composite consisted of a polyether urethane matrix reinforced with an orthogonal pattern of black-coated carbon steel wire cable. A thin film of RTV silicone rubber was applied to both surfaces of the urethane. The flexibility, retraction system, and installation system are described.
Designing intuitive dialog boxes in Windows environments
NASA Astrophysics Data System (ADS)
Souetova, Natalia
2000-01-01
There were analyzed some approaches to user interface design. Most existing interfaces seem to be difficult for understanding and studying for newcomers. There were defined some ways for designing interfaces based on psychology of computer image perception and experience got while working with artists and designers without special technique education. Some applications with standard Windows interfaces, based on these results, were developed. Windows environment was chosen because they are very popular now. This increased quality and speed of users' job and reduced quantity of troubles and mistakes. Now high-qualified employers do not spend their working time for explanation and help.
DETAIL, WINDOW ON THE NORTH FACADE, LOOKING SOUTH Eglin ...
DETAIL, WINDOW ON THE NORTH FACADE, LOOKING SOUTH - Eglin Air Force Base, Storehouse & Company Administration, Southeast of Flager Road, Nassau Lane, & southern edge of Weekly Bayou, Valparaiso, Okaloosa County, FL
D Building FAÇADE Reconstruction Using Handheld Laser Scanning Data
NASA Astrophysics Data System (ADS)
Sadeghi, F.; Arefi, H.; Fallah, A.; Hahn, M.
2015-12-01
3D The three dimensional building modelling has been an interesting topic of research for decades and it seems that photogrammetry methods provide the only economic means to acquire truly 3D city data. According to the enormous developments of 3D building reconstruction with several applications such as navigation system, location based services and urban planning, the need to consider the semantic features (such as windows and doors) becomes more essential than ever, and therefore, a 3D model of buildings as block is not any more sufficient. To reconstruct the façade elements completely, we employed the high density point cloud data that obtained from the handheld laser scanner. The advantage of the handheld laser scanner with capability of direct acquisition of very dense 3D point clouds is that there is no need to derive three dimensional data from multi images using structure from motion techniques. This paper presents a grammar-based algorithm for façade reconstruction using handheld laser scanner data. The proposed method is a combination of bottom-up (data driven) and top-down (model driven) methods in which, at first the façade basic elements are extracted in a bottom-up way and then they are served as pre-knowledge for further processing to complete models especially in occluded and incomplete areas. The first step of data driven modelling is using the conditional RANSAC (RANdom SAmple Consensus) algorithm to detect façade plane in point cloud data and remove noisy objects like trees, pedestrians, traffic signs and poles. Then, the façade planes are divided into three depth layers to detect protrusion, indentation and wall points using density histogram. Due to an inappropriate reflection of laser beams from glasses, the windows appear like holes in point cloud data and therefore, can be distinguished and extracted easily from point cloud comparing to the other façade elements. Next step, is rasterizing the indentation layer that holds the windows and doors information. After rasterization process, the morphological operators are applied in order to remove small irrelevant objects. Next, the horizontal splitting lines are employed to determine floors and vertical splitting lines are employed to detect walls, windows, and doors. The windows, doors and walls elements which are named as terminals are clustered during classification process. Each terminal contains a special property as width. Among terminals, windows and doors are named the geometry tiles in definition of the vocabularies of grammar rules. Higher order structures that inferred by grouping the tiles resulted in the production rules. The rules with three dimensional modelled façade elements constitute formal grammar that is named façade grammar. This grammar holds all the information that is necessary to reconstruct façades in the style of the given building. Thus, it can be used to improve and complete façade reconstruction in areas with no or limited sensor data. Finally, a 3D reconstructed façade model is generated that the accuracy of its geometry size and geometry position depends on the density of the raw point cloud.
Ahmad, Muneer; Jung, Low Tan; Bhuiyan, Al-Amin
2017-10-01
Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals. This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise. Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters. Copyright © 2017 Elsevier B.V. All rights reserved.
Windowing technique in FM radar realized by FPGA for better target resolution
NASA Astrophysics Data System (ADS)
Ponomaryov, Volodymyr I.; Escamilla-Hernandez, Enrique; Kravchenko, Victor F.
2006-09-01
Remote sensing systems, such as SAR usually apply FM signals to resolve nearly placed targets (objects) and improve SNR. Main drawbacks in the pulse compression of FM radar signal that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) of output radar signal that permits to resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, Blackman-Harris, Kaiser-Bessel, Dolph-Chebyshev, Gauss, etc. in window processing. Additionally to classical ones in here we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. In this paper we also proposed to implement the compression-windowing model on a hardware level employing Field Programmable Gate Array (FPGA) that offers some benefits like instantaneous implementation, dynamic reconfiguration, design, and field programmability. It has been investigated the pulse compression design on FPGA applying classical and novel window technique to reduce the SLL in absence and presence of noise. The paper presents simulated and experimental examples of detection of small or nearly placed targets in the imaging radar. Paper also presents the experimental hardware results of windowing in FM radar demonstrating resolution of the several targets for classical rectangular, Hamming, Kaiser-Bessel, and some novel ones: Up(x), fup 4(x)•D 3(x), fup 6(x)•G 3(x), etc. It is possible to conclude that windows created on base of the AFs offer better decreasing of the SLL in cases of presence or absence of noise and when we move away of the main lobe in comparison with classical windows.
a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data
NASA Astrophysics Data System (ADS)
Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.
2017-09-01
The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial resolution images image by image. Simulated experiment and remote sensing image downscaling experiment were conducted. In simulated experiment, the 30 meters class map dataset Globeland30 was adopted to investigate the effect on avoid the underdetermined problem in downscaling procedure and a comparison between spiral and window was conducted. Further, the MODIS NDVI and Landsat image data was adopted to generate the 30m time series NDVI in remote sensing image downscaling experiment. Simulated experiment results showed that the proposed method had a robust performance in downscaling pixel in heterogeneous region and indicated that it was superior to the traditional window-based methods. The high resolution time series generated may be a benefit to the mapping and updating of land cover data.
Zhang, Zhengyi; Zhang, Gaoyan; Zhang, Yuanyuan; Liu, Hong; Xu, Junhai; Liu, Baolin
2017-12-01
This study aimed to investigate the functional connectivity in the brain during the cross-modal integration of polyphonic characters in Chinese audio-visual sentences. The visual sentences were all semantically reasonable and the audible pronunciations of the polyphonic characters in corresponding sentences contexts varied in four conditions. To measure the functional connectivity, correlation, coherence and phase synchronization index (PSI) were used, and then multivariate pattern analysis was performed to detect the consensus functional connectivity patterns. These analyses were confined in the time windows of three event-related potential components of P200, N400 and late positive shift (LPS) to investigate the dynamic changes of the connectivity patterns at different cognitive stages. We found that when differentiating the polyphonic characters with abnormal pronunciations from that with the appreciate ones in audio-visual sentences, significant classification results were obtained based on the coherence in the time window of the P200 component, the correlation in the time window of the N400 component and the coherence and PSI in the time window the LPS component. Moreover, the spatial distributions in these time windows were also different, with the recruitment of frontal sites in the time window of the P200 component, the frontal-central-parietal regions in the time window of the N400 component and the central-parietal sites in the time window of the LPS component. These findings demonstrate that the functional interaction mechanisms are different at different stages of audio-visual integration of polyphonic characters.
NASA Astrophysics Data System (ADS)
Manconi, A.; Giordan, D.
2015-02-01
We investigate the use of landslide failure forecast models by exploiting near-real-time monitoring data. Starting from the inverse velocity theory, we analyze landslide surface displacements on different temporal windows, and apply straightforward statistical methods to obtain confidence intervals on the estimated time of failure. Here we describe the main concepts of our method, and show an example of application to a real emergency scenario, the La Saxe rockslide, Aosta Valley region, northern Italy. Based on the herein presented case study, we identify operational thresholds based on the reliability of the forecast models, in order to support the management of early warning systems in the most critical phases of the landslide emergency.
Intelligent windows using new thermotropic layers with long-term stability
NASA Astrophysics Data System (ADS)
Watanabe, Haruo
1995-08-01
This paper concerns the autonomous responsive type light adjustment window (intelligent windows) among smart windows which adjust the light upon receiving environmental energy. More specifically, this is a thermotropic window panel that laminates and seals a new type of highly viscous polymer aqueous solution gel. A conventional thermotropic window panel has never been put to practical use since the reversible change between the colorless, transparent state (water-clear) and translucent scattered state (paper-white) with uniformity was not possible. The change involved phase separation and generated non-uniformity. The author, after fundamental studies of hydrophobic bonding, successfully solved the problem by developing a polymer aqueous solution gel with amphiphatic molecule as the third component in addition to water and water-soluble polymer with hydrophobic radical, based on the molecular spacer concept. In addition, the author established peripheral technologies and succeeded in experimentally fabricating a panel type 'Affinity's Intelligent Window (AIW)' that has attained the level of practical use.
Batch production of microchannel plate photo-multipliers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frisch, Henry J.; Wetstein, Matthew; Elagin, Andrey
In-situ methods for the batch fabrication of flat-panel micro-channel plate (MCP) photomultiplier tube (PMT) detectors (MCP-PMTs), without transporting either the window or the detector assembly inside a vacuum vessel are provided. The method allows for the synthesis of a reflection-mode photocathode on the entrance to the pores of a first MCP or the synthesis of a transmission-mode photocathode on the vacuum side of a photodetector entrance window.
INTERIOR OF BUILDING 2, TYPICAL OFFICE (#212) WINDOW AND HEAT ...
INTERIOR OF BUILDING 2, TYPICAL OFFICE (#212) WINDOW AND HEAT REGISTER, SECOND FLOOR. FACING SOUTH - Roosevelt Base, Dispensary, Corner of Colorado Street & Richardson Avenue, Long Beach, Los Angeles County, CA
Northwest side view showing 3 windows and security light ...
Northwest side view showing 3 windows and security light - U.S. Naval Base, Pearl Harbor, Naval Hospital, Animal House, Near intersection of Hospital Way & Third Street, Pearl City, Honolulu County, HI
13. Interior view of open; showing exterior window, open doorways ...
13. Interior view of open; showing exterior window, open doorways into offices; northeast corner of building; view to southeast. - Ellsworth Air Force Base, Warehouse, 789 Twining Street, Blackhawk, Meade County, SD
An Evidence-Based Forensic Taxonomy of Windows Phone Communication Apps.
Cahyani, Niken Dwi Wahyu; Martini, Ben; Choo, Kim-Kwang Raymond; Ab Rahman, Nurul Hidayah; Ashman, Helen
2018-05-01
Communication apps can be an important source of evidence in a forensic investigation (e.g., in the investigation of a drug trafficking or terrorism case where the communications apps were used by the accused persons during the transactions or planning activities). This study presents the first evidence-based forensic taxonomy of Windows Phone communication apps, using an existing two-dimensional Android forensic taxonomy as a baseline. Specifically, 30 Windows Phone communication apps, including Instant Messaging (IM) and Voice over IP (VoIP) apps, are examined. Artifacts extracted using physical acquisition are analyzed, and seven digital evidence objects of forensic interest are identified, namely: Call Log, Chats, Contacts, Locations, Installed Applications, SMSs and User Accounts. Findings from this study would help to facilitate timely and effective forensic investigations involving Windows Phone communication apps. © 2017 American Academy of Forensic Sciences.
Siegel, Nisan; Rosen, Joseph; Brooker, Gary
2013-10-01
Recent advances in Fresnel incoherent correlation holography (FINCH) increase the signal-to-noise ratio in hologram recording by interference of images from two diffractive lenses with focal lengths close to the image plane. Holograms requiring short reconstruction distances are created that reconstruct poorly with existing Fresnel propagation methods. Here we show a dramatic improvement in reconstructed fluorescent images when a 2D Hamming window function substituted for the disk window typically used to bound the impulse response in the Fresnel propagation. Greatly improved image contrast and quality are shown for simulated and experimentally determined FINCH holograms using a 2D Hamming window without significant loss in lateral or axial resolution.
Optical Distortion Evaluation in Large Area Windows using Interferometry
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Skow, Miles; Nurge, Mark A.
2015-01-01
It is important that imagery seen through large area windows, such as those used on space vehicles, not be substantially distorted. Many approaches are described in the literature for measuring the distortion of an optical window, but most suffer from either poor resolution or processing difficulties. In this paper a new definition of distortion is presented, allowing accurate measurement using an optical interferometer. This new definition is shown to be equivalent to the definitions provided by the military and the standards organizations. In order to determine the advantages and disadvantages of this new approach the distortion of an acrylic window is measured using three different methods; image comparison, Moiré interferometry, and phase-shifting interferometry.
n-Gram-Based Text Compression.
Nguyen, Vu H; Nguyen, Hien T; Duong, Hieu N; Snasel, Vaclav
2016-01-01
We propose an efficient method for compressing Vietnamese text using n -gram dictionaries. It has a significant compression ratio in comparison with those of state-of-the-art methods on the same dataset. Given a text, first, the proposed method splits it into n -grams and then encodes them based on n -gram dictionaries. In the encoding phase, we use a sliding window with a size that ranges from bigram to five grams to obtain the best encoding stream. Each n -gram is encoded by two to four bytes accordingly based on its corresponding n -gram dictionary. We collected 2.5 GB text corpus from some Vietnamese news agencies to build n -gram dictionaries from unigram to five grams and achieve dictionaries with a size of 12 GB in total. In order to evaluate our method, we collected a testing set of 10 different text files with different sizes. The experimental results indicate that our method achieves compression ratio around 90% and outperforms state-of-the-art methods.
Duong, Hieu N.; Snasel, Vaclav
2016-01-01
We propose an efficient method for compressing Vietnamese text using n-gram dictionaries. It has a significant compression ratio in comparison with those of state-of-the-art methods on the same dataset. Given a text, first, the proposed method splits it into n-grams and then encodes them based on n-gram dictionaries. In the encoding phase, we use a sliding window with a size that ranges from bigram to five grams to obtain the best encoding stream. Each n-gram is encoded by two to four bytes accordingly based on its corresponding n-gram dictionary. We collected 2.5 GB text corpus from some Vietnamese news agencies to build n-gram dictionaries from unigram to five grams and achieve dictionaries with a size of 12 GB in total. In order to evaluate our method, we collected a testing set of 10 different text files with different sizes. The experimental results indicate that our method achieves compression ratio around 90% and outperforms state-of-the-art methods. PMID:27965708
Mittmann, Philipp; Ernst, A; Mittmann, M; Todt, I
2016-11-01
To preserve residual hearing in cochlear implant candidates, the atraumatic insertion of the cochlea electrode has become a focus of cochlea implant research. In a previous study, intracochlear pressure changes during the opening of the round window membrane were investigated. In the current study, intracochlear pressure changes during opening of the round window membrane under dry and transfluid conditions were investigated. Round window openings were performed in an artificial cochlear model. Intracochlear pressure changes were measured using a micro-optical pressure sensor, which was placed in the apex. Openings of the round window membrane were performed under dry and wet conditions using a cannula and a diode laser. Statistically significant differences in the intracochlear pressure changes were seen between the different methods used for opening of the round window membrane. Lower pressure changes were seen by opening the round window membrane with the diode laser than with the cannula. A significant difference was seen between the dry and wet conditions. The atraumatic approach to the cochlea is assumed to be essential for the preservation of residual hearing. Opening of the round window under wet conditions produce a significant advantage on intracochlear pressure changes in comparison to dry conditions by limiting negative outward pressure.
Delorme, Arnaud; Miyakoshi, Makoto; Jung, Tzyy-Ping; Makeig, Scott
2014-01-01
With the advent of modern computing methods, modeling trial-to-trial variability in biophysical recordings including electroencephalography (EEG) has become of increasingly interest. Yet no widely used method exists for comparing variability in ordered collections of single-trial data epochs across conditions and subjects. We have developed a method based on an ERP-image visualization tool in which potential, spectral power, or some other measure at each time point in a set of event-related single-trial data epochs are represented as color coded horizontal lines that are then stacked to form a 2-D colored image. Moving-window smoothing across trial epochs can make otherwise hidden event-related features in the data more perceptible. Stacking trials in different orders, for example ordered by subject reaction time, by context-related information such as inter-stimulus interval, or some other characteristic of the data (e.g., latency-window mean power or phase of some EEG source) can reveal aspects of the multifold complexities of trial-to-trial EEG data variability. This study demonstrates new methods for computing and visualizing grand ERP-image plots across subjects and for performing robust statistical testing on the resulting images. These methods have been implemented and made freely available in the EEGLAB signal-processing environment that we maintain and distribute. PMID:25447029