Least Squares Moving-Window Spectral Analysis.
Lee, Young Jong
2017-08-01
Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
NASA Astrophysics Data System (ADS)
Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han
2017-11-01
Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey-Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey-Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods.
Domingo-Almenara, Xavier; Perera, Alexandre; Brezmes, Jesus
2016-11-25
Gas chromatography-mass spectrometry (GC-MS) produces large and complex datasets characterized by co-eluted compounds and at trace levels, and with a distinct compound ion-redundancy as a result of the high fragmentation by the electron impact ionization. Compounds in GC-MS can be resolved by taking advantage of the multivariate nature of GC-MS data by applying multivariate resolution methods. However, multivariate methods have to be applied in small regions of the chromatogram, and therefore chromatograms are segmented prior to the application of the algorithms. The automation of this segmentation process is a challenging task as it implies separating between informative data and noise from the chromatogram. This study demonstrates the capabilities of independent component analysis-orthogonal signal deconvolution (ICA-OSD) and multivariate curve resolution-alternating least squares (MCR-ALS) with an overlapping moving window implementation to avoid the typical hard chromatographic segmentation. Also, after being resolved, compounds are aligned across samples by an automated alignment algorithm. We evaluated the proposed methods through a quantitative analysis of GC-qTOF MS data from 25 serum samples. The quantitative performance of both moving window ICA-OSD and MCR-ALS-based implementations was compared with the quantification of 33 compounds by the XCMS package. Results shown that most of the R 2 coefficients of determination exhibited a high correlation (R 2 >0.90) in both ICA-OSD and MCR-ALS moving window-based approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
Wang, Shenghao; Zhang, Yuyan; Cao, Fuyi; Pei, Zhenying; Gao, Xuewei; Zhang, Xu; Zhao, Yong
2018-02-13
This paper presents a novel spectrum analysis tool named synergy adaptive moving window modeling based on immune clone algorithm (SA-MWM-ICA) considering the tedious and inconvenient labor involved in the selection of pre-processing methods and spectral variables by prior experience. In this work, immune clone algorithm is first introduced into the spectrum analysis field as a new optimization strategy, covering the shortage of the relative traditional methods. Based on the working principle of the human immune system, the performance of the quantitative model is regarded as antigen, and a special vector corresponding to the above mentioned antigen is regarded as antibody. The antibody contains a pre-processing method optimization region which is created by 11 decimal digits, and a spectrum variable optimization region which is formed by some moving windows with changeable width and position. A set of original antibodies are created by modeling with this algorithm. After calculating the affinity of these antibodies, those with high affinity will be selected to clone. The regulation for cloning is that the higher the affinity, the more copies will be. In the next step, another import operation named hyper-mutation is applied to the antibodies after cloning. Moreover, the regulation for hyper-mutation is that the lower the affinity, the more possibility will be. Several antibodies with high affinity will be created on the basis of these steps. Groups of simulated dataset, gasoline near-infrared spectra dataset, and soil near-infrared spectra dataset are employed to verify and illustrate the performance of SA-MWM-ICA. Analysis results show that the performance of the quantitative models adopted by SA-MWM-ICA are better especially for structures with relatively complex spectra than traditional models such as partial least squares (PLS), moving window PLS (MWPLS), genetic algorithm PLS (GAPLS), and pretreatment method classification and adjustable parameter changeable size moving window PLS (CA-CSMWPLS). The selected pre-processing methods and spectrum variables are easily explained. The proposed method will converge in few generations and can be used not only for near-infrared spectroscopy analysis but also for other similar spectral analysis, such as infrared spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Image pre-processing method for near-wall PIV measurements over moving curved interfaces
NASA Astrophysics Data System (ADS)
Jia, L. C.; Zhu, Y. D.; Jia, Y. X.; Yuan, H. J.; Lee, C. B.
2017-03-01
PIV measurements near a moving interface are always difficult. This paper presents a PIV image pre-processing method that returns high spatial resolution velocity profiles near the interface. Instead of re-shaping or re-orientating the interrogation windows, interface tracking and an image transformation are used to stretch the particle image strips near a curved interface into rectangles. Then the adaptive structured interrogation windows can be arranged at specified distances from the interface. Synthetic particles are also added into the solid region to minimize interfacial effects and to restrict particles on both sides of the interface. Since a high spatial resolution is only required in high velocity gradient region, adaptive meshing and stretching of the image strips in the normal direction is used to improve the cross-correlation signal-to-noise ratio (SN) by reducing the velocity difference and the particle image distortion within the interrogation window. A two dimensional Gaussian fit is used to compensate for the effects of stretching particle images. The working hypothesis is that fluid motion near the interface is ‘quasi-tangential flow’, which is reasonable in most fluid-structure interaction scenarios. The method was validated against the window deformation iterative multi-grid scheme (WIDIM) using synthetic image pairs with different velocity profiles. The method was tested for boundary layer measurements of a supersonic turbulent boundary layer on a flat plate, near a rotating blade and near a flexible flapping flag. This image pre-processing method provides higher spatial resolution than conventional WIDIM and good robustness for measuring velocity profiles near moving interfaces.
Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom
2016-01-01
The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159
ERIC Educational Resources Information Center
Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.
2013-01-01
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…
Kim, Joowhan; Min, Sung-Wook; Lee, Byoungho
2007-10-01
Integral floating display is a recently proposed three-dimensional (3D) display method which provides a dynamic 3D image in the vicinity to an observer. It has a viewing window only through which correct 3D images can be observed. However, the positional difference between the viewing window and the floating image causes limited viewing zone in integral floating system. In this paper, we provide the principle and experimental results of the location adjustment of the viewing window of the integral floating display system by modifying the elemental image region for integral imaging. We explain the characteristics of the viewing window and propose how to move the viewing window to maximize the viewing zone.
Detrending moving average algorithm for multifractals
NASA Astrophysics Data System (ADS)
Gu, Gao-Feng; Zhou, Wei-Xing
2010-07-01
The detrending moving average (DMA) algorithm is a widely used technique to quantify the long-term correlations of nonstationary time series and the long-range correlations of fractal surfaces, which contains a parameter θ determining the position of the detrending window. We develop multifractal detrending moving average (MFDMA) algorithms for the analysis of one-dimensional multifractal measures and higher-dimensional multifractals, which is a generalization of the DMA method. The performance of the one-dimensional and two-dimensional MFDMA methods is investigated using synthetic multifractal measures with analytical solutions for backward (θ=0) , centered (θ=0.5) , and forward (θ=1) detrending windows. We find that the estimated multifractal scaling exponent τ(q) and the singularity spectrum f(α) are in good agreement with the theoretical values. In addition, the backward MFDMA method has the best performance, which provides the most accurate estimates of the scaling exponents with lowest error bars, while the centered MFDMA method has the worse performance. It is found that the backward MFDMA algorithm also outperforms the multifractal detrended fluctuation analysis. The one-dimensional backward MFDMA method is applied to analyzing the time series of Shanghai Stock Exchange Composite Index and its multifractal nature is confirmed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, X; Cao, D; Housley, D
2014-06-01
Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-movingmore » platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.« less
Model Identification of Integrated ARMA Processes
ERIC Educational Resources Information Center
Stadnytska, Tetiana; Braun, Simone; Werner, Joachim
2008-01-01
This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…
The short time Fourier transform and local signals
NASA Astrophysics Data System (ADS)
Okumura, Shuhei
In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices.
Song, Soonyong; Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-04-17
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions.
Gap-filling methods to impute eddy covariance flux data by preserving variance.
NASA Astrophysics Data System (ADS)
Kunwor, S.; Staudhammer, C. L.; Starr, G.; Loescher, H. W.
2015-12-01
To represent carbon dynamics, in terms of exchange of CO2 between the terrestrial ecosystem and the atmosphere, eddy covariance (EC) data has been collected using eddy flux towers from various sites across globe for more than two decades. However, measurements from EC data are missing for various reasons: precipitation, routine maintenance, or lack of vertical turbulence. In order to have estimates of net ecosystem exchange of carbon dioxide (NEE) with high precision and accuracy, robust gap-filling methods to impute missing data are required. While the methods used so far have provided robust estimates of the mean value of NEE, little attention has been paid to preserving the variance structures embodied by the flux data. Preserving the variance of these data will provide unbiased and precise estimates of NEE over time, which mimic natural fluctuations. We used a non-linear regression approach with moving windows of different lengths (15, 30, and 60-days) to estimate non-linear regression parameters for one year of flux data from a long-leaf pine site at the Joseph Jones Ecological Research Center. We used as our base the Michaelis-Menten and Van't Hoff functions. We assessed the potential physiological drivers of these parameters with linear models using micrometeorological predictors. We then used a parameter prediction approach to refine the non-linear gap-filling equations based on micrometeorological conditions. This provides us an opportunity to incorporate additional variables, such as vapor pressure deficit (VPD) and volumetric water content (VWC) into the equations. Our preliminary results indicate that improvements in gap-filling can be gained with a 30-day moving window with additional micrometeorological predictors (as indicated by lower root mean square error (RMSE) of the predicted values of NEE). Our next steps are to use these parameter predictions from moving windows to gap-fill the data with and without incorporation of potential driver variables of the parameters traditionally used. Then, comparisons of the predicted values from these methods and 'traditional' gap-filling methods (using 12 fixed monthly windows) will be assessed to show the scale of preserving variance. Further, this method will be applied to impute artificially created gaps for analyzing if variance is preserved.
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
Etchells, Peter J; Benton, Christopher P; Ludwig, Casimir J H; Gilchrist, Iain D
2011-01-01
A growing number of studies in vision research employ analyses of how perturbations in visual stimuli influence behavior on single trials. Recently, we have developed a method along such lines to assess the time course over which object velocity information is extracted on a trial-by-trial basis in order to produce an accurate intercepting saccade to a moving target. Here, we present a simplified version of this methodology, and use it to investigate how changes in stimulus contrast affect the temporal velocity integration window used when generating saccades to moving targets. Observers generated saccades to one of two moving targets which were presented at high (80%) or low (7.5%) contrast. In 50% of trials, target velocity stepped up or down after a variable interval after the saccadic go signal. The extent to which the saccade endpoint can be accounted for as a weighted combination of the pre- or post-step velocities allows for identification of the temporal velocity integration window. Our results show that the temporal integration window takes longer to peak in the low when compared to high contrast condition. By enabling the assessment of how information such as changes in velocity can be used in the programming of a saccadic eye movement on single trials, this study describes and tests a novel methodology with which to look at the internal processing mechanisms that transform sensory visual inputs into oculomotor outputs.
Ionospheric gravity wave measurements with the USU dynasonde
NASA Technical Reports Server (NTRS)
Berkey, Frank T.; Deng, Jun Yuan
1992-01-01
A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.
NASA Astrophysics Data System (ADS)
Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim
2018-01-01
The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.
Illusory displacement of equiluminous kinetic edges.
Ramachandran, V S; Anstis, S M
1990-01-01
A stationary window was cut out of a stationary random-dot pattern. When a field of dots was moved continuously behind the window (a) the window appeared to move in the same direction even though it was stationary, (b) the position of the 'kinetic edges' defining the window was also displaced along the direction of dot motion, and (c) the edges of the window tended to fade on steady fixation even though the dots were still clearly visible. The illusory displacement was enhanced considerably if the kinetic edge was equiluminous and if the 'window' region was seen as 'figure' rather than 'ground'. Since the extraction of kinetic edges probably involves the use of direction-selective cells, the illusion may provide insights into how the visual system uses the output of these cells to localize the kinetic edges.
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices
Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-01-01
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions. PMID:29673206
James, S. R.; Knox, H. A.; Abbott, R. E.; ...
2017-04-13
Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variationsmore » that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. Lastly, this improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.« less
Point-point and point-line moving-window correlation spectroscopy and its applications
NASA Astrophysics Data System (ADS)
Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu
2008-07-01
In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.
An impact of environmental changes on flows in the reach scale under a range of climatic conditions
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Romanowicz, Renata J.
2016-04-01
The present paper combines detection and adequate identification of causes of changes in flow regime at cross-sections along the Middle River Vistula reach using different methods. Two main experimental set ups (designs) have been applied to study the changes, a moving three-year window and low- and high-flow event based approach. In the first experiment, a Stochastic Transfer Function (STF) model and a quantile-based statistical analysis of flow patterns were compared. These two methods are based on the analysis of changes of the STF model parameters and standardised differences of flow quantile values. In the second experiment, in addition to the STF-based also a 1-D distributed model, MIKE11 was applied. The first step of the procedure used in the study is to define the river reaches that have recorded information on land use and water management changes. The second task is to perform the moving window analysis of standardised differences of flow quantiles and moving window optimisation of the STF model for flow routing. The third step consists of an optimisation of the STF and MIKE11 models for high- and low-flow events. The final step is to analyse the results and relate the standardised quantile changes and model parameter changes to historical land use changes and water management practices. Results indicate that both models give consistent assessment of changes in the channel for medium and high flows. ACKNOWLEDGEMENTS This research was supported by the Institute of Geophysics Polish Academy of Sciences through the Young Scientist Grant no. 3b/IGF PAN/2015.
Li, Xinjian; Cao, Vania Y; Zhang, Wenyu; Mastwal, Surjeet S; Liu, Qing; Otte, Stephani; Wang, Kuan Hong
2017-11-01
In vivo optical imaging of neural activity provides important insights into brain functions at the single-cell level. Cranial windows and virally delivered calcium indicators are commonly used for imaging cortical activity through two-photon microscopes in head-fixed animals. Recently, head-mounted one-photon microscopes have been developed for freely behaving animals. However, minimizing tissue damage from the virus injection procedure and maintaining window clarity for imaging can be technically challenging. We used a wide-diameter glass pipette at the cortical surface for infusing the viral calcium reporter AAV-GCaMP6 into the cortex. After infusion, the scalp skin over the implanted optical window was sutured to facilitate postoperative recovery. The sutured scalp was removed approximately two weeks later and a miniature microscope was attached above the window to image neuronal activity in freely moving mice. We found that cortical surface virus infusion efficiently labeled neurons in superficial layers, and scalp skin suturing helped to maintain the long-term clarity of optical windows. As a result, several hundred neurons could be recorded in freely moving animals. Compared to intracortical virus injection and open-scalp postoperative recovery, our methods minimized tissue damage and dura overgrowth underneath the optical window, and significantly increased the experimental success rate and the yield of identified neurons. Our improved cranial surgery technique allows for high-yield calcium imaging of cortical neurons with head-mounted microscopes in freely behaving animals. This technique may be beneficial for other optical applications such as two-photon microscopy, multi-site imaging, and optogenetic modulation. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Pan, Chu-Dong; Yu, Ling; Liu, Huan-Lin
2017-08-01
Traffic-induced moving force identification (MFI) is a typical inverse problem in the field of bridge structural health monitoring. Lots of regularization-based methods have been proposed for MFI. However, the MFI accuracy obtained from the existing methods is low when the moving forces enter into and exit a bridge deck due to low sensitivity of structural responses to the forces at these zones. To overcome this shortcoming, a novel moving average Tikhonov regularization method is proposed for MFI by combining with the moving average concepts. Firstly, the bridge-vehicle interaction moving force is assumed as a discrete finite signal with stable average value (DFS-SAV). Secondly, the reasonable signal feature of DFS-SAV is quantified and introduced for improving the penalty function (∣∣x∣∣2 2) defined in the classical Tikhonov regularization. Then, a feasible two-step strategy is proposed for selecting regularization parameter and balance coefficient defined in the improved penalty function. Finally, both numerical simulations on a simply-supported beam and laboratory experiments on a hollow tube beam are performed for assessing the accuracy and the feasibility of the proposed method. The illustrated results show that the moving forces can be accurately identified with a strong robustness. Some related issues, such as selection of moving window length, effect of different penalty functions, and effect of different car speeds, are discussed as well.
Simple and effective method to lock buoy position to ocean currents
NASA Technical Reports Server (NTRS)
Vachon, W. A.; Dahlen, J. M.
1975-01-01
Window-shade drogue, used with drifting buoys to keep them moving with current at speed as close to that of current as possible, has drag coefficient of 1.93 compared to maximum of 1.52 for previous drogues. It is remarkably simple to construct, use, and store.
Shot boundary detection and label propagation for spatio-temporal video segmentation
NASA Astrophysics Data System (ADS)
Piramanayagam, Sankaranaryanan; Saber, Eli; Cahill, Nathan D.; Messinger, David
2015-02-01
This paper proposes a two stage algorithm for streaming video segmentation. In the first stage, shot boundaries are detected within a window of frames by comparing dissimilarity between 2-D segmentations of each frame. In the second stage, the 2-D segments are propagated across the window of frames in both spatial and temporal direction. The window is moved across the video to find all shot transitions and obtain spatio-temporal segments simultaneously. As opposed to techniques that operate on entire video, the proposed approach consumes significantly less memory and enables segmentation of lengthy videos. We tested our segmentation based shot detection method on the TRECVID 2007 video dataset and compared it with block-based technique. Cut detection results on the TRECVID 2007 dataset indicate that our algorithm has comparable results to the best of the block-based methods. The streaming video segmentation routine also achieves promising results on a challenging video segmentation benchmark database.
A refined technique to calculate finite helical axes from rigid body trackers.
McLachlin, Stewart D; Ferreira, Louis M; Dunning, Cynthia E
2014-12-01
Finite helical axes (FHAs) are a potentially effective tool for joint kinematic analysis. Unfortunately, no straightforward guidelines exist for calculating accurate FHAs using prepackaged six degree-of-freedom (6 DOF) rigid body trackers. Thus, this study aimed to: (1) describe a protocol for calculating FHA parameters from 6 DOF rigid body trackers using the screw matrix and (2) to maximize the number of accurate FHAs generated from a given data set using a moving window analysis. Four Optotrak® Smart Markers were used as the rigid body trackers, two moving and two fixed, at different distances from the hinge joint of a custom-machined jig. 6D OF pose information was generated from 51 static positions of the jig rotated and fixed in 0.5 deg increments up to 25 deg. Output metrics included the FHA direction cosines, the rotation about the FHA, the translation along the axis, and the intercept of the FHA with the plane normal to the jig's hinge joint. FHA metrics were calculated using the relative tracker rotation from the starting position, and using a moving window analysis to define a minimum acceptable rotational displacement between the moving tracker data points. Data analysis found all FHA rotations calculated from the starting position were within 0.15 deg of the prescribed jig rotation. FHA intercepts were most stable when determined using trackers closest to the hinge axis. Increasing the moving window size improved the FHA direction cosines and center of rotation accuracy. Window sizes larger than 2 deg had an intercept deviation of less than 1 mm. Furthermore, compared to the 0 deg window size, the 2 deg window had a 90% improvement in FHA intercept precision while generating almost an equivalent number of FHA axes. This work identified a solution to improve FHA calculations for biomechanical researchers looking to describe changes in 3D joint motion.
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L
2012-09-01
Geostatistical methods are widely used in estimating long-term exposures for epidemiological studies on air pollution, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and the uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian maximum entropy (BME) method and applied this framework to estimate fine particulate matter (PM(2.5)) yearly average concentrations over the contiguous US. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingness in the air-monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM(2.5) data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM(2.5). Moreover, the MWBME method further reduces the MSE by 8.4-43.7%, with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM(2.5) across large geographical domains with expected spatial non-stationarity.
NASA Astrophysics Data System (ADS)
Magdy, Nancy; Ayad, Miriam F.
2015-02-01
Two simple, accurate, precise, sensitive and economic spectrophotometric methods were developed for the simultaneous determination of Simvastatin and Ezetimibe in fixed dose combination products without prior separation. The first method depends on a new chemometrics-assisted ratio spectra derivative method using moving window polynomial least square fitting method (Savitzky-Golay filters). The second method is based on a simple modification for the ratio subtraction method. The suggested methods were validated according to USP guidelines and can be applied for routine quality control testing.
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Mulligan, J. B.; Stone, L. S.; Hargens, Alan R. (Technical Monitor)
1995-01-01
We have shown that moving a plaid in an asymmetric window biases the perceived direction of motion (Beutter, Mulligan & Stone, ARVO 1994). We now explore whether these biased motion signals might also drive the smooth eye-movement response by comparing the perceived and tracked directions. The human smooth oculomotor response to moving plaids appears to be driven by the perceived rather than the veridical direction of motion. This suggests that human motion perception and smooth eye movements share underlying neural motion-processing substrates as has already been shown to be true for monkeys.
Reading Time Allocation Strategies and Working Memory Using Rapid Serial Visual Presentation
ERIC Educational Resources Information Center
Busler, Jessica N.; Lazarte, Alejandro A.
2017-01-01
Rapid serial visual presentation (RSVP) is a useful method for controlling the timing of text presentations and studying how readers' characteristics, such as working memory (WM) and reading strategies for time allocation, influence text recall. In the current study, a modified version of RSVP (Moving Window RSVP [MW-RSVP]) was used to induce…
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
Effect of window length on performance of the elbow-joint angle prediction based on electromyography
NASA Astrophysics Data System (ADS)
Triwiyanto; Wahyunggoro, Oyas; Adi Nugroho, Hanung; Herianto
2017-05-01
The high performance of the elbow joint angle prediction is essential on the development of the devices based on electromyography (EMG) control. The performance of the prediction depends on the feature of extraction parameters such as window length. In this paper, we evaluated the effect of the window length on the performance of the elbow-joint angle prediction. The prediction algorithm consists of zero-crossing feature extraction and second order of Butterworth low pass filter. The feature was used to extract the EMG signal by varying window length. The EMG signal was collected from the biceps muscle while the elbow was moved in the flexion and extension motion. The subject performed the elbow motion by holding a 1-kg load and moved the elbow in different periods (12 seconds, 8 seconds and 6 seconds). The results indicated that the window length affected the performance of the prediction. The 250 window lengths yielded the best performance of the prediction algorithm of (mean±SD) root mean square error = 5.68%±1.53% and Person’s correlation = 0.99±0.0059.
NASA Technical Reports Server (NTRS)
Casasent, D.
1978-01-01
The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.
Eye movements and the span of the effective stimulus in visual search.
Bertera, J H; Rayner, K
2000-04-01
The span of the effective stimulus during visual search through an unstructured alphanumeric array was investigated by using eye-contingent-display changes while the subjects searched for a target letter. In one condition, a window exposing the search array moved in synchrony with the subjects' eye movements, and the size of the window was varied. Performance reached asymptotic levels when the window was 5 degrees. In another condition, a foveal mask moved in synchrony with each eye movement, and the size of the mask was varied. The foveal mask conditions were much more detrimental to search behavior than the window conditions, indicating the importance of foveal vision during search. The size of the array also influenced performance, but performance reached asymptote for all array sizes tested at the same window size, and the effect of the foveal mask was the same for all array sizes. The results indicate that both acuity and difficulty of the search task influenced the span of the effective stimulus during visual search.
NASA Astrophysics Data System (ADS)
Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.
2017-04-01
A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.
A Study of Feature Extraction Using Divergence Analysis of Texture Features
NASA Technical Reports Server (NTRS)
Hallada, W. A.; Bly, B. G.; Boyd, R. K.; Cox, S.
1982-01-01
An empirical study of texture analysis for feature extraction and classification of high spatial resolution remotely sensed imagery (10 meters) is presented in terms of specific land cover types. The principal method examined is the use of spatial gray tone dependence (SGTD). The SGTD method reduces the gray levels within a moving window into a two-dimensional spatial gray tone dependence matrix which can be interpreted as a probability matrix of gray tone pairs. Haralick et al (1973) used a number of information theory measures to extract texture features from these matrices, including angular second moment (inertia), correlation, entropy, homogeneity, and energy. The derivation of the SGTD matrix is a function of: (1) the number of gray tones in an image; (2) the angle along which the frequency of SGTD is calculated; (3) the size of the moving window; and (4) the distance between gray tone pairs. The first three parameters were varied and tested on a 10 meter resolution panchromatic image of Maryville, Tennessee using the five SGTD measures. A transformed divergence measure was used to determine the statistical separability between four land cover categories forest, new residential, old residential, and industrial for each variation in texture parameters.
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L.
2013-01-01
Geostatistical methods are widely used in estimating long-term exposures for air pollution epidemiological studies, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian Maximum Entropy (BME) method and applied this framework to estimate fine particulate matter (PM2.5) yearly average concentrations over the contiguous U.S. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingnees in the air monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM2.5 data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM2.5. Moreover, the MWBME method further reduces the MSE by 8.4% to 43.7% with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM2.5 across large geographical domains with expected spatial non-stationarity. PMID:22739679
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindran, P; Wui Ann, W; Lim, Y
Purpose: In general, the linear accelerator is gated using respiratory signal obtained by way of external sensors to account for the breathing motion during radiotherapy. One of the commonly used gating devices is the Varian RPM device. Calypso system that uses electromagnetic tracking of implanted or surface transponders could also be used for gating. The aim of this study is to compare the gating efficiency of RPM device and the calypso system by phantom studies. Methods: An ArcCheck insert was used as the phantom with a Gafchromic film placed in its holder. The ArcCheck insert was placed on a Motionmore » Sim platform and moved in the longitudinal direction simulating a respiratory motion with a period of 5 seconds and amplitude of ±6mm. The Gafchromic film was exposed to a 2 × 2cm{sup 2} field, i) with the phantom static, ii) phantom moving but ungated iii) gated with gating window of 2mm and 3mm. This was repeated with Calypso system using surface transponders with the same gating window. The Gafchromic films were read with an EPSON 11000 flatbed scanner and analysed with ‘Medphysto’ software. Results: The full width at half maximum (FWHM) as measured with film at the level of the film holder was 1.65cm when the phantom was static. FWHM measured with phantom moving and without gating was 1.16 cm and penumbra was 7 mm (80–20%) on both sides. When the beam was gated with 2 mm gating window the FWHM was 1.8 cm with RPM device and 1.9 cm with Calypso. Similarly, when the beam was gated with 3 mm window, the FWHM was 1.9cm with RPM device and 2cm with Calypso. Conclusion: This work suggests that the gating efficiency of RPM device is better than that of the Calypso with surface transponder, with reference to the latency in gating.« less
Dynamic time-correlated single-photon counting laser ranging
NASA Astrophysics Data System (ADS)
Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang
2018-03-01
We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.
Balabin, Roman M; Smirnov, Sergey V
2011-04-29
During the past several years, near-infrared (near-IR/NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields from petroleum to biomedical sectors. The NIR spectrum (above 4000 cm(-1)) of a sample is typically measured by modern instruments at a few hundred of wavelengths. Recently, considerable effort has been directed towards developing procedures to identify variables (wavelengths) that contribute useful information. Variable selection (VS) or feature selection, also called frequency selection or wavelength selection, is a critical step in data analysis for vibrational spectroscopy (infrared, Raman, or NIRS). In this paper, we compare the performance of 16 different feature selection methods for the prediction of properties of biodiesel fuel, including density, viscosity, methanol content, and water concentration. The feature selection algorithms tested include stepwise multiple linear regression (MLR-step), interval partial least squares regression (iPLS), backward iPLS (BiPLS), forward iPLS (FiPLS), moving window partial least squares regression (MWPLS), (modified) changeable size moving window partial least squares (CSMWPLS/MCSMWPLSR), searching combination moving window partial least squares (SCMWPLS), successive projections algorithm (SPA), uninformative variable elimination (UVE, including UVE-SPA), simulated annealing (SA), back-propagation artificial neural networks (BP-ANN), Kohonen artificial neural network (K-ANN), and genetic algorithms (GAs, including GA-iPLS). Two linear techniques for calibration model building, namely multiple linear regression (MLR) and partial least squares regression/projection to latent structures (PLS/PLSR), are used for the evaluation of biofuel properties. A comparison with a non-linear calibration model, artificial neural networks (ANN-MLP), is also provided. Discussion of gasoline, ethanol-gasoline (bioethanol), and diesel fuel data is presented. The results of other spectroscopic techniques application, such as Raman, ultraviolet-visible (UV-vis), or nuclear magnetic resonance (NMR) spectroscopies, can be greatly improved by an appropriate feature selection choice. Copyright © 2011 Elsevier B.V. All rights reserved.
Efficient Windows Collaborative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nils Petermann
2010-02-28
The project goals covered both the residential and commercial windows markets and involved a range of audiences such as window manufacturers, builders, homeowners, design professionals, utilities, and public agencies. Essential goals included: (1) Creation of 'Master Toolkits' of information that integrate diverse tools, rating systems, and incentive programs, customized for key audiences such as window manufacturers, design professionals, and utility programs. (2) Delivery of education and outreach programs to multiple audiences through conference presentations, publication of articles for builders and other industry professionals, and targeted dissemination of efficient window curricula to professionals and students. (3) Design and implementation of mechanismsmore » to encourage and track sales of more efficient products through the existing Window Products Database as an incentive for manufacturers to improve products and participate in programs such as NFRC and ENERGY STAR. (4) Development of utility incentive programs to promote more efficient residential and commercial windows. Partnership with regional and local entities on the development of programs and customized information to move the market toward the highest performing products. An overarching project goal was to ensure that different audiences adopt and use the developed information, design and promotion tools and thus increase the market penetration of energy efficient fenestration products. In particular, a crucial success criterion was to move gas and electric utilities to increase the promotion of energy efficient windows through demand side management programs as an important step toward increasing the market share of energy efficient windows.« less
Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng
2016-06-24
Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.
Teng, Wei-Zhuo; Song, Jia; Meng, Fan-Xin; Meng, Qing-Fan; Lu, Jia-Hui; Hu, Shuang; Teng, Li-Rong; Wang, Di; Xie, Jing
2014-10-01
Partial least squares (PLS) and radial basis function neural network (RBFNN) combined with near infrared spectros- copy (NIR) were applied to develop models for cordycepic acid, polysaccharide and adenosine analysis in Paecilomyces hepialid fermentation mycelium. The developed models possess well generalization and predictive ability which can be applied for crude drugs and related productions determination. During the experiment, 214 Paecilomyces hepialid mycelium samples were obtained via chemical mutagenesis combined with submerged fermentation. The contents of cordycepic acid, polysaccharide and adenosine were determined via traditional methods and the near infrared spectroscopy data were collected. The outliers were removed and the numbers of calibration set were confirmed via Monte Carlo partial least square (MCPLS) method. Based on the values of degree of approach (Da), both moving window partial least squares (MWPLS) and moving window radial basis function neural network (MWRBFNN) were applied to optimize characteristic wavelength variables, optimum preprocessing methods and other important variables in the models. After comparison, the RBFNN, RBFNN and PLS models were developed successfully for cordycepic acid, polysaccharide and adenosine detection, and the correlation between reference values and predictive values in both calibration set (R2c) and validation set (R2p) of optimum models was 0.9417 and 0.9663, 0.9803 and 0.9850, and 0.9761 and 0.9728, respectively. All the data suggest that these models possess well fitness and predictive ability.
Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.
Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel
2018-06-05
In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.
Advances and applications of ABCI
NASA Astrophysics Data System (ADS)
Chin, Y. H.
1993-05-01
ABCI (Azimuthal Beam Cavity Interaction) is a computer program which solves the Maxwell equations directly in the time domain when a Gaussian beam goes through an axi-symmetrical structure on or off axis. Many new features have been implemented in the new version of ABCI (presently version 6.6), including the 'moving mesh' and Napoly's method of calculation of wake potentials. The mesh is now generated only for the part of the structure inside a window and moves together with the window frame. This moving mesh option reduces the number of mesh points considerably, and very fine meshes can be used. Napoly's integration method makes it possible to compute wake potentials in a structure such as a collimator, where parts of the cavity material are at smaller radii than that of the beam pipes, in such a way that the contribution from the beam pipes vanishes. For the monopole wake potential, ABCI can be applied even to structures with unequal beam pipe radii. Furthermore, the radial mesh size can be varied over the structure, permitting use a fine mesh only where actually needed. With these improvements, the program allows computation of wake fields for structures far too complicated for older codes. Plots of a cavity shape and wake potentials can be obtained in the form of a Top Drawer file. The program can also calculate and plot the impedance of a structure and/or the distribution of the deposited energy as a function of the frequency from Fourier transforms of wake potentials. Its usefulness is illustrated by showing some numerical examples.
A Novel Feature Extraction Method for Monitoring (Vehicular) Fuel Storage System Leaks
2014-10-02
gives a continuous output of the DPDF with predefined partitions . Resolution a DPDF is dependent on pre-determined signal range and number of... partitions within that range. Conceptually, proposed implementation is identical to the creation of a histogram with a moving data windown given some...window. The crisp partitions within specified signal range act as “competing and possible” scenarios or alternatives where we impose a “winner takes all
Improved modified energy ratio method using a multi-window approach for accurate arrival picking
NASA Astrophysics Data System (ADS)
Lee, Minho; Byun, Joongmoo; Kim, Dowan; Choi, Jihun; Kim, Myungsun
2017-04-01
To identify accurately the location of microseismic events generated during hydraulic fracture stimulation, it is necessary to detect the first break of the P- and S-wave arrival times recorded at multiple receivers. These microseismic data often contain high-amplitude noise, which makes it difficult to identify the P- and S-wave arrival times. The short-term-average to long-term-average (STA/LTA) and modified energy ratio (MER) methods are based on the differences in the energy densities of the noise and signal, and are widely used to identify the P-wave arrival times. The MER method yields more consistent results than the STA/LTA method for data with a low signal-to-noise (S/N) ratio. However, although the MER method shows good results regardless of the delay of the signal wavelet for signals with a high S/N ratio, it may yield poor results if the signal is contaminated by high-amplitude noise and does not have the minimum delay. Here we describe an improved MER (IMER) method, whereby we apply a multiple-windowing approach to overcome the limitations of the MER method. The IMER method contains calculations of an additional MER value using a third window (in addition to the original MER window), as well as the application of a moving average filter to each MER data point to eliminate high-frequency fluctuations in the original MER distributions. The resulting distribution makes it easier to apply thresholding. The proposed IMER method was applied to synthetic and real datasets with various S/N ratios and mixed-delay wavelets. The results show that the IMER method yields a high accuracy rate of around 80% within five sample errors for the synthetic datasets. Likewise, in the case of real datasets, 94.56% of the P-wave picking results obtained by the IMER method had a deviation of less than 0.5 ms (corresponding to 2 samples) from the manual picks.
INTERIOR VIEW OF DEBITEUSE ROOM. MONORAIL USED TO MOVE DEBIS ...
INTERIOR VIEW OF DEBITEUSE ROOM. MONORAIL USED TO MOVE DEBIS IS FROM ORIGINAL CLAY HOUSE. VIEW SHOWS WORKER USING AIR HAMMER TO BEGIN FINISH ON DEBI. - Chambers-McKee Window Glass Company, Debiteuse, Clay Avenue Extension, Jeannette, Westmoreland County, PA
NASA Astrophysics Data System (ADS)
Li, M.; Yu, T.; Chunliang, X.; Zuo, X.; Liu, Z.
2017-12-01
A new method for estimating the equatorial plasma bubbles (EPBs) motions from airglow emission all-sky images is presented in this paper. This method, which is called 'cloud-derived wind technology' and widely used in satellite observation of wind, could reasonable derive zonal and meridional velocity vectors of EPBs drifts by tracking a series of successive airglow 630.0 nm emission images. Airglow emission images data are available from an all sky airglow camera in Hainan Fuke (19.5°N, 109.2°E) supported by China Meridional Project, which can receive the 630.0nm emission from the ionosphere F region at low-latitudes to observe plasma bubbles. A series of pretreatment technology, e.g. image enhancement, orientation correction, image projection are utilized to preprocess the raw observation. Then the regions of plasma bubble extracted from the images are divided into several small tracing windows and each tracing window can find a target window in the searching area in following image, which is considered as the position tracing window moved to. According to this, velocities in each window are calculated by using the technology of cloud-derived wind. When applying the cloud-derived wind technology, the maximum correlation coefficient (MCC) and the histogram of gradient (HOG) methods to find the target window, which mean to find the maximum correlation and the minimum euclidean distance between two gradient histograms in respectively, are investigated and compared in detail. The maximum correlation method is fianlly adopted in this study to analyze the velocity of plasma bubbles because of its better performance than HOG. All-sky images from Hainan Fuke, between August 2014 and October 2014, are analyzed to investigate the plasma bubble drift velocities using MCC method. The data at different local time at 9 nights are studied and find that zonal drift velocity in different latitude at different local time ranges from 50 m/s to 180 m/s and there is a peak value at about 20°N. For comparison and validation, EPBs motions obtained from three traditional methods are also investigated and compared with MC method. The advantages and disadvantages of using cloud-derived wind technology to calculate EPB drift velocity are discussed.
ECG artifact cancellation in surface EMG signals by fractional order calculus application.
Miljković, Nadica; Popović, Nenad; Djordjević, Olivera; Konstantinović, Ljubica; Šekara, Tomislav B
2017-03-01
New aspects for automatic electrocardiography artifact removal from surface electromyography signals by application of fractional order calculus in combination with linear and nonlinear moving window filters are explored. Surface electromyography recordings of skeletal trunk muscles are commonly contaminated with spike shaped artifacts. This artifact originates from electrical heart activity, recorded by electrocardiography, commonly present in the surface electromyography signals recorded in heart proximity. For appropriate assessment of neuromuscular changes by means of surface electromyography, application of a proper filtering technique of electrocardiography artifact is crucial. A novel method for automatic artifact cancellation in surface electromyography signals by applying fractional order calculus and nonlinear median filter is introduced. The proposed method is compared with the linear moving average filter, with and without prior application of fractional order calculus. 3D graphs for assessment of window lengths of the filters, crest factors, root mean square differences, and fractional calculus orders (called WFC and WRC graphs) have been introduced. For an appropriate quantitative filtering evaluation, the synthetic electrocardiography signal and analogous semi-synthetic dataset have been generated. The examples of noise removal in 10 able-bodied subjects and in one patient with muscle dystrophy are presented for qualitative analysis. The crest factors, correlation coefficients, and root mean square differences of the recorded and semi-synthetic electromyography datasets showed that the most successful method was the median filter in combination with fractional order calculus of the order 0.9. Statistically more significant (p < 0.001) ECG peak reduction was obtained by the median filter application compared to the moving average filter in the cases of low level amplitude of muscle contraction compared to ECG spikes. The presented results suggest that the novel method combining a median filter and fractional order calculus can be used for automatic filtering of electrocardiography artifacts in the surface electromyography signal envelopes recorded in trunk muscles. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Can we estimate total magnetization directions from aeromagnetic data using Helbig's integrals?
Phillips, J.D.
2005-01-01
An algorithm that implements Helbig's (1963) integrals for estimating the vector components (mx, my, mz) of tile magnetic dipole moment from the first order moments of the vector magnetic field components (??X, ??Y, ??Z) is tested on real and synthetic data. After a grid of total field aeromagnetic data is converted to vector component grids using Fourier filtering, Helbig's infinite integrals are evaluated as finite integrals in small moving windows using a quadrature algorithm based on the 2-D trapezoidal rule. Prior to integration, best-fit planar surfaces must be removed from the component data within the data windows in order to make the results independent of the coordinate system origin. Two different approaches are described for interpreting the results of the integration. In the "direct" method, results from pairs of different window sizes are compared to identify grid nodes where the angular difference between solutions is small. These solutions provide valid estimates of total magnetization directions for compact sources such as spheres or dipoles, but not for horizontally elongated or 2-D sources. In the "indirect" method, which is more forgiving of source geometry, results of the quadrature analysis are scanned for solutions that are parallel to a specified total magnetization direction.
Apparatus and method for solar coal gasification
Gregg, David W.
1980-01-01
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials. Incident solar radiation is focused from an array of heliostats onto a tower-mounted secondary mirror which redirects the focused solar radiation down through a window onto the surface of a vertically-moving bed of coal, or a fluidized bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called "synthesis gas", which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam at the rear surface of the secondary mirror.
Effects of Spatio-Temporal Aliasing on Out-the-Window Visual Systems
NASA Technical Reports Server (NTRS)
Sweet, Barbara T.; Stone, Leland S.; Liston, Dorion B.; Hebert, Tim M.
2014-01-01
Designers of out-the-window visual systems face a challenge when attempting to simulate the outside world as viewed from a cockpit. Many methodologies have been developed and adopted to aid in the depiction of particular scene features, or levels of static image detail. However, because aircraft move, it is necessary to also consider the quality of the motion in the simulated visual scene. When motion is introduced in the simulated visual scene, perceptual artifacts can become apparent. A particular artifact related to image motion, spatiotemporal aliasing, will be addressed. The causes of spatio-temporal aliasing will be discussed, and current knowledge regarding the impact of these artifacts on both motion perception and simulator task performance will be reviewed. Methods of reducing the impact of this artifact are also addressed
A study on scattering correction for γ-photon 3D imaging test method
NASA Astrophysics Data System (ADS)
Xiao, Hui; Zhao, Min; Liu, Jiantang; Chen, Hao
2018-03-01
A pair of 511KeV γ-photons is generated during a positron annihilation. Their directions differ by 180°. The moving path and energy information can be utilized to form the 3D imaging test method in industrial domain. However, the scattered γ-photons are the major factors influencing the imaging precision of the test method. This study proposes a γ-photon single scattering correction method from the perspective of spatial geometry. The method first determines possible scattering points when the scattered γ-photon pair hits the detector pair. The range of scattering angle can then be calculated according to the energy window. Finally, the number of scattered γ-photons denotes the attenuation of the total scattered γ-photons along its moving path. The corrected γ-photons are obtained by deducting the scattered γ-photons from the original ones. Two experiments are conducted to verify the effectiveness of the proposed scattering correction method. The results concluded that the proposed scattering correction method can efficiently correct scattered γ-photons and improve the test accuracy.
NASA Technical Reports Server (NTRS)
2003-01-01
January 28, 2003The Mars Exploration Rover -2 is moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars. Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.NASA Astrophysics Data System (ADS)
Levine, Zachary H.; Pintar, Adam L.
2015-11-01
A simple algorithm for averaging a stochastic sequence of 1D arrays in a moving, expanding window is provided. The samples are grouped in bins which increase exponentially in size so that a constant fraction of the samples is retained at any point in the sequence. The algorithm is shown to have particular relevance for a class of Monte Carlo sampling problems which includes one characteristic of iterative reconstruction in computed tomography. The code is available in the CPC program library in both Fortran 95 and C and is also available in R through CRAN.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
Online tracking of instantaneous frequency and amplitude of dynamical system response
NASA Astrophysics Data System (ADS)
Frank Pai, P.
2010-05-01
This paper presents a sliding-window tracking (SWT) method for accurate tracking of the instantaneous frequency and amplitude of arbitrary dynamic response by processing only three (or more) most recent data points. Teager-Kaiser algorithm (TKA) is a well-known four-point method for online tracking of frequency and amplitude. Because finite difference is used in TKA, its accuracy is easily destroyed by measurement and/or signal-processing noise. Moreover, because TKA assumes the processed signal to be a pure harmonic, any moving average in the signal can destroy the accuracy of TKA. On the other hand, because SWT uses a constant and a pair of windowed regular harmonics to fit the data and estimate the instantaneous frequency and amplitude, the influence of any moving average is eliminated. Moreover, noise filtering is an implicit capability of SWT when more than three data points are used, and this capability increases with the number of processed data points. To compare the accuracy of SWT and TKA, Hilbert-Huang transform is used to extract accurate time-varying frequencies and amplitudes by processing the whole data set without assuming the signal to be harmonic. Frequency and amplitude trackings of different amplitude- and frequency-modulated signals, vibrato in music, and nonlinear stationary and non-stationary dynamic signals are studied. Results show that SWT is more accurate, robust, and versatile than TKA for online tracking of frequency and amplitude.
Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan
2015-03-01
In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.
Hussein, Sami; Kruger, Jörg
2011-01-01
Robot assisted training has proven beneficial as an extension of conventional therapy to improve rehabilitation outcome. Further facilitation of this positive impact is expected from the application of cooperative control algorithms to increase the patient's contribution to the training effort according to his level of ability. This paper presents an approach for cooperative training for end-effector based gait rehabilitation devices. Thereby it provides the basis to firstly establish sophisticated cooperative control methods in this class of devices. It uses a haptic control framework to synthesize and render complex, task specific training environments, which are composed of polygonal primitives. Training assistance is integrated as part of the environment into the haptic control framework. A compliant window is moved along a nominal training trajectory compliantly guiding and supporting the foot motion. The level of assistance is adjusted via the stiffness of the moving window. Further an iterative learning algorithm is used to automatically adjust this assistance level. Stable haptic rendering of the dynamic training environments and adaptive movement assistance have been evaluated in two example training scenarios: treadmill walking and stair climbing. Data from preliminary trials with one healthy subject is provided in this paper. © 2011 IEEE
Moving Rivers, Shifting Streams: Perspectives on the Existence of a Policy Window
ERIC Educational Resources Information Center
Galligan, Ann M.; Burgess, Chris N.
2005-01-01
This article represents differing perspectives on the creation and establishment of the Rhode Island Arts Learning Network (ALN). At the heart of this discussion is whether or not the Rhode Island task force in charge of this process took advantage of what noted public policy analyst John Kingdon refers to as a "policy window" where…
[A peak recognition algorithm designed for chromatographic peaks of transformer oil].
Ou, Linjun; Cao, Jian
2014-09-01
In the field of the chromatographic peak identification of the transformer oil, the traditional first-order derivative requires slope threshold to achieve peak identification. In terms of its shortcomings of low automation and easy distortion, the first-order derivative method was improved by applying the moving average iterative method and the normalized analysis techniques to identify the peaks. Accurate identification of the chromatographic peaks was realized through using multiple iterations of the moving average of signal curves and square wave curves to determine the optimal value of the normalized peak identification parameters, combined with the absolute peak retention times and peak window. The experimental results show that this algorithm can accurately identify the peaks and is not sensitive to the noise, the chromatographic peak width or the peak shape changes. It has strong adaptability to meet the on-site requirements of online monitoring devices of dissolved gases in transformer oil.
Quantifying rapid changes in cardiovascular state with a moving ensemble average.
Cieslak, Matthew; Ryan, William S; Babenko, Viktoriya; Erro, Hannah; Rathbun, Zoe M; Meiring, Wendy; Kelsey, Robert M; Blascovich, Jim; Grafton, Scott T
2018-04-01
MEAP, the moving ensemble analysis pipeline, is a new open-source tool designed to perform multisubject preprocessing and analysis of cardiovascular data, including electrocardiogram (ECG), impedance cardiogram (ICG), and continuous blood pressure (BP). In addition to traditional ensemble averaging, MEAP implements a moving ensemble averaging method that allows for the continuous estimation of indices related to cardiovascular state, including cardiac output, preejection period, heart rate variability, and total peripheral resistance, among others. Here, we define the moving ensemble technique mathematically, highlighting its differences from fixed-window ensemble averaging. We describe MEAP's interface and features for signal processing, artifact correction, and cardiovascular-based fMRI analysis. We demonstrate the accuracy of MEAP's novel B point detection algorithm on a large collection of hand-labeled ICG waveforms. As a proof of concept, two subjects completed a series of four physical and cognitive tasks (cold pressor, Valsalva maneuver, video game, random dot kinetogram) on 3 separate days while ECG, ICG, and BP were recorded. Critically, the moving ensemble method reliably captures the rapid cyclical cardiovascular changes related to the baroreflex during the Valsalva maneuver and the classic cold pressor response. Cardiovascular measures were seen to vary considerably within repetitions of the same cognitive task for each individual, suggesting that a carefully designed paradigm could be used to capture fast-acting event-related changes in cardiovascular state. © 2017 Society for Psychophysiological Research.
Modeling laser-plasma acceleration in the laboratory frame
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-01-01
A simulation of laser-plasma acceleration in the laboratory frame. Both the laser and the wakefield buckets must be resolved over the entire domain of the plasma, requiring many cells and many time steps. While researchers often use a simulation window that moves with the pulse, this reduces only the multitude of cells, not the multitude of time steps. For an artistic impression of how to solve the simulation by using the boosted-frame method, watch the video "Modeling laser-plasma acceleration in the wakefield frame".
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-01-01
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97. PMID:26393607
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data.
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-09-18
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97.
ERIC Educational Resources Information Center
Selcuk, Gamze Sezgin; Yurumezoglu, Kemal
2013-01-01
Someone in a car moving at constant speed along a smooth, straight road cannot perceive movement unless he looks out a window. When the person looks out and sees another car traveling alongside, in the same direction and at an equal speed, he will think that the other car is not moving either. When we see a tree in the distance as we are driving…
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
NASA Astrophysics Data System (ADS)
Taira, T.; Kato, A.
2013-12-01
A high-resolution Vp/Vs ratio estimate is one of the key parameters to understand spatial variations of composition and physical state within the Earth. Lin and Shearer (2007, BSSA) recently developed a methodology to obtain local Vp/Vs ratios in individual similar earthquake clusters, based on P- and S-wave differential times. A waveform cross-correlation approach is typically employed to measure those differential times for pairs of seismograms from similar earthquakes clusters, at narrow time windows around the direct P and S waves. This approach effectively collects P- and S-wave differential times and however requires the robust P- and S-wave time windows that are extracted based on either manually or automatically picked P- and S-phases. We present another technique to estimate P- and S-wave differential times by exploiting temporal properties of delayed time as a function of elapsed time on the seismograms with a moving-window cross-correlation analysis (e.g., Snieder, 2002, Phys. Rev. E; Niu et al. 2003, Nature). Our approach is based on the principle that the delayed time for the direct S wave differs from that for the direct P wave. Two seismograms aligned by the direct P waves from a pair of similar earthquakes yield that delayed times become zero around the direct P wave. In contrast, delayed times obtained from time windows including the direct S wave have non-zero value. Our approach, in principle, is capable of measuring both P- and S-wave differential times from single-component seismograms. In an ideal case, the temporal evolution of delayed time becomes a step function with its discontinuity at the onset of the direct S wave. The offset in the resulting step function would be the S-wave differential time, relative to the P-wave differential time as the two waveforms are aligned by the direct P wave. We apply our moving-window cross-correlation technique to the two different data sets collected at: 1) the Wakayama district, Japan and 2) the Geysers geothermal field, California. The both target areas are characterized by earthquake swarms that provide a number of similar events clusters. We use the following automated procedure to systematically analyze the two data sets: 1) the identification of the direct P arrivals by using an Akaike Information Criterion based phase picking algorithm introduced by Zhang and Thurber (2003, BSSA), 2) the waveform alignment by the P-wave with a waveform cross-correlation to obtain P-wave differential time, 3) the moving-time window analysis to estimate the S-differential time. Kato et al. (2010, GRL) have estimated the Vp/Vs ratios for a few similar earthquake clusters from the Wakayama data set, by a conventional approach to obtain differential times. We find that the resulting Vp/Vs ratios from our approach for the same earthquake clusters are comparable with those obtained from Kato et al. (2010, GRL). We show that the moving-window cross-correlation technique effectively measures both P- and S-wave differential times for the seismograms in which the clear P and S phases are not observed. We will show spatial distributions in Vp/Vs ratios in our two target areas.
A case study of exposure to ultrafine particles from secondhand tobacco smoke in an automobile.
Liu, S; Zhu, Y
2010-10-01
Secondhand tobacco smoke (SHS) in enclosed spaces is a major source of potentially harmful airborne particles. To quantify exposure to ultrafine particles (UFP) because of SHS and to investigate the interaction between pollutants from SHS and vehicular emissions, number concentration and size distribution of UFP and other air pollutants (CO, CO(2) , and PM(2.5)) were measured inside a moving vehicle under five different ventilation conditions. A major interstate freeway with a speed limit of 60 mph and an urban roadway with a speed limit of 30 mph were selected to represent typical urban routes. In a typical 30-min commute on urban roadways, the SHS of one cigarette exposed passengers to approximately 10 times the UFP and 120 times the PM(2.5) of ambient air. The most effective solution to protect passengers from SHS exposure is to abstain from smoking in the vehicle. Opening a window is an effective method for decreasing pollutant exposures on most urban roadways. However, under road conditions with high UFP concentrations, such as tunnels or busy freeways with high proportion of heavy-duty diesel trucks (such as the 710 Freeway in Los Angeles, CA, USA), opening a window is not a viable method to reduce UFPs. Time budget studies show that Americans spend, on average, more than 60 min each day in enclosed vehicles. Smoking inside vehicles can expose the driver and other passengers to high levels of pollutants. Thus, an understanding of the variations and interactions of secondhand tobacco smoke (SHS) and vehicular emissions under realistic driving conditions is necessary. Results of this study indicated that high ventilation rates can effectively dilute ultrafine particles (UFP) inside moving vehicles on urban routes. However, driving with open windows and an increased air exchange rate (AER) are not recommended on tunnels and heavily travelled freeways.
On the temporal window of auditory-brain system in connection with subjective responses
NASA Astrophysics Data System (ADS)
Mouri, Kiminori
2003-08-01
The human auditory-brain system processes information extracted from autocorrelation function (ACF) of the source signal and interaural cross correlation function (IACF) of binaural sound signals which are associated with the left and right cerebral hemispheres, respectively. The purpose of this dissertation is to determine the desirable temporal window (2T: integration interval) for ACF and IACF mechanisms. For the ACF mechanism, the visual change of Φ(0), i.e., the power of ACF, was associated with the change of loudness, and it is shown that the recommended temporal window is given as about 30(τe)min [s]. The value of (τe)min is the minimum value of effective duration of the running ACF of the source signal. It is worth noticing from the experiment of EEG that the most preferred delay time of the first reflection sound is determined by the piece indicating (τe)min in the source signal. For the IACF mechanism, the temporal window is determined as below: The measured range of τIACC corresponding to subjective angle for the moving image sound depends on the temporal window. Here, the moving image was simulated by the use of two loudspeakers located at +/-20° in the horizontal plane, reproducing amplitude modulated band-limited noise alternatively. It is found that the temporal window has a wide range of values from 0.03 to 1 [s] for the modulation frequency below 0.2 Hz. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Kiminori Mouri, 5-3-3-1110 Harayama-dai, Sakai city, Osaka 590-0132, Japan. E-mail address: km529756@aol.com
Robotic Attention Processing And Its Application To Visual Guidance
NASA Astrophysics Data System (ADS)
Barth, Matthew; Inoue, Hirochika
1988-03-01
This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.
Hyun, Eugin; Jin, Young-Seok; Lee, Jong-Hun
2016-01-01
For an automotive pedestrian detection radar system, fast-ramp based 2D range-Doppler Frequency Modulated Continuous Wave (FMCW) radar is effective for distinguishing between moving targets and unwanted clutter. However, when a weak moving target such as a pedestrian exists together with strong clutter, the pedestrian may be masked by the side-lobe of the clutter even though they are notably separated in the Doppler dimension. To prevent this problem, one popular solution is the use of a windowing scheme with a weighting function. However, this method leads to a spread spectrum, so the pedestrian with weak signal power and slow Doppler may also be masked by the main-lobe of clutter. With a fast-ramp based FMCW radar, if the target is moving, the complex spectrum of the range- Fast Fourier Transform (FFT) is changed with a constant phase difference over ramps. In contrast, the clutter exhibits constant phase irrespective of the ramps. Based on this fact, in this paper we propose a pedestrian detection for highly cluttered environments using a coherent phase difference method. By detecting the coherent phase difference from the complex spectrum of the range-FFT, we first extract the range profile of the moving pedestrians. Then, through the Doppler FFT, we obtain the 2D range-Doppler map for only the pedestrian. To test the proposed detection scheme, we have developed a real-time data logging system with a 24 GHz FMCW transceiver. In laboratory tests, we verified that the signal processing results from the proposed method were much better than those expected from the conventional 2D FFT-based detection method. PMID:26805835
Hyun, Eugin; Jin, Young-Seok; Lee, Jong-Hun
2016-01-20
For an automotive pedestrian detection radar system, fast-ramp based 2D range-Doppler Frequency Modulated Continuous Wave (FMCW) radar is effective for distinguishing between moving targets and unwanted clutter. However, when a weak moving target such as a pedestrian exists together with strong clutter, the pedestrian may be masked by the side-lobe of the clutter even though they are notably separated in the Doppler dimension. To prevent this problem, one popular solution is the use of a windowing scheme with a weighting function. However, this method leads to a spread spectrum, so the pedestrian with weak signal power and slow Doppler may also be masked by the main-lobe of clutter. With a fast-ramp based FMCW radar, if the target is moving, the complex spectrum of the range- Fast Fourier Transform (FFT) is changed with a constant phase difference over ramps. In contrast, the clutter exhibits constant phase irrespective of the ramps. Based on this fact, in this paper we propose a pedestrian detection for highly cluttered environments using a coherent phase difference method. By detecting the coherent phase difference from the complex spectrum of the range-FFT, we first extract the range profile of the moving pedestrians. Then, through the Doppler FFT, we obtain the 2D range-Doppler map for only the pedestrian. To test the proposed detection scheme, we have developed a real-time data logging system with a 24 GHz FMCW transceiver. In laboratory tests, we verified that the signal processing results from the proposed method were much better than those expected from the conventional 2D FFT-based detection method.
Pettit runs a drill while looking through a camera mounted on the Nadir window in the U.S. Lab
2003-04-05
ISS006-E-44305 (5 April 2003) --- Astronaut Donald R. Pettit, Expedition Six NASA ISS science officer, runs a drill while looking through a camera mounted on the nadir window in the Destiny laboratory on the International Space Station (ISS). The device is called a barn door tracker. The drill turns the screw, which moves the camera and its spotting scope.
[A fast iterative algorithm for adaptive histogram equalization].
Cao, X; Liu, X; Deng, Z; Jiang, D; Zheng, C
1997-01-01
In this paper, we propose an iterative algorthm called FAHE., which is based on the relativity between the current local histogram and the one before the sliding window moving. Comparing with the basic AHE, the computing time of FAHE is decreased from 5 hours to 4 minutes on a 486dx/33 compatible computer, when using a 65 x 65 sliding window for a 512 x 512 with 8 bits gray-level range.
Time-localized wavelet multiple regression and correlation
NASA Astrophysics Data System (ADS)
Fernández-Macho, Javier
2018-02-01
This paper extends wavelet methodology to handle comovement dynamics of multivariate time series via moving weighted regression on wavelet coefficients. The concept of wavelet local multiple correlation is used to produce one single set of multiscale correlations along time, in contrast with the large number of wavelet correlation maps that need to be compared when using standard pairwise wavelet correlations with rolling windows. Also, the spectral properties of weight functions are investigated and it is argued that some common time windows, such as the usual rectangular rolling window, are not satisfactory on these grounds. The method is illustrated with a multiscale analysis of the comovements of Eurozone stock markets during this century. It is shown how the evolution of the correlation structure in these markets has been far from homogeneous both along time and across timescales featuring an acute divide across timescales at about the quarterly scale. At longer scales, evidence from the long-term correlation structure can be interpreted as stable perfect integration among Euro stock markets. On the other hand, at intramonth and intraweek scales, the short-term correlation structure has been clearly evolving along time, experiencing a sharp increase during financial crises which may be interpreted as evidence of financial 'contagion'.
Air change rates of motor vehicles and in-vehicle pollutant concentrations from secondhand smoke.
Ott, Wayne; Klepeis, Neil; Switzer, Paul
2008-05-01
The air change rates of motor vehicles are relevant to the sheltering effect from air pollutants entering from outside a vehicle and also to the interior concentrations from any sources inside its passenger compartment. We made more than 100 air change rate measurements on four motor vehicles under moving and stationary conditions; we also measured the carbon monoxide (CO) and fine particle (PM(2.5)) decay rates from 14 cigarettes smoked inside the vehicle. With the vehicle stationary and the fan off, the ventilation rate in air changes per hour (ACH) was less than 1 h(-1) with the windows closed and increased to 6.5 h(-1) with one window fully opened. The vehicle speed, window position, ventilation system, and air conditioner setting was found to affect the ACH. For closed windows and passive ventilation (fan off and no recirculation), the ACH was linearly related to the vehicle speed over the range from 15 to 72 mph (25 to 116 km h(-1)). With a vehicle moving, windows closed, and the ventilation system off (or the air conditioner set to AC Max), the ACH was less than 6.6 h(-1) for speeds ranging from 20 to 72 mph (32 to 116 km h(-1)). Opening a single window by 3'' (7.6 cm) increased the ACH by 8-16 times. For the 14 cigarettes smoked in vehicles, the deposition rate k and the air change rate a were correlated, following the equation k=1.3a (R(2)=82%; n=14). With recirculation on (or AC Max) and closed windows, the interior PM(2.5) concentration exceeded 2000 microg m(-3) momentarily for all cigarettes tested, regardless of speed. The concentration time series measured inside the vehicle followed the mathematical solutions of the indoor mass balance model, and the 24-h average personal exposure to PM(2.5) could exceed 35 microg m(-3) for just two cigarettes smoked inside the vehicle.
Yeo, Boon Y.; McLaughlin, Robert A.; Kirk, Rodney W.; Sampson, David D.
2012-01-01
We present a high-resolution three-dimensional position tracking method that allows an optical coherence tomography (OCT) needle probe to be scanned laterally by hand, providing the high degree of flexibility and freedom required in clinical usage. The method is based on a magnetic tracking system, which is augmented by cross-correlation-based resampling and a two-stage moving window average algorithm to improve upon the tracker's limited intrinsic spatial resolution, achieving 18 µm RMS position accuracy. A proof-of-principle system was developed, with successful image reconstruction demonstrated on phantoms and on ex vivo human breast tissue validated against histology. This freehand scanning method could contribute toward clinical implementation of OCT needle imaging. PMID:22808429
Dual-filter estimation for rotating-panel sample designs
Francis Roesch
2017-01-01
Dual-filter estimators are described and tested for use in the annual estimation for national forest inventories. The dual-filter approach involves the use of a moving widow estimator in the first pass, which is used as input to Theilâs mixed estimator in the second pass. The moving window and dual-filter estimators are tested along with two other estimators in a...
Fundamental study of compression for movie files of coronary angiography
NASA Astrophysics Data System (ADS)
Ando, Takekazu; Tsuchiya, Yuichiro; Kodera, Yoshie
2005-04-01
When network distribution of movie files was considered as reference, it could be useful that the lossy compression movie files which has small file size. We chouse three kinds of coronary stricture movies with different moving speed as an examination object; heart rate of slow, normal and fast movies. The movies of MPEG-1, DivX5.11, WMV9 (Windows Media Video 9), and WMV9-VCM (Windows Media Video 9-Video Compression Manager) were made from three kinds of AVI format movies with different moving speeds. Five kinds of movies that are four kinds of compression movies and non-compression AVI instead of the DICOM format were evaluated by Thurstone's method. The Evaluation factors of movies were determined as "sharpness, granularity, contrast, and comprehensive evaluation." In the virtual bradycardia movie, AVI was the best evaluation at all evaluation factors except the granularity. In the virtual normal movie, an excellent compression technique is different in all evaluation factors. In the virtual tachycardia movie, MPEG-1 was the best evaluation at all evaluation factors expects the contrast. There is a good compression form depending on the speed of movies because of the difference of compression algorithm. It is thought that it is an influence by the difference of the compression between frames. The compression algorithm for movie has the compression between the frames and the intra-frame compression. As the compression algorithm give the different influence to image by each compression method, it is necessary to examine the relation of the compression algorithm and our results.
Online Wavelet Complementary velocity Estimator.
Righettini, Paolo; Strada, Roberto; KhademOlama, Ehsan; Valilou, Shirin
2018-02-01
In this paper, we have proposed a new online Wavelet Complementary velocity Estimator (WCE) over position and acceleration data gathered from an electro hydraulic servo shaking table. This is a batch estimator type that is based on the wavelet filter banks which extract the high and low resolution of data. The proposed complementary estimator combines these two resolutions of velocities which acquired from numerical differentiation and integration of the position and acceleration sensors by considering a fixed moving horizon window as input to wavelet filter. Because of using wavelet filters, it can be implemented in a parallel procedure. By this method the numerical velocity is estimated without having high noise of differentiators, integration drifting bias and with less delay which is suitable for active vibration control in high precision Mechatronics systems by Direct Velocity Feedback (DVF) methods. This method allows us to make velocity sensors with less mechanically moving parts which makes it suitable for fast miniature structures. We have compared this method with Kalman and Butterworth filters over stability, delay and benchmarked them by their long time velocity integration for getting back the initial position data. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Ellis, Katherine; Godbole, Suneeta; Marshall, Simon; Lanckriet, Gert; Staudenmayer, John; Kerr, Jacqueline
2014-01-01
Active travel is an important area in physical activity research, but objective measurement of active travel is still difficult. Automated methods to measure travel behaviors will improve research in this area. In this paper, we present a supervised machine learning method for transportation mode prediction from global positioning system (GPS) and accelerometer data. We collected a dataset of about 150 h of GPS and accelerometer data from two research assistants following a protocol of prescribed trips consisting of five activities: bicycling, riding in a vehicle, walking, sitting, and standing. We extracted 49 features from 1-min windows of this data. We compared the performance of several machine learning algorithms and chose a random forest algorithm to classify the transportation mode. We used a moving average output filter to smooth the output predictions over time. The random forest algorithm achieved 89.8% cross-validated accuracy on this dataset. Adding the moving average filter to smooth output predictions increased the cross-validated accuracy to 91.9%. Machine learning methods are a viable approach for automating measurement of active travel, particularly for measuring travel activities that traditional accelerometer data processing methods misclassify, such as bicycling and vehicle travel.
Delorme, Arnaud; Miyakoshi, Makoto; Jung, Tzyy-Ping; Makeig, Scott
2014-01-01
With the advent of modern computing methods, modeling trial-to-trial variability in biophysical recordings including electroencephalography (EEG) has become of increasingly interest. Yet no widely used method exists for comparing variability in ordered collections of single-trial data epochs across conditions and subjects. We have developed a method based on an ERP-image visualization tool in which potential, spectral power, or some other measure at each time point in a set of event-related single-trial data epochs are represented as color coded horizontal lines that are then stacked to form a 2-D colored image. Moving-window smoothing across trial epochs can make otherwise hidden event-related features in the data more perceptible. Stacking trials in different orders, for example ordered by subject reaction time, by context-related information such as inter-stimulus interval, or some other characteristic of the data (e.g., latency-window mean power or phase of some EEG source) can reveal aspects of the multifold complexities of trial-to-trial EEG data variability. This study demonstrates new methods for computing and visualizing grand ERP-image plots across subjects and for performing robust statistical testing on the resulting images. These methods have been implemented and made freely available in the EEGLAB signal-processing environment that we maintain and distribute. PMID:25447029
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
NASA Astrophysics Data System (ADS)
Diodato, A.; Cafarelli, A.; Schiappacasse, A.; Tognarelli, S.; Ciuti, G.; Menciassi, A.
2018-02-01
High intensity focused ultrasound (HIFU) is an emerging therapeutic solution that enables non-invasive treatment of several pathologies, mainly in oncology. On the other hand, accurate targeting of moving abdominal organs (e.g. liver, kidney, pancreas) is still an open challenge. This paper proposes a novel method to compensate the physiological respiratory motion of organs during HIFU procedures, by exploiting a robotic platform for ultrasound-guided HIFU surgery provided with a therapeutic annular phased array transducer. The proposed method enables us to keep the same contact point between the transducer and the patient’s skin during the whole procedure, thus minimizing the modification of the acoustic window during the breathing phases. The motion of the target point is compensated through the rotation of the transducer around a virtual pivot point, while the focal depth is continuously adjusted thanks to the axial electronically steering capabilities of the HIFU transducer. The feasibility of the angular motion compensation strategy has been demonstrated in a simulated respiratory-induced organ motion environment. Based on the experimental results, the proposed method appears to be significantly accurate (i.e. the maximum compensation error is always under 1 mm), thus paving the way for the potential use of this technique for in vivo treatment of moving organs, and therefore enabling a wide use of HIFU in clinics.
Window of visibility - A psychophysical theory of fidelity in time-sampled visual motion displays
NASA Technical Reports Server (NTRS)
Watson, A. B.; Ahumada, A. J., Jr.; Farrell, J. E.
1986-01-01
A film of an object in motion presents on the screen a sequence of static views, while the human observer sees the object moving smoothly across the screen. Questions related to the perceptual identity of continuous and stroboscopic displays are examined. Time-sampled moving images are considered along with the contrast distribution of continuous motion, the contrast distribution of stroboscopic motion, the frequency spectrum of continuous motion, the frequency spectrum of stroboscopic motion, the approximation of the limits of human visual sensitivity to spatial and temporal frequencies by a window of visibility, the critical sampling frequency, the contrast distribution of staircase motion and the frequency spectrum of this motion, and the spatial dependence of the critical sampling frequency. Attention is given to apparent motion, models of motion, image recording, and computer-generated imagery.
Lidar point density analysis: implications for identifying water bodies
Worstell, Bruce B.; Poppenga, Sandra K.; Evans, Gayla A.; Prince, Sandra
2014-01-01
Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.
Design and control of a 3-DOF rehabilitation robot for forearm and wrist.
Lincong Luo; Liang Peng; Zengguang Hou; Weiqun Wang
2017-07-01
This paper presents a 3-DOF compact rehabilitation robot, involving mechanical structure design, control system design and gravity compensation analysis. The robot can simultaneously provide assistance for pronation/supination(P/S), flexion/extension(F/E) and adduction/abduction(A/A) joints rehabilitation training. The P/S and F/E joints are designed to be driven by cable transmission to gain a high backdrivability, and an adjustment plate is adopted to decrease the distance between the rotation axis of F/E joint of the human wrist and the robot. In addition, gravity compensation is considered to offset the impact of self-gravity on the performance of the controller. A "moving window" control strategy based on impedance control is proposed and implemented on the robot. A comparison between the "moving window" control and classical impedance control indicates that the former has more potential to stimulate the voluntary efforts of the participant, and has a less limitation moving in a fixed reference trajectory. Meanwhile, the results also validate the feasibility and safety of the wrist robot system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, S. R.; Knox, H. A.; Abbott, R. E.
Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variationsmore » that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. Lastly, this improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.« less
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Traversing Time and Space from the Blessing Window
NASA Astrophysics Data System (ADS)
Huang, Ya-Ling
2013-02-01
The visual graphics for the holographic artwork "Blessing Window" were created from observations of Tainan city, with a focus on the beauty of Chinese characters, their typographic. The concept of movement in the artwork is from a traditional Chinese philosophy, "When the mountain does not move, the road extends, when the road does not extend to the destination, the heart will extend". One multiplex-hologram and an interactive installation were used to combine the visual concepts of typography and the philosophy.
Cassidy looks through window into the PMA-2 during STS-127 Mission
2009-07-17
S127-E-006705 (17 July 2009) --- Astronaut Christopher Cassidy, STS-127 mission specialist, peers through a window in the hatch that separates seven Endeavour crew members from six International Space Station inhabitants. But the separation wasn't for long, as soon afterward the hatch was opened and the visitors from Earth moved onto the station to set the population record at 13. More importantly, over a week's worth of joint activities lies ahead for the two crews.
Canonical Probability Distributions for Model Building, Learning, and Inference
2006-07-14
hand , are for Ranked nodes set at Unobservable and Auxiliary nodes. The value of alpha is set in the diagnostic window by moving the slider in the upper...right hand side of the window. The upper bound of alpha can be modified by typing the new value in the small edit box to the right of the slider. f...TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER University of Pittsburgh
An Open-Source Standard T-Wave Alternans Detector for Benchmarking.
Khaustov, A; Nemati, S; Clifford, Gd
2008-09-14
We describe an open source algorithm suite for T-Wave Alternans (TWA) detection and quantification. The software consists of Matlab implementations of the widely used Spectral Method and Modified Moving Average with libraries to read both WFDB and ASCII data under windows and Linux. The software suite can run in both batch mode and with a provided graphical user interface to aid waveform exploration. Our software suite was calibrated using an open source TWA model, described in a partner paper [1] by Clifford and Sameni. For the PhysioNet/CinC Challenge 2008 we obtained a score of 0.881 for the Spectral Method and 0.400 for the MMA method. However, our objective was not to provide the best TWA detector, but rather a basis for detailed discussion of algorithms.
Ellis, Katherine; Godbole, Suneeta; Marshall, Simon; Lanckriet, Gert; Staudenmayer, John; Kerr, Jacqueline
2014-01-01
Background: Active travel is an important area in physical activity research, but objective measurement of active travel is still difficult. Automated methods to measure travel behaviors will improve research in this area. In this paper, we present a supervised machine learning method for transportation mode prediction from global positioning system (GPS) and accelerometer data. Methods: We collected a dataset of about 150 h of GPS and accelerometer data from two research assistants following a protocol of prescribed trips consisting of five activities: bicycling, riding in a vehicle, walking, sitting, and standing. We extracted 49 features from 1-min windows of this data. We compared the performance of several machine learning algorithms and chose a random forest algorithm to classify the transportation mode. We used a moving average output filter to smooth the output predictions over time. Results: The random forest algorithm achieved 89.8% cross-validated accuracy on this dataset. Adding the moving average filter to smooth output predictions increased the cross-validated accuracy to 91.9%. Conclusion: Machine learning methods are a viable approach for automating measurement of active travel, particularly for measuring travel activities that traditional accelerometer data processing methods misclassify, such as bicycling and vehicle travel. PMID:24795875
Xu, Yinlin; Ma, Qianli D Y; Schmitt, Daniel T; Bernaola-Galván, Pedro; Ivanov, Plamen Ch
2011-11-01
We investigate how various coarse-graining (signal quantization) methods affect the scaling properties of long-range power-law correlated and anti-correlated signals, quantified by the detrended fluctuation analysis. Specifically, for coarse-graining in the magnitude of a signal, we consider (i) the Floor, (ii) the Symmetry and (iii) the Centro-Symmetry coarse-graining methods. We find that for anti-correlated signals coarse-graining in the magnitude leads to a crossover to random behavior at large scales, and that with increasing the width of the coarse-graining partition interval Δ, this crossover moves to intermediate and small scales. In contrast, the scaling of positively correlated signals is less affected by the coarse-graining, with no observable changes when Δ < 1, while for Δ > 1 a crossover appears at small scales and moves to intermediate and large scales with increasing Δ. For very rough coarse-graining (Δ > 3) based on the Floor and Symmetry methods, the position of the crossover stabilizes, in contrast to the Centro-Symmetry method where the crossover continuously moves across scales and leads to a random behavior at all scales; thus indicating a much stronger effect of the Centro-Symmetry compared to the Floor and the Symmetry method. For coarse-graining in time, where data points are averaged in non-overlapping time windows, we find that the scaling for both anti-correlated and positively correlated signals is practically preserved. The results of our simulations are useful for the correct interpretation of the correlation and scaling properties of symbolic sequences.
Xu, Yinlin; Ma, Qianli D.Y.; Schmitt, Daniel T.; Bernaola-Galván, Pedro; Ivanov, Plamen Ch.
2014-01-01
We investigate how various coarse-graining (signal quantization) methods affect the scaling properties of long-range power-law correlated and anti-correlated signals, quantified by the detrended fluctuation analysis. Specifically, for coarse-graining in the magnitude of a signal, we consider (i) the Floor, (ii) the Symmetry and (iii) the Centro-Symmetry coarse-graining methods. We find that for anti-correlated signals coarse-graining in the magnitude leads to a crossover to random behavior at large scales, and that with increasing the width of the coarse-graining partition interval Δ, this crossover moves to intermediate and small scales. In contrast, the scaling of positively correlated signals is less affected by the coarse-graining, with no observable changes when Δ < 1, while for Δ > 1 a crossover appears at small scales and moves to intermediate and large scales with increasing Δ. For very rough coarse-graining (Δ > 3) based on the Floor and Symmetry methods, the position of the crossover stabilizes, in contrast to the Centro-Symmetry method where the crossover continuously moves across scales and leads to a random behavior at all scales; thus indicating a much stronger effect of the Centro-Symmetry compared to the Floor and the Symmetry method. For coarse-graining in time, where data points are averaged in non-overlapping time windows, we find that the scaling for both anti-correlated and positively correlated signals is practically preserved. The results of our simulations are useful for the correct interpretation of the correlation and scaling properties of symbolic sequences. PMID:25392599
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ojeda-Gonzalez, A.; Prestes, A.; Klausner, V.
Spatio-temporal entropy (STE) analysis is used as an alternative mathematical tool to identify possible magnetic cloud (MC) candidates. We analyze Interplanetary Magnetic Field (IMF) data using a time interval of only 10 days. We select a convenient data interval of 2500 records moving forward by 200 record steps until the end of the time series. For every data segment, the STE is calculated at each step. During an MC event, the STE reaches values close to zero. This extremely low value of STE is due to MC structure features. However, not all of the magnetic components in MCs have STEmore » values close to zero at the same time. For this reason, we create a standardization index (the so-called Interplanetary Entropy, IE, index). This index is a worthwhile effort to develop new tools to help diagnose ICME structures. The IE was calculated using a time window of one year (1999), and it has a success rate of 70% over other identifiers of MCs. The unsuccessful cases (30%) are caused by small and weak MCs. The results show that the IE methodology identified 9 of 13 MCs, and emitted nine false alarm cases. In 1999, a total of 788 windows of 2500 values existed, meaning that the percentage of false alarms was 1.14%, which can be considered a good result. In addition, four time windows, each of 10 days, are studied, where the IE method was effective in finding MC candidates. As a novel result, two new MCs are identified in these time windows.« less
Novel windowing technique realized in FPGA for radar system
NASA Astrophysics Data System (ADS)
Escamilla-Hernandez, E.; Kravchenko, V. F.; Ponomaryov, V. I.; Ikuo, Arai
2006-02-01
To improve the weak target detection ability in radar applications a pulse compression is usually used that in the case linear FM modulation can improve the SNR. One drawback in here is that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) and resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, etc. in window processing. Additionally to classical ones in this paper we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. To implement the compression-windowing model on hardware level it has been employed FPGA. This work aims at demonstrating a reasonably flexible implementation of FM-linear signal, pulse compression and windowing employing FPGA's. Classical and novel AF window technique has been investigated to reduce the SLL taking into account the noise influence and increasing the detection ability of the small or weak targets in the imaging radar. Paper presents the experimental hardware results of windowing in pulse compression radar resolving several targets for rectangular, Hamming, Kaiser-Bessel, (see manuscript for formula) functions windows. The windows created by use the atomic functions offer sufficiently better decreasing of the SLL in case of noise presence and when we move away of the main lobe in comparison with classical windows.
Reduction of display artifacts by random sampling
NASA Technical Reports Server (NTRS)
Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.
1983-01-01
The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.
Method and apparatus for scientific analysis under low temperature vacuum conditions
Winefordner, James D.; Jones, Bradley T.
1990-01-01
A method and apparatus for scientific analysis of a sample under low temperature vacuum conditions uses a vacuum chamber with a conveyor belt disposed therein. One end of the conveyor belt is a cool end in thermal contact with the cold stage of a refrigerator, whereas the other end of the conveyor belt is a warm end spaced from the refrigerator. A septum allows injection of a sample into the vacuum chamber on top of the conveyor belt for spectroscopic or other analysis. The sample freezes on the conveyor belt at the cold end. One or more windows in the vacuum chamber housing allow spectroscopic analysis of the sample. Following the spectroscopic analysis, the conveyor belt may be moved such that the sample moves toward the warm end of the conveyor belt where upon it evaporates, thereby cleaning the conveyor belt. Instead of injecting the sample by way of a septum and use of a syringe and needle, the present device may be used in series with capillary-column gas chromatography or micro-bore high performance liquid chromatography.
Nanometer resolution optical coherence tomography using broad bandwidth XUV and soft x-ray radiation
Fuchs, Silvio; Rödel, Christian; Blinne, Alexander; ...
2016-02-10
Optical coherence tomography (OCT) is a non-invasive technique for cross-sectional imaging. It is particularly advantageous for applications where conventional microscopy is not able to image deeper layers of samples in a reasonable time, e.g. in fast moving, deeper lying structures. However, at infrared and optical wavelengths, which are commonly used, the axial resolution of OCT is limited to about 1 μm, even if the bandwidth of the light covers a wide spectral range. Here, we present extreme ultraviolet coherence tomography (XCT) and thus introduce a new technique for non-invasive cross-sectional imaging of nanometer structures. XCT exploits the nanometerscale coherence lengthsmore » corresponding to the spectral transmission windows of, e.g., silicon samples. The axial resolution of coherence tomography is thus improved from micrometers to a few nanometers. Tomographic imaging with an axial resolution better than 18 nm is demonstrated for layer-type nanostructures buried in a silicon substrate. Using wavelengths in the water transmission window, nanometer-scale layers of platinum are retrieved with a resolution better than 8 nm. As a result, XCT as a nondestructive method for sub-surface tomographic imaging holds promise for several applications in semiconductor metrology and imaging in the water window.« less
Simultaneous Detection and Tracking of Pedestrian from Panoramic Laser Scanning Data
NASA Astrophysics Data System (ADS)
Xiao, Wen; Vallet, Bruno; Schindler, Konrad; Paparoditis, Nicolas
2016-06-01
Pedestrian traffic flow estimation is essential for public place design and construction planning. Traditional data collection by human investigation is tedious, inefficient and expensive. Panoramic laser scanners, e.g. Velodyne HDL-64E, which scan surroundings repetitively at a high frequency, have been increasingly used for 3D object tracking. In this paper, a simultaneous detection and tracking (SDAT) method is proposed for precise and automatic pedestrian trajectory recovery. First, the dynamic environment is detected using two different methods, Nearest-point and Max-distance. Then, all the points on moving objects are transferred into a space-time (x, y, t) coordinate system. The pedestrian detection and tracking amounts to assign the points belonging to pedestrians into continuous trajectories in space-time. We formulate the point assignment task as an energy function which incorporates the point evidence, trajectory number, pedestrian shape and motion. A low energy trajectory will well explain the point observations, and have plausible trajectory trend and length. The method inherently filters out points from other moving objects and false detections. The energy function is solved by a two-step optimization process: tracklet detection in a short temporal window; and global tracklet association through the whole time span. Results demonstrate that the proposed method can automatically recover the pedestrians trajectories with accurate positions and low false detections and mismatches.
[Recognition of walking stance phase and swing phase based on moving window].
Geng, Xiaobo; Yang, Peng; Wang, Xinran; Geng, Yanli; Han, Yu
2014-04-01
Wearing transfemoral prosthesis is the only way to complete daily physical activity for amputees. Motion pattern recognition is important for the control of prosthesis, especially in the recognizing swing phase and stance phase. In this paper, it is reported that surface electromyography (sEMG) signal is used in swing and stance phase recognition. sEMG signal of related muscles was sampled by Infiniti of a Canadian company. The sEMG signal was then filtered by weighted filtering window and analyzed by height permitted window. The starting time of stance phase and swing phase is determined through analyzing special muscles. The sEMG signal of rectus femoris was used in stance phase recognition and sEMG signal of tibialis anterior is used in swing phase recognition. In a certain tolerating range, the double windows theory, including weighted filtering window and height permitted window, can reach a high accuracy rate. Through experiments, the real walking consciousness of the people was reflected by sEMG signal of related muscles. Using related muscles to recognize swing and stance phase is reachable. The theory used in this paper is useful for analyzing sEMG signal and actual prosthesis control.
Non-destructive scanning for applied stress by the continuous magnetic Barkhausen noise method
NASA Astrophysics Data System (ADS)
Franco Grijalba, Freddy A.; Padovese, L. R.
2018-01-01
This paper reports the use of a non-destructive continuous magnetic Barkhausen noise technique to detect applied stress on steel surfaces. The stress profile generated in a sample of 1070 steel subjected to a three-point bending test is analyzed. The influence of different parameters such as pickup coil type, scanner speed, applied magnetic field and frequency band analyzed on the effectiveness of the technique is investigated. A moving smoothing window based on a second-order statistical moment is used to analyze the time signal. The findings show that the technique can be used to detect applied stress profiles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J; Lu, B; Yan, G
Purpose: To identify the weakness of dose calculation algorithm in a treatment planning system for volumetric modulated arc therapy (VMAT) and sliding window (SW) techniques using a two-dimensional diode array. Methods: The VMAT quality assurance(QA) was implemented with a diode array using multiple partial arcs that divided from a VMAT plan; each partial arc has the same segments and the original monitor units. Arc angles were less than ± 30°. Multiple arcs delivered through consecutive and repetitive gantry operating clockwise and counterclockwise. The source-toaxis distance setup with the effective depths of 10 and 20 cm were used for a diodemore » array. To figure out dose errors caused in delivery of VMAT fields, the numerous fields having the same segments with the VMAT field irradiated using different delivery techniques of static and step-and-shoot. The dose distributions of the SW technique were evaluated by creating split fields having fine moving steps of multi-leaf collimator leaves. Calculated doses using the adaptive convolution algorithm were analyzed with measured ones with distance-to-agreement and dose difference of 3 mm and 3%.. Results: While the beam delivery through static and step-and-shoot techniques showed the passing rate of 97 ± 2%, partial arc delivery of the VMAT fields brought out passing rate of 85%. However, when leaf motion was restricted less than 4.6 mm/°, passing rate was improved up to 95 ± 2%. Similar passing rate were obtained for both 10 and 20 cm effective depth setup. The calculated doses using the SW technique showed the dose difference over 7% at the final arrival point of moving leaves. Conclusion: Error components in dynamic delivery of modulated beams were distinguished by using the suggested QA method. This partial arc method can be used for routine VMAT QA. Improved SW calculation algorithm is required to provide accurate estimated doses.« less
Ensemble Data Assimilation Without Ensembles: Methodology and Application to Ocean Data Assimilation
NASA Technical Reports Server (NTRS)
Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume
2013-01-01
Two methods to estimate background error covariances for data assimilation are introduced. While both share properties with the ensemble Kalman filter (EnKF), they differ from it in that they do not require the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The first method is referred-to as SAFE (Space Adaptive Forecast error Estimation) because it estimates error covariances from the spatial distribution of model variables within a single state vector. It can thus be thought of as sampling an ensemble in space. The second method, named FAST (Flow Adaptive error Statistics from a Time series), constructs an ensemble sampled from a moving window along a model trajectory. The underlying assumption in these methods is that forecast errors in data assimilation are primarily phase errors in space and/or time.
Cockpit Window Edge Proximity Effects on Judgements of Horizon Vertical Displacement
NASA Technical Reports Server (NTRS)
Haines, R. F.
1984-01-01
To quantify the influence of a spatially fixed edge on vertical displacement threshold, twenty-four males (12 pilots, 12 non-pilots) were presented a series of forced choice, paired comparison trials in which a 32 deg arc wide, thin, luminous horizontal stimulus line moved smoothly downward through five angles from a common starting position within a three second-long period. The five angles were 1.4, 1.7, 2, 2.3, and 2.6 deg. Each angle was presented paired with itself and the other four angles in all combinations in random order. For each pair of trials the observer had to choose which trial possessed the largest displacement. A confidence response also was made. The independent variable was the angular separation between the lower edge of a stable 'window' aperture through which the stimulus was seen to move and the lowest position attained by the stimulus. It was found that vertical displacement accuracy is inversely related to the angle separating the stimulus and the fixed window edge (p = .05). In addition, there is a strong tendency for pilot confidence to be lower than that of non-pilots for each of the three angular separations. These results are discussed in erms of selected cockpit features and as they relate to how pilots judge changes in aircraft pitch attitude.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, P; Tsai, Y; Nien, H
2015-06-15
Purpose: Four dimensional computed tomography (4DCT) scans reliably record whole respiratory phase and generate internal target volumes (ITV) for radiotherapy planning. However, image guiding with cone-beam computed tomography (CBCT) cannot acquire all or specific respiratory phases. This study was designed to investigate the correlation between average CT and Maximum Intensity Projection (MIP) from 4DCT and CBCT. Methods: Retrospective respiratory gating were performed by GE Discovery CT590 RT. 4DCT and CBCT data from CRIS Dynamic Thorax Phantom with simulated breathing mode were analyzed. The lung tissue equivalent material encompassed 3 cm sphere tissue equivalent material. Simulated breathing cycle period was setmore » as 4 seconds, 5 seconds and 6 seconds for representing variation of patient breathing cycle time, and the sphere material moved toward inferior and superior direction with 1 cm amplitude simulating lung tumor motion during respiration. Results: Under lung window, the volume ratio of CBCT scans to ITVs derived from 10 phases average scans was 1.00 ± 0.02, and 1.03 ± 0.03 for ratio of CBCT scans to MIP scans. Under abdomen window, the ratio of CBCT scans to ITVs derived from 10 phases average scans was 0.39 ± 0.06, and 0.06 ± 0.00 for ratio of CBCT scans to MIP scans. There was a significant difference between lung window Result and abdomen window Result. For reducing image guiding uncertainty, CBCT window was set with width 500 and level-250. The ratio of CBCT scans to ITVs derived from 4 phases average scans with abdomen window was 1.19 ± 0.02, and 1.06 ± 0.01 for ratio of CBCT to MIP scans. Conclusion: CBCT images with suitable window width and level can efficiently reduce image guiding uncertainty for patient with mobile tumor. By our setting, we can match motion tumor to gating tumor location on planning CT more accurately neglecting other motion artifacts during CBCT scans.« less
Yao, Jingyu; Jia, Lin; Khan, Naheed; Zheng, Qiong-Duan; Moncrief, Ashley; Hauswirth, William W.; Thompson, Debra A.; Zacks, David N.
2012-01-01
Purpose AAV-mediated gene therapy in the rd10 mouse, with retinal degeneration caused by mutation in the rod cyclic guanosine monophosphate phosphodiesterase β-subunit (PDEβ) gene, produces significant, but transient, rescue of photoreceptor structure and function. This study evaluates the ability of AAV-mediated delivery of X-linked inhibitor of apoptosis (XIAP) to enhance and prolong the efficacy of PDEβ gene-replacement therapy. Methods Rd10 mice were bred and housed in darkness. Two groups of animals were generated: Group 1 received sub-retinal AAV5-XIAP or AAV5-GFP at postnatal age (P) 4 or 21 days; Group 2 received sub-retinal AAV5-XIAP plus AAV5- PDEβ, AAV5-GFP plus AAV5- PDEβ, or AAV- PDEβ alone at age P4 or P21. Animals were maintained for an additional 4 weeks in darkness before being moved to a cyclic-light environment. A subset of animals from Group 1 received a second sub-retinal injection of AAV8-733-PDEβ two weeks after being moved to the light. Histology, immunohistochemistry, Western blots, and electroretinograms were performed at different times after moving to the light. Results Injection of AAV5-XIAP alone at P4 and 21 resulted in significant slowing of light-induced retinal degeneration, as measured by outer nuclear thickness and cell counts, but did not result in improved outer segment structure and rhodopsin localization. In contrast, co-injection of AAV5-XIAP and AAV5-PDEβ resulted in increased levels of rescue and decreased rates of retinal degeneration compared to treatment with AAV5-PDEβ alone. Mice treated with AAV5-XIAP at P4, but not P21, remained responsive to subsequent rescue by AAV8-733-PDEβ when injected two weeks after moving to a light-cycling environment. Conclusions Adjunctive treatment with the anti-apoptotic gene XIAP confers additive protective effect to gene-replacement therapy with AAV5-PDEβ in the rd10 mouse. In addition, AAV5-XIAP, when given early, can increase the age at which gene-replacement therapy remains effective, thus effectively prolonging the window of opportunity for therapeutic intervention. PMID:22615940
Mirmohseni, A; Abdollahi, H; Rostamizadeh, K
2007-02-28
Net analyte signal (NAS)-based method called HLA/GO was applied for the selectively determination of binary mixture of ethanol and water by quartz crystal nanobalance (QCN) sensor. A full factorial design was applied for the formation of calibration and prediction sets in the concentration ranges 5.5-22.2 microg mL(-1) for ethanol and 7.01-28.07 microg mL(-1) for water. An optimal time range was selected by procedure which was based on the calculation of the net analyte signal regression plot in any considered time window for each test sample. A moving window strategy was used for searching the region with maximum linearity of NAS regression plot (minimum error indicator) and minimum of PRESS value. On the base of obtained results, the differences on the adsorption profiles in the time range between 1 and 600 s were used to determine mixtures of both compounds by HLA/GO method. The calculation of the net analytical signal using HLA/GO method allows determination of several figures of merit like selectivity, sensitivity, analytical sensitivity and limit of detection, for each component. To check the ability of the proposed method in the selection of linear regions of adsorption profile, a test for detecting non-linear regions of adsorption profile data in the presence of methanol was also described. The results showed that the method was successfully applied for the determination of ethanol and water.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
A complete passive blind image copy-move forensics scheme based on compound statistics features.
Peng, Fei; Nie, Yun-ying; Long, Min
2011-10-10
Since most sensor pattern noise based image copy-move forensics methods require a known reference sensor pattern noise, it generally results in non-blinded passive forensics, which significantly confines the application circumstances. In view of this, a novel passive-blind image copy-move forensics scheme is proposed in this paper. Firstly, a color image is transformed into a grayscale one, and wavelet transform based de-noising filter is used to extract the sensor pattern noise, then the variance of the pattern noise, the signal noise ratio between the de-noised image and the pattern noise, the information entropy and the average energy gradient of the original grayscale image are chosen as features, non-overlapping sliding window operations are done to the images to divide them into different sub-blocks. Finally, the tampered areas are detected by analyzing the correlation of the features between the sub-blocks and the whole image. Experimental results and analysis show that the proposed scheme is completely passive-blind, has a good detection rate, and is robust against JPEG compression, noise, rotation, scaling and blurring. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Whitford, Veronica; O'Driscoll, Gillian A; Pack, Christopher C; Joober, Ridha; Malla, Ashok; Titone, Debra
2013-02-01
Language and oculomotor disturbances are 2 of the best replicated findings in schizophrenia. However, few studies have examined skilled reading in schizophrenia (e.g., Arnott, Sali, Copland, 2011; Hayes & O'Grady, 2003; Revheim et al., 2006; E. O. Roberts et al., 2012), and none have examined the contribution of cognitive and motor processes that underlie reading performance. Thus, to evaluate the relationship of linguistic processes and oculomotor control to skilled reading in schizophrenia, 20 individuals with schizophrenia and 16 demographically matched controls were tested using a moving window paradigm (McConkie & Rayner, 1975). Linguistic skills supporting reading (phonological awareness) were assessed with the Comprehensive Test of Phonological Processing (R. K. Wagner, Torgesen, & Rashotte, 1999). Eye movements were assessed during reading tasks and during nonlinguistic tasks tapping basic oculomotor control (prosaccades, smooth pursuit) and executive functions (predictive saccades, antisaccades). Compared with controls, schizophrenia patients exhibited robust oculomotor markers of reading difficulty (e.g., reduced forward saccade amplitude) and were less affected by reductions in window size, indicative of reduced perceptual span. Reduced perceptual span in schizophrenia was associated with deficits in phonological processing and reduced saccade amplitudes. Executive functioning (antisaccade errors) was not related to perceptual span but was related to reading comprehension. These findings suggest that deficits in language, oculomotor control, and cognitive control contribute to skilled reading deficits in schizophrenia. Given that both language and oculomotor dysfunction precede illness onset, reading may provide a sensitive window onto cognitive dysfunction in schizophrenia vulnerability and be an important target for cognitive remediation. 2013 APA, all rights reserved
Eye movement evidence for defocused attention in dysphoria--a perceptual span analysis.
Brzezicka, Aneta; Krejtz, Izabela; von Hecker, Ulrich; Laubrock, Jochen
2012-07-01
The defocused attention hypothesis (von Hecker and Meiser, 2005) assumes that negative mood broadens attention, whereas the analytical rumination hypothesis (Andrews and Thompson, 2009) suggests a narrowing of the attentional focus with depression. We tested these conflicting hypotheses by directly measuring the perceptual span in groups of dysphoric and control subjects, using eye tracking. In the moving window paradigm, information outside of a variable-width gaze-contingent window was masked during reading of sentences. In measures of sentence reading time and mean fixation duration, dysphoric subjects were more pronouncedly affected than controls by a reduced window size. This difference supports the defocused attention hypothesis and seems hard to reconcile with a narrowing of attentional focus. Copyright © 2011 Elsevier B.V. All rights reserved.
MOVING TO INEQUALITY: NEIGHBORHOOD EFFECTS AND EXPERIMENTS MEET STRUCTURE1
Sampson, Robert J.
2014-01-01
The Moving to Opportunity (MTO) housing experiment has proven to be an important intervention not just in the lives of the poor, but in social science theories of neighborhood effects. Competing causal claims have been the subject of considerable disagreement, culminating in the debate between Clampet-Lundquist and Massey (2008) and Ludwig et al. (2008). This paper assesses the debate by clarifying analytically distinct questions posed by neighborhood-level theories, reconceptualizing selection bias as a fundamental social process worthy of study in its own right rather than as a statistical nuisance, and reconsidering the scientific method of experimentation, and hence causality, in the social world of the city. I also analyze MTO and independent survey data from Chicago to examine trajectories of residential attainment. Although MTO provides crucial leverage for estimating neighborhood effects on individuals, as proponents rightly claim, I demonstrate the implications imposed by a stratified urban structure and how MTO simultaneously provides a new window on the social reproduction of concentrated inequality. PMID:25360053
Rigorous Numerical Study of Low-Period Windows for the Quadratic Map
NASA Astrophysics Data System (ADS)
Galias, Zbigniew
An efficient method to find all low-period windows for the quadratic map is proposed. The method is used to obtain very accurate rigorous bounds of positions of all periodic windows with periods p ≤ 32. The contribution of period-doubling windows on the total width of periodic windows is discussed. Properties of periodic windows are studied numerically.
Power strain imaging based on vibro-elastography techniques
NASA Astrophysics Data System (ADS)
Wen, Xu; Salcudean, S. E.
2007-03-01
This paper describes a new ultrasound elastography technique, power strain imaging, based on vibro-elastography (VE) techniques. With this method, tissue is compressed by a vibrating actuator driven by low-pass or band-pass filtered white noise, typically in the 0-20 Hz range. Tissue displacements at different spatial locations are estimated by correlation-based approaches on the raw ultrasound radio frequency signals and recorded in time sequences. The power spectra of these time sequences are computed by Fourier spectral analysis techniques. As the average of the power spectrum is proportional to the squared amplitude of the tissue motion, the square root of the average power over the range of excitation frequencies is used as a measure of the tissue displacement. Then tissue strain is determined by the least squares estimation of the gradient of the displacement field. The computation of the power spectra of the time sequences can be implemented efficiently by using Welch's periodogram method with moving windows or with accumulative windows with a forgetting factor. Compared to the transfer function estimation originally used in VE, the computation of cross spectral densities is not needed, which saves both the memory and computational times. Phantom experiments demonstrate that the proposed method produces stable and operator-independent strain images with high signal-to-noise ratio in real time. This approach has been also tested on a few patient data of the prostate region, and the results are encouraging.
Human-Robot Interface Controller Usability for Mission Planning on the Move
2012-11-01
5 Figure 3. Microsoft Xbox 360 controller for Windows...6 Figure 5. Microsoft Trackball Explorer. .........................................................................................7 Figure 6...Xbox 360 Controller is a registered trademark of Microsoft Corporation. 4 3.2.1 HMMWV The HMMWV was equipped with a diesel engine
Transparent Conveyor of Dielectric Liquids or Particles
NASA Technical Reports Server (NTRS)
Calle, Carlos I.; Mantovani, James G.
2009-01-01
The concept of a transparent conveyor of small loose dielectric parti cles or small amounts of dielectric liquids has emerged as an outgro wth of an effort to develop efficient, reliable means of automated re moval of dust from solar cells and from windows of optical instrumen ts. This concept is based on the previously reported concept of an e lectrodynamic screen, according to which a grid-like electric field is established on and near a surface and is moved along the surface p erpendicularly to the grid lines. The resulting electrodynamic force s on loose dielectric particles or dielectric liquid drops in the vic inity would move the particles or drops along the surface. In the or iginal dust-removal application, dust particles would thus be swept out of the affected window area. Other potential applications may occ ur in nanotechnology -- for example, involving mixing of two or more fluids and/or nanoscale particles under optical illumination and/or optical observation.
An accelerating precursor to predict "time-to-failure" in creep and volcanic eruptions
NASA Astrophysics Data System (ADS)
Hao, Shengwang; Yang, Hang; Elsworth, Derek
2017-09-01
Real-time prediction by monitoring of the evolution of response variables is a central goal in predicting rock failure. A linear relation Ω˙Ω¨-1 = C(tf - t) has been developed to describe the time to failure, where Ω represents a response quantity, C is a constant and tf represents the failure time. Observations from laboratory creep failure experiments and precursors to volcanic eruptions are used to test the validity of the approach. Both cumulative and simple moving window techniques are developed to perform predictions and to illustrate the effects of data selection on the results. Laboratory creep failure experiments on granites show that the linear relation works well during the final approach to failure. For blind prediction, the simple moving window technique is preferred because it always uses the most recent data and excludes effects of early data deviating significantly from the predicted trend. When the predicted results show only small fluctuations, failure is imminent.
Gunn, W J; Shigehisa, T; Shepherd, W T
1979-10-01
The conditions were examined under which more valid and reliable estimates could be made of the effects of aircraft noise on people. In Exper. 1, 12 Ss in 2 different houses directly under the flight path of a major airport (JFK) indicated 1 of 12 possible flight paths (4 directly overhead and 8 to one side) for each of 3 jet aircraft flyovers: 3% of cases in House A and 56% in House B (which had open windows) were correctly identified. Despite judgment inaccuracy, Ss were more than moderately certain of the correctness of their judgments. In Exper. II. Ss either inside or outside of 2 houses in Wallops Station, Virginia, indicated on diagrams the direction of flyovers. Each of 4 aircraft (Boeing 737, C-54, UE-1 helicopter, Queenaire) made 8 flyovers directly over the houses and 8 to one side. Windows were either open or closed. All flyovers and conditions were counterbalanced. All sound sources under all conditions were usually judged to be overhead and moving, but for Ss indoors with windows closed the to-the-side flyovers were judged to be off to the side in 24% of cases. Outdoor Ss reported correct direction in 75% of cases while indoor Ss were correct in only 25% (windows open) or 18% (windows closed). Judgments "to the side" were significantly better (p = less than .02) with windows open vs closed, while with windows closed judgments were significantly better (p = less than .05) for flyovers overhead vs to the side. In Exper. III, Ss localized in azimuth and in the vertical plane recorded noises (10 1-oct noise bands of CF = 28.12 c/s - 14.4kc/s, spoken voice, and jet aircraft takeoffs and landings), presented through 1, 2, or 4 floor-level loudspeakers at each corner of a simulated living room (4.2 x 5.4m)built inside an IAC soundproof room. Aircraft noises presented by 4 loudspeakers were localized as "directly" overhead 80% of the time and "generally overhead" about 90% of the time; other sounds were so localized about 50% and 75% of the time respectively. Through only 2 loudspeakers, aircraft noises were localized 25-36 degrees above horizontal. Implications are that acoustic realism can be achieved in simulated aircraft overflights and that future laboratory noise-effects research should incorporate the required conditions.
NASA Astrophysics Data System (ADS)
Gligor, M.; Ausloos, M.
2007-05-01
The statistical distances between countries, calculated for various moving average time windows, are mapped into the ultrametric subdominant space as in classical Minimal Spanning Tree methods. The Moving Average Minimal Length Path (MAMLP) algorithm allows a decoupling of fluctuations with respect to the mass center of the system from the movement of the mass center itself. A Hamiltonian representation given by a factor graph is used and plays the role of cost function. The present analysis pertains to 11 macroeconomic (ME) indicators, namely the GDP (x1), Final Consumption Expenditure (x2), Gross Capital Formation (x3), Net Exports (x4), Consumer Price Index (y1), Rates of Interest of the Central Banks (y2), Labour Force (z1), Unemployment (z2), GDP/hour worked (z3), GDP/capita (w1) and Gini coefficient (w2). The target group of countries is composed of 15 EU countries, data taken between 1995 and 2004. By two different methods (the Bipartite Factor Graph Analysis and the Correlation Matrix Eigensystem Analysis) it is found that the strongly correlated countries with respect to the macroeconomic indicators fluctuations can be partitioned into stable clusters.
NASA Astrophysics Data System (ADS)
Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun
2018-03-01
Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.
Non-invasive detection of language-related prefrontal high gamma band activity with beamforming MEG.
Hashimoto, Hiroaki; Hasegawa, Yuka; Araki, Toshihiko; Sugata, Hisato; Yanagisawa, Takufumi; Yorifuji, Shiro; Hirata, Masayuki
2017-10-27
High gamma band (>50 Hz) activity is a key oscillatory phenomenon of brain activation. However, there has not been a non-invasive method established to detect language-related high gamma band activity. We used a 160-channel whole-head magnetoencephalography (MEG) system equipped with superconducting quantum interference device (SQUID) gradiometers to non-invasively investigate neuromagnetic activities during silent reading and verb generation tasks in 15 healthy participants. Individual data were divided into alpha (8-13 Hz), beta (13-25 Hz), low gamma (25-50 Hz), and high gamma (50-100 Hz) bands and analysed with the beamformer method. The time window was consecutively moved. Group analysis was performed to delineate common areas of brain activation. In the verb generation task, transient power increases in the high gamma band appeared in the left middle frontal gyrus (MFG) at the 550-750 ms post-stimulus window. We set a virtual sensor on the left MFG for time-frequency analysis, and high gamma event-related synchronization (ERS) induced by a verb generation task was demonstrated at 650 ms. In contrast, ERS in the high gamma band was not detected in the silent reading task. Thus, our study successfully non-invasively measured language-related prefrontal high gamma band activity.
Microwave window breakdown experiments and simulations on the UM/L-3 relativistic magnetron
NASA Astrophysics Data System (ADS)
Hoff, B. W.; Mardahl, P. J.; Gilgenbach, R. M.; Haworth, M. D.; French, D. M.; Lau, Y. Y.; Franzi, M.
2009-09-01
Experiments have been performed on the UM/L-3 (6-vane, L-band) relativistic magnetron to test a new microwave window configuration designed to limit vacuum side breakdown. In the baseline case, acrylic microwave windows were mounted between three of the waveguide coupling cavities in the anode block vacuum housing and the output waveguides. Each of the six 3 cm deep coupling cavities is separated from its corresponding anode cavity by a 1.75 cm wide aperture. In the baseline case, vacuum side window breakdown was observed to initiate at single waveguide output powers close to 20 MW. In the new window configuration, three Air Force Research Laboratory-designed, vacuum-rated directional coupler waveguide segments were mounted between the coupling cavities and the microwave windows. The inclusion of the vacuum side power couplers moved the microwave windows an additional 30 cm away from the anode apertures. Additionally, the Lucite microwave windows were replaced with polycarbonate windows and the microwave window mounts were redesigned to better maintain waveguide continuity in the region around the microwave windows. No vacuum side window breakdown was observed in the new window configuration at single waveguide output powers of 120+MW (a factor of 3 increase in measured microwave pulse duration and factor of 3 increase in measured peak power over the baseline case). Simulations were performed to investigate likely causes for the window breakdown in the original configuration. Results from these simulations have shown that in the original configuration, at typical operating voltage and magnetic field ranges, electrons emitted from the anode block microwave apertures strike the windows with a mean kinetic energy of 33 keV with a standard deviation of 14 keV. Calculations performed using electron impact angle and energy data predict a first generation secondary electron yield of 65% of the primary electron population. The effects of the primary aperture electron impacts, combined with multiplication of the secondary populations, were determined to be the likely causes of the poor microwave window performance in the original configuration.
NASA Astrophysics Data System (ADS)
Jeter, G. W.; Carter, G. A.
2013-12-01
Guy (Will) Wilburn Jeter Jr., Gregory A. Carter University of Southern Mississippi Geography and Geology Gulf Coast Geospatial Center The over-arching goal of this research is to assess habitat change over a seventy year period to better understand the combined effects of global sea level rise and storm impacts on the stability of Horn Island, MS habitats. Historical aerial photography is often overlooked as a resource for use in determining habitat change. However, the spatial information provided even by black and white imagery can give insight into past habitat composition via textural analysis. This research will evaluate characteristic dimensions; most notably patch size of habitat types using simple geo-statistics and textures of brightness values of historical aerial imagery. It is assumed that each cover type has an identifiable patch size that can be used as a unique classifier of each habitat type. Analytical methods applied to the 1940 imagery were developed using 2010 field data and USDA aerial imagery. Textural moving window methods and basic geo-statistics were used to estimate characteristic dimensions of each cover type in 1940 aerial photography. The moving window texture analysis was configured with multiple window sizes to capture the characteristic dimensions of six habitat types; water, bare sand , dune herb land, estuarine shrub land, marsh land and slash pine woodland. Coefficient of variation (CV), contrast, and entropy texture filters were used to analyze the spatial variability of the 1940 and 2010 imagery. (CV) was used to depict the horizontal variability of each habitat characteristic dimension. Contrast was used to represent the variability of bright versus dark pixel values; entropy was used to show the variation in the slash pine woodland habitat type. Results indicate a substantial increase in marshland habitat relative to other habitat types since 1940. Results also reveal each habitat-type, such as dune herb-land, marsh-land, estuarine shrub-land, bare sand, slash pine woodland, and water exhibit a characteristic dimension that may be estimated from horizontal variability in image brightness values. These characteristic dimensions are estimated at less than one 1 meter for marsh-land bare sand and water, 3 meters for estuarine shrub-land and dune herb-land, and 5 to 7 meters for slash pine woodland.
Validation of the Spatial Accuracy of the ExacTracRTM Adaptive Gating System
NASA Astrophysics Data System (ADS)
Twork, Gregory
Stereotactic body radiation therapy (SBRT) is a method of treatment that is used in extracranial locations, including the abdominal and thoracic cavities, as well as spinal and paraspinal locations. At the McGill University Health Centre, liver SBRT treatments include gating, which places the treatment beam on a duty cycle controlled by tracking of fiducial markers moving with the patient's breathing cycle. Respiratory gated treatments aim to spare normal tissue, while delivering a dose properly to a moving target. The ExacTracRTM system (BrainLAB AG Germany) is an image-guided radiotherapy system consisting of a combination of infra-red (IR) cameras and dual kilovoltage (kV) X-ray tubes. The IR system is used to track patient positioning and respiratory motion, while the kV X-rays are used to determine a positional shift based on internal anatomy or fiducial markers. In order to validate the system's ability to treat under gating conditions, each step of the SBRT process was evaluated quantitatively. Initially the system was tested under ideal static conditions, followed by a study including gated parameters. The uncertainties of the isocenters, positioning algorithm, planning computed tomography (CT) and four dimensional CT (4DCT) scans, gating window size and tumor motion were evaluated for their contributions to the total uncertainty in treatment. The mechanical isocenter and 4DCT were found to be the largest sources of uncertainty. However, for tumors with large internal amplitudes (>2.25 cm) that are treated with large gating windows (>30%) the gating parameters can contribute more than 1.1 +/- 1.8 mm.
Whalen, D. H.; Zunshine, Lisa; Holquist, Michael
2015-01-01
Reading fiction is a major component of intellectual life, yet it has proven difficult to study experimentally. One aspect of literature that has recently come to light is perspective embedding (“she thought I left” embedding her perspective on “I left”), which seems to be a defining feature of fiction. Previous work (Whalen et al., 2012) has shown that increasing levels of embedment affects the time that it takes readers to read and understand short vignettes in a moving window paradigm. With increasing levels of embedment from 1 to 5, reading times in a moving window paradigm rose almost linearly. However, level 0 was as slow as 3–4. Accuracy on probe questions was relatively constant until dropping at the fifth level. Here, we assessed this effect in a more ecologically valid (“typical”) reading paradigm, in which the entire vignette was visible at once, either for as long as desired (Experiment 1) or a fixed time (Experiment 2). In Experiment 1, reading times followed a pattern similar to that of the previous experiment, with some differences in absolute speed. Accuracy matched previous results: fairly consistent accuracy until a decline at level 5, indicating that both presentation methods allowed understanding. In Experiment 2, accuracy was somewhat reduced, perhaps because participants were less successful at allocating their attention than they were during the earlier experiment; however, the pattern was the same. It seems that literature does not, on average, use easiest reading level but rather uses a middle ground that challenges the reader, but not too much. PMID:26635684
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Mulligan, J. B.; Stone, L. S.; Statler, Irving C. (Technical Monitor)
1994-01-01
Mulligan showed that the perceived direction of a moving grating can be biased by the shape of the Gaussian window in which it is viewed. We sought to determine if a 2-D pattern with an unambiguous velocity would also show such biases. Observers viewed a drifting plaid (sum of two orthogonal 2.5 c/d sinusoidal gratings of 12% contrast, each with a TF of 4 Hz.) whose contrast was modulated spatially by a stationary, asymmetric 2-D Gaussian window (i.e. unequal standard deviations in the principal directions). The direction of plaid motion with respect to the orientation of the window's major axis (Delta Theta) was varied while all other motion parameters were held fixed. Observers reported the perceived plaid direction of motion by adjusting the orientation of a pointer. All five observers showed systematic biases in perceived plaid direction that depended on Delta Theta and the aspect ratio of the Gaussian window (lambda). For circular Gaussian windows Lambda = 1), plaid direction was veridically perceived. However, biases of up to 10 deg. were found for lambda = 2 and Delta Theta = 30 deg. These data present a challenge to models of motion perception which do not explicitly consider the integration of information across the visual field.
Climate Exposure of US National Parks in a New Era of Change
Monahan, William B.; Fisichelli, Nicholas A.
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901–2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change. PMID:24988483
NASA Astrophysics Data System (ADS)
Johnson, R. M.; Herrold, A.; Holzer, M. A.; Passow, M. J.
2010-12-01
The geoscience research and education community is interested in developing scalable and effective user-friendly strategies for reaching the public, students and educators with information about the Earth and space sciences. Based on experience developed over the past decade with education and outreach programs seeking to reach these populations, there is a growing consensus that this will be best achieved through collaboration, leveraging the resources and networks already in existence. While it is clear that gifted researchers and developers can create wonderful online educational resources, many programs have been stymied by the difficulty of attracting an audience to these resources. The National Earth Science Teachers Association (NESTA) has undertaken an exciting new project, with support from the William and Flora Hewlett Foundation, that provides a new platform for the geoscience education and research community to share their research, resources, programs, products and services with a wider audience. In April 2010, the Windows to the Universe project (http://windows2universe.org) moved from the University Corporation for Atmospheric Research to NESTA. Windows to the Universe, which started in 1995 at the University of Michigan, is one of the most popular Earth and space science education websites globally, with over 16 million visits annually. The objective of this move is to develop a suite of new opportunities and capabilities on the website that will allow it become a sustainable education and outreach platform for the geoscience research and education community hosting open educational resources. This presentation will provide an update on our progress, highlighting our new strategies, synergies with community needs, and opportunities for collaboration.
Impact of smoking on in-vehicle fine particle exposure during driving
NASA Astrophysics Data System (ADS)
Sohn, Hongji; Lee, Kiyoung
2010-09-01
Indoor smoking ban in public places can reduce secondhand smoke (SHS) exposure. However, smoking in cars and homes has continued. The purpose of this study was to assess particulate matter less than 2.5 μm (PM 2.5) concentration in moving cars with different window opening conditions. The PM 2.5 level was measured by an aerosol spectrometer inside and outside moving cars simultaneously, along with ultrafine particle (UFP) number concentration, speed, temperature and humidity inside cars. Two sport utility vehicles were used. Three different ventilation conditions were evaluated by up to 20 repeated experiments. In the pre-smoking phase, average in-vehicle PM 2.5 concentrations were 16-17 μg m -3. Regardless of different window opening conditions, the PM 2.5 levels promptly increased when smoking occurred and decreased after cigarette was extinguished. Although only a single cigarette was smoked, the average PM 2.5 levels were 506-1307 μg m -3 with different window opening conditions. When smoking was ceased, the average PM 2.5 levels for 15 min were several times higher than the US National Ambient Air Quality Standard of 35 μg m -3. It took longer than 10 min to reach the level of the pre-smoking phase. Although UFP levels had a similar temporal profile of PM 2.5, the increased levels during the smoking phase were relatively small. This study demonstrated that the SHS exposure in cars with just a single cigarette being smoked could exceed the US EPA NAAQS under realistic window opening conditions. Therefore, the findings support the need for public education against smoking in cars and advocacy for a smoke-free car policy.
Climate exposure of US national parks in a new era of change.
Monahan, William B; Fisichelli, Nicholas A
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901-2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cervino, L; Soultan, D; Pettersson, N
2016-06-15
Purpose: to evaluate the dosimetric and radiobiological consequences from having different gating windows, dose rates, and breathing patterns in gated VMAT lung radiotherapy. Methods: A novel 3D-printed moving phantom with central high and peripheral low tracer uptake regions was 4D FDG-PET/CT-scanned using ideal, patient-specific regular, and irregular breathing patterns. A scan of the stationary phantom was obtained as a reference. Target volumes corresponding to different uptake regions were delineated. Simultaneous integrated boost (SIB) 6 MV VMAT plans were produced for conventional and hypofractionated radiotherapy, using 30–70 and 100% cycle gating scenarios. Prescribed doses were 200 cGy with SIB to 240more » cGy to high uptake volume for conventional, and 800 with SIB to 900 cGy for hypofractionated plans. Dose rates of 600 MU/min (conventional and hypofractionated) and flattening filter free 1400 MU/min (hypofractionated) were used. Ion chamber measurements were performed to verify delivered doses. Vials with A549 cells placed in locations matching ion chamber measurements were irradiated using the same plans to measure clonogenic survival. Differences in survival for the different doses, dose rates, gating windows, and breathing patterns were analyzed. Results: Ion chamber measurements agreed within 3% of the planned dose, for all locations, breathing patterns and gating windows. Cell survival depended on dose alone, and not on gating window, breathing pattern, MU rate, or delivery time. The surviving fraction varied from approximately 40% at 2Gy to 1% for 9 Gy and was within statistical uncertainty relative to that observed for the stationary phantom. Conclusions: Use of gated VMAT in PET-driven SIB radiotherapy was validated using ion chamber measurements and cell survival assays for conventional and hypofractionated radiotherapy.« less
Moving object detection using dynamic motion modelling from UAV aerial images.
Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid
2014-01-01
Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.
Keep Your Windows Open and Mirrors Polished: On Quality Education in a Changing America
ERIC Educational Resources Information Center
Katz, Lucinda Lee
2011-01-01
Lucinda Lee Katz, head of Marin Country Day School (California), received the 2009 NAIS Diversity Leadership Award. This article presents an edited excerpt of her acceptance speech. In this speech, she outlines what is necessary to move school communities ahead in one's diversity work.
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-01-01
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW’s) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach. PMID:27314363
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data.
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-06-15
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration's (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003-2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW's) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.
NASA Astrophysics Data System (ADS)
Zhao, Jinping; Cao, Yong; Wang, Xin
2018-06-01
In order to study the temporal variations of correlations between two time series, a running correlation coefficient (RCC) could be used. An RCC is calculated for a given time window, and the window is then moved sequentially through time. The current calculation method for RCCs is based on the general definition of the Pearson product-moment correlation coefficient, calculated with the data within the time window, which we call the local running correlation coefficient (LRCC). The LRCC is calculated via the two anomalies corresponding to the two local means, meanwhile, the local means also vary. It is cleared up that the LRCC reflects only the correlation between the two anomalies within the time window but fails to exhibit the contributions of the two varying means. To address this problem, two unchanged means obtained from all available data are adopted to calculate an RCC, which is called the synthetic running correlation coefficient (SRCC). When the anomaly variations are dominant, the two RCCs are similar. However, when the variations of the means are dominant, the difference between the two RCCs becomes obvious. The SRCC reflects the correlations of both the anomaly variations and the variations of the means. Therefore, the SRCCs from different time points are intercomparable. A criterion for the superiority of the RCC algorithm is that the average value of the RCC should be close to the global correlation coefficient calculated using all data. The SRCC always meets this criterion, while the LRCC sometimes fails. Therefore, the SRCC is better than the LRCC for running correlations. We suggest using the SRCC to calculate the RCCs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mestrovic, Ante; Chitsazzadeh, Shadi; Wells, Derek
2016-08-15
Purpose: To develop a highly sensitive patient specific QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: A platform was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside the ArcCheck. The Quasar phantom controller uses a patient-specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. With this system the ion chamber is used to QA the correct phase of the gated delivery and the ArcCheck diodes are used to QA the overall dose distribution. This novelmore » approach requires a single plan delivery for a complete QA of a gated plan. The sensitivity of the gating QA procedure was investigated with respect to the following parameters: PTV size, exhale duration, baseline drift, gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns is currently undergoing to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
The role of compositionality in standardized problem list generation.
Elkin, P L; Tuttle, M; Keck, K; Campbell, K; Atkin, G; Chute, C G
1998-01-01
Compositionality is the ability of a Vocabulary System to record non-atomic strings. In this manuscript we define the types of composition, which can occur. We will then propose methods for both server based and client-based composition. We will differentiate the terms Pre-Coordination, Post-Coordination, and User-Directed Coordination. A simple grammar for the recording of terms with concept level identification will be presented, with examples from the Unified Medical Language System's (UMLS) Metathesaurus. We present an implementation of a Window's NT based client application and a remote Internet Based Vocabulary Server, which makes use of this method of compositionality. Finally we will suggest a research agenda which we believe is necessary to move forward toward a more complete understanding of compositionality. This work has the promise of paving the way toward a robust and complete Problem List Entry Tool.
2003-04-25
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility help guide the Mars Exploration Rover 1 (MER-1) as it is moved to the lander base petal for installation. The MER Mission consists of two identical rovers, landing at different regions of Mars, designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The first rover has a launch window opening June 5, and the second rover a window opening June 25. The rovers will be launched from Cape Canaveral Air Force Station.
2003-04-04
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility check the Mars Exploration Rover 2 (MER-2) before it is lifted and moved to the lander where it will be mated to the base petal. Set to launch in Spring 2003, the MER Mission consists of two identical rovers, landing at different regions of Mars, designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The first rover has a launch window opening May 30, and the second rover a window opening June 25.
NASA Astrophysics Data System (ADS)
Pellerin, Morgane; Castaing, Victor; Gourier, Didier; Chanéac, Corinne; Viana, Bruno
2018-02-01
Persistent luminescence materials present many applications including security lighting and bio-imaging. Many progresses have been made in the elaboration of persistent luminescent nanoparticles suitable for the first NIR partial transparency window (650 - 950 nm). Moving to the second and third near-infrared partial transparency windows (1000 nm - 1800 nm) allows further reducing of scattering, absorption and tissue autofluorescence effects. In this work, we present the synthesis of Co2+ and Ni2+ doped zinc-gallate nanoparticles with broad emission covering the NIR-II range. Site occupancy, energy levels, optical features and persistent phenomena are presented.
2003-01-28
KENNEDY SPACE CENTER, FLA. -- The Mars Exploration Rover -2 is moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
2003-01-28
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility move the Mars Exploration Rover -2 to a workstand in the high bay. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
Mechanisms of Cochlear Stimulation Through the Round Window
NASA Astrophysics Data System (ADS)
Lukashkin, Andrei N.; Weddell, Thomas; Russell, Ian J.
2011-11-01
The round window membrane (RW) functions as a pressure relief valve in conventional hearing allowing structures of the middle ear to move. Investigations in recent years have shown that middle ear implants can be used to stimulate the cochlea via the RW. Isolated clinical uses of this technique have been applied but more thorough theoretical and empirical studies are required. Using guinea pigs as test subjects we have investigated physiological effects of RW stimulation using a simulation of active middle ear prosthesis, a cylindrical neodymium iron boron disk magnet placed upon the RW which can be stimulated by an electromagnetic coil positioned in close proximity to the magnet.
Wang, Ruiping; Jiang, Yonggen; Michael, Engelgau; Zhao, Genming
2017-06-12
China Centre for Diseases Control and Prevention (CDC) developed the China Infectious Disease Automated Alert and Response System (CIDARS) in 2005. The CIDARS was used to strengthen infectious disease surveillance and aid in the early warning of outbreak. The CIDARS has been integrated into the routine outbreak monitoring efforts of the CDC at all levels in China. Early warning threshold is crucial for outbreak detection in the CIDARS, but CDCs at all level are currently using thresholds recommended by the China CDC, and these recommended thresholds have recognized limitations. Our study therefore seeks to explore an operational method to select the proper early warning threshold according to the epidemic features of local infectious diseases. The data used in this study were extracted from the web-based Nationwide Notifiable Infectious Diseases Reporting Information System (NIDRIS), and data for infectious disease cases were organized by calendar week (1-52) and year (2009-2015) in Excel format; Px was calculated using a percentile-based moving window (moving window [5 week*5 year], x), where x represents one of 12 centiles (0.40, 0.45, 0.50….0.95). Outbreak signals for the 12 Px were calculated using the moving percentile method (MPM) based on data from the CIDARS. When the outbreak signals generated by the 'mean + 2SD' gold standard were in line with a Px generated outbreak signal for each week during the year of 2014, this Px was then defined as the proper threshold for the infectious disease. Finally, the performance of new selected thresholds for each infectious disease was evaluated by simulated outbreak signals based on 2015 data. Six infectious diseases were selected in this study (chickenpox, mumps, hand foot and mouth diseases (HFMD), scarlet fever, influenza and rubella). Proper thresholds for chickenpox (P75), mumps (P80), influenza (P75), rubella (P45), HFMD (P75), and scarlet fever (P80) were identified. The selected proper thresholds for these 6 infectious diseases could detect almost all simulated outbreaks within a shorter time period compared to thresholds recommended by the China CDC. It is beneficial to select the proper early warning threshold to detect infectious disease aberrations based on characteristics and epidemic features of local diseases in the CIDARS.
NASA Astrophysics Data System (ADS)
Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.
2017-04-01
To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.
NASA Astrophysics Data System (ADS)
Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.
2017-11-01
Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.
2.5-month-old infants' reasoning about when objects should and should not be occluded.
Aguiar, A; Baillargeon, R
1999-09-01
The present research examined 2.5-month-old infants' reasoning about occlusion events. Three experiments investigated infants' ability to predict whether an object should remain continuously hidden or become temporarily visible when passing behind an occluder with an opening in its midsection. In Experiment 1, the infants were habituated to a short toy mouse that moved back and forth behind a screen. Next, the infants saw two test events that were identical to the habituation event except that a portion of the screen's midsection was removed to create a large window. In one event (high-window event), the window extended from the screen's upper edge; the mouse was shorter than the bottom of the window and thus did not become visible when passing behind the screen. In the other event (low-window event), the window extended from the screen's lower edge; although the mouse was shorter than the top of the window and hence should have become fully visible when passing behind the screen, it never appeared in the window. The infants tended to look equally at the high- and low-window events, suggesting that they were not surprised when the mouse failed to appear in the low window. However, positive results were obtained in Experiment 2 when the low-window event was modified: a portion of the screen above the window was removed so that the left and right sections of the screen were no longer connected (two-screens event). The infants looked reliably longer at the two-screens than at the high-window event. Together, the results of Experiments 1 and 2 suggested that, at 2.5 months of age, infants possess only very limited expectations about when objects should and should not be occluded. Specifically, infants expect objects (1) to become visible when passing between occluders and (2) to remain hidden when passing behind occluders, irrespective of whether these have openings extending from their upper or lower edges. Experiment 3 provided support for this interpretation. The implications of these findings for models of the origins and development of infants' knowledge about occlusion events are discussed. Copyright 1999 Academic Press.
NASA Astrophysics Data System (ADS)
Kwon, O.; Kim, W.; Kim, J.
2017-12-01
Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)
Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena
2016-01-01
Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential 'window of opportunity' to promote pro-environmental behaviours.
Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena
2016-01-01
Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential ‘window of opportunity’ to promote pro-environmental behaviours. PMID:27120333
Through the Sliding Glass Door: #EmpowerTheReader
ERIC Educational Resources Information Center
Johnson, Nancy J.; Koss, Melanie D.; Martinez, Miriam
2018-01-01
This article seeks to complicate the understanding of Bishop's (1990) metaphor of mirrors, windows, and sliding glass doors, with particular emphasis on sliding glass doors and the emotional connections needed for readers to move through them. The authors begin by examining the importance of the reader and the characters he or she meets. Next, the…
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study.
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-06-17
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows.
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study †
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-01-01
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows. PMID:27322280
BOREAS AFM-2 King Air 1994 Aircraft Flux and Moving Window Data
NASA Technical Reports Server (NTRS)
Kelly, Robert D.; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)
2000-01-01
The BOREAS AFM-2 team collected pass-by-pass fluxes (and many other statistics) for a large number of level (constant altitude), straight-line passes used in a variety of flight patterns. The data were collected by the University of Wyoming King Air in 1994 BOREAS IFCs 1-3. Most of these data were collected at 60-70 m above ground level, but a significant number of passes were also flown at various levels in the planetary boundary layer, up to about the inversion height. This documentation concerns only the data from the straight and level passes that are presented as original (over the NSA and SSA) and moving window values (over the Transect). Another archive of King Air data is also available, containing data from all the soundings flown by the King Air 1994 IFCs 1-3. The data are stored in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Puerari, Ivânio; Elmegreen, Bruce G.; Block, David L., E-mail: puerari@inaoep.mx
2014-12-01
We examine 8 μm IRAC images of the grand design two-arm spiral galaxies M81 and M51 using a new method whereby pitch angles are locally determined as a function of scale and position, in contrast to traditional Fourier transform spectral analyses which fit to average pitch angles for whole galaxies. The new analysis is based on a correlation between pieces of a galaxy in circular windows of (lnR,θ) space and logarithmic spirals with various pitch angles. The diameter of the windows is varied to study different scales. The result is a best-fit pitch angle to the spiral structure as amore » function of position and scale, or a distribution function of pitch angles as a function of scale for a given galactic region or area. We apply the method to determine the distribution of pitch angles in the arm and interarm regions of these two galaxies. In the arms, the method reproduces the known pitch angles for the main spirals on a large scale, but also shows higher pitch angles on smaller scales resulting from dust feathers. For the interarms, there is a broad distribution of pitch angles representing the continuation and evolution of the spiral arm feathers as the flow moves into the interarm regions. Our method shows a multiplicity of spiral structures on different scales, as expected from gas flow processes in a gravitating, turbulent and shearing interstellar medium. We also present results for M81 using classical 1D and 2D Fourier transforms, together with a new correlation method, which shows good agreement with conventional 2D Fourier transforms.« less
Application of a GPU-Assisted Maxwell Code to Electromagnetic Wave Propagation in ITER
NASA Astrophysics Data System (ADS)
Kubota, S.; Peebles, W. A.; Woodbury, D.; Johnson, I.; Zolfaghari, A.
2014-10-01
The Low Field Side Reflectometer (LSFR) on ITER is envisioned to provide capabilities for electron density profile and fluctuations measurements in both the plasma core and edge. The current design for the Equatorial Port Plug 11 (EPP11) employs seven monostatic antennas for use with both fixed-frequency and swept-frequency systems. The present work examines the characteristics of this layout using the 3-D version of the GPU-Assisted Maxwell Code (GAMC-3D). Previous studies in this area were performed with either 2-D full wave codes or 3-D ray- and beam-tracing. GAMC-3D is based on the FDTD method and can be run with either a fixed-frequency or modulated (e.g. FMCW) source, and with either a stationary or moving target (e.g. Doppler backscattering). The code is designed to run on a single NVIDIA Tesla GPU accelerator, and utilizes a technique based on the moving window method to overcome the size limitation of the onboard memory. Effects such as beam drift, linear mode conversion, and diffraction/scattering will be examined. Comparisons will be made with beam-tracing calculations using the complex eikonal method. Supported by U.S. DoE Grants DE-FG02-99ER54527 and DE-AC02-09CH11466, and the DoE SULI Program at PPPL.
Aston, Philip J; Christie, Mark I; Huang, Ying H; Nandi, Manasi
2018-03-01
Advances in monitoring technology allow blood pressure waveforms to be collected at sampling frequencies of 250-1000 Hz for long time periods. However, much of the raw data are under-analysed. Heart rate variability (HRV) methods, in which beat-to-beat interval lengths are extracted and analysed, have been extensively studied. However, this approach discards the majority of the raw data. Our aim is to detect changes in the shape of the waveform in long streams of blood pressure data. Our approach involves extracting key features from large complex data sets by generating a reconstructed attractor in a three-dimensional phase space using delay coordinates from a window of the entire raw waveform data. The naturally occurring baseline variation is removed by projecting the attractor onto a plane from which new quantitative measures are obtained. The time window is moved through the data to give a collection of signals which relate to various aspects of the waveform shape. This approach enables visualisation and quantification of changes in the waveform shape and has been applied to blood pressure data collected from conscious unrestrained mice and to human blood pressure data. The interpretation of the attractor measures is aided by the analysis of simple artificial waveforms. We have developed and analysed a new method for analysing blood pressure data that uses all of the waveform data and hence can detect changes in the waveform shape that HRV methods cannot, which is confirmed with an example, and hence our method goes 'beyond HRV'.
The Study of Residential Areas Extraction Based on GF-3 Texture Image Segmentation
NASA Astrophysics Data System (ADS)
Shao, G.; Luo, H.; Tao, X.; Ling, Z.; Huang, Y.
2018-04-01
The study chooses the standard stripe and dual polarization SAR images of GF-3 as the basic data. Residential areas extraction processes and methods based upon GF-3 images texture segmentation are compared and analyzed. GF-3 images processes include radiometric calibration, complex data conversion, multi-look processing, images filtering, and then conducting suitability analysis for different images filtering methods, the filtering result show that the filtering method of Kuan is efficient for extracting residential areas, then, we calculated and analyzed the texture feature vectors using the GLCM (the Gary Level Co-occurrence Matrix), texture feature vectors include the moving window size, step size and angle, the result show that window size is 11*11, step is 1, and angle is 0°, which is effective and optimal for the residential areas extracting. And with the FNEA (Fractal Net Evolution Approach), we segmented the GLCM texture images, and extracted the residential areas by threshold setting. The result of residential areas extraction verified and assessed by confusion matrix. Overall accuracy is 0.897, kappa is 0.881, and then we extracted the residential areas by SVM classification based on GF-3 images, the overall accuracy is less 0.09 than the accuracy of extraction method based on GF-3 Texture Image Segmentation. We reached the conclusion that residential areas extraction based on GF-3 SAR texture image multi-scale segmentation is simple and highly accurate. although, it is difficult to obtain multi-spectrum remote sensing image in southern China, in cloudy and rainy weather throughout the year, this paper has certain reference significance.
Aston, Philip J; Christie, Mark I; Huang, Ying H; Nandi, Manasi
2018-01-01
Abstract Advances in monitoring technology allow blood pressure waveforms to be collected at sampling frequencies of 250–1000 Hz for long time periods. However, much of the raw data are under-analysed. Heart rate variability (HRV) methods, in which beat-to-beat interval lengths are extracted and analysed, have been extensively studied. However, this approach discards the majority of the raw data. Objective: Our aim is to detect changes in the shape of the waveform in long streams of blood pressure data. Approach: Our approach involves extracting key features from large complex data sets by generating a reconstructed attractor in a three-dimensional phase space using delay coordinates from a window of the entire raw waveform data. The naturally occurring baseline variation is removed by projecting the attractor onto a plane from which new quantitative measures are obtained. The time window is moved through the data to give a collection of signals which relate to various aspects of the waveform shape. Main results: This approach enables visualisation and quantification of changes in the waveform shape and has been applied to blood pressure data collected from conscious unrestrained mice and to human blood pressure data. The interpretation of the attractor measures is aided by the analysis of simple artificial waveforms. Significance: We have developed and analysed a new method for analysing blood pressure data that uses all of the waveform data and hence can detect changes in the waveform shape that HRV methods cannot, which is confirmed with an example, and hence our method goes ‘beyond HRV’. PMID:29350622
Quantifying Wikipedia Usage Patterns Before Stock Market Moves
NASA Astrophysics Data System (ADS)
Moat, Helen Susannah; Curme, Chester; Avakian, Adam; Kenett, Dror Y.; Stanley, H. Eugene; Preis, Tobias
2013-05-01
Financial crises result from a catastrophic combination of actions. Vast stock market datasets offer us a window into some of the actions that have led to these crises. Here, we investigate whether data generated through Internet usage contain traces of attempts to gather information before trading decisions were taken. We present evidence in line with the intriguing suggestion that data on changes in how often financially related Wikipedia pages were viewed may have contained early signs of stock market moves. Our results suggest that online data may allow us to gain new insight into early information gathering stages of decision making.
Quantifying Wikipedia Usage Patterns Before Stock Market Moves
Moat, Helen Susannah; Curme, Chester; Avakian, Adam; Kenett, Dror Y.; Stanley, H. Eugene; Preis, Tobias
2013-01-01
Financial crises result from a catastrophic combination of actions. Vast stock market datasets offer us a window into some of the actions that have led to these crises. Here, we investigate whether data generated through Internet usage contain traces of attempts to gather information before trading decisions were taken. We present evidence in line with the intriguing suggestion that data on changes in how often financially related Wikipedia pages were viewed may have contained early signs of stock market moves. Our results suggest that online data may allow us to gain new insight into early information gathering stages of decision making.
Windowed multipole for cross section Doppler broadening
NASA Astrophysics Data System (ADS)
Josey, C.; Ducru, P.; Forget, B.; Smith, K.
2016-02-01
This paper presents an in-depth analysis on the accuracy and performance of the windowed multipole Doppler broadening method. The basic theory behind cross section data is described, along with the basic multipole formalism followed by the approximations leading to windowed multipole method and the algorithm used to efficiently evaluate Doppler broadened cross sections. The method is tested by simulating the BEAVRS benchmark with a windowed multipole library composed of 70 nuclides. Accuracy of the method is demonstrated on a single assembly case where total neutron production rates and 238U capture rates compare within 0.1% to ACE format files at the same temperature. With regards to performance, clock cycle counts and cache misses were measured for single temperature ACE table lookup and for windowed multipole. The windowed multipole method was found to require 39.6% more clock cycles to evaluate, translating to a 7.9% performance loss overall. However, the algorithm has significantly better last-level cache performance, with 3 fewer misses per evaluation, or a 65% reduction in last-level misses. This is due to the small memory footprint of the windowed multipole method and better memory access pattern of the algorithm.
Improved Scanners for Microscopic Hyperspectral Imaging
NASA Technical Reports Server (NTRS)
Mao, Chengye
2009-01-01
Improved scanners to be incorporated into hyperspectral microscope-based imaging systems have been invented. Heretofore, in microscopic imaging, including spectral imaging, it has been customary to either move the specimen relative to the optical assembly that includes the microscope or else move the entire assembly relative to the specimen. It becomes extremely difficult to control such scanning when submicron translation increments are required, because the high magnification of the microscope enlarges all movements in the specimen image on the focal plane. To overcome this difficulty, in a system based on this invention, no attempt would be made to move either the specimen or the optical assembly. Instead, an objective lens would be moved within the assembly so as to cause translation of the image at the focal plane: the effect would be equivalent to scanning in the focal plane. The upper part of the figure depicts a generic proposed microscope-based hyperspectral imaging system incorporating the invention. The optical assembly of this system would include an objective lens (normally, a microscope objective lens) and a charge-coupled-device (CCD) camera. The objective lens would be mounted on a servomotor-driven translation stage, which would be capable of moving the lens in precisely controlled increments, relative to the camera, parallel to the focal-plane scan axis. The output of the CCD camera would be digitized and fed to a frame grabber in a computer. The computer would store the frame-grabber output for subsequent viewing and/or processing of images. The computer would contain a position-control interface board, through which it would control the servomotor. There are several versions of the invention. An essential feature common to all versions is that the stationary optical subassembly containing the camera would also contain a spatial window, at the focal plane of the objective lens, that would pass only a selected portion of the image. In one version, the window would be a slit, the CCD would contain a one-dimensional array of pixels, and the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion. The image built up by scanning in this case would be an ordinary (non-spectral) image. In another version, the optics of which are depicted in the lower part of the figure, the spatial window would be a slit, the CCD would contain a two-dimensional array of pixels, the slit image would be refocused onto the CCD by a relay-lens pair consisting of a collimating and a focusing lens, and a prism-gratingprism optical spectrometer would be placed between the collimating and focusing lenses. Consequently, the image on the CCD would be spatially resolved along the slit axis and spectrally resolved along the axis perpendicular to the slit. As in the first-mentioned version, the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion.
Space-variant restoration of images degraded by camera motion blur.
Sorel, Michal; Flusser, Jan
2008-02-01
We examine the problem of restoration from multiple images degraded by camera motion blur. We consider scenes with significant depth variations resulting in space-variant blur. The proposed algorithm can be applied if the camera moves along an arbitrary curve parallel to the image plane, without any rotations. The knowledge of camera trajectory and camera parameters is not necessary. At the input, the user selects a region where depth variations are negligible. The algorithm belongs to the group of variational methods that estimate simultaneously a sharp image and a depth map, based on the minimization of a cost functional. To initialize the minimization, it uses an auxiliary window-based depth estimation algorithm. Feasibility of the algorithm is demonstrated by three experiments with real images.
Windowing technique in FM radar realized by FPGA for better target resolution
NASA Astrophysics Data System (ADS)
Ponomaryov, Volodymyr I.; Escamilla-Hernandez, Enrique; Kravchenko, Victor F.
2006-09-01
Remote sensing systems, such as SAR usually apply FM signals to resolve nearly placed targets (objects) and improve SNR. Main drawbacks in the pulse compression of FM radar signal that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) of output radar signal that permits to resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, Blackman-Harris, Kaiser-Bessel, Dolph-Chebyshev, Gauss, etc. in window processing. Additionally to classical ones in here we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. In this paper we also proposed to implement the compression-windowing model on a hardware level employing Field Programmable Gate Array (FPGA) that offers some benefits like instantaneous implementation, dynamic reconfiguration, design, and field programmability. It has been investigated the pulse compression design on FPGA applying classical and novel window technique to reduce the SLL in absence and presence of noise. The paper presents simulated and experimental examples of detection of small or nearly placed targets in the imaging radar. Paper also presents the experimental hardware results of windowing in FM radar demonstrating resolution of the several targets for classical rectangular, Hamming, Kaiser-Bessel, and some novel ones: Up(x), fup 4(x)•D 3(x), fup 6(x)•G 3(x), etc. It is possible to conclude that windows created on base of the AFs offer better decreasing of the SLL in cases of presence or absence of noise and when we move away of the main lobe in comparison with classical windows.
An energy function for dynamics simulations of polypeptides in torsion angle space
NASA Astrophysics Data System (ADS)
Sartori, F.; Melchers, B.; Böttcher, H.; Knapp, E. W.
1998-05-01
Conventional simulation techniques to model the dynamics of proteins in atomic detail are restricted to short time scales. A simplified molecular description, in which high frequency motions with small amplitudes are ignored, can overcome this problem. In this protein model only the backbone dihedrals φ and ψ and the χi of the side chains serve as degrees of freedom. Bond angles and lengths are fixed at ideal geometry values provided by the standard molecular dynamics (MD) energy function CHARMM. In this work a Monte Carlo (MC) algorithm is used, whose elementary moves employ cooperative rotations in a small window of consecutive amide planes, leaving the polypeptide conformation outside of this window invariant. A single of these window MC moves generates local conformational changes only. But, the application of many such moves at different parts of the polypeptide backbone leads to global conformational changes. To account for the lack of flexibility in the protein model employed, the energy function used to evaluate conformational energies is split into sequentially neighbored and sequentially distant contributions. The sequentially neighbored part is represented by an effective (φ,ψ)-torsion potential. It is derived from MD simulations of a flexible model dipeptide using a conventional MD energy function. To avoid exaggeration of hydrogen bonding strengths, the electrostatic interactions involving hydrogen atoms are scaled down at short distances. With these adjustments of the energy function, the rigid polypeptide model exhibits the same equilibrium distributions as obtained by conventional MD simulation with a fully flexible molecular model. Also, the same temperature dependence of the stability and build-up of α helices of 18-alanine as found in MD simulations is observed using the adapted energy function for MC simulations. Analyses of transition frequencies demonstrate that also dynamical aspects of MD trajectories are faithfully reproduced. Finally, it is demonstrated that even for high temperature unfolded polypeptides the MC simulation is more efficient by a factor of 10 than conventional MD simulations.
NASA Astrophysics Data System (ADS)
Lanka, Karthikeyan; Pan, Ming; Konings, Alexandra; Piles, María; D, Nagesh Kumar; Wood, Eric
2017-04-01
Traditionally, passive microwave retrieval algorithms such as Land Parameter Retrieval Model (LPRM) estimate simultaneously soil moisture and Vegetation Optical Depth (VOD) using brightness temperature (Tb) data. The algorithm requires a surface roughness parameter which - despite implications - is generally assumed to be constant at global scale. Due to inherent noise in the satellite data and retrieval algorithm, the VOD retrievals are usually observed to be highly fluctuating at daily scale which may not occur in reality. Such noisy VOD retrievals along with spatially invariable roughness parameter may affect the quality of soil moisture retrievals. The current work aims to smoothen the VOD retrievals (with an assumption that VOD remains constant over a period of time) and simultaneously generate, for the first time, global surface roughness map using multiple descending X-band Tb observations of AMSR-E. The methodology utilizes Tb values under a moving-time-window-setup to estimate concurrently the soil moisture of each day and a constant VOD in the window. Prior to this step, surface roughness parameter is estimated using the complete time series of Tb record. Upon carrying out the necessary sensitivity analysis, the smoothened VOD along with soil moisture retrievals is generated for the 10-year duration of AMSR-E (2002-2011) with a 7-day moving window using the LPRM framework. The spatial patterns of resulted global VOD maps are in coherence with vegetation biomass and climate conditions. The VOD results also exhibit a smoothening effect in terms of lower values of standard deviation. This is also evident from time series comparison of VOD and LPRM VOD retrievals without optimization over moving windows at several grid locations across the globe. The global surface roughness map also exhibited spatial patterns that are strongly influenced by topography and land use conditions. Some of the noticeable features include high roughness over mountainous regions and heavily vegetated tropical rainforests, low roughness in desert areas and moderate roughness value over higher latitudes. The new datasets of VOD and surface roughness can help improving the quality of soil moisture retrievals. Also, the methodology proposed is generic by nature and can be implemented over currently operating AMSR2, SMOS, and SMAP soil moisture missions.
NASA Astrophysics Data System (ADS)
Dai, Junhu; Xu, Yunjia; Wang, Huanjiong; Alatalo, Juha; Tao, Zexing; Ge, Quansheng
2017-12-01
Continuous long-term temperature sensitivity (ST) of leaf unfolding date (LUD) and main impacting factors in spring in the period 1978-2014 for 40 plant species in Mudanjiang, Heilongjiang Province, Northeast China, were analyzed by using observation data from the China Phenological Observation Network (CPON), together with the corresponding meteorological data from the China Meteorological Data Service Center. Temperature sensitivities, slopes of the regression between LUD and mean temperature during the optimum preseason (OP), were analyzed using 15-year moving window to determine their temporal trends. Major factors impacting ST were then chosen and evaluated by applying a random sampling method. The results showed that LUD was sensitive to mean temperature in a defined period before phenophase onset for all plant species analyzed. Over the period 1978-2014, the mean ST of LUD for all plant species was - 3.2 ± 0.49 days °C-1. The moving window analysis revealed that 75% of species displayed increasing ST of LUD, with 55% showing significant increases (P < 0.05). ST for the other 25% exhibited a decreasing trend, with 17% showing significant decreases (P < 0.05). On average, ST increased by 16%, from - 2.8 ± 0.83 days °C-1 during 1980-1994 to - 3.30 ± 0.65 days °C-1 during 2000-2014. For species with later LUD and longer OP, ST tended to increase more, while species with earlier LUD and shorter OP tended to display a decreasing ST. The standard deviation of preseason temperature impacted the temporal variation in ST. Chilling conditions influenced ST for some species, but photoperiod limitation did not have significant or coherent effects on changes in ST.
Comparison of muscles activity of abled bodied and amputee subjects for around shoulder movement.
Kaur, Amanpreet; Agarwal, Ravinder; Kumar, Amod
2016-05-12
Worldwide, about 56% of the amputees are upper limb amputees. This research deals a method with two-channel surface electromyogram (SEMG) signal recorded from around shoulder to estimate the changes in muscle activity in non-amputee and the residual limb of trans humeral amputees with different movements of arm. Identification of different muscles activity of near shoulder amputee and non-amputee persons. SEMG signal were acquired during three distinct exercises from three-selected muscles location around shoulder. The participants were asked to move their dominant arm from an assigned position to record their muscles activity recorded with change in position. Results shows the muscles activity in scalene is more than the other muscles like pectoralis and infraspinatus with the same shoulder motion. In addition, STFT (Short-Time Fourier Transform) spectrogram with window length of 256 samples at maximum of 512 frequency bins using hamming window has used to identify the signal for the maximum muscles activity with best resolution in spectrum plot. The results suggest that one can use this analysis for making a suitable device for around shoulder prosthetic users based on muscles activation of amputee persons.
Zhong, Ran; Xie, Haiyang; Kong, Fanzhi; Zhang, Qiang; Jahan, Sharmin; Xiao, Hua; Fan, Liuyin; Cao, Chengxi
2016-09-21
In this work, we developed the concept of enzyme catalysis-electrophoresis titration (EC-ET) under ideal conditions, the theory of EC-ET for multiplex enzymatic assay (MEA), and a related method based on a moving reaction boundary (MRB) chip with a collateral channel and cell phone imaging. As a proof of principle, the model enzymes horseradish peroxidase (HRP), laccase and myeloperoxidase (MPO) were chosen for the tests of the EC-ET model. The experiments revealed that the EC-ET model could be achieved via coupling EC with ET within a MRB chip; particularly the MEA analyses of catalysis rate, maximum rate, activity, Km and Kcat could be conducted via a single run of the EC-ET chip, systemically demonstrating the validity of the EC-ET theory. Moreover, the developed method had these merits: (i) two orders of magnitude higher sensitivity than a fluorescence microplate reader, (ii) simplicity and low cost, and (iii) fairly rapid (30 min incubation, 20 s imaging) analysis, fair stability (<5.0% RSD) and accuracy, thus validating the EC-ET method. Finally, the developed EC-ET method was used for the clinical assay of MPO activity in blood samples; the values of MPO activity detected via the EC-ET chip were in agreement with those obtained by a traditional fluorescence microplate reader, indicating the applicability of the EC-ET method. The work opens a window for the development of enzymatic research, enzyme assay, immunoassay, and point-of-care testing as well as titration, one of the oldest methods of analysis, based on a simple chip.
Multifractal Fluctuations of Jiuzhaigou Tourists Before and after Wenchuan Earthquake
NASA Astrophysics Data System (ADS)
Shi, Kai; Li, Wen-Yong; Liu, Chun-Qiong; Huang, Zheng-Wen
2013-03-01
In this work, multifractal methods have been successfully used to characterize the temporal fluctuations of daily Jiuzhai Valley domestic and foreign tourists before and after Wenchuan earthquake in China. We used multifractal detrending moving average method (MF-DMA). It showed that Jiuzhai Valley tourism markets are characterized by long-term memory and multifractal nature in. Moreover, the major sources of multifractality are studied. Based on the concept of sliding window, the time evolutions of the multifractal behavior of domestic and foreign tourists were analyzed and the influence of Wenchuan earthquake on Jiuzhai Valley tourism system dynamics were evaluated quantitatively. The study indicates that the inherent dynamical mechanism of Jiuzhai Valley tourism system has not been fundamentally changed from long views, although Jiuzhai Valley tourism system was seriously affected by the Wenchuan earthquake. Jiuzhai Valley tourism system has the ability to restore to its previous state in the short term.
An improved artifact removal in exposure fusion with local linear constraints
NASA Astrophysics Data System (ADS)
Zhang, Hai; Yu, Mali
2018-04-01
In exposure fusion, it is challenging to remove artifacts because of camera motion and moving objects in the scene. An improved artifact removal method is proposed in this paper, which performs local linear adjustment in artifact removal progress. After determining a reference image, we first perform high-dynamic-range (HDR) deghosting to generate an intermediate image stack from the input image stack. Then, a linear Intensity Mapping Function (IMF) in each window is extracted based on the intensities of intermediate image and reference image, the intensity mean and variance of reference image. Finally, with the extracted local linear constraints, we reconstruct a target image stack, which can be directly used for fusing a single HDR-like image. Some experiments have been implemented and experimental results demonstrate that the proposed method is robust and effective in removing artifacts especially in the saturated regions of the reference image.
Utility of correlation techniques in gravity and magnetic interpretation
NASA Technical Reports Server (NTRS)
Chandler, V. W.; Koski, J. S.; Braice, L. W.; Hinze, W. J.
1977-01-01
Internal correspondence uses Poisson's Theorem in a moving-window linear regression analysis between the anomalous first vertical derivative of gravity and total magnetic field reduced to the pole. The regression parameters provide critical information on source characteristics. The correlation coefficient indicates the strength of the relation between magnetics and gravity. Slope value gives delta j/delta sigma estimates of the anomalous source. The intercept furnishes information on anomaly interference. Cluster analysis consists of the classification of subsets of data into groups of similarity based on correlation of selected characteristics of the anomalies. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile shows they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wavelength anomalies, and isolating geomagnetic field removal problems.
Random domain name and address mutation (RDAM) for thwarting reconnaissance attacks
Chen, Xi; Zhu, Yuefei
2017-01-01
Network address shuffling is a novel moving target defense (MTD) that invalidates the address information collected by the attacker by dynamically changing or remapping the host’s network addresses. However, most network address shuffling methods are limited by the limited address space and rely on the host’s static domain name to map to its dynamic address; therefore these methods cannot effectively defend against random scanning attacks, and cannot defend against an attacker who knows the target’s domain name. In this paper, we propose a network defense method based on random domain name and address mutation (RDAM), which increases the scanning space of the attacker through a dynamic domain name method and reduces the probability that a host will be hit by an attacker scanning IP addresses using the domain name system (DNS) query list and the time window methods. Theoretical analysis and experimental results show that RDAM can defend against scanning attacks and worm propagation more effectively than general network address shuffling methods, while introducing an acceptable operational overhead. PMID:28489910
Foo, Lee Kien; McGree, James; Duffull, Stephen
2012-01-01
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.
Transportable Applications Environment Plus, Version 5.1
NASA Technical Reports Server (NTRS)
1994-01-01
Transportable Applications Environment Plus (TAE+) computer program providing integrated, portable programming environment for developing and running application programs based on interactive windows, text, and graphical objects. Enables both programmers and nonprogrammers to construct own custom application interfaces easily and to move interfaces and application programs to different computers. Used to define corporate user interface, with noticeable improvements in application developer's and end user's learning curves. Main components are; WorkBench, What You See Is What You Get (WYSIWYG) software tool for design and layout of user interface; and WPT (Window Programming Tools) Package, set of callable subroutines controlling user interface of application program. WorkBench and WPT's written in C++, and remaining code written in C.
2003-03-28
KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility, workers move the Mars Exploration Rover-2 (MER-2) into position over the base petal of its lander assembly. Set to launch in Spring 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover, a window opening June 25.
2003-01-28
KENNEDY SPACE CENTER, FLA. - After being cleaned up, the Mars Exploration Rover -2 is ready to be moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
2003-03-28
KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility, workers move the Mars Exploration Rover-2 (MER-2) towards the base petal of its lander assembly. Set to launch in Spring 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover, a window opening June 25.
2003-01-31
KENNEDY SPACE CENTER, FLA. - Suspended by an overhead crane in the Payload Hazardous Servicing Facility, the Mars Exploration Rover (MER) aeroshell is guided by workers as it moves to a rotation stand. Set to launch in 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover a window opening June 25, 2003.
2014-01-01
Background In order to characterize the intracranial pressure-volume reserve capacity, the correlation coefficient (R) between the ICP wave amplitude (A) and the mean ICP level (P), the RAP index, has been used to improve the diagnostic value of ICP monitoring. Baseline pressure errors (BPEs), caused by spontaneous shifts or drifts in baseline pressure, cause erroneous readings of mean ICP. Consequently, BPEs could also affect ICP indices such as the RAP where in the mean ICP is incorporated. Methods A prospective, observational study was carried out on patients with aneurysmal subarachnoid hemorrhage (aSAH) undergoing ICP monitoring as part of their surveillance. Via the same burr hole in the scull, two separate ICP sensors were placed close to each other. For each consecutive 6-sec time window, the dynamic mean ICP wave amplitude (MWA; measure of the amplitude of the single pressure waves) and the static mean ICP, were computed. The RAP index was computed as the Pearson correlation coefficient between the MWA and the mean ICP for 40 6-sec time windows, i.e. every subsequent 4-min period (method 1). We compared this approach with a method of calculating RAP using a 4-min moving window updated every 6 seconds (method 2). Results The study included 16 aSAH patients. We compared 43,653 4-min RAP observations of signals 1 and 2 (method 1), and 1,727,000 6-sec RAP observations (method 2). The two methods of calculating RAP produced similar results. Differences in RAP ≥0.4 in at least 7% of observations were seen in 5/16 (31%) patients. Moreover, the combination of a RAP of ≥0.6 in one signal and <0.6 in the other was seen in ≥13% of RAP-observations in 4/16 (25%) patients, and in ≥8% in another 4/16 (25%) patients. The frequency of differences in RAP >0.2 was significantly associated with the frequency of BPEs (5 mmHg ≤ BPE <10 mmHg). Conclusions Simultaneous monitoring from two separate, close-by ICP sensors reveals significant differences in RAP that correspond to the occurrence of BPEs. As differences in RAP are of magnitudes that may alter patient management, we do not advocate the use of RAP in the management of neurosurgical patients. PMID:25052470
Text-Based Recall and Extra-Textual Generations Resulting from Simplified and Authentic Texts
ERIC Educational Resources Information Center
Crossley, Scott A.; McNamara, Danielle S.
2016-01-01
This study uses a moving windows self-paced reading task to assess text comprehension of beginning and intermediate-level simplified texts and authentic texts by L2 learners engaged in a text-retelling task. Linear mixed effects (LME) models revealed statistically significant main effects for reading proficiency and text level on the number of…
Best Practices Case Study: Schneider Homes, Inc. - Village at Miller Creek, Burien, W
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2010-09-01
Case study of Schneider Homes, who achieved 50% savings over the 2004 IECC with analysis and recommendations from DOE’s Building America including moving ducts and furnace into conditioned space, R-23 blown fiberglass in the walls and R-38 in the attics, and high-performance HVAC, lighting, appliances, and windows.
ERIC Educational Resources Information Center
Tam, Cynthia; Wells, David
2009-01-01
Visual-cognitive loads influence the effectiveness of word prediction technology. Adjusting parameters of word prediction programs can lessen visual-cognitive loads. This study evaluated the benefits of WordQ word prediction software for users' performance when the prediction window was moved to a personal digital assistant (PDA) device placed at…
Initial Scene Representations Facilitate Eye Movement Guidance in Visual Search
ERIC Educational Resources Information Center
Castelhano, Monica S.; Henderson, John M.
2007-01-01
What role does the initial glimpse of a scene play in subsequent eye movement guidance? In 4 experiments, a brief scene preview was followed by object search through the scene via a small moving window that was tied to fixation position. Experiment 1 demonstrated that the scene preview resulted in more efficient eye movements compared with a…
Visualising Cultures: The "European Picture Book Collection" Moves "Down Under"
ERIC Educational Resources Information Center
Cotton, Penni; Daly, Nicola
2015-01-01
The potential for picture books in national collections to act as mirrors reflecting the reader's cultural identity, is widely accepted. This paper shows that the books in a New Zealand Picture Book Collection can also become windows into unfamiliar worlds for non-New Zealand readers, giving them the opportunity to learn more about a context in…
Pereira, Telma; Lemos, Luís; Cardoso, Sandra; Silva, Dina; Rodrigues, Ana; Santana, Isabel; de Mendonça, Alexandre; Guerreiro, Manuela; Madeira, Sara C
2017-07-19
Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.
Simplified Computation for Nonparametric Windows Method of Probability Density Function Estimation.
Joshi, Niranjan; Kadir, Timor; Brady, Michael
2011-08-01
Recently, Kadir and Brady proposed a method for estimating probability density functions (PDFs) for digital signals which they call the Nonparametric (NP) Windows method. The method involves constructing a continuous space representation of the discrete space and sampled signal by using a suitable interpolation method. NP Windows requires only a small number of observed signal samples to estimate the PDF and is completely data driven. In this short paper, we first develop analytical formulae to obtain the NP Windows PDF estimates for 1D, 2D, and 3D signals, for different interpolation methods. We then show that the original procedure to calculate the PDF estimate can be significantly simplified and made computationally more efficient by a judicious choice of the frame of reference. We have also outlined specific algorithmic details of the procedures enabling quick implementation. Our reformulation of the original concept has directly demonstrated a close link between the NP Windows method and the Kernel Density Estimator.
NASA Astrophysics Data System (ADS)
Kang, Jae-sik; Oh, Eun-Joo; Bae, Min-Jung; Song, Doo-Sam
2017-12-01
Given that the Korean government is implementing what has been termed the energy standards and labelling program for windows, window companies will be required to assign window ratings based on the experimental results of their product. Because this has added to the cost and time required for laboratory tests by window companies, the simulation system for the thermal performance of windows has been prepared to compensate for time and cost burdens. In Korea, a simulator is usually used to calculate the thermal performance of a window through WINDOW/THERM, complying with ISO 15099. For a single window, the simulation results are similar to experimental results. A double window is also calculated using the same method, but the calculation results for this type of window are unreliable. ISO 15099 should not recommend the calculation of the thermal properties of an air cavity between window sashes in a double window. This causes a difference between simulation and experimental results pertaining to the thermal performance of a double window. In this paper, the thermal properties of air cavities between window sashes in a double window are analyzed through computational fluid dynamics (CFD) simulations with the results compared to calculation results certified by ISO 15099. The surface temperature of the air cavity analyzed by CFD is compared to the experimental temperatures. These results show that an appropriate calculation method for an air cavity between window sashes in a double window should be established for reliable thermal performance results for a double window.
Size and Location of Defects at the Coupling Interface Affect Lithotripter Performance
Li, Guangyan; Williams, James C.; Pishchalnikov, Yuri A.; Liu, Ziyue; McAteer, James A.
2012-01-01
OBJECTIVE To determine how the size and location of coupling defects caught between the therapy head of a lithotripter and the skin of a surrogate patient (acoustic window of a test chamber) affect the features of shock waves responsible for stone breakage. METHODS Model defects were placed in the coupling gel between the therapy head of a Dornier Compact-S electromagnetic lithotripter and the Mylar window of a water-filled coupling test system. A fiber-optic hydrophone was used to measure acoustic pressures and map the lateral dimensions of the focal zone of the lithotripter. The effect of coupling conditions on stone breakage was assessed using Gypsum model stones. RESULTS Stone breakage decreased in proportion to the area of the coupling defect; a centrally located defect blocking only 18% of the transmission area reduced stone breakage by an average of almost 30%. The effect on stone breakage was greater for defects located on-axis and decreased as the defect was moved laterally; an 18% defect located near the periphery of the coupling window (2.0 cm off-axis) reduced stone breakage by only ~15% compared to when coupling was completely unobstructed. Defects centered within the coupling window acted to narrow the focal width of the lithotripter; an 8.2% defect reduced the focal width ~30% compared to no obstruction (4.4 mm versus 6.5 mm). Coupling defects located slightly off center disrupted the symmetry of the acoustic field; an 18% defect positioned 1.0 cm off-axis shifted the focus of maximum positive pressure ~1.0 mm laterally. Defects on and off-axis imposed a significant reduction in the energy density of shock waves across the focal zone. CONCLUSIONS In addition to blocking the transmission of shock wave energy, coupling defects also disrupt the properties of shock waves that play a role in stone breakage, including the focal width of the lithotripter and the symmetry of the acoustic field; the effect is dependent on the size and location of defects, with defects near the center of the coupling window having the greatest effect. These data emphasize the importance of eliminating air pockets from the coupling interface, particularly defects located near the center of the coupling window. PMID:22938566
Electron wind in strong wave guide fields
NASA Astrophysics Data System (ADS)
Krienen, F.
1985-03-01
The X-ray activity observed near highly powered waveguide structures is usually caused by local electric discharges originating from discontinuities such as couplers, tuners or bends. In traveling waves electrons move in the direction of the power flow. Seed electrons can multipactor in a traveling wave, the moving charge pattern is different from the multipactor in a resonant structure and is self-extinguishing. The charge density in the wave guide will modify impedance and propagation constant of the wave guide. The radiation level inside the output wave guide of the SLAC, 50 MW, S-band, klystron is estimated. Possible contributions of radiation to window failure are discussed.
2001-05-29
KODIAK ISLAND, Alaska -- A boat moves a ramp into place that will allow Castor 120, the first stage of the Athena 1 launch vehicle, to safely move onto the dock at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J; Shi, W; Andrews, D
2015-06-15
Purpose To compare online image registrations of TrueBeam cone-beam CT (CBCT) and BrainLab ExacTrac imaging systems. Methods Tests were performed on a Varian TrueBeam STx linear accelerator (Version 2.0), which is integrated with a BrainLab ExacTrac imaging system (Version 6.0.5). The study was focused on comparing the online image registrations for translational shifts. A Rando head phantom was placed on treatment couch and immobilized with a BrainLab mask. The phantom was shifted by moving the couch translationally for 8 mm with a step size of 1 mm, in vertical, longitudinal, and lateral directions, respectively. At each location, the phantom wasmore » imaged with CBCT and ExacTrac x-ray. CBCT images were registered with TrueBeam and ExacTrac online registration algorithms, respectively. And ExacTrac x-ray image registrations were performed. Shifts calculated from different registrations were compared with nominal couch shifts. Results The averages and ranges of absolute differences between couch shifts and calculated phantom shifts obtained from ExacTrac x-ray registration, ExacTrac CBCT registration with default window, ExaxTrac CBCT registration with adjusted window (bone), Truebeam CBCT registration with bone window, and Truebeam CBCT registration with soft tissue window, were: 0.07 (0.02–0.14), 0.14 (0.01–0.35), 0.12 (0.02–0.28), 0.09 (0–0.20), and 0.06 (0–0.10) mm, in vertical direction; 0.06 (0.01–0.12), 0.27 (0.07–0.57), 0.23 (0.02–0.48), 0.04 (0–0.10), and 0.08 (0– 0.20) mm, in longitudinal direction; 0.05 (0.01–0.21), 0.35 (0.14–0.80), 0.25 (0.01–0.56), 0.19 (0–0.40), and 0.20 (0–0.40) mm, in lateral direction. Conclusion The shifts calculated from ExacTrac x-ray and TrueBeam CBCT registrations were close to each other (the differences between were less than 0.40 mm in any direction), and had better agreements with couch shifts than those from ExacTrac CBCT registrations. There were no significant differences between TrueBeam CBCT registrations using different windows. In ExacTrac CBCT registrations, using bone window led to better agreements than using default window.« less
Measuring floodplain spatial patterns using continuous surface metrics at multiple scales
Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.
2015-01-01
Interactions between fluvial processes and floodplain ecosystems occur upon a floodplain surface that is often physically complex. Spatial patterns in floodplain topography have only recently been quantified over multiple scales, and discrepancies exist in how floodplain surfaces are perceived to be spatially organised. We measured spatial patterns in floodplain topography for pool 9 of the Upper Mississippi River, USA, using moving window analyses of eight surface metrics applied to a 1 × 1 m2 DEM over multiple scales. The metrics used were Range, SD, Skewness, Kurtosis, CV, SDCURV,Rugosity, and Vol:Area, and window sizes ranged from 10 to 1000 m in radius. Surface metric values were highly variable across the floodplain and revealed a high degree of spatial organisation in floodplain topography. Moran's I correlograms fit to the landscape of each metric at each window size revealed that patchiness existed at nearly all window sizes, but the strength and scale of patchiness changed within window size, suggesting that multiple scales of patchiness and patch structure exist in the topography of this floodplain. Scale thresholds in the spatial patterns were observed, particularly between the 50 and 100 m window sizes for all surface metrics and between the 500 and 750 m window sizes for most metrics. These threshold scales are ~ 15–20% and 150% of the main channel width (1–2% and 10–15% of the floodplain width), respectively. These thresholds may be related to structuring processes operating across distinct scale ranges. By coupling surface metrics, multi-scale analyses, and correlograms, quantifying floodplain topographic complexity is possible in ways that should assist in clarifying how floodplain ecosystems are structured.
Muilenberg, M L; Skellenger, W S; Burge, H A; Solomon, W R
1991-02-01
Penetration of particulate aeroallergens into the interiors of two, new, similar Chrysler Corporation passenger vehicles (having no evidence of intrinsic microbial contamination) was studied on a large circular test track during periods of high pollen and spore prevalence. Impactor collections were obtained at front and rear seat points and at the track center during periods with (1) windows and vents closed and air conditioning on, (2) windows closed, vents open, and no air conditioning, and (3) air conditioner off, front windows open, and vents closed. These conditions were examined sequentially during travel at 40, 50, 60, and 80 kph. Particle recoveries within the two, new, similar Chrysler Corporation passenger vehicles did not vary with the speed of travel, either overall or with regard to each of the three ventilatory modalities. In addition, collections at front and rear seat sampling points were comparable. Highest interior aeroallergen levels were recorded with WO, and yet, these levels averaged only half the concurrent outside concentrations at track center. Recoveries within the cars were well below recoveries obtained outside when windows were closed (both VO and AC modes). These findings suggest window ventilation as an overriding factor determining particle ingress into moving vehicles. Efforts to delineate additional determinants of exposure by direct sampling are feasible and would appear essential in formulating realistic strategies of avoidance.
NASA Astrophysics Data System (ADS)
Tavakkol, Sasan; Lynett, Patrick
2017-08-01
In this paper, we introduce an interactive coastal wave simulation and visualization software, called Celeris. Celeris is an open source software which needs minimum preparation to run on a Windows machine. The software solves the extended Boussinesq equations using a hybrid finite volume-finite difference method and supports moving shoreline boundaries. The simulation and visualization are performed on the GPU using Direct3D libraries, which enables the software to run faster than real-time. Celeris provides a first-of-its-kind interactive modeling platform for coastal wave applications and it supports simultaneous visualization with both photorealistic and colormapped rendering capabilities. We validate our software through comparison with three standard benchmarks for non-breaking and breaking waves.
Hydrofluoric acid-resistant composite window and method for its fabrication
Ostenak, C.A.; Mackay, H.A.
1985-07-18
A hydrofluoric acid-resistant composite window and method for its fabrication are disclosed. The composite window comprises a window having first and second sides. The first side is oriented towards an environment containing hydrofluoric acid. An adhesive is applied to the first side. A layer of transparent hydrofluoric acid-resistant material, such as Mylar, is applied to the adhesive and completely covers the first side. The adhesive is then cured.
Hydrofluoric acid-resistant composite window and method for its fabrication
Ostenak, Carl A.; Mackay, Harold A.
1987-01-01
A hydrofluoric acid-resistant composite window and method for its fabrication are disclosed. The composite window comprises a window having first and second sides. The first side is oriented towards an environment containing hydrofluoric acid. An adhesive is applied to the first side. A layer of transparent hydrofluoric acid-resistant material, such as Mylar, is applied to the adhesive and completely covers the first side. The adhesive is then cured.
In Search of Conversational Grain Size: Modelling Semantic Structure Using Moving Stanza Windows
ERIC Educational Resources Information Center
Siebert-Evenstone, Amanda L.; Irgens, Golnaz Arastoopour; Collier, Wesley; Swiecki, Zachari; Ruis, Andrew R.; Shaffer, David Williamson
2017-01-01
Analyses of learning based on student discourse need to account not only for the content of the utterances but also for the ways in which students make connections across turns of talk. This requires segmentation of discourse data to define when connections are likely to be meaningful. In this paper, we present an approach to segmenting data for…
View of the ISS stack as seen during the fly-around by the STS-96 crew
2017-04-20
S96-E-5218 (3 June 1999) --- Partially silhouetted over clouds and a wide expanse of ocean waters, the unmanned International Space Station (ISS) moves away from the Space Shuttle Discovery. An electronic still camera (ESC) was aimed through aft flight deck windows to capture the image at 23:01:00 GMT, June 3, 1999.
ERIC Educational Resources Information Center
Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.
2014-01-01
This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…
The Advantage of Word-Based Processing in Chinese Reading: Evidence from Eye Movements
ERIC Educational Resources Information Center
Li, Xingshan; Gu, Junjuan; Liu, Pingping; Rayner, Keith
2013-01-01
In 2 experiments, we tested the prediction that reading is more efficient when characters belonging to a word are presented simultaneously than when they are not in Chinese reading using a novel variation of the moving window paradigm (McConkie & Rayner, 1975). In Experiment 1, we found that reading was slowed down when Chinese readers could…
Moving-window dynamic optimization: design of stimulation profiles for walking.
Dosen, Strahinja; Popović, Dejan B
2009-05-01
The overall goal of the research is to improve control for electrical stimulation-based assistance of walking in hemiplegic individuals. We present the simulation for generating offline input (sensors)-output (intensity of muscle stimulation) representation of walking that serves in synthesizing a rule-base for control of electrical stimulation for restoration of walking. The simulation uses new algorithm termed moving-window dynamic optimization (MWDO). The optimization criterion was to minimize the sum of the squares of tracking errors from desired trajectories with the penalty function on the total muscle efforts. The MWDO was developed in the MATLAB environment and tested using target trajectories characteristic for slow-to-normal walking recorded in healthy individual and a model with the parameters characterizing the potential hemiplegic user. The outputs of the simulation are piecewise constant intensities of electrical stimulation and trajectories generated when the calculated stimulation is applied to the model. We demonstrated the importance of this simulation by showing the outputs for healthy and hemiplegic individuals, using the same target trajectories. Results of the simulation show that the MWDO is an efficient tool for analyzing achievable trajectories and for determining the stimulation profiles that need to be delivered for good tracking.
Yang, Ping; Dumont, Guy A; Ansermino, J Mark
2009-04-01
Intraoperative heart rate is routinely measured independently from the ECG monitor, pulse oximeter, and the invasive blood pressure monitor if available. The presence of artifacts, in one or more of theses signals, especially sustained artifacts, represents a critical challenge for physiological monitoring. When temporal filters are used to suppress sustained artifacts, unwanted delays or signal distortion are often introduced. The aim of this study was to remove artifacts and derive accurate estimates for the heart rate signal by using measurement redundancy. Heart rate measurements from multiple sensors and previous estimates that fall in a short moving window were treated as samples of the same heart rate. A hybrid median filter was used to align these samples into one ordinal series and to select the median as the fused estimate. This method can successfully remove artifacts that are sustained for shorter than half the length of the filter window, or artifacts that are sustained for a longer duration but presented in no more than half of the sensors. The method was tested on both simulated and clinical cases. The performance of the hybrid median filter in the simulated study was compared with that of a two-step estimation process, comprising a threshold-controlled artifact-removal module and a Kalman filter. The estimation accuracy of the hybrid median filter is better than that of the Kalman filter in the presence of artifacts. The hybrid median filter combines the structural and temporal information from two or more sensors and generates a robust estimate of heart rate without requiring strict assumptions about the signal's characteristics. This method is intuitive, computationally simple, and the performance can be easily adjusted. These considerable benefits make this method highly suitable for clinical use.
climwin: An R Toolbox for Climate Window Analysis.
Bailey, Liam D; van de Pol, Martijn
2016-01-01
When studying the impacts of climate change, there is a tendency to select climate data from a small set of arbitrary time periods or climate windows (e.g., spring temperature). However, these arbitrary windows may not encompass the strongest periods of climatic sensitivity and may lead to erroneous biological interpretations. Therefore, there is a need to consider a wider range of climate windows to better predict the impacts of future climate change. We introduce the R package climwin that provides a number of methods to test the effect of different climate windows on a chosen response variable and compare these windows to identify potential climate signals. climwin extracts the relevant data for each possible climate window and uses this data to fit a statistical model, the structure of which is chosen by the user. Models are then compared using an information criteria approach. This allows users to determine how well each window explains variation in the response variable and compare model support between windows. climwin also contains methods to detect type I and II errors, which are often a problem with this type of exploratory analysis. This article presents the statistical framework and technical details behind the climwin package and demonstrates the applicability of the method with a number of worked examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chitsazzadeh, S; Wells, D; Mestrovic, A
2016-06-15
Purpose: To develop a QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: An interface was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside an ArcCheck diode array. The Quasar phantom controller used a patient specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. An amplitude-based RPM tracking system was specified to turn the beam on during the exhale phase of the breathing pattern. SABR plans were developed using Eclipse for liver PTVs ranging in sizemore » from 3-12 cm in diameter using a 2-arc VMAT technique. Dose was measured in the middle of the penumbra region, where the high dose gradient allowed for sensitive detection of any inaccuracies in gated dose delivery. The overall fidelity of the dose distribution was confirmed using ArcCheck. The sensitivity of the gating QA procedure was investigated with respect to the following four parameters: PTV size, duration of exhale, baseline drift, and gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns will be required to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
Does preprocessing change nonlinear measures of heart rate variability?
Gomes, Murilo E D; Guimarães, Homero N; Ribeiro, Antônio L P; Aguirre, Luis A
2002-11-01
This work investigated if methods used to produce a uniformly sampled heart rate variability (HRV) time series significantly change the deterministic signature underlying the dynamics of such signals and some nonlinear measures of HRV. Two methods of preprocessing were used: the convolution of inverse interval function values with a rectangular window and the cubic polynomial interpolation. The HRV time series were obtained from 33 Wistar rats submitted to autonomic blockade protocols and from 17 healthy adults. The analysis of determinism was carried out by the method of surrogate data sets and nonlinear autoregressive moving average modelling and prediction. The scaling exponents alpha, alpha(1) and alpha(2) derived from the detrended fluctuation analysis were calculated from raw HRV time series and respective preprocessed signals. It was shown that the technique of cubic interpolation of HRV time series did not significantly change any nonlinear characteristic studied in this work, while the method of convolution only affected the alpha(1) index. The results suggested that preprocessed time series may be used to study HRV in the field of nonlinear dynamics.
NASA Astrophysics Data System (ADS)
Jiang, Peng; Peng, Lihui; Xiao, Deyun
2007-06-01
This paper presents a regularization method by using different window functions as regularization for electrical capacitance tomography (ECT) image reconstruction. Image reconstruction for ECT is a typical ill-posed inverse problem. Because of the small singular values of the sensitivity matrix, the solution is sensitive to the measurement noise. The proposed method uses the spectral filtering properties of different window functions to make the solution stable by suppressing the noise in measurements. The window functions, such as the Hanning window, the cosine window and so on, are modified for ECT image reconstruction. Simulations with respect to five typical permittivity distributions are carried out. The reconstructions are better and some of the contours are clearer than the results from the Tikhonov regularization. Numerical results show that the feasibility of the image reconstruction algorithm using different window functions as regularization.
The dynamics underlying the regeneration and stalling of Hurricane Harvey
NASA Astrophysics Data System (ADS)
Liang, X. S.
2017-12-01
The explosive regeneration and stalling make the hurricane Harvey go from a little-noticed storm to an extremely destructive behemoth in late August 2017 that incurred an estimated economic loss at 70-200 billion USD. In this study, we use a recently developed analysis tool, namely, multiscale window transform (MWT), and the MWT-based theory of canonical transfer, to investigate the dynamics underlying this regeneration and stalling. The atmospheric fields are reconstructed onto three scale ranges or windows, namely, large-scale, tropical cyclone-scale, and cumulus convection-scale windows. The intertwined cyclone-scale nonlinear energy process is uniquely separated into a transport of energy within the cyclone window and an interscale transfer through reconstructing the "atomic" energy fluxes on the multiple scale windows. The resulting transfer bears a Lie bracket form, reminiscent of the Poisson bracket in Hamiltonian mechanics, and is hence referred to as canonical. It is found that within the Gulf of Mexico, Harvey gains much energy from the cumulus convection window through an inverse energy cascade, leading to its explosive growth. In the mean time, there is a barotropic instability (positive canonical transfer) center of the mean circulation in the lower and mid troposphere which lies quasi-steadily over Houston during August 22 through early September. The northwestward propagating Harvey meets that center and then stalls for two days near the coastline, dropping torrential and unprecedented amounts of rainfall and causing catastrophic flooding. It moves out of the instability center by the end of August, and then dissipates quickly in the following days.
Ultrafast dark-field surface inspection with hybrid-dispersion laser scanning
NASA Astrophysics Data System (ADS)
Yazaki, Akio; Kim, Chanju; Chan, Jacky; Mahjoubfar, Ata; Goda, Keisuke; Watanabe, Masahiro; Jalali, Bahram
2014-06-01
High-speed surface inspection plays an important role in industrial manufacturing, safety monitoring, and quality control. It is desirable to go beyond the speed limitation of current technologies for reducing manufacturing costs and opening a new window onto a class of applications that require high-throughput sensing. Here, we report a high-speed dark-field surface inspector for detection of micrometer-sized surface defects that can travel at a record high speed as high as a few kilometers per second. This method is based on a modified time-stretch microscope that illuminates temporally and spatially dispersed laser pulses on the surface of a fast-moving object and detects scattered light from defects on the surface with a sensitive photodetector in a dark-field configuration. The inspector's ability to perform ultrafast dark-field surface inspection enables real-time identification of difficult-to-detect features on weakly reflecting surfaces and hence renders the method much more practical than in the previously demonstrated bright-field configuration. Consequently, our inspector provides nearly 1000 times higher scanning speed than conventional inspectors. To show our method's broad utility, we demonstrate real-time inspection of the surface of various objects (a non-reflective black film, transparent flexible film, and reflective hard disk) for detection of 10 μm or smaller defects on a moving target at 20 m/s within a scan width of 25 mm at a scan rate of 90.9 MHz. Our method holds promise for improving the cost and performance of organic light-emitting diode displays for next-generation smart phones, lithium-ion batteries for green electronics, and high-efficiency solar cells.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Apparatus and method for in-situ cleaning of resist outgassing windows
Klebanoff, Leonard E.; Haney, Steven J.
2001-01-01
An apparatus and method for in-situ cleaning of resist outgassing windows. The apparatus includes a chamber located in a structure, with the chamber having an outgassing window to be cleaned positioned in alignment with a slot in the chamber, whereby radiation energy passes through the window, the chamber, and the slot onto a resist-coated wafer mounted in the structure. The chamber is connected to a gas supply and the structure is connected to a vacuum pump. Within the chamber are two cylindrical sector electrodes and a filament is electrically connected to one sector electrode and a power supply. In a first cleaning method the sector electrodes are maintained at the same voltage, the filament is unheated, the chamber is filled with argon (Ar) gas under pressure, and the window is maintained at a zero voltage, whereby Ar ions are accelerated onto the window surface, sputtering away carbon deposits that build up as a result of resist outgassing. A second cleaning method is similar except oxygen gas (O.sub.2) is admitted to the chamber instead of Ar. These two methods can be carried out during lithographic operation. A third method, carried out during a maintenance period, involves admitting CO.sub.2 into the chamber, heating the filament to a point of thermionic emission, the sector electrodes are at different voltages, excited CO.sub.2 gas molecules are created which impact the carbon contamination on the window, and gasify it, producing CO gaseous products that are pumped away.
Dynamic Granger-Geweke causality modeling with application to interictal spike propagation
Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.
2010-01-01
A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280
NASA Astrophysics Data System (ADS)
Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari
2018-01-01
The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.
Window classification of brain CT images in biomedical articles.
Xue, Zhiyun; Antani, Sameer; Long, L Rodney; Demner-Fushman, Dina; Thoma, George R
2012-01-01
Effective capability to search biomedical articles based on visual properties of article images may significantly augment information retrieval in the future. In this paper, we present a new method to classify the window setting types of brain CT images. Windowing is a technique frequently used in the evaluation of CT scans, and is used to enhance contrast for the particular tissue or abnormality type being evaluated. In particular, it provides radiologists with an enhanced view of certain types of cranial abnormalities, such as the skull lesions and bone dysplasia which are usually examined using the " bone window" setting and illustrated in biomedical articles using "bone window images". Due to the inherent large variations of images among articles, it is important that the proposed method is robust. Our algorithm attained 90% accuracy in classifying images as bone window or non-bone window in a 210 image data set.
Michael L. Hoppus; Rachel I. Riemann; Andrew J. Lister; Mark V. Finco
2002-01-01
The panchromatic bands of Landsat 7, SPOT, and IRS satellite imagery provide an opportunity to evaluate the effectiveness of texture analysis of satellite imagery for mapping of land use/cover, especially forest cover. A variety of texture algorithms, including standard deviation, Ryherd-Woodcock minimum variance adaptive window, low pass etc., were applied to moving...
ERIC Educational Resources Information Center
Stone, Tammy; Coussons-Read, Mary
2011-01-01
Moving from a faculty position to an administrative office frequently entails gaining considerable responsibility, but ambiguous power. The hope of these two authors is that this volume will serve as a reference and a source of support for current associate and assistant deans and as a window into these jobs for faculty who may be considering such…
End effector of the Discovery's RMS with tools moves toward Syncom-IV
1985-04-17
51D-44-046 (17 April 1985) --- The Space Shuttle Discovery's Remote Manipulator System (RMS) arm and two specially designed extensions move toward the troubled Syncom-IV (LEASAT) communications satellite during a station keeping mode of the two spacecraft in Earth orbit. Inside the Shuttle's cabin, astronaut Rhea Seddon, 51D mission specialist, controlled the Canadian-built arm in an attempt to move an external lever on the satellite. Crewmembers learned of the satellite's problems shortly after it was deployed from the cargo bay on April 13, 1985. The arm achieved physical contact with the lever as planned. However, the satellite did not respond to the contact as hoped. A 70mm handheld Hassellblad camera, aimed through Discovery's windows, recorded this frame -- one of the first to be released to news media following return of the seven-member crew on April 17, 1985.
Templated fabrication of hollow nanospheres with 'windows' of accurate size and tunable number.
Xie, Duan; Hou, Yidong; Su, Yarong; Gao, Fuhua; Du, Jinglei
2015-01-01
The 'windows' or 'doors' on the surface of a closed hollow structure can enable the exchange of material and information between the interior and exterior of one hollow sphere or between two hollow spheres, and this information or material exchange can also be controlled through altering the window' size. Thus, it is very interesting and important to achieve the fabrication and adjustment of the 'windows' or 'doors' on the surface of a closed hollow structure. In this paper, we propose a new method based on the temple-assisted deposition method to achieve the fabrication of hollow spheres with windows of accurate size and number. Through precisely controlling of deposition parameters (i.e., deposition angle and number), hollow spheres with windows of total size from 0% to 50% and number from 1 to 6 have been successfully achieved. A geometrical model has been developed for the morphology simulation and size calculation of the windows, and the simulation results meet well with the experiment. This model will greatly improve the convenience and efficiency of this temple-assisted deposition method. In addition, these hollow spheres with desired windows also can be dispersed into liquid or arranged regularly on any desired substrate. These advantages will maximize their applications in many fields, such as drug transport and nano-research container.
Design and fabrication of a large area freestanding compressive stress SiO2 optical window
NASA Astrophysics Data System (ADS)
Van Toan, Nguyen; Sangu, Suguru; Ono, Takahito
2016-07-01
This paper reports the design and fabrication of a 7.2 mm × 9.6 mm freestanding compressive stress SiO2 optical window without buckling. An application of the SiO2 optical window with and without liquid penetration has been demonstrated for an optical modulator and its optical characteristic is evaluated by using an image sensor. Two methods for SiO2 optical window fabrication have been presented. The first method is a combination of silicon etching and a thermal oxidation process. Silicon capillaries fabricated by deep reactive ion etching (deep RIE) are completely oxidized to form the SiO2 capillaries. The large compressive stress of the oxide causes buckling of the optical window, which is reduced by optimizing the design of the device structure. A magnetron-type RIE, which is investigated for deep SiO2 etching, is the second method. This method achieves deep SiO2 etching together with smooth surfaces, vertical shapes and a high aspect ratio. Additionally, in order to avoid a wrinkling optical window, the idea of a Peano curve structure has been proposed to achieve a freestanding compressive stress SiO2 optical window. A 7.2 mm × 9.6 mm optical window area without buckling integrated with an image sensor for an optical modulator has been successfully fabricated. The qualitative and quantitative evaluations have been performed in cases with and without liquid penetration.
Oval Window Size and Shape: a Micro-CT Anatomical Study With Considerations for Stapes Surgery.
Zdilla, Matthew J; Skrzat, Janusz; Kozerska, Magdalena; Leszczyński, Bartosz; Tarasiuk, Jacek; Wroński, Sebastian
2018-06-01
The oval window is an important structure with regard to stapes surgeries, including stapedotomy for the treatment of otosclerosis. Recent study of perioperative imaging of the oval window has revealed that oval window niche height can indicate both operative difficulty and subjective discomfort during otosclerosis surgery. With regard to shape, structures incorporated into the oval window niche, such as cartilage grafts, must be compatible with the shape of the oval window. Despite the clinical importance of the oval window, there is little information regarding its size and shape. This study assessed oval window size and shape via micro-computed tomography paired with modern morphometric methodology in the fetal, infant, child, and adult populations. Additionally, the study compared oval window size and shape between sexes and between left- and right-sided ears. No significant differences were found among traditional morphometric parameters among age groups, sides, or sexes. However, geometric morphometric methods revealed shape differences between age groups. Further, geometric morphometric methods provided the average oval window shape and most-likely shape variance. Beyond demonstrating oval window size and shape variation, the results of this report will aid in identifying patients among whom anatomical variation may contribute to surgical difficulty and surgeon discomfort, or otherwise warrant preoperative adaptations for the incorporation of materials into and around the oval window.
NASA Technical Reports Server (NTRS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-01-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
NASA Astrophysics Data System (ADS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-12-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
NASA Astrophysics Data System (ADS)
Ramezanzadeh, B.; Arman, S. Y.; Mehdipour, M.; Markhali, B. P.
2014-01-01
In this study, the corrosion inhibition properties of two similar heterocyclic compounds namely benzotriazole (BTA) and benzothiazole (BNS) inhibitors on copper in 1.0 M H2SO4 solution were studied by electrochemical techniques as well as surface analysis. The results showed that corrosion inhibition of copper largely depends on the molecular structure and concentration of the inhibitors. The effect of DC trend on the interpretation of electrochemical noise (ECN) results in time domain was evaluated by moving average removal (MAR) method. Accordingly, the impact of square and Hanning window functions as drift removal methods in frequency domain was studied. After DC trend removal, a good trend was observed between electrochemical noise (ECN) data and the results obtained from EIS and potentiodynamic polarization. Furthermore, the shot noise theory in frequency domain was applied to approach the charge of each electrochemical event (q) from the potential and current noise signals.
Detecting labor using graph theory on connectivity matrices of uterine EMG.
Al-Omar, S; Diab, A; Nader, N; Khalil, M; Karlsson, B; Marque, C
2015-08-01
Premature labor is one of the most serious health problems in the developed world. One of the main reasons for this is that no good way exists to distinguish true labor from normal pregnancy contractions. The aim of this paper is to investigate if the application of graph theory techniques to multi-electrode uterine EMG signals can improve the discrimination between pregnancy contractions and labor. To test our methods we first applied them to synthetic graphs where we detected some differences in the parameters results and changes in the graph model from pregnancy-like graphs to labor-like graphs. Then, we applied the same methods to real signals. We obtained the best differentiation between pregnancy and labor through the same parameters. Major improvements in differentiating between pregnancy and labor were obtained using a low pass windowing preprocessing step. Results show that real graphs generally became more organized when moving from pregnancy, where the graph showed random characteristics, to labor where the graph became a more small-world like graph.
Li, Min; Yu, Bing-bing; Wu, Jian-hua; Xu, Lin; Sun, Gang
2013-01-01
Purpose As Doppler ultrasound has been proven to be an effective tool to predict and compress the optimal pulsing windows, we evaluated the effective dose and diagnostic accuracy of coronary CT angiography (CTA) incorporating Doppler-guided prospective electrocardiograph (ECG) gating, which presets pulsing windows according to Doppler analysis, in patients with a heart rate >65 bpm. Materials and Methods 119 patients with a heart rate >65 bpm who were scheduled for invasive coronary angiography were prospectively studied, and patients were randomly divided into traditional prospective (n = 61) and Doppler-guided prospective (n = 58) ECG gating groups. The exposure window of traditional prospective ECG gating was set at 30%–80% of the cardiac cycle. For the Doppler group, the length of diastasis was analyzed by Doppler. For lengths greater than 90 ms, the pulsing window was preset during diastole (during 60%–80%); otherwise, the optimal pulsing intervals were moved from diastole to systole (during 30%–50%). Results The mean heart rates of the traditional ECG and the Doppler-guided group during CT scanning were 75.0±7.7 bpm (range, 66–96 bpm) and 76.5±5.4 bpm (range: 66–105 bpm), respectively. The results indicated that whereas the image quality showed no significant difference between the traditional and Doppler groups (P = 0.42), the radiation dose of the Doppler group was significantly lower than that of the traditional group (5.2±3.4mSv vs. 9.3±4.5mSv, P<0.001). The sensitivities of CTA applying traditional and Doppler-guided prospective ECG gating to diagnose stenosis on a segment level were 95.5% and 94.3%, respectively; specificities 98.0% and 97.1%, respectively; positive predictive values 90.7% and 88.2%, respectively; negative predictive values 99.0% and 98.7%, respectively. There was no statistical difference in concordance between the traditional and Doppler groups (P = 0.22). Conclusion Doppler-guided prospective ECG gating represents an improved method in patients with a high heart rate to reduce effective radiation doses, while maintaining high diagnostic accuracy. PMID:23696793
Genkawa, Takuma; Shinzawa, Hideyuki; Kato, Hideaki; Ishikawa, Daitaro; Murayama, Kodai; Komiyama, Makoto; Ozaki, Yukihiro
2015-12-01
An alternative baseline correction method for diffuse reflection near-infrared (NIR) spectra, searching region standard normal variate (SRSNV), was proposed. Standard normal variate (SNV) is an effective pretreatment method for baseline correction of diffuse reflection NIR spectra of powder and granular samples; however, its baseline correction performance depends on the NIR region used for SNV calculation. To search for an optimal NIR region for baseline correction using SNV, SRSNV employs moving window partial least squares regression (MWPLSR), and an optimal NIR region is identified based on the root mean square error (RMSE) of cross-validation of the partial least squares regression (PLSR) models with the first latent variable (LV). The performance of SRSNV was evaluated using diffuse reflection NIR spectra of mixture samples consisting of wheat flour and granular glucose (0-100% glucose at 5% intervals). From the obtained NIR spectra of the mixture in the 10 000-4000 cm(-1) region at 4 cm intervals (1501 spectral channels), a series of spectral windows consisting of 80 spectral channels was constructed, and then SNV spectra were calculated for each spectral window. Using these SNV spectra, a series of PLSR models with the first LV for glucose concentration was built. A plot of RMSE versus the spectral window position obtained using the PLSR models revealed that the 8680–8364 cm(-1) region was optimal for baseline correction using SNV. In the SNV spectra calculated using the 8680–8364 cm(-1) region (SRSNV spectra), a remarkable relative intensity change between a band due to wheat flour at 8500 cm(-1) and that due to glucose at 8364 cm(-1) was observed owing to successful baseline correction using SNV. A PLSR model with the first LV based on the SRSNV spectra yielded a determination coefficient (R2) of 0.999 and an RMSE of 0.70%, while a PLSR model with three LVs based on SNV spectra calculated in the full spectral region gave an R2 of 0.995 and an RMSE of 2.29%. Additional evaluation of SRSNV was carried out using diffuse reflection NIR spectra of marzipan and corn samples, and PLSR models based on SRSNV spectra showed good prediction results. These evaluation results indicate that SRSNV is effective in baseline correction of diffuse reflection NIR spectra and provides regression models with good prediction accuracy.
Sink detection on tilted terrain for automated identification of glacial cirques
NASA Astrophysics Data System (ADS)
Prasicek, Günther; Robl, Jörg; Lang, Andreas
2016-04-01
Glacial cirques are morphologically distinct but complex landforms and represent a vital part of high mountain topography. Their distribution, elevation and relief are expected to hold information on (1) the extent of glacial occupation, (2) the mechanism of glacial cirque erosion, and (3) how glacial in concert with periglacial processes can limit peak altitude and mountain range height. While easily detectably for the expert's eye both in nature and on various representations of topography, their complicated nature makes them a nemesis for computer algorithms. Consequently, manual mapping of glacial cirques is commonplace in many mountain landscapes worldwide, but consistent datasets of cirque distribution and objectively mapped cirques and their morphometrical attributes are lacking. Among the biggest problems for algorithm development are the complexity in shape and the great variability of cirque size. For example, glacial cirques can be rather circular or longitudinal in extent, exist as individual and composite landforms, show prominent topographic depressions or can entirely be filled with water or sediment. For these reasons, attributes like circularity, size, drainage area and topology of landform elements (e.g. a flat floor surrounded by steep walls) have only a limited potential for automated cirque detection. Here we present a novel, geomorphometric method for automated identification of glacial cirques on digital elevation models that exploits their genetic bowl-like shape. First, we differentiate between glacial and fluvial terrain employing an algorithm based on a moving window approach and multi-scale curvature, which is also capable of fitting the analysis window to valley width. We then fit a plane to the valley stretch clipped by the analysis window and rotate the terrain around the center cell until the plane is level. Doing so, we produce sinks of considerable size if the clipped terrain represents a cirque, while no or only very small sinks develop on other valley stretches. We normalize sink area by window size for sink classification, apply this method to the Sawtooth Mountains, Idaho, and to Fiordland, New Zealand, and compare the results to manually mapped reference cirques. Results indicate that false negatives are produced only in very rugged terrain and false positives occur in rare cases, when valleys are strongly curved in longitudinal direction.
Apparatus for solar coal gasification
Gregg, D.W.
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials is described. Incident solar radiation is focused from an array of heliostats onto a tower-mounted secondary mirror which redirects the focused solar radiation down through a window onto the surface of a vertically-moving bed of coal, or a fluidized bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called synthesis gas, which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam at the rear surface of the secondary mirror.
Mock Target Window OTR and IR Design and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wass, Alexander Joseph
In order to fully verify temperature measurements made on the target window using infrared (IR) optical non-contact methods, actual comparative measurements are made with a real beam distribution as the heat source using Argonne National Laboratory’s (ANL) 35 MeV electron accelerator. Using Monte Carlo N-Particle (MCNP) simulations and thermal Finite Element Analysis (FEA), a cooled mock target window with thermocouple implants is designed to be used in such a test to achieve window temperatures up to 700°C. An uncoated and blackcoated mock window is designed to enhance the IR temperature measurements and verify optical transmitted radiation (OTR) imagery. This allowsmore » us to fully verify and characterize our temperature accuracy with our current IR camera method and any future method we may wish to explore using actual production conditions. This test also provides us with valuable conclusions/concerns regarding the calibration method we developed using our IR test stand at TA-53 in MPF-14.« less
Overview of Fabrication Techniques and Lessons Learned with Accelerator Vacuum Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ader, C. R.; McGee, M. W.; Nobrega, L. E.
Vacuum thin windows have been used in Fermilab's accelerators for decades and typically have been overlooked in terms of their criticality and fragility. Vacuum windows allow beam to pass through while creating a boundary between vacuum and air or high vacuum and low vacuum areas. The design of vacuum windows, including Titanium and Beryllium windows, will be discussed as well as fabrication, testing, and operational concerns. Failure of windows will be reviewed as well as safety approaches to mitigating failures and extending the lifetimes of vacuum windows. Various methods of calculating the strengths of vacuum windows will be explored, includingmore » FEA.« less
Cross, Troy J.; Keller-Ross, Manda; Issa, Amine; Wentz, Robert; Taylor, Bryan; Johnson, Bruce
2015-01-01
Study Objectives: To determine the impact of averaging window-length on the “desaturation” indexes (DIs) obtained via overnight pulse oximetry (SpO2) at high altitude. Design: Overnight SpO2 data were collected during a 10-day sojourn at high altitude. SpO2 was obtained using a commercial wrist-worn finger oximeter whose firmware was modified to store unaveraged beat-to-beat data. Simple moving averages of window lengths spanning 2 to 20 cardiac beats were retrospectively applied to beat-to-beat SpO2 datasets. After SpO2 artifacts were removed, the following DIs were then calculated for each of the averaged datasets: oxygen desaturation index (ODI); total sleep time with SpO2 < 80% (TST < 80), and the lowest SpO2 observed during sleep (SpO2 low). Setting: South Base Camp, Mt. Everest (5,364 m elevation). Participants: Five healthy, adult males (35 ± 5 y; 180 ± 1 cm; 85 ± 4 kg). Interventions: N/A. Measurements and Results: 49 datasets were obtained from the 5 participants, totalling 239 hours of data. For all window lengths ≥ 2 beats, ODI and TST < 80 were lower, and SpO2 low was higher than those values obtained from the beat-to-beat SpO2 time series data (P < 0.05). Conclusions: Our findings indicate that increasing oximeter averaging window length progressively underestimates the frequency and magnitude of sleep disordered breathing events at high altitude, as indirectly assessed via the desaturation indexes. Citation: Cross TJ, Keller-Ross M, Issa A, Wentz R, Taylor B, Johnson B. The impact of averaging window length on the “desaturation” indexes obtained via overnight pulse oximetry at high altitude. SLEEP 2015;38(8):1331–1334. PMID:25581919
Method of fabricating a microelectronic device package with an integral window
Peterson, Kenneth A.; Watson, Robert D.
2003-01-01
A method of fabricating a microelectronic device package with an integral window for providing optical access through an aperture in the package. The package is made of a multilayered insulating material, e.g., a low-temperature cofired ceramic (LTCC) or high-temperature cofired ceramic (HTCC). The window is inserted in-between personalized layers of ceramic green tape during stackup and registration. Then, during baking and firing, the integral window is simultaneously bonded to the sintered ceramic layers of the densified package. Next, the microelectronic device is flip-chip bonded to cofired thick-film metallized traces on the package, where the light-sensitive side is optically accessible through the window. Finally, a cover lid is attached to the opposite side of the package. The result is a compact, low-profile package, flip-chip bonded, hermetically-sealed package having an integral window.
Accuracy and Consistency of Respiratory Gating in Abdominal Cancer Patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ge, Jiajia; Santanam, Lakshmi; Yang, Deshan
2013-03-01
Purpose: To evaluate respiratory gating accuracy and intrafractional consistency for abdominal cancer patients treated with respiratory gated treatment on a regular linear accelerator system. Methods and Materials: Twelve abdominal patients implanted with fiducials were treated with amplitude-based respiratory-gated radiation therapy. On the basis of daily orthogonal fluoroscopy, the operator readjusted the couch position and gating window such that the fiducial was within a setup margin (fiducial-planning target volume [f-PTV]) when RPM indicated “beam-ON.” Fifty-five pre- and post-treatment fluoroscopic movie pairs with synchronized respiratory gating signal were recorded. Fiducial motion traces were extracted from the fluoroscopic movies using a template matchingmore » algorithm and correlated with f-PTV by registering the digitally reconstructed radiographs with the fluoroscopic movies. Treatment was determined to be “accurate” if 50% of the fiducial area stayed within f-PTV while beam-ON. For movie pairs that lost gating accuracy, a MATLAB program was used to assess whether the gating window was optimized, the external-internal correlation (EIC) changed, or the patient moved between movies. A series of safety margins from 0.5 mm to 3 mm was added to f-PTV for reassessing gating accuracy. Results: A decrease in gating accuracy was observed in 44% of movie pairs from daily fluoroscopic movies of 12 abdominal patients. Three main causes for inaccurate gating were identified as change of global EIC over time (∼43%), suboptimal gating setup (∼37%), and imperfect EIC within movie (∼13%). Conclusions: Inconsistent respiratory gating accuracy may occur within 1 treatment session even with a daily adjusted gating window. To improve or maintain gating accuracy during treatment, we suggest using at least a 2.5-mm safety margin to account for gating and setup uncertainties.« less
UNIPIC code for simulations of high power microwave devices
NASA Astrophysics Data System (ADS)
Wang, Jianguo; Zhang, Dianhui; Liu, Chunliang; Li, Yongdong; Wang, Yue; Wang, Hongguang; Qiao, Hailiang; Li, Xiaoze
2009-03-01
In this paper, UNIPIC code, a new member in the family of fully electromagnetic particle-in-cell (PIC) codes for simulations of high power microwave (HPM) generation, is introduced. In the UNIPIC code, the electromagnetic fields are updated using the second-order, finite-difference time-domain (FDTD) method, and the particles are moved using the relativistic Newton-Lorentz force equation. The convolutional perfectly matched layer method is used to truncate the open boundaries of HPM devices. To model curved surfaces and avoid the time step reduction in the conformal-path FDTD method, CP weakly conditional-stable FDTD (WCS FDTD) method which combines the WCS FDTD and CP-FDTD methods, is implemented. UNIPIC is two-and-a-half dimensional, is written in the object-oriented C++ language, and can be run on a variety of platforms including WINDOWS, LINUX, and UNIX. Users can use the graphical user's interface to create the geometric structures of the simulated HPM devices, or input the old structures created before. Numerical experiments on some typical HPM devices by using the UNIPIC code are given. The results are compared to those obtained from some well-known PIC codes, which agree well with each other.
Goicoechea, H C; Olivieri, A C
2001-07-01
A newly developed multivariate method involving net analyte preprocessing (NAP) was tested using central composite calibration designs of progressively decreasing size regarding the multivariate simultaneous spectrophotometric determination of three active components (phenylephrine, diphenhydramine and naphazoline) and one excipient (methylparaben) in nasal solutions. Its performance was evaluated and compared with that of partial least-squares (PLS-1). Minimisation of the calibration predicted error sum of squares (PRESS) as a function of a moving spectral window helped to select appropriate working spectral ranges for both methods. The comparison of NAP and PLS results was carried out using two tests: (1) the elliptical joint confidence region for the slope and intercept of a predicted versus actual concentrations plot for a large validation set of samples and (2) the D-optimality criterion concerning the information content of the calibration data matrix. Extensive simulations and experimental validation showed that, unlike PLS, the NAP method is able to furnish highly satisfactory results when the calibration set is reduced from a full four-component central composite to a fractional central composite, as expected from the modelling requirements of net analyte based methods.
A large, switchable optical clearing skull window for cerebrovascular imaging
Zhang, Chao; Feng, Wei; Zhao, Yanjie; Yu, Tingting; Li, Pengcheng; Xu, Tonghui; Luo, Qingming; Zhu, Dan
2018-01-01
Rationale: Intravital optical imaging is a significant method for investigating cerebrovascular structure and function. However, its imaging contrast and depth are limited by the turbid skull. Tissue optical clearing has a great potential for solving this problem. Our goal was to develop a transparent skull window, without performing a craniotomy, for use in assessing cerebrovascular structure and function. Methods: Skull optical clearing agents were topically applied to the skulls of mice to create a transparent window within 15 min. The clearing efficacy, repeatability, and safety of the skull window were then investigated. Results: Imaging through the optical clearing skull window enhanced both the contrast and the depth of intravital imaging. The skull window could be used on 2-8-month-old mice and could be expanded from regional to bi-hemispheric. In addition, the window could be repeatedly established without inducing observable inflammation and metabolic toxicity. Conclusion: We successfully developed an easy-to-handle, large, switchable, and safe optical clearing skull window. Combined with various optical imaging techniques, cerebrovascular structure and function can be observed through this optical clearing skull window. Thus, it has the potential for use in basic research on the physiopathologic processes of cortical vessels. PMID:29774069
Eisner, Brian H; Kambadakone, Avinash; Monga, Manoj; Anderson, James K; Thoreson, Andrew A; Lee, Hang; Dretler, Stephen P; Sahani, Dushyant V
2009-04-01
We determined the most accurate method of measuring urinary stones on computerized tomography. For the in vitro portion of the study 24 calculi, including 12 calcium oxalate monohydrate and 12 uric acid stones, that had been previously collected at our clinic were measured manually with hand calipers as the gold standard measurement. The calculi were then embedded into human kidney-sized potatoes and scanned using 64-slice multidetector computerized tomography. Computerized tomography measurements were performed at 4 window settings, including standard soft tissue windows (window width-320 and window length-50), standard bone windows (window width-1120 and window length-300), 5.13x magnified soft tissue windows and 5.13x magnified bone windows. Maximum stone dimensions were recorded. For the in vivo portion of the study 41 patients with distal ureteral stones who underwent noncontrast computerized tomography and subsequently spontaneously passed the stones were analyzed. All analyzed stones were 100% calcium oxalate monohydrate or mixed, calcium based stones. Stones were prospectively collected at the clinic and the largest diameter was measured with digital calipers as the gold standard. This was compared to computerized tomography measurements using 4.0x magnified soft tissue windows and 4.0x magnified bone windows. Statistical comparisons were performed using Pearson's correlation and paired t test. In the in vitro portion of the study the most accurate measurements were obtained using 5.13x magnified bone windows with a mean 0.13 mm difference from caliper measurement (p = 0.6). Measurements performed in the soft tissue window with and without magnification, and in the bone window without magnification were significantly different from hand caliper measurements (mean difference 1.2, 1.9 and 1.4 mm, p = 0.003, <0.001 and 0.0002, respectively). When comparing measurement errors between stones of different composition in vitro, the error for calcium oxalate calculi was significantly different from the gold standard for all methods except bone window settings with magnification. For uric acid calculi the measurement error was observed only in standard soft tissue window settings. In vivo 4.0x magnified bone windows was superior to 4.0x magnified soft tissue windows in measurement accuracy. Magnified bone window measurements were not statistically different from digital caliper measurements (mean underestimation vs digital caliper 0.3 mm, p = 0.4), while magnified soft tissue windows were statistically distinct (mean underestimation 1.4 mm, p = 0.001). In this study magnified bone windows were the most accurate method of stone measurements in vitro and in vivo. Therefore, we recommend the routine use of magnified bone windows for computerized tomography measurement of stones. In vitro the measurement error in calcium oxalate stones was greater than that in uric acid stones, suggesting that stone composition may be responsible for measurement inaccuracies.
Traffic-Related Air Pollution, Blood Pressure, and Adaptive Response of Mitochondrial Abundance.
Zhong, Jia; Cayir, Akin; Trevisi, Letizia; Sanchez-Guerra, Marco; Lin, Xinyi; Peng, Cheng; Bind, Marie-Abèle; Prada, Diddier; Laue, Hannah; Brennan, Kasey J M; Dereix, Alexandra; Sparrow, David; Vokonas, Pantel; Schwartz, Joel; Baccarelli, Andrea A
2016-01-26
Exposure to black carbon (BC), a tracer of vehicular-traffic pollution, is associated with increased blood pressure (BP). Identifying biological factors that attenuate BC effects on BP can inform prevention. We evaluated the role of mitochondrial abundance, an adaptive mechanism compensating for cellular-redox imbalance, in the BC-BP relationship. At ≥ 1 visits among 675 older men from the Normative Aging Study (observations=1252), we assessed daily BP and ambient BC levels from a stationary monitor. To determine blood mitochondrial abundance, we used whole blood to analyze mitochondrial-to-nuclear DNA ratio (mtDNA/nDNA) using quantitative polymerase chain reaction. Every standard deviation increase in the 28-day BC moving average was associated with 1.97 mm Hg (95% confidence interval [CI], 1.23-2.72; P<0.0001) and 3.46 mm Hg (95% CI, 2.06-4.87; P<0.0001) higher diastolic and systolic BP, respectively. Positive BC-BP associations existed throughout all time windows. BC moving averages (5-day to 28-day) were associated with increased mtDNA/nDNA; every standard deviation increase in 28-day BC moving average was associated with 0.12 standard deviation (95% CI, 0.03-0.20; P=0.007) higher mtDNA/nDNA. High mtDNA/nDNA significantly attenuated the BC-systolic BP association throughout all time windows. The estimated effect of 28-day BC moving average on systolic BP was 1.95-fold larger for individuals at the lowest mtDNA/nDNA quartile midpoint (4.68 mm Hg; 95% CI, 3.03-6.33; P<0.0001), in comparison with the top quartile midpoint (2.40 mm Hg; 95% CI, 0.81-3.99; P=0.003). In older adults, short-term to moderate-term ambient BC levels were associated with increased BP and blood mitochondrial abundance. Our findings indicate that increased blood mitochondrial abundance is a compensatory response and attenuates the cardiac effects of BC. © 2015 American Heart Association, Inc.
Superconductive radiofrequency window assembly
Phillips, Harry Lawrence; Elliott, Thomas S.
1998-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconductive radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1998-05-19
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The SRF window assembly has a superconducting metal-ceramic design. The SRF window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the SRF window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
Superconducting radiofrequency window assembly
Phillips, Harry L.; Elliott, Thomas S.
1997-01-01
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly (20) has a superconducting metal-ceramic design. The srf window assembly (20) comprises a superconducting frame (30), a ceramic plate (40) having a superconducting metallized area, and a superconducting eyelet (50) for sealing plate (40) into frame (30). The plate (40) is brazed to eyelet (50) which is then electron beam welded to frame (30). A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator.
Superconducting radiofrequency window assembly
Phillips, H.L.; Elliott, T.S.
1997-03-11
The present invention is a superconducting radiofrequency window assembly for use in an electron beam accelerator. The srf window assembly has a superconducting metal-ceramic design. The srf window assembly comprises a superconducting frame, a ceramic plate having a superconducting metallized area, and a superconducting eyelet for sealing plate into frame. The plate is brazed to eyelet which is then electron beam welded to frame. A method for providing a ceramic object mounted in a metal member to withstand cryogenic temperatures is also provided. The method involves a new metallization process for coating a selected area of a ceramic object with a thin film of a superconducting material. Finally, a method for assembling an electron beam accelerator cavity utilizing the srf window assembly is provided. The procedure is carried out within an ultra clean room to minimize exposure to particulates which adversely affect the performance of the cavity within the electron beam accelerator. 11 figs.
NASA Astrophysics Data System (ADS)
Kaur, Paramjit; Wasan, Ajay
2017-03-01
We present a theoretical model, using density matrix approach, to study the effect of external longitudinal and transverse magnetic fields on the optical properties of an inhomogeneously broadened multilevel Λ-system using the D2 line in 85Rb and 87Rb atoms. The presence of closely spaced multiple excited states causes asymmetry in the absorption and dispersion profiles. We observe a wide EIT window with a positive slope at the line center for a stationary atom. While for a moving atom, the linewidth of EIT window reduces and positive dispersion becomes steeper. When magnetic field is applied, our calculations show multiple EIT subwindows that are significantly narrower and shallow than single EIT window. The number of EIT subwindows depend on the orientation of the magnetic field. We also obtain multiple positive dispersive regions for subluminal propagation in the medium. The anomalous dispersion exists in between two subwindows showing the superluminal light propagation. Our theoretical analysis explain the experiments performed by Wei et al. [Phys. Rev. A 72, 023806 (2005)] and Iftiquar et al. [Phys. Rev. A 79, 013808 (2009)].
Exponential smoothing weighted correlations
NASA Astrophysics Data System (ADS)
Pozzi, F.; Di Matteo, T.; Aste, T.
2012-06-01
In many practical applications, correlation matrices might be affected by the "curse of dimensionality" and by an excessive sensitiveness to outliers and remote observations. These shortcomings can cause problems of statistical robustness especially accentuated when a system of dynamic correlations over a running window is concerned. These drawbacks can be partially mitigated by assigning a structure of weights to observational events. In this paper, we discuss Pearson's ρ and Kendall's τ correlation matrices, weighted with an exponential smoothing, computed on moving windows using a data-set of daily returns for 300 NYSE highly capitalized companies in the period between 2001 and 2003. Criteria for jointly determining optimal weights together with the optimal length of the running window are proposed. We find that the exponential smoothing can provide more robust and reliable dynamic measures and we discuss that a careful choice of the parameters can reduce the autocorrelation of dynamic correlations whilst keeping significance and robustness of the measure. Weighted correlations are found to be smoother and recovering faster from market turbulence than their unweighted counterparts, helping also to discriminate more effectively genuine from spurious correlations.
The Flash-Preview Moving Window Paradigm: Unpacking Visual Expertise One Glimpse at a Time
ERIC Educational Resources Information Center
Litchfield, Damien; Donovan, Tim
2017-01-01
How we make sense of what we see and where best to look is shaped by our experience, our current task goals and how we first perceive our environment. An established way of demonstrating these factors work together is to study how eye movement patterns change as a function of expertise and to observe how experts can solve complex tasks after only…
Impact of Advanced Avionics Technology on Ground Attack Weapon Systems.
1982-02-01
as the relevant feature. 3.0 Problem The task is to perform the automatic cueing of moving objects in a natural environment . Additional problems...views on this subject to the American Defense Preparedness Association (ADPA) on 11 February 1981 in Orlando, Florida. ENVIRONMENTAL CONDITIONS OUR...the operating window or the environmental conditions of combat that our forces may encounter worldwide. The three areas selected were Europe, the
ERIC Educational Resources Information Center
Branzburg, Jeffrey
2007-01-01
Interactive whiteboards have made quite a splash in classrooms in recent years. When a computer image is projected on the whiteboard using an LCD projector, users can directly control the computer from the whiteboard. In some systems such as Smart and Mimio, the finger is used in place of a mouse to open and run programs or move windows around. In…
ERIC Educational Resources Information Center
Whitford, Veronica; O'Driscoll, Gillian A.; Pack, Christopher C.; Joober, Ridha; Malla, Ashok; Titone, Debra
2013-01-01
Language and oculomotor disturbances are 2 of the best replicated findings in schizophrenia. However, few studies have examined skilled reading in schizophrenia (e.g., Arnott, Sali, Copland, 2011; Hayes & O'Grady, 2003; Revheim et al., 2006; E. O. Roberts et al., 2012), and none have examined the contribution of cognitive and motor processes that…
Optimal routing of coordinated aircraft to Identify moving surface contacts
2017-06-01
Time TAO Tactical Action Officer TSP Traveling Salesman Problem TSPTW TSP with Time Windows UAV unmanned aerial vehicle VRP Vehicle Routing...Orienteering Problem (OP), while the ORCA TI formulation follows the structure of a time dependent Traveling Salesman Problem (TSP), or a time dependent...Fox, Kenneth R., Bezalel Gavish, and Stephen C. Graves. 1980. “An n- Constraint Formulation of the ( Time Dependent) Traveling Salesman Problem
ERIC Educational Resources Information Center
Whitford, Veronica; Titone, Debra
2015-01-01
Eye movement measures demonstrate differences in first-language (L1) and second-language (L2) paragraph-level reading as a function of individual differences in current L2 exposure among bilinguals (Whitford & Titone, 2012). Specifically, as current L2 exposure increases, the ease of L2 word processing increases, but the ease of L1 word…
Design and DSP implementation of star image acquisition and star point fast acquiring and tracking
NASA Astrophysics Data System (ADS)
Zhou, Guohui; Wang, Xiaodong; Hao, Zhihang
2006-02-01
Star sensor is a special high accuracy photoelectric sensor. Attitude acquisition time is an important function index of star sensor. In this paper, the design target is to acquire 10 samples per second dynamic performance. On the basis of analyzing CCD signals timing and star image processing, a new design and a special parallel architecture for improving star image processing are presented in this paper. In the design, the operation moving the data in expanded windows including the star to the on-chip memory of DSP is arranged in the invalid period of CCD frame signal. During the CCD saving the star image to memory, DSP processes the data in the on-chip memory. This parallelism greatly improves the efficiency of processing. The scheme proposed here results in enormous savings of memory normally required. In the scheme, DSP HOLD mode and CPLD technology are used to make a shared memory between CCD and DSP. The efficiency of processing is discussed in numerical tests. Only in 3.5ms is acquired the five lightest stars in the star acquisition stage. In 43us, the data in five expanded windows including stars are moved into the internal memory of DSP, and in 1.6ms, five star coordinates are achieved in the star tracking stage.
NASA Astrophysics Data System (ADS)
Ham, Seung-Hee; Kato, Seiji; Barker, Howard W.; Rose, Fred G.; Sun-Mack, Sunny
2014-01-01
Three-dimensional (3-D) effects on broadband shortwave top of atmosphere (TOA) nadir radiance, atmospheric absorption, and surface irradiance are examined using 3-D cloud fields obtained from one hour's worth of A-train satellite observations and one-dimensional (1-D) independent column approximation (ICA) and full 3-D radiative transfer simulations. The 3-D minus ICA differences in TOA nadir radiance multiplied by π, atmospheric absorption, and surface downwelling irradiance, denoted as πΔI, ΔA, and ΔT, respectively, are analyzed by cloud type. At the 1 km pixel scale, πΔI, ΔA, and ΔT exhibit poor spatial correlation. Once averaged with a moving window, however, better linear relationships among πΔI, ΔA, and ΔT emerge, especially for moving windows larger than 5 km and large θ0. While cloud properties and solar geometry are shown to influence the relationships amongst πΔI, ΔA, and ΔT, once they are separated by cloud type, their linear relationships become much stronger. This suggests that ICA biases in surface irradiance and atmospheric absorption can be approximated based on ICA biases in nadir radiance as a function of cloud type.
Gostian, Antoniu-Oreste; Schwarz, David; Mandt, Philipp; Anagiotos, Andreas; Ortmann, Magdalene; Pazen, David; Beutner, Dirk; Hüttenbrink, Karl-Bernd
2016-11-01
The round window vibroplasty is a feasible option for the treatment of conductive, sensorineural and mixed hearing loss. Although clinical data suggest a satisfying clinical outcome with various coupling methods, the most efficient coupling technique of the floating mass transducer to the round window is still a matter of debate. For this, a soft silicone-made coupler has been developed recently that aims to ease and optimize the stimulation of the round window membrane of this middle ear implant. We performed a temporal bone study evaluating the performance of the soft coupler compared to the coupling with individually shaped cartilage, perichondrium and the titanium round window coupler with loads up to 20 mN at the unaltered and fully exposed round window niche. The stimulation of the cochlea was measured by the volume velocities of the stapes footplate detected by a laser Doppler vibrometer. The coupling method was computed as significant factor with cartilage and perichondrium allowing for the highest volume velocities followed by the soft and titanium coupler. Exposure of the round window niche allowed for higher volume velocities while the applied load did not significantly affect the results. The soft coupler allows for a good contact to the round window membrane and an effective backward stimulation of the cochlea. Clinical data are mandatory to evaluate performance of this novel coupling method in vivo.
NASA Astrophysics Data System (ADS)
El-Madany, T.; Griessbaum, F.; Maneke, F.; Chu, H.-S.; Wu, C.-C.; Chang, S. C.; Hsia, Y.-J.; Juang, J.-Y.; Klemm, O.
2010-07-01
To estimate carbon dioxide or water vapor fluxes with the Eddy Covariance method high quality data sets are necessary. Under foggy conditions this is challenging, because open path measurements are influenced by the water droplets that cross the measurement path as well as deposit on the windows of the optical path. For the LI-7500 the deposition of droplets on the window results in an intensity reduction of the infrared beam. To keep the strength of the infrared beam under these conditions, the energy is increased. A measure for the increased energy is given by the AGC value (Automatic Gain Control). Up to a AGC threshold value of 70 % the data from the LI-7500 is assumed to be of good quality (personal communication with LICOR). Due to fog deposition on the windows, the AGC value rises above 70 % and stays there until the fog disappears and the water on the windows evaporates. To gain better data quality during foggy conditions, a blower system was developed that blows the deposited water droplets off the window. The system is triggered if the AGC value rises above 70 %. Then a pneumatic jack will lift the blower system towards the LI-7500 and the water-droplets get blown off with compressed air. After the AGC value drops below 70 %, the pneumatic jack will move back to the idle position. Using this technique showed that not only the fog droplets on the window causing significant problems to the measurement, but also the fog droplets inside the measurement path. Under conditions of very dense fog the measured values of carbon dioxide can get unrealistically high, and for water vapor, negative values can be observed even if the AGC value is below 70 %. The negative values can be explained by the scatter of the infrared beam on the fog droplets. It is assumed, that different types of fog droplet spectra are causing the various error patterns observed. For high quality flux measurements, not only the AGC threshold value of 70 % is important, but also the fluctuation of the AGC value in a flux averaging interval. Such AGC value fluctuations can cause severe jumps in the concentration measurements that can hardly be corrected for. Results of fog effects on the LI-7500 performance and its consequences for flux measurements and budget calculations will be presented.
Identification of a novel dynamic red blindness in human by event-related brain potentials.
Zhang, Jiahua; Kong, Weijia; Yang, Zhongle
2010-12-01
Dynamic color is an important carrier that takes information in some special occupations. However, up to the present, there are no available and objective tests to evaluate dynamic color processing. To investigate the characteristics of dynamic color processing, we adopted two patterns of visual stimulus called "onset-offset" which reflected static color stimuli and "sustained moving" without abrupt mode which reflected dynamic color stimuli to evoke event-related brain potentials (ERPs) in primary color amblyopia patients (abnormal group) and subjects with normal color recognition ability (normal group). ERPs were recorded by Neuroscan system. The results showed that in the normal group, ERPs in response to the dynamic red stimulus showed frontal positive amplitudes with a latency of about 180 ms, a negative peak at about 240 ms and a peak latency of the late positive potential (LPP) in a time window between 290 and 580 ms. In the abnormal group, ERPs in response to the dynamic red stimulus were fully lost and characterized by vanished amplitudes between 0 and 800 ms. No significant difference was noted in ERPs in response to the dynamic green and blue stimulus between the two groups (P>0.05). ERPs of the two groups in response to the static red, green and blue stimulus were not much different, showing a transient negative peak at about 170 ms and a peak latency of LPP in a time window between 350 and 650 ms. Our results first revealed that some subjects who were not identified as color blindness under static color recognition could not completely apperceive a sort of dynamic red stimulus by ERPs, which was called "dynamic red blindness". Furthermore, these results also indicated that low-frequency ERPs induced by "sustained moving" may be a good and new method to test dynamic color perception competence.
Moving horizon estimation for assimilating H-SAF remote sensing data into the HBV hydrological model
NASA Astrophysics Data System (ADS)
Montero, Rodolfo Alvarado; Schwanenberg, Dirk; Krahe, Peter; Lisniak, Dmytro; Sensoy, Aynur; Sorman, A. Arda; Akkol, Bulut
2016-06-01
Remote sensing information has been extensively developed over the past few years including spatially distributed data for hydrological applications at high resolution. The implementation of these products in operational flow forecasting systems is still an active field of research, wherein data assimilation plays a vital role on the improvement of initial conditions of streamflow forecasts. We present a novel implementation of a variational method based on Moving Horizon Estimation (MHE), in application to the conceptual rainfall-runoff model HBV, to simultaneously assimilate remotely sensed snow covered area (SCA), snow water equivalent (SWE), soil moisture (SM) and in situ measurements of streamflow data using large assimilation windows of up to one year. This innovative application of the MHE approach allows to simultaneously update precipitation, temperature, soil moisture as well as upper and lower zones water storages of the conceptual model, within the assimilation window, without an explicit formulation of error covariance matrixes and it enables a highly flexible formulation of distance metrics for the agreement of simulated and observed variables. The framework is tested in two data-dense sites in Germany and one data-sparse environment in Turkey. Results show a potential improvement of the lead time performance of streamflow forecasts by using perfect time series of state variables generated by the simulation of the conceptual rainfall-runoff model itself. The framework is also tested using new operational data products from the Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF) of EUMETSAT. This study is the first application of H-SAF products to hydrological forecasting systems and it verifies their added value. Results from assimilating H-SAF observations lead to a slight reduction of the streamflow forecast skill in all three cases compared to the assimilation of streamflow data only. On the other hand, the forecast skill of soil moisture shows a significant improvement.
Novel hermetic packaging methods for MOEMS
NASA Astrophysics Data System (ADS)
Stark, David
2003-01-01
Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.
Bispectral analysis: comparison of two windowing functions
NASA Astrophysics Data System (ADS)
Silvagni, D.; Djerroud, C.; Réveillé, T.; Gravier, E.
2018-02-01
Amongst all the normalized forms of bispectrum, the bicoherence is shown to be a very useful diagnostic tool in experimental studies of nonlinear wave interactions in plasma, as it measures the fraction of wave power due to the quadratic wave coupling in a self-excited fluctuation spectrum [1, 2]. In order to avoid spectral leakage, the application of a windowing function is needed during the bicoherence computation. Spectral leakage from statistically dependent components are of crucial importance in the discrimination between coupled and uncoupled modes, as they will introduce in the bicoherence spectrum phase-coupled modes which in reality do not exist. Therefore, the windowing function plays a key role in the bicoherence estimation. In this paper, two windowing methods are compared: the multiplication of the initial signal by the Hanning function and the subtraction of the straight line which links the two extremities of the signal. The influence of these two windowing methods on both the power spectrum and the bicoherence spectrum is showed. Although both methods give precise results, the Hanning function appears to be the more suitable window.
NASA Astrophysics Data System (ADS)
Brewick, P. T.; Smyth, A. W.
2014-12-01
The accurate and reliable estimation of modal damping from output-only vibration measurements of structural systems is a continuing challenge in the fields of operational modal analysis (OMA) and system identification. In this paper a modified version of the blind source separation (BSS)-based Second-Order Blind Identification (SOBI) method was used to perform modal damping identification on a model bridge structure under varying loading conditions. The bridge model was created with finite elements and consisted of a series of stringer beams supported by a larger girder. The excitation was separated into two categories: ambient noise and traffic loads with noise modeled with random forcing vectors and traffic simulated with moving loads for cars and partially distributed moving masses for trains. The acceleration responses were treated as the mixed output signals for the BSS algorithm. The modified SOBI method used a windowing technique to maximize the amount of information used for blind identification from the responses. The modified SOBI method successfully found the mode shapes for both types of excitation with strong accuracy, but power spectral densities (PSDs) of the recovered modal responses showed signs of distortion for the traffic simulations. The distortion had an adverse affect on the damping ratio estimates for some of the modes but no correlation could be found between the accuracy of the damping estimates and the accuracy of the recovered mode shapes. The responses and their PSDs were compared to real-world collected data and patterns similar to distortion were observed implying that this issue likely affects real-world estimates.
On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2011-01-01
This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.
Method of identifying features in indexed data
Jarman, Kristin H [Richland, WA; Daly, Don Simone [Richland, WA; Anderson, Kevin K [Richland, WA; Wahl, Karen L [Richland, WA
2001-06-26
The present invention is a method of identifying features in indexed data, especially useful for distinguishing signal from noise in data provided as a plurality of ordered pairs. Each of the plurality of ordered pairs has an index and a response. The method has the steps of: (a) providing an index window having a first window end located on a first index and extending across a plurality of indices to a second window end; (b) selecting responses corresponding to the plurality of indices within the index window and computing a measure of dispersion of the responses; and (c) comparing the measure of dispersion to a dispersion critical value. Advantages of the present invention include minimizing signal to noise ratio, signal drift, varying baseline signal and combinations thereof.
Using OpenOffice as a Portable Interface to JAVA-Based Applications
NASA Astrophysics Data System (ADS)
Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.
2004-07-01
STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.
Hybrid window layer for photovoltaic cells
Deng, Xunming
2010-02-23
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Hybrid window layer for photovoltaic cells
Deng, Xunming [Syvania, OH; Liao, Xianbo [Toledo, OH; Du, Wenhui [Toledo, OH
2011-10-04
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Hybrid window layer for photovoltaic cells
Deng, Xunming [Sylvania, OH; Liao, Xianbo [Toledo, OH; Du, Wenhui [Toledo, OH
2011-02-01
A novel photovoltaic solar cell and method of making the same are disclosed. The solar cell includes: at least one absorber layer which could either be a lightly doped layer or an undoped layer, and at least a doped window-layers which comprise at least two sub-window-layers. The first sub-window-layer, which is next to the absorber-layer, is deposited to form desirable junction with the absorber-layer. The second sub-window-layer, which is next to the first sub-window-layer, but not in direct contact with the absorber-layer, is deposited in order to have transmission higher than the first-sub-window-layer.
Detection of Early Ischemic Changes in Noncontrast CT Head Improved with "Stroke Windows".
Mainali, Shraddha; Wahba, Mervat; Elijovich, Lucas
2014-01-01
Introduction. Noncontrast head CT (NCCT) is the standard radiologic test for patients presenting with acute stroke. Early ischemic changes (EIC) are often overlooked on initial NCCT. We determine the sensitivity and specificity of improved EIC detection by a standardized method of image evaluation (Stroke Windows). Methods. We performed a retrospective chart review to identify patients with acute ischemic stroke who had NCCT at presentation. EIC was defined by the presence of hyperdense MCA/basilar artery sign; sulcal effacement; basal ganglia/subcortical hypodensity; and loss of cortical gray-white differentiation. NCCT was reviewed with standard window settings and with specialized Stroke Windows. Results. Fifty patients (42% females, 58% males) with a mean NIHSS of 13.4 were identified. EIC was detected in 9 patients with standard windows, while EIC was detected using Stroke Windows in 35 patients (18% versus 70%; P < 0.0001). Hyperdense MCA sign was the most commonly reported EIC; it was better detected with Stroke Windows (14% and 36%; P < 0.0198). Detection of the remaining EIC also improved with Stroke Windows (6% and 46%; P < 0.0001). Conclusions. Detection of EIC has important implications in diagnosis and treatment of acute ischemic stroke. Utilization of Stroke Windows significantly improved detection of EIC.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-07-01
Epidemiological research supports an association between maternal exposure to air pollution during pregnancy and adverse children's health outcomes. Advances in exposure assessment and statistics allow for estimation of both critical windows of vulnerability and exposure effect heterogeneity. Simultaneous estimation of windows of vulnerability and effect heterogeneity can be accomplished by fitting a distributed lag model (DLM) stratified by subgroup. However, this can provide an incomplete picture of how effects vary across subgroups because it does not allow for subgroups to have the same window but different within-window effects or to have different windows but the same within-window effect. Because the timing of some developmental processes are common across subpopulations of infants while for others the timing differs across subgroups, both scenarios are important to consider when evaluating health risks of prenatal exposures. We propose a new approach that partitions the DLM into a constrained functional predictor that estimates windows of vulnerability and a scalar effect representing the within-window effect directly. The proposed method allows for heterogeneity in only the window, only the within-window effect, or both. In a simulation study we show that a model assuming a shared component across groups results in lower bias and mean squared error for the estimated windows and effects when that component is in fact constant across groups. We apply the proposed method to estimate windows of vulnerability in the association between prenatal exposures to fine particulate matter and each of birth weight and asthma incidence, and estimate how these associations vary by sex and maternal obesity status in a Boston-area prospective pre-birth cohort study. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Occupant-responsive optimal control of smart facade systems
NASA Astrophysics Data System (ADS)
Park, Cheol-Soo
Windows provide occupants with daylight, direct sunlight, visual contact with the outside and a feeling of openness. Windows enable the use of daylighting and offer occupants a outside view. Glazing may also cause a number of problems: undesired heat gain/loss in winter. An over-lit window can cause glare, which is another major complaint by occupants. Furthermore, cold or hot window surfaces induce asymmetric thermal radiation which can result in thermal discomfort. To reduce the potential problems of window systems, double skin facades and airflow window systems have been introduced in the 1970s. They typically contain interstitial louvers and ventilation openings. The current problem with double skin facades and airflow windows is that their operation requires adequate dynamic control to reach their expected performance. Many studies have recognized that only an optimal control enables these systems to truly act as active energy savers and indoor environment controllers. However, an adequate solution for this dynamic optimization problem has thus far not been developed. The primary objective of this study is to develop occupant responsive optimal control of smart facade systems. The control could be implemented as a smart controller that operates the motorized Venetian blind system and the opening ratio of ventilation openings. The objective of the control is to combine the benefits of large windows with low energy demands for heating and cooling, while keeping visual well-being and thermal comfort at an optimal level. The control uses a simulation model with an embedded optimization routine that allows occupant interaction via the Web. An occupant can access the smart controller from a standard browser and choose a pre-defined mode (energy saving mode, visual comfort mode, thermal comfort mode, default mode, nighttime mode) or set a preferred mode (user-override mode) by moving preference sliders on the screen. The most prominent feature of these systems is the capability of dynamically reacting to the environmental input data through real-time optimization. The proposed occupant responsive optimal control of smart facade systems could provide a breakthrough in this under-developed area and lead to a renewed interest in smart facade systems.
Active noise attenuation in ventilation windows.
Huang, Huahua; Qiu, Xiaojun; Kang, Jian
2011-07-01
The feasibility of applying active noise control techniques to attenuate low frequency noise transmission through a natural ventilation window into a room is investigated analytically and experimentally. The window system is constructed by staggering the opening sashes of a spaced double glazing window to allow ventilation and natural light. An analytical model based on the modal expansion method is developed to calculate the low frequency sound field inside the window and the room and to be used in the active noise control simulations. The effectiveness of the proposed analytical model is validated by using the finite element method. The performance of the active control system for a window with different source and receiver configurations are compared, and it is found that the numerical and experimental results are in good agreement and the best result is achieved when the secondary sources are placed in the center at the bottom of the staggered window. The extra attenuation at the observation points in the optimized window system is almost equivalent to the noise reduction at the error sensor and the frequency range of effective control is up to 390 Hz in the case of a single channel active noise control system. © 2011 Acoustical Society of America
Fixed-pattern noise correction method based on improved moment matching for a TDI CMOS image sensor.
Xu, Jiangtao; Nie, Huafeng; Nie, Kaiming; Jin, Weimin
2017-09-01
In this paper, an improved moment matching method based on a spatial correlation filter (SCF) and bilateral filter (BF) is proposed to correct the fixed-pattern noise (FPN) of a time-delay-integration CMOS image sensor (TDI-CIS). First, the values of row FPN (RFPN) and column FPN (CFPN) are estimated and added to the original image through SCF and BF, respectively. Then the filtered image will be processed by an improved moment matching method with a moving window. Experimental results based on a 128-stage TDI-CIS show that, after correcting the FPN in the image captured under uniform illumination, the standard deviation of row mean vector (SDRMV) decreases from 5.6761 LSB to 0.1948 LSB, while the standard deviation of the column mean vector (SDCMV) decreases from 15.2005 LSB to 13.1949LSB. In addition, for different images captured by different TDI-CISs, the average decrease of SDRMV and SDCMV is 5.4922/2.0357 LSB, respectively. Comparative experimental results indicate that the proposed method can effectively correct the FPNs of different TDI-CISs while maintaining image details without any auxiliary equipment.
Thurlow, W R
1980-01-01
Messages were presented which moved from right to left along an electronic alphabetic display which was varied in "window" size from 4 through 32 letter spaces. Deaf subjects signed the messages they perceived. Relatively few errors were made even at the highest rate of presentation, which corresponded to a typing rate of 60 words/min. It is concluded that many deaf persons can make effective use of a small visual display. A reduced cost is then possible for visual communication instruments for these people through reduced display size. Deaf subjects who can profit from a small display can be located by a sentence test administered by tape recorder which drives the display of the communication device by means of the standard code of the deaf teletype network.
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
Testing Land-Vegetation retrieval algorithms for the ICESat-2 mission
NASA Astrophysics Data System (ADS)
Zhou, T.; Popescu, S. C.
2017-12-01
The upcoming spaceborne satellite, the Ice, Cloud and land Elevation Satellite 2 (ICESat-2), will provide topography and canopy profiles at the global scale using photon counting LiDAR. To prepare for the mission launch, the aim of this research is to develop a framework for retrieving ground and canopy height in different forest types and noise levels using two ICESat-2 testbed sensor data: MABEL (Multiple Altimeter Beam Experimental Lidar) data and simulated ICESat-2 data. The first step of the framework is to reduce as many noise photons as possible through grid statistical methods and cluster analysis. Subsequently, we employed the overlapping moving windows and estimated quantile heights in each window to characterize the possible ground and canopy top using the filtered photons. Both MABEL and simulated ICESat-2 data generated satisfactory results with reasonable accuracy, while the results of simulated ICESat-2 data were better than that of MABEL data with smaller root mean square errors (RMSEs). For example, the RMSEs of canopy top identification in various vegetation using simulated ICESat-2 data were less than 3.78 m comparing to 6.48 m for the MABE data. It is anticipated that the methodology will advance data processing of the ICESat-2 mission and expand potential applications of ICESat-2 data once available such as mapping vegetation canopy heights.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
Nonlinear filtering properties of detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-11-01
Detrended fluctuation analysis (DFA) has been widely used for quantifying long-range correlation and fractal scaling behavior. In DFA, to avoid spurious detection of scaling behavior caused by a nonstationary trend embedded in the analyzed time series, a detrending procedure using piecewise least-squares fitting has been applied. However, it has been pointed out that the nonlinear filtering properties involved with detrending may induce instabilities in the scaling exponent estimation. To understand this issue, we investigate the adverse effects of the DFA detrending procedure on the statistical estimation. We show that the detrending procedure using piecewise least-squares fitting results in the nonuniformly weighted estimation of the root-mean-square deviation and that this property could induce an increase in the estimation error. In addition, for comparison purposes, we investigate the performance of a centered detrending moving average analysis with a linear detrending filter and sliding window DFA and show that these methods have better performance than the standard DFA.
NASA Technical Reports Server (NTRS)
Tescher, Andrew G. (Editor)
1989-01-01
Various papers on image compression and automatic target recognition are presented. Individual topics addressed include: target cluster detection in cluttered SAR imagery, model-based target recognition using laser radar imagery, Smart Sensor front-end processor for feature extraction of images, object attitude estimation and tracking from a single video sensor, symmetry detection in human vision, analysis of high resolution aerial images for object detection, obscured object recognition for an ATR application, neural networks for adaptive shape tracking, statistical mechanics and pattern recognition, detection of cylinders in aerial range images, moving object tracking using local windows, new transform method for image data compression, quad-tree product vector quantization of images, predictive trellis encoding of imagery, reduced generalized chain code for contour description, compact architecture for a real-time vision system, use of human visibility functions in segmentation coding, color texture analysis and synthesis using Gibbs random fields.
Bias Reduction in Short Records of Satellite Soil Moisture
NASA Technical Reports Server (NTRS)
Reichle, Rolf H.; Koster, Randal D.
2004-01-01
Although surface soil moisture data from different sources (satellite retrievals, ground measurements, and land model integrations of observed meteorological forcing data) have been shown to contain consistent and useful information in their seasonal cycle and anomaly signals, they typically exhibit very different mean values and variability. These biases pose a severe obstacle to exploiting the useful information contained in satellite retrievals through data assimilation. A simple method of bias removal is to match the cumulative distribution functions (cdf) of the satellite and model data. However, accurate cdf estimation typically requires a long record of satellite data. We demonstrate here that by wing spatial sampling with a 2 degree moving window we can obtain local statistics based on a one-year satellite record that are a good approximation to those that would be derived from a much longer time series. This result should increase the usefulness of relatively short satellite data records.
Polyp measurement with CT colonography: multiple-reader, multiple-workstation comparison.
Young, Brett M; Fletcher, J G; Paulsen, Scott R; Booya, Fargol; Johnson, C Daniel; Johnson, Kristina T; Melton, Zackary; Rodysill, Drew; Mandrekar, Jay
2007-01-01
The risk of invasive colorectal cancer in colorectal polyps correlates with lesion size. Our purpose was to define the most accurate methods for measuring polyp size at CT colonography (CTC) using three models of workstations and multiple observers. Six reviewers measured 24 unique polyps of known size (5, 7, 10, and 12 mm), shape (sessile, flat, and pedunculated), and location (straight or curved bowel segment) using CTC data sets obtained at two doses (5 mAs and 65 mAs) and a previously described colonic phantom model. Reviewers measured the largest diameter of polyps on three proprietary workstations. Each polyp was measured with lung and soft-tissue windows on axial, 2D multiplanar reconstruction (MPR), and 3D images. There were significant differences among measurements obtained at various settings within each workstation (p < 0.0001). Measurements on 2D images were more accurate with lung window than with soft-tissue window settings (p < 0.0001). For the 65-mAs data set, the most accurate measurements were obtained in analysis of axial images with lung window, 2D MPR images with lung window, and 3D tissue cube images for Wizard, Advantage, and Vitrea workstations, respectively, without significant differences in accuracy among techniques (0.11 < p < 0.59). The mean absolute error values for these optimal settings were 0.48 mm, 0.61 mm, and 0.76 mm, respectively, for the three workstations. Within the ultralow-dose 5-mAs data set the best methods for Wizard, Advantage, and Vitrea were axial with lung window, 2D MPR with lung window, and 2D MPR with lung window, respectively. Use of nearly all measurement methods, except for the Vitrea 3D tissue cube and the Wizard 2D MPR with lung window, resulted in undermeasurement of the true size of the polyps. Use of CTC computer workstations facilitates accurate polyp measurement. For routine CTC examinations, polyps should be measured with lung window settings on 2D axial or MPR images (Wizard and Advantage) or 3D images (Vitrea). When these optimal methods are used, these three commercial workstations do not differ significantly in acquisition of accurate polyp measurements at routine dose settings.
NASA Astrophysics Data System (ADS)
Ojeda, GermáN. Y.; Whitman, Dean
2002-11-01
The effective elastic thickness (Te) of the lithosphere is a parameter that describes the flexural strength of a plate. A method routinely used to quantify this parameter is to calculate the coherence between the two-dimensional gravity and topography spectra. Prior to spectra calculation, data grids must be "windowed" in order to avoid edge effects. We investigated the sensitivity of Te estimates obtained via the coherence method to mirroring, Hanning and multitaper windowing techniques on synthetic data as well as on data from northern South America. These analyses suggest that the choice of windowing technique plays an important role in Te estimates and may result in discrepancies of several kilometers depending on the selected windowing method. Te results from mirrored grids tend to be greater than those from Hanning smoothed or multitapered grids. Results obtained from mirrored grids are likely to be over-estimates. This effect may be due to artificial long wavelengths introduced into the data at the time of mirroring. Coherence estimates obtained from three subareas in northern South America indicate that the average effective elastic thickness is in the range of 29-30 km, according to Hanning and multitaper windowed data. Lateral variations across the study area could not be unequivocally determined from this study. We suggest that the resolution of the coherence method does not permit evaluation of small (i.e., ˜5 km), local Te variations. However, the efficiency and robustness of the coherence method in rendering continent-scale estimates of elastic thickness has been confirmed.
Radiation-transparent windows, method for imaging fluid transfers
Shu, Deming [Darien, IL; Wang, Jin [Burr Ridge, IL
2011-07-26
A thin, x-ray-transparent window system for environmental chambers involving pneumatic pressures above 40 bar is presented. The window allows for x-ray access to such phenomena as fuel sprays injected into a pressurized chamber that mimics realistic internal combustion engine cylinder operating conditions.
Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation
NASA Astrophysics Data System (ADS)
Sekhar, S. Chandra; Sreenivas, TV
2004-12-01
We address the problem of estimating instantaneous frequency (IF) of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE). The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF) estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD)-based IF estimators for different signal-to-noise ratio (SNR).
Li, Rongxia; Stewart, Brock; Weintraub, Eric
2016-01-01
The self-controlled case series (SCCS) and self-controlled risk interval (SCRI) designs have recently become widely used in the field of post-licensure vaccine safety monitoring to detect potential elevated risks of adverse events following vaccinations. The SCRI design can be viewed as a subset of the SCCS method in that a reduced comparison time window is used for the analysis. Compared to the SCCS method, the SCRI design has less statistical power due to fewer events occurring in the shorter control interval. In this study, we derived the asymptotic relative efficiency (ARE) between these two methods to quantify this loss in power in the SCRI design. The equation is formulated as [Formula: see text] (a: control window-length ratio between SCRI and SCCS designs; b: ratio of risk window length and control window length in the SCCS design; and [Formula: see text]: relative risk of exposed window to control window). According to this equation, the relative efficiency declines as the ratio of control-period length between SCRI and SCCS methods decreases, or with an increase in the relative risk [Formula: see text]. We provide an example utilizing data from the Vaccine Safety Datalink (VSD) to study the potential elevated risk of febrile seizure following seasonal influenza vaccine in the 2010-2011 season.
Fabrication of Microcapsules for Dye-Doped Polymer-Dispersed Liquid Crystal-Based Smart Windows.
Kim, Mingyun; Park, Kyun Joo; Seok, Seunghwan; Ok, Jong Min; Jung, Hee-Tae; Choe, Jaehoon; Kim, Do Hyun
2015-08-19
A dye-doped polymer-dispersed liquid crystal (PDLC) is an attractive material for application in smart windows. Smart windows using a PDLC can be operated simply and have a high contrast ratio compared to those of other devices that employed photochromic or thermochromic material. However, in conventional dye-doped PDLC methods, dye contamination can cause problems and has a limited degree of commercialization of electric smart windows. Here, we report on an approach to resolve dye-related problems by encapsulating the dye in monodispersed capsules. By encapsulation, a fabricated dye-doped PDLC had a contrast ratio of >120 at 600 nm. This fabrication method of encapsulating the dye in a core-shell structured microcapsule in a dye-doped PDLC device provides a practical platform for dye-doped PDLC-based smart windows.
Assessment of gliosis around moveable implants in the brain
Stice, Paula
2010-01-01
Repositioning microelectrodes post-implantation is emerging as a promising approach to achieve long-term reliability in single neuronal recordings. The main goal of this study was to (a) assess glial reaction in response to movement of microelectrodes in the brain post-implantation and (b) determine an optimal window of time post-implantation when movement of microelectrodes within the brain would result in minimal glial reaction. Eleven Sprague-Dawley rats were implanted with two microelectrodes each that could be moved in vivo post-implantation. Three cohorts were investigated: (1) microelectrode moved at day 2 (n = 4 animals), (2) microelectrode moved at day 14 (n = 5 animals) and (3) microelectrode moved at day 28 (n = 2 animals). Histological evaluation was performed in cohorts 1–3 at four-week post-movement (30 days, 42 days and 56 days post-implantation, respectively). In addition, five control animals were implanted with microelectrodes that were not moved. Control animals were implanted for (1) 30 days (n = 1), (2) 42 days (n = 2) and (3) 56 days (n = 2) prior to histological evaluation. Quantitative assessment of glial fibrillary acidic protein (GFAP) around the tip of the microelectrodes demonstrated that GFAP levels were similar around microelectrodes moved at day 2 when compared to the 30-day controls. However, GFAP expression levels around microelectrode tips that moved at day 14 and day 28 were significantly less than those around control microelectrodes implanted for 42 and 56 days, respectively. Therefore, we conclude that moving microelectrodes after implantation is a viable strategy that does not result in any additional damage to brain tissue. Further, moving the microelectrode downwards after 14 days of implantation may actually reduce the levels of GFAP expression around the tips of the microelectrodes in the long term. PMID:19556680
Sunlight Responsive Thermochromic Window System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millett, F,A; Byker,H, J
2006-10-27
Pleotint has embarked on a novel approach with our Sunlight Responsive Thermochromic, SRT™, windows. We are integrating dynamic sunlight control, high insulation values and low solar heat gain together in a high performance window. The Pleotint SRT window is dynamic because it reversibly changes light transmission based on thermochromics activated directly by the heating effect of sunlight. We can achieve a window package with low solar heat gain coefficient (SHGC), a low U value and high insulation. At the same time our windows provide good daylighting. Our innovative window design offers architects and building designers the opportunity to choose theirmore » desired energy performance, excellent sound reduction, external pane can be self-cleaning, or a resistance to wind load, blasts, bullets or hurricanes. SRT windows would provide energy savings that are estimated at up to 30% over traditional window systems. Glass fabricators will be able to use existing equipment to make the SRT window while adding value and flexibility to the basic design. Glazing installers will have the ability to fit the windows with traditional methods without wires, power supplies and controllers. SRT windows can be retrofit into existing buildings,« less
NASA Astrophysics Data System (ADS)
Lyubushin, Alexey
2016-04-01
The problem of estimate of current seismic danger based on monitoring of seismic noise properties from broadband seismic network F-net in Japan (84 stations) is considered. Variations of the following seismic noise parameters are analyzed: multifractal singularity spectrum support width, generalized Hurst exponent, minimum Hölder-Lipschitz exponent and minimum normalized entropy of squared orthogonal wavelet coefficients. These parameters are estimated within adjacent time windows of the length 1 day for seismic noise waveforms from each station. Calculating daily median values of these parameters by all stations provides 4-dimensional time series which describes integral properties of the seismic noise in the region covered by the network. Cluster analysis is applied to the sequence of clouds of 4-dimensional vectors within moving time window of the length 365 days with mutual shift 3 days starting from the beginning of 1997 up to the current time. The purpose of the cluster analysis is to find the best number of clusters (BNC) from probe numbers which are varying from 1 up to the maximum value 40. The BNC is found from the maximum of pseudo-F-statistics (PFS). A 2D map could be created which presents dependence of PFS on the tested probe number of clusters and the right-hand end of moving time window which is rather similar to usual spectral time-frequency diagrams. In the paper [1] it was shown that the BNC before Tohoku mega-earthquake on March 11, 2011, has strongly chaotic regime with jumps from minimum up to maximum values in the time interval 1 year before the event and this time intervals was characterized by high PFS values. The PFS-map is proposed as the method for extracting time intervals with high current seismic danger. The next danger time interval after Tohoku mega-EQ began at the end of 2012 and was finished at the middle of 2013. Starting from middle of 2015 the high PFS values and chaotic regime of BNC variations were returned. This could be interpreted as the increasing of the danger of the next mega-EQ in Japan in the region of Nankai Trough [1] at the first half of 2016. References 1. Lyubushin, A., 2013. How soon would the next mega-earthquake occur in Japan? // Natural Science, 5 (8A1), 1-7. http://dx.doi.org/10.4236/ns.2013.58A1001
Asgari, Afrouz; Ashoor, Mansour; Sohrabpour, Mostafa; Shokrani, Parvaneh; Rezaei, Ali
2015-05-01
Improving signal to noise ratio (SNR) and qualified images by the various methods is very important for detecting the abnormalities at the body organs. Scatter and attenuation of photons by the organs lead to errors in radiopharmaceutical estimation as well as degradation of images. The choice of suitable energy window and the radionuclide have a key role in nuclear medicine which appearing the lowest scatter fraction as well as having a nearly constant linear attenuation coefficient as a function of phantom thickness. The energy windows of symmetrical window (SW), asymmetric window (ASW), high window (WH) and low window (WL) using Tc-99m and Sm-153 radionuclide with solid water slab phantom (RW3) and Teflon bone phantoms have been compared, and Matlab software and Monte Carlo N-Particle (MCNP4C) code were modified to simulate these methods and obtaining the amounts of FWHM and full width at tenth maximum (FWTM) using line spread functions (LSFs). The experimental data were obtained from the Orbiter Scintron gamma camera. Based on the results of the simulation as well as experimental work, the performance of WH and ASW display of the results, lowest scatter fraction as well as constant linear attenuation coefficient as a function of phantom thickness. WH and ASW were optimal windows in nuclear medicine imaging for Tc-99m in RW3 phantom and Sm-153 in Teflon bone phantom. Attenuation correction was done for WH and ASW optimal windows and for these radionuclides using filtered back projection algorithm. Results of simulation and experimental show that very good agreement between the set of experimental with simulation as well as theoretical values with simulation data were obtained which was nominally less than 7.07 % for Tc-99m and less than 8.00 % for Sm-153. Corrected counts were not affected by the thickness of scattering material. The Simulated results of Line Spread Function (LSF) for Sm-153 and Tc-99m in phantom based on four windows and TEW method were indicated that the FWHM and FWTM values were approximately the same in TEW method and WH and ASW, but the sensitivity at the optimal window was more than that of the other one. The suitable determination of energy window width on the energy spectra can be useful in optimal design to improve efficiency and contrast. It is found that the WH is preferred to the ASW and the ASW is preferred to the SW.
Local concurrent error detection and correction in data structures using virtual backpointers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.C.J.; Chen, P.P.; Fuchs, W.K.
1989-11-01
A new technique, based on virtual backpointers, is presented in this paper for local concurrent error detection and correction in linked data structures. Two new data structures utilizing virtual backpointers, the Virtual Double-Linked List and the B-Tree and Virtual Backpointers, are described. For these structures, double errors within a fixed-size checking window can be detected in constant time and single errors detected during forward moves can be corrected in constant time.
Two-zone elastic-plastic single shock waves in solids.
Zhakhovsky, Vasily V; Budzevich, Mikalai M; Inogamov, Nail A; Oleynik, Ivan I; White, Carter T
2011-09-23
By decoupling time and length scales in moving window molecular dynamics shock-wave simulations, a new regime of shock-wave propagation is uncovered characterized by a two-zone elastic-plastic shock-wave structure consisting of a leading elastic front followed by a plastic front, both moving with the same average speed and having a fixed net thickness that can extend to microns. The material in the elastic zone is in a metastable state that supports a pressure that can substantially exceed the critical pressure characteristic of the onset of the well-known split-elastic-plastic, two-wave propagation. The two-zone elastic-plastic wave is a general phenomenon observed in simulations of a broad class of crystalline materials and is within the reach of current experimental techniques.
Can Changes in Eye Movement Scanning Alter the Age-Related Deficit in Recognition Memory?
Chan, Jessica P. K.; Kamino, Daphne; Binns, Malcolm A.; Ryan, Jennifer D.
2011-01-01
Older adults typically exhibit poorer face recognition compared to younger adults. These recognition differences may be due to underlying age-related changes in eye movement scanning. We examined whether older adults’ recognition could be improved by yoking their eye movements to those of younger adults. Participants studied younger and older faces, under free viewing conditions (bases), through a gaze-contingent moving window (own), or a moving window which replayed the eye movements of a base participant (yoked). During the recognition test, participants freely viewed the faces with no viewing restrictions. Own-age recognition biases were observed for older adults in all viewing conditions, suggesting that this effect occurs independently of scanning. Participants in the bases condition had the highest recognition accuracy, and participants in the yoked condition were more accurate than participants in the own condition. Among yoked participants, recognition did not depend on age of the base participant. These results suggest that successful encoding for all participants requires the bottom-up contribution of peripheral information, regardless of the locus of control of the viewer. Although altering the pattern of eye movements did not increase recognition, the amount of sampling of the face during encoding predicted subsequent recognition accuracy for all participants. Increased sampling may confer some advantages for subsequent recognition, particularly for people who have declining memory abilities. PMID:21687460
Influence of gravity and light on the developmental polarity of Ceratopteris richardii fern spores
NASA Technical Reports Server (NTRS)
Edwards, E. S.; Roux, S. J.
1998-01-01
The polarity of germinating single-celled spores of the fern Ceratopteris richardii Brogn. is influenced by gravity during a time period prior to the first cellular division designated a "polarity-determination window". After this window closes, control of polarity is seen in the downward (with respect to gravity) migration of the nucleus along the proximal face of the spore and the subsequent downward growth of the primary rhizoid. When spores are germinated on a clinostat the direction of nuclear migration and subsequent primary rhizoid growth is random. However, in each case the direction of nuclear migration predicts the direction of rhizoid elongation. Although it is the most obvious movement, the downward migration is not the first movement of the nucleus. During the polarity-determination window, the nucleus moves randomly within a region centered behind the trilete marking. While the polarity of many fern spores has been reported to be controlled by light, spores of C. richardii are the first documented to have their polarity influenced by gravity. Directional white light also affects the polarity of these spores, but this influence is slight and is secondary to that of gravity.
Compositional searching of CpG islands in the human genome
NASA Astrophysics Data System (ADS)
Luque-Escamilla, Pedro Luis; Martínez-Aroza, José; Oliver, José L.; Gómez-Lopera, Juan Francisco; Román-Roldán, Ramón
2005-06-01
We report on an entropic edge detector based on the local calculation of the Jensen-Shannon divergence with application to the search for CpG islands. CpG islands are pieces of the genome related to gene expression and cell differentiation, and thus to cancer formation. Searching for these CpG islands is a major task in genetics and bioinformatics. Some algorithms have been proposed in the literature, based on moving statistics in a sliding window, but its size may greatly influence the results. The local use of Jensen-Shannon divergence is a completely different strategy: the nucleotide composition inside the islands is different from that in their environment, so a statistical distance—the Jensen-Shannon divergence—between the composition of two adjacent windows may be used as a measure of their dissimilarity. Sliding this double window over the entire sequence allows us to segment it compositionally. The fusion of those segments into greater ones that satisfy certain identification criteria must be achieved in order to obtain the definitive results. We find that the local use of Jensen-Shannon divergence is very suitable in processing DNA sequences for searching for compositionally different structures such as CpG islands, as compared to other algorithms in literature.
A Comparison of Three Methods for Measuring Distortion in Optical Windows
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Nurge, Mark A.; Skow, Miles
2015-01-01
It's important that imagery seen through large-area windows, such as those used on space vehicles, not be substantially distorted. Many approaches are described in the literature for measuring the distortion of an optical window, but most suffer from either poor resolution or processing difficulties. In this paper a new definition of distortion is presented, allowing accurate measurement using an optical interferometer. This new definition is shown to be equivalent to the definitions provided by the military and the standards organizations. In order to determine the advantages and disadvantages of this new approach, the distortion of an acrylic window is measured using three different methods: image comparison, moiré interferometry, and phase-shifting interferometry.
NASA Technical Reports Server (NTRS)
Newell, J. D.; Keller, R. A.; Baily, N. A.
1974-01-01
A simple method for outlining or contouring any area defined by a change in film density or fluoroscopic screen intensity is described. The entire process, except for the positioning of an electronic window, is accomplished using a small computer having appropriate softwave. The electronic window is operator positioned over the area to be processed. The only requirement is that the window be large enough to encompass the total area to be considered.
Launch window analysis of satellites in high eccentricity or large circular orbits
NASA Technical Reports Server (NTRS)
Renard, M. L.; Bhate, S. K.; Sridharan, R.
1973-01-01
Numerical methods and computer programs for studying the stability and evolution of orbits of large eccentricity are presented. Methods for determining launch windows and target dates are developed. Mathematical models are prepared to analyze the characteristics of specific missions.
Non-intrusive parameter identification procedure user's guide
NASA Technical Reports Server (NTRS)
Hanson, G. D.; Jewell, W. F.
1983-01-01
Written in standard FORTRAN, NAS is capable of identifying linear as well as nonlinear relations between input and output parameters; the only restriction is that the input/output relation be linear with respect to the unknown coefficients of the estimation equations. The output of the identification algorithm can be specified to be in either the time domain (i.e., the estimation equation coefficients) or in the frequency domain (i.e., a frequency response of the estimation equation). The frame length ("window") over which the identification procedure is to take place can be specified to be any portion of the input time history, thereby allowing the freedom to start and stop the identification procedure within a time history. There also is an option which allows a sliding window, which gives a moving average over the time history. The NAS software also includes the ability to identify several assumed solutions simultaneously for the same or different input data.
Information transfer across intra/inter-structure of CDS and stock markets
NASA Astrophysics Data System (ADS)
Lim, Kyuseong; Kim, Sehyun; Kim, Soo Yong
2017-11-01
We investigate the information flow between industrial sectors in credit default swap and stock markets in the United States based on transfer entropy. Both markets have been studied with respect to dynamics and relations. Our approach considers the intra-structure of each financial market as well as the inter-structure between two markets through a moving window in order to scan a period from 2005 to 2012. We examine the information transfer with different k, especially k = 3, k = 5 and k = 7. Analysis indicates that the cases with k = 3 and k = 7 show the opposite trends but similar characteristics. Change in transfer entropy for intra-structure of CDS market precedes that of stock market in view of the entire time windows. Abrupt rise and fall in inter-structural information transfer between two markets are detected at the periods related to the financial crises, which can be considered as early warnings.
Karst database development in Minnesota: Design and data assembly
Gao, Y.; Alexander, E.C.; Tipping, R.G.
2005-01-01
The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces. ?? Springer-Verlag 2005.
Systemic risk and hierarchical transitions of financial networks
NASA Astrophysics Data System (ADS)
Nobi, Ashadun; Lee, Jae Woo
2017-06-01
In this paper, the change in topological hierarchy, which is measured by the minimum spanning tree constructed from the cross-correlations between the stock indices from the S & P 500 for 1998-2012 in a one year moving time window, was used to analyze a financial crisis. The hierarchy increased in all minor crises in the observation time window except for the sharp crisis of 2007-2008 when the global financial crisis occurred. The sudden increase in hierarchy just before the global financial crisis can be used for the early detection of an upcoming crisis. Clearly, the higher the hierarchy, the higher the threats to financial stability. The scaling relations were developed to observe the changes in hierarchy with the network topology. These scaling relations can also identify and quantify the financial crisis periods, and appear to contain the predictive power of an upcoming crisis.
Systemic risk and hierarchical transitions of financial networks.
Nobi, Ashadun; Lee, Jae Woo
2017-06-01
In this paper, the change in topological hierarchy, which is measured by the minimum spanning tree constructed from the cross-correlations between the stock indices from the S & P 500 for 1998-2012 in a one year moving time window, was used to analyze a financial crisis. The hierarchy increased in all minor crises in the observation time window except for the sharp crisis of 2007-2008 when the global financial crisis occurred. The sudden increase in hierarchy just before the global financial crisis can be used for the early detection of an upcoming crisis. Clearly, the higher the hierarchy, the higher the threats to financial stability. The scaling relations were developed to observe the changes in hierarchy with the network topology. These scaling relations can also identify and quantify the financial crisis periods, and appear to contain the predictive power of an upcoming crisis.
A view of metals through the terahertz window
NASA Astrophysics Data System (ADS)
Dodge, Steve
2006-05-01
As electrons move through a metal, interaction with their environment tends to slow them down, causing the Drude peak in the optical conductivity to become narrower. The resulting peak width is typically in the terahertz frequency range that sits between microwaves the far infrared, too fast for conventional electronics and too slow for conventional infrared spectroscopy. With femtosecond laser techniques, however, coherent, broadband terahertz radiation can now be generated and detected with exquisite sensitivity, providing a new window onto electronic interactions in metals. I will discuss the application of this technique to a variety of metallic systems, including elemental lead, the nearly magnetic oxide metal CaRuO3, and CrV alloys that span the quantum phase transition from spin-density wave to paramagnetic metal. M. A. Gilmore, S. Kamal, D. M. Broun, and J. S. Dodge, Appl. Phys. Lett. 88, 141910 (2006).
NASA Astrophysics Data System (ADS)
Wang, Xiaohua; Rong, Mingzhe; Qiu, Juan; Liu, Dingxin; Su, Biao; Wu, Yi
A new type of algorithm for predicting the mechanical faults of a vacuum circuit breaker (VCB) based on an artificial neural network (ANN) is proposed in this paper. There are two types of mechanical faults in a VCB: operation mechanism faults and tripping circuit faults. An angle displacement sensor is used to measure the main axle angle displacement which reflects the displacement of the moving contact, to obtain the state of the operation mechanism in the VCB, while a Hall current sensor is used to measure the trip coil current, which reflects the operation state of the tripping circuit. Then an ANN prediction algorithm based on a sliding time window is proposed in this paper and successfully used to predict mechanical faults in a VCB. The research results in this paper provide a theoretical basis for the realization of online monitoring and fault diagnosis of a VCB.
Veldre, Aaron; Andrews, Sally
2014-01-01
Two experiments used the gaze-contingent moving-window paradigm to investigate whether reading comprehension and spelling ability modulate the perceptual span of skilled adult readers during sentence reading. Highly proficient reading and spelling were both associated with increased use information to the right of fixation, but did not systematically modulate the extraction of information to the left of fixation. Individuals who were high in both reading and spelling ability showed the greatest benefit from window sizes larger than 11 characters, primarily because of increases in forward saccade length. They were also significantly more disrupted by being denied close parafoveal information than those poor in reading and/or spelling. These results suggest that, in addition to supporting rapid lexical retrieval of fixated words, the high quality lexical representations indexed by the combination of high reading and spelling ability support efficient processing of parafoveal information and effective saccadic targeting.
The X-windows interactive navigation data editor
NASA Technical Reports Server (NTRS)
Rinker, G. C.
1992-01-01
A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.
Reading direction and the central perceptual span in Urdu and English.
Paterson, Kevin B; McGowan, Victoria A; White, Sarah J; Malik, Sameen; Abedipour, Lily; Jordan, Timothy R
2014-01-01
Normal reading relies on the reader making a series of saccadic eye movements along lines of text, separated by brief fixational pauses during which visual information is acquired from a region of text. In English and other alphabetic languages read from left to right, the region from which useful information is acquired during each fixational pause is generally reported to extend further to the right of each fixation than to the left. However, the asymmetry of the perceptual span for alphabetic languages read in the opposite direction (i.e., from right to left) has received much less attention. Accordingly, in order to more fully investigate the asymmetry in the perceptual span for these languages, the present research assessed the influence of reading direction on the perceptual span for bilingual readers of Urdu and English. Text in Urdu and English was presented either entirely as normal or in a gaze-contingent moving-window paradigm in which a region of text was displayed as normal at the reader's point of fixation and text outside this region was obscured. The windows of normal text extended symmetrically 0.5° of visual angle to the left and right of fixation, or asymmetrically by increasing the size of each window to 1.5° or 2.5° to either the left or right of fixation. When participants read English, performance for the window conditions was superior when windows extended to the right. However, when reading Urdu, performance was superior when windows extended to the left, and was essentially the reverse of that observed for English. These findings provide a novel indication that the perceptual span is modified by the language being read to produce an asymmetry in the direction of reading and show for the first time that such an asymmetry occurs for reading Urdu.
Old practices, new windows: reflections on a communications skills innovation.
Cantillon, Peter
2017-03-01
Most of the great innovations in communication skills education, from Balint's concept of the 'doctor as drug' to the Calgary Cambridge conceptualisation of the consultation, were founded in general practice. It can be argued however, that there has been a hiatus in the development of new approaches to analysing the consultation since the mid-1990s. It is most welcome therefore that in this issue of the journal two papers are presented that describe and evaluate a novel approach to consultation analysis entitled 'the windows method'. Building on the more structured approaches that preceded it, the windows method offers some genuine innovations in terms of its emphasis on emotional knowledge and the manner in which it addresses many of the potential deficiencies in feedback practice associated with older methods. The new approach is very much in step with current thinking about emotional development and the establishment of appropriate environments for feedback. The windows method has the potential to breathe fresh life into old and well-established communication skills education practices.
Fisher zeros and conformality in lattice models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meurice, Yannick; Bazavov, Alexei; Berg, Bernd A.
2012-10-01
Fisher zeros are the zeros of the partition function in the complex beta=2N_c/g^2 plane. When they pinch the real axis, finite size scaling allows one to distinguish between first and second order transition and to estimate exponents. On the other hand, a gap signals confinement and the method can be used to explore the boundary of the conformal window. We present recent numerical results for 2D O(N) sigma models, 4D U(1) and SU(2) pure gauge and SU(3) gauge theory with N_f=4 and 12 flavors. We discuss attempts to understand some of these results using analytical methods. We discuss the 2-latticemore » matching and qualitative aspects of the renormalization group (RG) flows in the Migdal-Kadanoff approximation, in particular how RG flows starting at large beta seem to move around regions where bulk transitions occur. We consider the effects of the boundary conditions on the nonperturbative part of the average energy and on the Fisher zeros for the 1D O(2) model.« less
Non-parametric and least squares Langley plot methods
NASA Astrophysics Data System (ADS)
Kiedron, P. W.; Michalsky, J. J.
2016-01-01
Langley plots are used to calibrate sun radiometers primarily for the measurement of the aerosol component of the atmosphere that attenuates (scatters and absorbs) incoming direct solar radiation. In principle, the calibration of a sun radiometer is a straightforward application of the Bouguer-Lambert-Beer law V = V0e-τ ṡ m, where a plot of ln(V) voltage vs. m air mass yields a straight line with intercept ln(V0). This ln(V0) subsequently can be used to solve for τ for any measurement of V and calculation of m. This calibration works well on some high mountain sites, but the application of the Langley plot calibration technique is more complicated at other, more interesting, locales. This paper is concerned with ferreting out calibrations at difficult sites and examining and comparing a number of conventional and non-conventional methods for obtaining successful Langley plots. The 11 techniques discussed indicate that both least squares and various non-parametric techniques produce satisfactory calibrations with no significant differences among them when the time series of ln(V0)'s are smoothed and interpolated with median and mean moving window filters.
Measuring Glial Metabolism in Repetitive Brain Trauma and Alzheimer’s Disease
2016-09-01
Six methods: Single value decomposition (SVD), wavelet, sliding window, sliding window with Gaussian weighting, spline and spectral improvements...comparison of a range of different denoising methods for dynamic MRS. Six denoising methods were considered: Single value decomposition (SVD), wavelet...project by improving the software required for the data analysis by developing six different denoising methods. He also assisted with the testing
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
Investigation of Crustal Thickness in Eastern Anatolia Using Gravity, Magnetic and Topographic Data
NASA Astrophysics Data System (ADS)
Pamukçu, Oya Ankaya; Akçığ, Zafer; Demirbaş, Şevket; Zor, Ekrem
2007-12-01
The tectonic regime of Eastern Anatolia is determined by the Arabia-Eurasia continent-continent collision. Several dynamic models have been proposed to characterize the collision zone and its geodynamic structure. In this study, change in crustal thickness has been investigated using gravity, magnetic and topographic data of the region. In the first stage, two-dimensional low-pass filter and upward analytical continuation techniques were applied to the Bouguer gravity data of the region to investigate the behavior of the regional gravity anomalies. Next the moving window power spectrum method was used, and changes in the probable structural depths from 38 to 52 km were determined. The changes in crustal thickness where free air gravity and magnetic data have inversely correlated and the type of the anomaly resources were investigated applying the Euler deconvolution method to Bouguer gravity data. The obtained depth values are consistent with the results obtained using the power spectrum method. It was determined that the types of anomaly resources are different in the west and east of the 40° E longitude. Finally, using the obtained findings from this study and seismic velocity models proposed for this region by previous studies, a probable two-dimensional crust model was constituted.
NASA Astrophysics Data System (ADS)
Salinas Solé, Celia; Peña Angulo, Dhais; Gonzalez Hidalgo, Jose Carlos; Brunetti, Michele
2017-04-01
In this poster we applied the moving window approach (see Poster I of this collection) to analyze trends of spring and its corresponding months (March, April, May) temperature mean values of maximum (Tmax) and minimum (Tmin) in Spanish mainland to detect the effects of length period and starting year. Monthly series belong to Monthly Temperature dataset of Spanish mainland (MOTEDAS). Database contains in its grid format of 5236 pixels of monthly series (10x10 km). The threshold used in spatial analyses considers 20% of land under significant trend (p<0.05). The most striking results are as follow: • Seasonal Tmax shows that global trend was positive and significant until the mid 80's with higher values than 75% from between 1954-2010 to 1979-2010, being reduced after to the north region. So, from 1985-2010 no significant trend have been detected. Monthly analyses show differences. March trend is not significant (<20% of area) since 1974-2010, while significant trend in April and May varies between 1961-2010/1979-2010 and 1965-2010/1980-2010 respectively, clearly located in northern midland and Mediterranean coastland. • Spring Tmin trend analyses is significantly (>20%) during all temporal windows, notwithstanding NW do not show global significant trend, and in the most recent temporal windows only affect significantly SE. Monthly analyses also differ. Not significant trend is detected in March from 1979-2010, and from 1985-2010 in May, being April the month in any temporal windows with more than 20% of land affected by significant trend. • Spatial differences are detected between windows (South-North in March, East-West in April-May. We can conclude Tmax trend varies accordingly temporal windows dramatically in spring and no significance has been detected in the recent decades. Northern areas and Mediterranean coastland seems to be the most affected. Monthy Tmax trend spatial analyses confirm the heterogeneity of diurnal temperatures; different spatial gradients in windows have been detected between months. Seasonal Tmin show a more global temporal pattern. Spatial gradients of significance between months have been detected, in some sense contraries to the observed in Tmax.
Sliding window prior data assisted compressed sensing for MRI tracking of lung tumors.
Yip, Eugene; Yun, Jihyun; Wachowicz, Keith; Gabos, Zsolt; Rathee, Satyapal; Fallone, B G
2017-01-01
Hybrid magnetic resonance imaging and radiation therapy devices are capable of imaging in real-time to track intrafractional lung tumor motion during radiotherapy. Highly accelerated magnetic resonance (MR) imaging methods can potentially reduce system delay time and/or improves imaging spatial resolution, and provide flexibility in imaging parameters. Prior Data Assisted Compressed Sensing (PDACS) has previously been proposed as an acceleration method that combines the advantages of 2D compressed sensing and the KEYHOLE view-sharing technique. However, as PDACS relies on prior data acquired at the beginning of a dynamic imaging sequence, decline in image quality occurs for longer duration scans due to drifts in MR signal. Novel sliding window-based techniques for refreshing prior data are proposed as a solution to this problem. MR acceleration is performed by retrospective removal of data from the fully sampled sets. Six patients with lung tumors are scanned with a clinical 3 T MRI using a balanced steady-state free precession (bSSFP) sequence for 3 min at approximately 4 frames per second, for a total of 650 dynamics. A series of distinct pseudo-random patterns of partial k-space acquisition is generated such that, when combined with other dynamics within a sliding window of 100 dynamics, covers the entire k-space. The prior data in the sliding window are continuously refreshed to reduce the impact of MR signal drifts. We intended to demonstrate two different ways to utilize the sliding window data: a simple averaging method and a navigator-based method. These two sliding window methods are quantitatively compared against the original PDACS method using three metrics: artifact power, centroid displacement error, and Dice's coefficient. The study is repeated with pseudo 0.5 T images by adding complex, normally distributed noise with a standard deviation that reduces image SNR, relative to original 3 T images, by a factor of 6. Without sliding window implemented, PDACS-reconstructed dynamic datasets showed progressive increases in image artifact power as the 3 min scan progresses. With sliding windows implemented, this increase in artifact power is eliminated. Near the end of a 3 min scan at 3 T SNR and 5× acceleration, implementation of an averaging (navigator) sliding window method improves our metrics by the following ways: artifact power decreases from 0.065 without sliding window to 0.030 (0.031), centroid error decreases from 2.64 to 1.41 mm (1.28 mm), and Dice coefficient agreement increases from 0.860 to 0.912 (0.915). At pseudo 0.5 T SNR, the improvements in metrics are as follows: artifact power decreases from 0.110 without sliding window to 0.0897 (0.0985), centroid error decreases from 2.92 mm to 1.36 mm (1.32 mm), and Dice coefficient agreements increases from 0.851 to 0.894 (0.896). In this work we demonstrated the negative impact of slow changes in MR signal for longer duration PDACS dynamic scans, namely increases in image artifact power and reductions of tumor tracking accuracy. We have also demonstrated sliding window implementations (i.e., refreshing of prior data) of PDACS are effective solutions to this problem at both 3 T and simulated 0.5 T bSSFP images. © 2016 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Yang, A.; Yongtao, F.
2016-12-01
The effective elastic thickness (Te) is an important parameter that characterizes the long term strength of the lithosphere, which has great significance on understanding the mechanical properties and evolution of the lithosphere. In contrast with many controversies regarding elastic thickness of continent lithosphere, the Te of oceanic lithosphere is thought to be in a simple way that is dependent on the age of the plate. However, rescent studies show that there is no simple relationship between Te and age at time of loading for both seamounts and subduction zones. As subsurface loading is very importand and has large influence in the estimate of Te for continent lithosphere, and many oceanic features such as subduction zones also have considerable subsurface loading. We introduce the method to estimate the effective elastic thickness of oceanic lithosphere using model including surface and subsurface loads by using free-air gravity anomaly and bathymetric data, together with a moving window admittance technique (MWAT). We use the multitaper spectral estimation method to calculate the power spectral density. Through tests with synthetic subduction zone like bathymetry and gravity data show that the Te can be recovered in an accurance similar to that in the continent and there is also a trade-off between spatial resolution and variance for different window sizes. We estimate Te of many subduction zones (Peru-Chile trench, Middle America trench, Caribbean trench, Kuril-Japan trench, Mariana trench, Tonga trench, Java trench, Ryukyu-Philippine trench) with an age range of 0-160 Myr to reassess the relationship between elastic thickness and the age of the lithosphere at the time of loading. The results do not show a simple relationship between Te and age.
A staggered-grid convolutional differentiator for elastic wave modelling
NASA Astrophysics Data System (ADS)
Sun, Weijia; Zhou, Binzhong; Fu, Li-Yun
2015-11-01
The computation of derivatives in governing partial differential equations is one of the most investigated subjects in the numerical simulation of physical wave propagation. An analytical staggered-grid convolutional differentiator (CD) for first-order velocity-stress elastic wave equations is derived in this paper by inverse Fourier transformation of the band-limited spectrum of a first derivative operator. A taper window function is used to truncate the infinite staggered-grid CD stencil. The truncated CD operator is almost as accurate as the analytical solution, and as efficient as the finite-difference (FD) method. The selection of window functions will influence the accuracy of the CD operator in wave simulation. We search for the optimal Gaussian windows for different order CDs by minimizing the spectral error of the derivative and comparing the windows with the normal Hanning window function for tapering the CD operators. It is found that the optimal Gaussian window appears to be similar to the Hanning window function for tapering the same CD operator. We investigate the accuracy of the windowed CD operator and the staggered-grid FD method with different orders. Compared to the conventional staggered-grid FD method, a short staggered-grid CD operator achieves an accuracy equivalent to that of a long FD operator, with lower computational costs. For example, an 8th order staggered-grid CD operator can achieve the same accuracy of a 16th order staggered-grid FD algorithm but with half of the computational resources and time required. Numerical examples from a homogeneous model and a crustal waveguide model are used to illustrate the superiority of the CD operators over the conventional staggered-grid FD operators for the simulation of wave propagations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... certification pursuant to § 0.459 of this chapter. (b) Initial certification window. Following the effective... window for digital output protection technologies or recording methods. Within thirty (30) days after the... certification window, the Commission shall issue a public notice identifying the certifications received and...
Settling into an increasingly hostile world: the rapidly closing "recruitment window" for corals.
Arnold, Suzanne N; Steneck, Robert S
2011-01-01
Free space is necessary for larval recruitment in all marine benthic communities. Settling corals, with limited energy to invest in competitive interactions, are particularly vulnerable during settlement into well-developed coral reef communities. This situation may be exacerbated for corals settling into coral-depauperate reefs where succession in nursery microhabitats moves rapidly toward heterotrophic organisms inhospitable to settling corals. To study effects of benthic organisms (at millimeter to centimeter scales) on newly settled corals and their survivorship we deployed terra-cotta coral settlement plates at 10 m depth on the Mesoamerican Barrier Reef in Belize and monitored them for 38 mo. During the second and third years, annual recruitment rates declined by over 50% from the previous year. Invertebrate crusts (primarily sponges) were absent at the start of the experiment but increased in abundance annually from 39, 60, to 73% of the plate undersides by year three. Subsequently, substrates hospitable to coral recruitment, including crustose coralline algae, biofilmed terra-cotta and polychaete tubes, declined. With succession, substrates upon which spat settled shifted toward organisms inimical to survivorship. Over 50% of spat mortality was due to overgrowth by sponges alone. This result suggests that when a disturbance creates primary substrate a "recruitment window" for settling corals exists from approximately 9 to 14 mo following the disturbance. During the window, early-succession, facilitating species are most abundant. The window closes as organisms hostile to coral settlement and survivorship overgrow nursery microhabitats.
Clipping polygon faces through a polyhedron of vision
NASA Technical Reports Server (NTRS)
Florence, Judit K. (Inventor); Rohner, Michel A. (Inventor)
1980-01-01
A flight simulator combines flight data and polygon face terrain data to provide a CRT display at each window of the simulated aircraft. The data base specifies the relative position of each vertex of each polygon face therein. Only those terrain faces currently appearing within the pyramid of vision defined by the pilots eye and the edges of the pilots window need be displayed at any given time. As the orientation of the pyramid of vision changes in response to flight data, the displayed faces are correspondingly displaced, eventually moving out of the pyramid of vision. Faces which are currently not visible (outside the pyramid of vision) are clipped from the data flow. In addition, faces which are only partially outside of pyramid of vision are reconstructed to eliminate the outside portion. Window coordinates are generated defining the distance between each vertex and each of the boundary planes forming the pyramid of vision. The sign bit of each window coordinate indicates whether the vertex is on the pyramid of vision side of the associated boundary panel (positive), or on the other side thereof (negative). The set of sign bits accompanying each vertex constitute the outcode of that vertex. The outcodes (O.C.) are systematically processed and examined to determine which faces are completely inside the pyramid of vision (Case A--all signs positive), which faces are completely outside (Case C--All signs negative) and which faces must be reconstructed (Case B--both positive and negative signs).
Estimating Characteristics of a Maneuvering Reentry Vehicle Observed by Multiple Sensors
2010-03-01
instead of as one large data set. This method allowed the filter to respond to changing dynamics. Jackson and Farbman’s approach could be of...portion of the entire acceleration was due to drag. Lee and Liu adopted a more hybrid approach , combining a least squares and Kalman filters [9...grows again as the window approaches the end of the available data. Three values for minimum window size, window size, and maximum window size are
Sound transmission loss of windows on high speed trains
NASA Astrophysics Data System (ADS)
Zhang, Yumei; Xiao, Xinbiao; Thompson, David; Squicciarini, Giacomo; Wen, Zefeng; Li, Zhihui; Wu, Yue
2016-09-01
The window is one of the main components of the high speed train car body structure through which noise can be transmitted. To study the windows’ acoustic properties, the vibration of one window of a high speed train has been measured for a running speed of 250 km/h. The corresponding interior noise and the noise in the wheel-rail area have been measured simultaneously. The experimental results show that the window vibration velocity has a similar spectral shape to the interior noise. Interior noise source identification further indicates that the window makes a contribution to the interior noise. Improvement of the window's Sound Transmission Loss (STL) can reduce the interior noise from this transmission path. An STL model of the window is built based on wave propagation and modal superposition methods. From the theoretical results, the window's STL property is studied and several factors affecting it are investigated, which provide indications for future low noise design of high speed train windows.
Fully automatic time-window selection using machine learning for global adjoint tomography
NASA Astrophysics Data System (ADS)
Chen, Y.; Hill, J.; Lei, W.; Lefebvre, M. P.; Bozdag, E.; Komatitsch, D.; Tromp, J.
2017-12-01
Selecting time windows from seismograms such that the synthetic measurements (from simulations) and measured observations are sufficiently close is indispensable in a global adjoint tomography framework. The increasing amount of seismic data collected everyday around the world demands "intelligent" algorithms for seismic window selection. While the traditional FLEXWIN algorithm can be "automatic" to some extent, it still requires both human input and human knowledge or experience, and thus is not deemed to be fully automatic. The goal of intelligent window selection is to automatically select windows based on a learnt engine that is built upon a huge number of existing windows generated through the adjoint tomography project. We have formulated the automatic window selection problem as a classification problem. All possible misfit calculation windows are classified as either usable or unusable. Given a large number of windows with a known selection mode (select or not select), we train a neural network to predict the selection mode of an arbitrary input window. Currently, the five features we extract from the windows are its cross-correlation value, cross-correlation time lag, amplitude ratio between observed and synthetic data, window length, and minimum STA/LTA value. More features can be included in the future. We use these features to characterize each window for training a multilayer perceptron neural network (MPNN). Training the MPNN is equivalent to solve a non-linear optimization problem. We use backward propagation to derive the gradient of the loss function with respect to the weighting matrices and bias vectors and use the mini-batch stochastic gradient method to iteratively optimize the MPNN. Numerical tests show that with a careful selection of the training data and a sufficient amount of training data, we are able to train a robust neural network that is capable of detecting the waveforms in an arbitrary earthquake data with negligible detection error compared to existing selection methods (e.g. FLEXWIN). We will introduce in detail the mathematical formulation of the window-selection-oriented MPNN and show very encouraging results when applying the new algorithm to real earthquake data.
Background subtraction for fluorescence EXAFS data of a very dilute dopant Z in Z + 1 host.
Medling, Scott; Bridges, Frank
2011-07-01
When conducting EXAFS at the Cu K-edge for ZnS:Cu with very low Cu concentration (<0.04% Cu), a large background was present that increased with energy. This background arises from a Zn X-ray Raman peak, which moves through the Cu fluorescence window, plus the tail of the Zn fluorescence peak. This large background distorts the EXAFS and must be removed separately before reducing the data. A simple means to remove this background is described.
Time-marching multi-grid seismic tomography
NASA Astrophysics Data System (ADS)
Tong, P.; Yang, D.; Liu, Q.
2016-12-01
From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.
Levels of polychlorinated biphenyls (PCBs) in caulk and window glazing material samples from older buildings were determined, using a method developed for this purpose. This method was evaluated by analyzing a combination of 47 samples of caulk, glazing materials, including quali...
NASA Technical Reports Server (NTRS)
Tilton, James C. (Inventor)
2010-01-01
A method, computer readable storage, and apparatus for implementing recursive segmentation of data with spatial characteristics into regions including splitting-remerging of pixels with contagious region designations and a user controlled parameter for providing a preference for merging adjacent regions to eliminate window artifacts.
Multi-window detection for P-wave in electrocardiograms based on bilateral accumulative area.
Chen, Riqing; Huang, Yingsong; Wu, Jian
2016-11-01
P-wave detection is one of the most challenging aspects in electrocardiograms (ECGs) due to its low amplitude, low frequency, and variable waveforms. This work introduces a novel multi-window detection method for P-wave delineation based on the bilateral accumulative area. The bilateral accumulative area is calculated by summing the areas covered by the P-wave curve with left and right sliding windows. The onset and offset of a positive P-wave correspond to the local maxima of the area detector. The position drift and difference in area variation of local extreme points with different windows are used to systematically combine multi-window and 12-lead synchronous detection methods, which are used to screen the optimization boundary points from all extreme points of different window widths and adaptively match the P-wave location. The proposed method was validated with ECG signals from various databases, including the Standard CSE Database, T-Wave Alternans Challenge Database, PTB Diagnostic ECG Database, and the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database. The average sensitivity Se was 99.44% with a positive predictivity P+ of 99.37% for P-wave detection. Standard deviations of 3.7 and 4.3ms were achieved for the onset and offset of P-waves, respectively, which is in agreement with the accepted tolerances required by the CSE committee. Compared with well-known delineation methods, this method can achieve high sensitivity and positive predictability using a simple calculation process. The experiment results suggest that the bilateral accumulative area could be an effective detection tool for ECG signal analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Galias, Zbigniew
2017-05-01
An efficient method to find positions of periodic windows for the quadratic map f(x)=ax(1-x) and a heuristic algorithm to locate the majority of wide periodic windows are proposed. Accurate rigorous bounds of positions of all periodic windows with periods below 37 and the majority of wide periodic windows with longer periods are found. Based on these results, we prove that the measure of the set of regular parameters in the interval [3,4] is above 0.613960137. The properties of periodic windows are studied numerically. The results of the analysis are used to estimate that the true value of the measure of the set of regular parameters is close to 0.6139603.
Measure Guideline. Wood Window Repair, Rehabilitation, and Replacement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, P.; Eng, P.
2012-12-01
This measure guideline provides information and guidance on rehabilitating, retrofitting, and replacing existing window assemblies in residential construction. The intent is to provide information regarding means and methods to improve the energy and comfort performance of existing wood window assemblies in a way that takes into consideration component durability, in-service operation, and long term performance of the strategies.
Measure Guideline: Window Repair, Rehabilitation, and Replacement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, P.
2012-12-01
This measure guideline provides information and guidance on rehabilitating, retrofitting, and replacing existing window assemblies in residential construction. The intent is to provide information regarding means and methods to improve the energy and comfort performance of existing wood window assemblies in a way that takes into consideration component durability, in-service operation, and long term performance of the strategies.
NASA Astrophysics Data System (ADS)
Xiao, Fan; Chen, Zhijun; Chen, Jianguo; Zhou, Yongzhang
2016-05-01
In this study, a novel batch sliding window (BSW) based singularity mapping approach was proposed. Compared to the traditional sliding window (SW) technique with disadvantages of the empirical predetermination of a fixed maximum window size and outliers sensitivity of least-squares (LS) linear regression method, the BSW based singularity mapping approach can automatically determine the optimal size of the largest window for each estimated position, and utilizes robust linear regression (RLR) which is insensitive to outlier values. In the case study, tin geochemical data in Gejiu, Yunnan, have been processed by BSW based singularity mapping approach. The results show that the BSW approach can improve the accuracy of the calculation of singularity exponent values due to the determination of the optimal maximum window size. The utilization of RLR method in the BSW approach can smoothen the distribution of singularity index values with few or even without much high fluctuate values looking like noise points that usually make a singularity map much roughly and discontinuously. Furthermore, the student's t-statistic diagram indicates a strong spatial correlation between high geochemical anomaly and known tin polymetallic deposits. The target areas within high tin geochemical anomaly could probably have much higher potential for the exploration of new tin polymetallic deposits than other areas, particularly for the areas that show strong tin geochemical anomalies whereas no tin polymetallic deposits have been found in them.
The window of visibility: A psychological theory of fidelity in time-sampled visual motion displays
NASA Technical Reports Server (NTRS)
Watson, A. B.; Ahumada, A. J., Jr.; Farrell, J. E.
1983-01-01
Many visual displays, such as movies and television, rely upon sampling in the time domain. The spatiotemporal frequency spectra for some simple moving images are derived and illustrations of how these spectra are altered by sampling in the time domain are provided. A simple model of the human perceiver which predicts the critical sample rate required to render sampled and continuous moving images indistinguishable is constructed. The rate is shown to depend upon the spatial and temporal acuity of the observer, and upon the velocity and spatial frequency content of the image. Several predictions of this model are tested and confirmed. The model is offered as an explanation of many of the phenomena known as apparent motion. Finally, the implications of the model for computer-generated imagery are discussed.
Very-short-term wind power prediction by a hybrid model with single- and multi-step approaches
NASA Astrophysics Data System (ADS)
Mohammed, E.; Wang, S.; Yu, J.
2017-05-01
Very-short-term wind power prediction (VSTWPP) has played an essential role for the operation of electric power systems. This paper aims at improving and applying a hybrid method of VSTWPP based on historical data. The hybrid method is combined by multiple linear regressions and least square (MLR&LS), which is intended for reducing prediction errors. The predicted values are obtained through two sub-processes:1) transform the time-series data of actual wind power into the power ratio, and then predict the power ratio;2) use the predicted power ratio to predict the wind power. Besides, the proposed method can include two prediction approaches: single-step prediction (SSP) and multi-step prediction (MSP). WPP is tested comparatively by auto-regressive moving average (ARMA) model from the predicted values and errors. The validity of the proposed hybrid method is confirmed in terms of error analysis by using probability density function (PDF), mean absolute percent error (MAPE) and means square error (MSE). Meanwhile, comparison of the correlation coefficients between the actual values and the predicted values for different prediction times and window has confirmed that MSP approach by using the hybrid model is the most accurate while comparing to SSP approach and ARMA. The MLR&LS is accurate and promising for solving problems in WPP.
Measuring multiple spike train synchrony.
Kreuz, Thomas; Chicharro, Daniel; Andrzejak, Ralph G; Haas, Julie S; Abarbanel, Henry D I
2009-10-15
Measures of multiple spike train synchrony are essential in order to study issues such as spike timing reliability, network synchronization, and neuronal coding. These measures can broadly be divided in multivariate measures and averages over bivariate measures. One of the most recent bivariate approaches, the ISI-distance, employs the ratio of instantaneous interspike intervals (ISIs). In this study we propose two extensions of the ISI-distance, the straightforward averaged bivariate ISI-distance and the multivariate ISI-diversity based on the coefficient of variation. Like the original measure these extensions combine many properties desirable in applications to real data. In particular, they are parameter-free, time scale independent, and easy to visualize in a time-resolved manner, as we illustrate with in vitro recordings from a cortical neuron. Using a simulated network of Hindemarsh-Rose neurons as a controlled configuration we compare the performance of our methods in distinguishing different levels of multi-neuron spike train synchrony to the performance of six other previously published measures. We show and explain why the averaged bivariate measures perform better than the multivariate ones and why the multivariate ISI-diversity is the best performer among the multivariate methods. Finally, in a comparison against standard methods that rely on moving window estimates, we use single-unit monkey data to demonstrate the advantages of the instantaneous nature of our methods.
NASA Astrophysics Data System (ADS)
Mannon, Timothy Patrick, Jr.
Improving well design has and always will be the primary goal in drilling operations in the oil and gas industry. Oil and gas plays are continuing to move into increasingly hostile drilling environments, including near and/or sub-salt proximities. The ability to reduce the risk and uncertainly involved in drilling operations in unconventional geologic settings starts with improving the techniques for mudweight window modeling. To address this issue, an analysis of wellbore stability and well design improvement has been conducted. This study will show a systematic approach to well design by focusing on best practices for mudweight window projection for a field in Mississippi Canyon, Gulf of Mexico. The field includes depleted reservoirs and is in close proximity of salt intrusions. Analysis of offset wells has been conducted in the interest of developing an accurate picture of the subsurface environment by making connections between depth, non-productive time (NPT) events, and mudweights used. Commonly practiced petrophysical methods of pore pressure, fracture pressure, and shear failure gradient prediction have been applied to key offset wells in order to enhance the well design for two proposed wells. For the first time in the literature, the accuracy of the commonly accepted, seismic interval velocity based and the relatively new, seismic frequency based methodologies for pore pressure prediction are qualitatively and quantitatively compared for accuracy. Accuracy standards will be based on the agreement of the seismic outputs to pressure data obtained while drilling and petrophysically based pore pressure outputs for each well. The results will show significantly higher accuracy for the seismic frequency based approach in wells that were in near/sub-salt environments and higher overall accuracy for all of the wells in the study as a whole.
Inda, Márcia A; van Batenburg, Marinus F; Roos, Marco; Belloum, Adam S Z; Vasunin, Dmitry; Wibisono, Adianto; van Kampen, Antoine H C; Breit, Timo M
2008-08-08
Chromosome location is often used as a scaffold to organize genomic information in both the living cell and molecular biological research. Thus, ever-increasing amounts of data about genomic features are stored in public databases and can be readily visualized by genome browsers. To perform in silico experimentation conveniently with this genomics data, biologists need tools to process and compare datasets routinely and explore the obtained results interactively. The complexity of such experimentation requires these tools to be based on an e-Science approach, hence generic, modular, and reusable. A virtual laboratory environment with workflows, workflow management systems, and Grid computation are therefore essential. Here we apply an e-Science approach to develop SigWin-detector, a workflow-based tool that can detect significantly enriched windows of (genomic) features in a (DNA) sequence in a fast and reproducible way. For proof-of-principle, we utilize a biological use case to detect regions of increased and decreased gene expression (RIDGEs and anti-RIDGEs) in human transcriptome maps. We improved the original method for RIDGE detection by replacing the costly step of estimation by random sampling with a faster analytical formula for computing the distribution of the null hypothesis being tested and by developing a new algorithm for computing moving medians. SigWin-detector was developed using the WS-VLAM workflow management system and consists of several reusable modules that are linked together in a basic workflow. The configuration of this basic workflow can be adapted to satisfy the requirements of the specific in silico experiment. As we show with the results from analyses in the biological use case on RIDGEs, SigWin-detector is an efficient and reusable Grid-based tool for discovering windows enriched for features of a particular type in any sequence of values. Thus, SigWin-detector provides the proof-of-principle for the modular e-Science based concept of integrative bioinformatics experimentation.
Handling and analysis of ices in cryostats and glove boxes in view of cometary samples
NASA Technical Reports Server (NTRS)
Roessler, K.; Eich, G.; Heyl, M.; Kochan, H.; Oehler, A.; Patnaik, A.; Schlosser, W.; Schulz, R.
1989-01-01
Comet nucleus sample return mission and other return missions from planets and satellites need equipment for handling and analysis of icy samples at low temperatures under vacuum or protective gas. Two methods are reported which were developed for analysis of small icy samples and which are modified for larger samples in cometary matter simulation experiments (KOSI). A conventional optical cryostat system was modified to allow for transport of samples at 5 K, ion beam irradiation, and measurement in an off-line optical spectrophotometer. The new system consists of a removable window plug containing nozzles for condensation of water and volatiles onto a cold finger. This plug can be removed in a vacuum system, changed against another plug (e.g., with other windows (IR, VIS, VUV) or other nozzles). While open, the samples can be treated under vacuum with cooling by manipulators (cut, removal, sample taking, irradiation with light, photons, or ions). After bringing the plug back, the samples can be moved to another site of analysis. For handling the 30 cm diameter mineral-ice samples from the KOSI experiments an 80x80x80 cm glove box made out of plexiglass was used. The samples were kept in a liquid nitrogen bath, which was filled from the outside. A stream a dry N2 and evaporating gas from the bath purified the glove box from impurity gases and, in particular, H2O, which otherwise would condense onto the samples.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
NASA Astrophysics Data System (ADS)
Imanishi, K.; Uchide, T.; Takeda, N.
2014-12-01
We propose a method to determine focal mechanisms of non-volcanic tremors (NVTs) based on S-wave polarization angles. The successful retrieval of polarization angles in low S/N tremor signals owes much to the observation that NVTs propagate slowly and therefore they do not change their location immediately. This feature of NVTs enables us to use a longer window to compute a polarization angle (e.g., one minute or longer), resulting in a stack of particle motions. Following Zhang and Schwartz (1994), we first correct for the splitting effect to recover the source polarization angle (anisotropy-corrected angle). This is a key step, because shear-wave splitting distorts the particle motion excited by a seismic source. We then determine the best double-couple solution using anisotropy-corrected angles of multiple stations. The present method was applied to a tremor sequence at Kii Peninsula, southwest Japan, which occurred at the beginning of April 2013. A standard splitting and polarization analysis were subject to a one-minute-long moving window to determine the splitting parameters as well as anisotropy-corrected angles. A grid search approach was performed at each hour to determine the best double-couple solution satisfying one-hour average polarization angles. Most solutions show NW-dipping low-angle planes consistent with the plate boundary or SE-dipping high-angle planes. Because of 180 degrees ambiguity in polarization angles, the present method alone cannot distinguish compressional quadrant from dilatational one. Together with the observation of very low-frequency earthquakes near the present study area (Ito et al., 2007), it is reasonable to consider that they represent shear slip on low-angle thrust faults. It is also noted that some of solutions contain strike-slip component. Acknowledgements: Seismograph stations used in this study include permanent stations operated by NIED (Hi-net), JMA, Earthquake Research Institute, together with Geological Survey of Japan, AIST. This work was supported by JSPS KAKENHI Grant Number 24540463.
Method for preparing dosimeter for measuring skin dose
Jones, Donald E.; Parker, DeRay; Boren, Paul R.
1982-01-01
A personnel dosimeter includes a plurality of compartments containing thermoluminescent dosimeter phosphors for registering radiation dose absorbed in the wearer's sensitive skin layer and for registering more deeply penetrating radiation. Two of the phosphor compartments communicate with thin windows of different thicknesses to obtain a ratio of shallowly penetrating radiation, e.g. beta. A third phosphor is disposed within a compartment communicating with a window of substantially greater thickness than the windows of the first two compartments for estimating the more deeply penetrating radiation dose. By selecting certain phosphors that are insensitive to neutrons and by loading the holder material with neutron-absorbing elements, energetic neutron dose can be estimated separately from other radiation dose. This invention also involves a method of injection molding of dosimeter holders with thin windows of consistent thickness at the corresponding compartments of different holders. This is achieved through use of a die insert having the thin window of precision thickness in place prior to the injection molding step.
Zhao, Li-Ting; Xiang, Yu-Hong; Dai, Yin-Mei; Zhang, Zhuo-Yong
2010-04-01
Near infrared spectroscopy was applied to measure the tissue slice of endometrial tissues for collecting the spectra. A total of 154 spectra were obtained from 154 samples. The number of normal, hyperplasia, and malignant samples was 36, 60, and 58, respectively. Original near infrared spectra are composed of many variables, for example, interference information including instrument errors and physical effects such as particle size and light scatter. In order to reduce these influences, original spectra data should be performed with different spectral preprocessing methods to compress variables and extract useful information. So the methods of spectral preprocessing and wavelength selection have played an important role in near infrared spectroscopy technique. In the present paper the raw spectra were processed using various preprocessing methods including first derivative, multiplication scatter correction, Savitzky-Golay first derivative algorithm, standard normal variate, smoothing, and moving-window median. Standard deviation was used to select the optimal spectral region of 4 000-6 000 cm(-1). Then principal component analysis was used for classification. Principal component analysis results showed that three types of samples could be discriminated completely and the accuracy almost achieved 100%. This study demonstrated that near infrared spectroscopy technology and chemometrics method could be a fast, efficient, and novel means to diagnose cancer. The proposed methods would be a promising and significant diagnosis technique of early stage cancer.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva
2017-06-01
Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.
Maps of averaged spectral deviations from soil lines and their comparison with traditional soil maps
NASA Astrophysics Data System (ADS)
Rukhovich, D. I.; Rukhovich, A. D.; Rukhovich, D. D.; Simakova, M. S.; Kulyanitsa, A. L.; Bryzzhev, A. V.; Koroleva, P. V.
2016-07-01
The analysis of 34 cloudless fragments of Landsat 5, 7, and 8 images (1985-2014) on the territory of Plavsk, Arsen'evsk, and Chern districts of Tula oblast has been performed. It is shown that bare soil surface on the RED-NIR plots derived from the images cannot be described in the form of a sector of spectral plane as it can be done for the NDVI values. The notion of spectral neighborhood of soil line (SNSL) is suggested. It is defined as the sum of points of the RED-NIR spectral space, which are characterized by spectral characteristics of the bare soil applied for constructing soil lines. The way of the SNSL separation along the line of the lowest concentration density of points on the RED-NIR spectral space is suggested. This line separates bare soil surface from vegetating plants. The SNSL has been applied to construct soil line (SL) for each of the 34 images and to delineate bare soil surface on them. Distances from the points with averaged RED-NIR coordinates to the SL have been calculated using the method of moving window. These distances can be referred to as averaged spectral deviations (ASDs). The calculations have been performed strictly for the SNSL areas. As a result, 34 maps of ASDs have been created. These maps contain ASD values for 6036 points of a grid used in the study. Then, the integral map of normalized ASD values has been built with due account for the number of points participating in the calculation (i.e., lying in the SNSL) within the moving window. The integral map of ASD values has been compared with four traditional soil maps on the studied territory. It is shown that this integral map can be interpreted in terms of soil taxa: the areas of seven soil subtypes (soddy moderately podzolic, soddy slightly podzolic, light gray forest. gray forest, dark gray forest, podzolized chernozems, and leached chernozems) belonging to three soil types (soddy-podzolic, gray forest, and chernozemic soils) can be delineated on it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fattori, G; Klimpki, G; Safai, S
Purpose: We aim to compare the performance of discrete spot- or continuous line scanning combined with rescanning in mitigating residual organ motion during gated proton therapy treatments. Methods: The Quasar respiratory phantom was used to move a 2D scintillation detector on a linear trajectory with sinusoidal motion pattern (sin{sup 4}), 20 mm peak-to-peak amplitude and 5 sec period. Its motion was monitored using a customized solution based on optical tracking technology. We compared spot and line scanning plans for a monoenergetic 150 MeV circular field, 50.4 mm radius at isocenter. Transverse dose distributions at 13 cm depth in PMMA (15.47more » mm water equivalent) were measured to compare three options for motion mitigation: rescanning (10× factor), gating and their combination. The gating window was centered in the trajectory plateau to simulate end-exhale gated treatment in presence of 2 mm and 4 mm residual motion, parallel or perpendicular to the primary scanning direction. Results: When the target moves perpendicular to the primary scanning direction, large dose deviations are measured (γ3%/3mm=47%) without mitigation techniques. Beam gating combined with rescanning restores target coverage (γ3%/3mm=91%). For parallel target motion, observed dose distortions in the non-compensated irradiation are smaller (γ3%/3mm=77%). Beam gating alone recovers the 100% gamma pass-rate at 3%/3mm. Continuous line scanning reduces delivery time by up to 60% with respect to discrete spot scanning in presence of motion mitigation, and improves homogeneity when rescanning is applied (up to 20%, perpendicular motion, 4 mm residual motion). Conclusion: The direction of motion has a large impact on the target dose coverage. Nevertheless, even in the worst case scenario, gating combined with rescanning could mitigate the impact of motion on dose deposition. Moreover, continuous line rescanning improves the robustness against residual motion in the gating window. This study has received funding from the European Community’s Seventh Framework Programme (FP7/2007–2013) under grant agreement n.290605 (PSI-FELLOW/COFUND) and ‘Giuliana and Giorgio Stefanini Foundation’.« less
Zhang, Liu-Xia; Cao, Yi-Ren; Xiao, Hua; Liu, Xiao-Ping; Liu, Shao-Rong; Meng, Qing-Hua; Fan, Liu-Yin; Cao, Cheng-Xi
2016-03-15
In the present work we address a simple, rapid and quantitative analytical method for detection of different proteins present in biological samples. For this, we proposed the model of titration of double protein (TDP) and its relevant leverage theory relied on the retardation signal of chip moving reaction boundary electrophoresis (MRBE). The leverage principle showed that the product of the first protein content and its absolute retardation signal is equal to that of the second protein content and its absolute one. To manifest the model, we achieved theoretical self-evidence for the demonstration of the leverage principle at first. Then relevant experiments were conducted on the TDP-MRBE chip. The results revealed that (i) there was a leverage principle of retardation signal within the TDP of two pure proteins, and (ii) a lever also existed within these two complex protein samples, evidently demonstrating the validity of TDP model and leverage theory in MRBE chip. It was also showed that the proposed technique could provide a rapid and simple quantitative analysis of two protein samples in a mixture. Finally, we successfully applied the developed technique for the quantification of soymilk in adulterated infant formula. The TDP-MRBE opens up a new window for the detection of adulteration ratio of the poor food (milk) in blended high quality one. Copyright © 2015 Elsevier B.V. All rights reserved.
1st- and 2nd-order motion and texture resolution in central and peripheral vision
NASA Technical Reports Server (NTRS)
Solomon, J. A.; Sperling, G.
1995-01-01
STIMULI. The 1st-order stimuli are moving sine gratings. The 2nd-order stimuli are fields of static visual texture, whose contrasts are modulated by moving sine gratings. Neither the spatial slant (orientation) nor the direction of motion of these 2nd-order (microbalanced) stimuli can be detected by a Fourier analysis; they are invisible to Reichardt and motion-energy detectors. METHOD. For these dynamic stimuli, when presented both centrally and in an annular window extending from 8 to 10 deg in eccentricity, we measured the highest spatial frequency for which discrimination between +/- 45 deg texture slants and discrimination between opposite directions of motion were each possible. RESULTS. For sufficiently low spatial frequencies, slant and direction can be discriminated in both central and peripheral vision, for both 1st- and for 2nd-order stimuli. For both 1st- and 2nd-order stimuli, at both retinal locations, slant discrimination is possible at higher spatial frequencies than direction discrimination. For both 1st- and 2nd-order stimuli, motion resolution decreases 2-3 times more rapidly with eccentricity than does texture resolution. CONCLUSIONS. (1) 1st- and 2nd-order motion scale similarly with eccentricity. (2) 1st- and 2nd-order texture scale similarly with eccentricity. (3) The central/peripheral resolution fall-off is 2-3 times greater for motion than for texture.
Displacement and frequency analyses of vibratory systems
NASA Astrophysics Data System (ADS)
Low, K. H.
1995-02-01
This paper deals with the frequency and response studies of vibratory systems, which are represented by a set of n coupled second-order differential equations. The following numerical methods are used in the response analysis: central difference, fourth-order Runge-Kutta and modal methods. Data generated in the response analysis are processed to obtain the system frequencies by using the fast Fourier transform (FFT) or harmonic response methods. Two types of the windows are used in the FFT analysis: rectangular and Hanning windows. Examples of two, four and seven degrees of freedom systems are considered, to illustrate the proposed algorithms. Comparisons with those existing results confirm the validity of the proposed methods. The Hanning window attenuates the results that give a narrower bandwidth around the peak if compared with those using the rectangular window. It is also found that in free vibrations of a multi-mass system, the masses will vibrate in a manner that is the superposition of the natural frequencies of the system, while the system will vibrate at the driving frequency in forced vibrations.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
AN ASSESSMENT OF MCNP WEIGHT WINDOWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. S. HENDRICKS; C. N. CULBERTSON
2000-01-01
The weight window variance reduction method in the general-purpose Monte Carlo N-Particle radiation transport code MCNPTM has recently been rewritten. In particular, it is now possible to generate weight window importance functions on a superimposed mesh, eliminating the need to subdivide geometries for variance reduction purposes. Our assessment addresses the following questions: (1) Does the new MCNP4C treatment utilize weight windows as well as the former MCNP4B treatment? (2) Does the new MCNP4C weight window generator generate importance functions as well as MCNP4B? (3) How do superimposed mesh weight windows compare to cell-based weight windows? (4) What are the shortcomingsmore » of the new MCNP4C weight window generator? Our assessment was carried out with five neutron and photon shielding problems chosen for their demanding variance reduction requirements. The problems were an oil well logging problem, the Oak Ridge fusion shielding benchmark problem, a photon skyshine problem, an air-over-ground problem, and a sample problem for variance reduction.« less
On the use of transition matrix methods with extended ensembles.
Escobedo, Fernando A; Abreu, Charlles R A
2006-03-14
Different extended ensemble schemes for non-Boltzmann sampling (NBS) of a selected reaction coordinate lambda were formulated so that they employ (i) "variable" sampling window schemes (that include the "successive umbrella sampling" method) to comprehensibly explore the lambda domain and (ii) transition matrix methods to iteratively obtain the underlying free-energy eta landscape (or "importance" weights) associated with lambda. The connection between "acceptance ratio" and transition matrix methods was first established to form the basis of the approach for estimating eta(lambda). The validity and performance of the different NBS schemes were then assessed using as lambda coordinate the configurational energy of the Lennard-Jones fluid. For the cases studied, it was found that the convergence rate in the estimation of eta is little affected by the use of data from high-order transitions, while it is noticeably improved by the use of a broader window of sampling in the variable window methods. Finally, it is shown how an "elastic" window of sampling can be used to effectively enact (nonuniform) preferential sampling over the lambda domain, and how to stitch the weights from separate one-dimensional NBS runs to produce a eta surface over a two-dimensional domain.
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L.
2014-01-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. PMID:24780131
Field repair of AH-16 helicopter window cutting assemblies
NASA Technical Reports Server (NTRS)
Bement, L. J.
1984-01-01
The U.S. Army uses explosively actuated window cutting assemblies to provide emergency crew ground egress. Gaps between the system's explosive cords and acrylic windows caused a concern about functional reliability for a fleet of several hundred aircraft. A field repair method, using room temperature vulcanizing silicone compound (RTV), was developed and demonstrated to fill gaps as large as 0.250 inch.
Phast4Windows: A 3D graphical user interface for the reactive-transport simulator PHAST
Charlton, Scott R.; Parkhurst, David L.
2013-01-01
Phast4Windows is a Windows® program for developing and running groundwater-flow and reactive-transport models with the PHAST simulator. This graphical user interface allows definition of grid-independent spatial distributions of model properties—the porous media properties, the initial head and chemistry conditions, boundary conditions, and locations of wells, rivers, drains, and accounting zones—and other parameters necessary for a simulation. Spatial data can be defined without reference to a grid by drawing, by point-by-point definitions, or by importing files, including ArcInfo® shape and raster files. All definitions can be inspected, edited, deleted, moved, copied, and switched from hidden to visible through the data tree of the interface. Model features are visualized in the main panel of the interface, so that it is possible to zoom, pan, and rotate features in three dimensions (3D). PHAST simulates single phase, constant density, saturated groundwater flow under confined or unconfined conditions. Reactions among multiple solutes include mineral equilibria, cation exchange, surface complexation, solid solutions, and general kinetic reactions. The interface can be used to develop and run simple or complex models, and is ideal for use in the classroom, for analysis of laboratory column experiments, and for development of field-scale simulations of geochemical processes and contaminant transport.
NASA Astrophysics Data System (ADS)
Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher
2017-03-01
As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.
Queues with Choice via Delay Differential Equations
NASA Astrophysics Data System (ADS)
Pender, Jamol; Rand, Richard H.; Wesson, Elizabeth
Delay or queue length information has the potential to influence the decision of a customer to join a queue. Thus, it is imperative for managers of queueing systems to understand how the information that they provide will affect the performance of the system. To this end, we construct and analyze two two-dimensional deterministic fluid models that incorporate customer choice behavior based on delayed queue length information. In the first fluid model, customers join each queue according to a Multinomial Logit Model, however, the queue length information the customer receives is delayed by a constant Δ. We show that the delay can cause oscillations or asynchronous behavior in the model based on the value of Δ. In the second model, customers receive information about the queue length through a moving average of the queue length. Although it has been shown empirically that giving patients moving average information causes oscillations and asynchronous behavior to occur in U.S. hospitals, we analytically and mathematically show for the first time that the moving average fluid model can exhibit oscillations and determine their dependence on the moving average window. Thus, our analysis provides new insight on how operators of service systems should report queue length information to customers and how delayed information can produce unwanted system dynamics.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. From inside the viewing room of the Launch Control Center, KSC employees watch Space Shuttle Discovery as it creeps along the crawlerway toward the horizon, and Launch Pad 39B at NASAs Kennedy Space Center. First motion of the Shuttle out of the Vehicle Assembly Building (VAB) was at 2:04 p.m. EDT. The Mobile Launcher Platform is moved by the Crawler-Transporter underneath. The Crawler is 20 feet high, 131 feet long and 114 feet wide. It moves on eight tracks, each containing 57 shoes, or cleats, weighing one ton each. Loaded with the Space Shuttle, the Crawler can move at a maximum speed of approximately 1 mile an hour. A leveling system in the Crawler keeps the Shuttle vertical while negotiating the 5 percent grade leading to the top of the launch pad. Launch of Discovery on its Return to Flight mission, STS-114, is targeted for May 15 with a launch window that extends to June 3. During its 12-day mission, Discoverys seven-person crew will test new hardware and techniques to improve Shuttle safety, as well as deliver supplies to the International Space Station. Discovery was moved on March 29 from the Orbiter Processing Facility to the VAB and attached to its propulsion elements, a redesigned ET and twin SRBs.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. As Space Shuttle Discovery creeps along the crawlerway toward the horizon, and Launch Pad 39B at NASAs Kennedy Space Center, media and workers in the foreground appear as ants. First motion of the Shuttle out of the Vehicle Assembly Building (VAB) was at 2:04 p.m. EDT. The Mobile Launcher Platform is moved by the Crawler-Transporter underneath. The Crawler is 20 feet high, 131 feet long and 114 feet wide. It moves on eight tracks, each containing 57 shoes, or cleats, weighing one ton each. Loaded with the Space Shuttle, the Crawler can move at a maximum speed of approximately 1 mile an hour. A leveling system in the Crawler keeps the Shuttle vertical while negotiating the 5 percent grade leading to the top of the launch pad. Launch of Discovery on its Return to Flight mission, STS- 114, is targeted for May 15 with a launch window that extends to June 3. During its 12-day mission, Discoverys seven-person crew will test new hardware and techniques to improve Shuttle safety, as well as deliver supplies to the International Space Station. Discovery was moved on March 29 from the Orbiter Processing Facility to the VAB and attached to its propulsion elements, a redesigned ET and twin SRBs.
Plan averaging for multicriteria navigation of sliding window IMRT and VMAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, David, E-mail: dcraft@partners.org; Papp, Dávid; Unkelbach, Jan
2014-02-15
Purpose: To describe a method for combining sliding window plans [intensity modulated radiation therapy (IMRT) or volumetric modulated arc therapy (VMAT)] for use in treatment plan averaging, which is needed for Pareto surface navigation based multicriteria treatment planning. Methods: The authors show that by taking an appropriately defined average of leaf trajectories of sliding window plans, the authors obtain a sliding window plan whose fluence map is the exact average of the fluence maps corresponding to the initial plans. In the case of static-beam IMRT, this also implies that the dose distribution of the averaged plan is the exact dosimetricmore » average of the initial plans. In VMAT delivery, the dose distribution of the averaged plan is a close approximation of the dosimetric average of the initial plans. Results: The authors demonstrate the method on three Pareto optimal VMAT plans created for a demanding paraspinal case, where the tumor surrounds the spinal cord. The results show that the leaf averaged plans yield dose distributions that approximate the dosimetric averages of the precomputed Pareto optimal plans well. Conclusions: The proposed method enables the navigation of deliverable Pareto optimal plans directly, i.e., interactive multicriteria exploration of deliverable sliding window IMRT and VMAT plans, eliminating the need for a sequencing step after navigation and hence the dose degradation that is caused by such a sequencing step.« less
A New Moving Object Detection Method Based on Frame-difference and Background Subtraction
NASA Astrophysics Data System (ADS)
Guo, Jiajia; Wang, Junping; Bai, Ruixue; Zhang, Yao; Li, Yong
2017-09-01
Although many methods of moving object detection have been proposed, moving object extraction is still the core in video surveillance. However, with the complex scene in real world, false detection, missed detection and deficiencies resulting from cavities inside the body still exist. In order to solve the problem of incomplete detection for moving objects, a new moving object detection method combined an improved frame-difference and Gaussian mixture background subtraction is proposed in this paper. To make the moving object detection more complete and accurate, the image repair and morphological processing techniques which are spatial compensations are applied in the proposed method. Experimental results show that our method can effectively eliminate ghosts and noise and fill the cavities of the moving object. Compared to other four moving object detection methods which are GMM, VIBE, frame-difference and a literature's method, the proposed method improve the efficiency and accuracy of the detection.
Research on the honeycomb restrain layer application to the high power microwave dielectric window
NASA Astrophysics Data System (ADS)
Zhang, Qingyuan; Shao, Hao; Huang, Wenhua; Guo, Letian
2018-01-01
Dielectric window breakdown is an important problem of high power microwave radiation. A honeycomb layer can suppress the multipactor in two directions to restrain dielectric window breakdown. This paper studies the effect of the honeycomb restrain layer on improving the dielectric window power capability. It also studies the multipactor suppression mechanism by using the electromagnetic particle-in-cell software, gives the design method, and accomplishes the test experiment. The experimental results indicated that the honeycomb restrain layer can effectively improve the power capability twice.
Smart glass as the method of improving the energy efficiency of high-rise buildings
NASA Astrophysics Data System (ADS)
Gamayunova, Olga; Gumerova, Eliza; Miloradova, Nadezda
2018-03-01
The question that has to be answered in high-rise building is glazing and its service life conditions. Contemporary market offers several types of window units, for instance, wooden, aluminum, PVC and combined models. Wooden and PVC windows become the most widespread and competitive between each other. In recent times design engineers choose smart glass. In this article, the advantages and drawbacks of all types of windows are reviewed, and the recommendations are given according to choice of window type in order to improve energy efficiency of buildings.
Research on the honeycomb restrain layer application to the high power microwave dielectric window.
Zhang, Qingyuan; Shao, Hao; Huang, Wenhua; Guo, Letian
2018-01-01
Dielectric window breakdown is an important problem of high power microwave radiation. A honeycomb layer can suppress the multipactor in two directions to restrain dielectric window breakdown. This paper studies the effect of the honeycomb restrain layer on improving the dielectric window power capability. It also studies the multipactor suppression mechanism by using the electromagnetic particle-in-cell software, gives the design method, and accomplishes the test experiment. The experimental results indicated that the honeycomb restrain layer can effectively improve the power capability twice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakano, M; Haga, A; Hanaoka, S
2016-06-15
Purpose: The purpose of this study is to propose a new concept of four-dimensional (4D) cone-beam CT (CBCT) reconstruction for non-periodic organ motion using the Time-ordered Chain Graph Model (TCGM), and to compare the reconstructed results with the previously proposed methods, the total variation-based compressed sensing (TVCS) and prior-image constrained compressed sensing (PICCS). Methods: CBCT reconstruction method introduced in this study consisted of maximum a posteriori (MAP) iterative reconstruction combined with a regularization term derived from a concept of TCGM, which includes a constraint coming from the images of neighbouring time-phases. The time-ordered image series were concurrently reconstructed in themore » MAP iterative reconstruction framework. Angular range of projections for each time-phase was 90 degrees for TCGM and PICCS, and 200 degrees for TVCS. Two kinds of projection data, an elliptic-cylindrical digital phantom data and two clinical patients’ data, were used for reconstruction. The digital phantom contained an air sphere moving 3 cm along longitudinal axis, and temporal resolution of each method was evaluated by measuring the penumbral width of reconstructed moving air sphere. The clinical feasibility of non-periodic time-ordered 4D CBCT reconstruction was also examined using projection data of prostate cancer patients. Results: The results of reconstructed digital phantom shows that the penumbral widths of TCGM yielded the narrowest result; PICCS and TCGM were 10.6% and 17.4% narrower than that of TVCS, respectively. This suggests that the TCGM has the better temporal resolution than the others. Patients’ CBCT projection data were also reconstructed and all three reconstructed results showed motion of rectal gas and stool. The result of TCGM provided visually clearer and less blurring images. Conclusion: The present study demonstrates that the new concept for 4D CBCT reconstruction, TCGM, combined with MAP iterative reconstruction framework enables time-ordered image reconstruction with narrower time-window.« less
Joint histogram-based cost aggregation for stereo matching.
Min, Dongbo; Lu, Jiangbo; Do, Minh N
2013-10-01
This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.
Non-valvular main pulmonary artery vegetation associated with aortopulmonary window.
Unal, M; Tuncer, C; Serçe, K; Bostan, M; Gökçe, M; Erem, C
1995-01-01
We present a 32-year-old female with aortopulmonary window and vegetation of non-valvular main pulmonary artery. The aortopulmonary window is a rare congenital disease in which the aorta and pulmonary arteries are communicated by a defect of variable diameter. The pulmonic valve is the least commonly involved valve in bacterial endocarditis, but there is no vegetation of non-valvular main pulmonary artery in the literature. Colour duplex sonography showed an aortopulmonary window with aortic regurgitation. Magnetic resonance (MR) imaging demonstrating the vegetation on the wall of main pulmonary artery, is an useful and complementary method, and can be used for demonstration of congenital and acquired cardiovascular pathologies including aortopulmonary window and subpulmonic or suprapulmonic vegetations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruemmer, David J; Walton, Miles C
Methods and systems for controlling a plurality of robots through a single user interface include at least one robot display window for each of the plurality of robots with the at least one robot display window illustrating one or more conditions of a respective one of the plurality of robots. The user interface further includes at least one robot control window for each of the plurality of robots with the at least one robot control window configured to receive one or more commands for sending to the respective one of the plurality of robots. The user interface further includes amore » multi-robot common window comprised of information received from each of the plurality of robots.« less
Short segment search method for phylogenetic analysis using nested sliding windows
NASA Astrophysics Data System (ADS)
Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.
2017-10-01
To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.
Experiencing simultanagnosia through windowed viewing of complex social scenes.
Dalrymple, Kirsten A; Birmingham, Elina; Bischof, Walter F; Barton, Jason J S; Kingstone, Alan
2011-01-07
Simultanagnosia is a disorder of visual attention, defined as an inability to see more than one object at once. It has been conceived as being due to a constriction of the visual "window" of attention, a metaphor that we examine in the present article. A simultanagnosic patient (SL) and two non-simultanagnosic control patients (KC and ES) described social scenes while their eye movements were monitored. These data were compared to a group of healthy subjects who described the same scenes under the same conditions as the patients, or through an aperture that restricted their vision to a small portion of the scene. Experiment 1 demonstrated that SL showed unusually low proportions of fixations to the eyes in social scenes, which contrasted with all other participants who demonstrated the standard preferential bias toward eyes. Experiments 2 and 3 revealed that when healthy participants viewed scenes through a window that was contingent on where they looked (Experiment 2) or where they moved a computer mouse (Experiment 3), their behavior closely mirrored that of patient SL. These findings suggest that a constricted window of visual processing has important consequences for how simultanagnosic patients explore their world. Our paradigm's capacity to mimic simultanagnosic behaviors while viewing complex scenes implies that it may be a valid way of modeling simultanagnosia in healthy individuals, providing a useful tool for future research. More broadly, our results support the thesis that people fixate the eyes in social scenes because they are informative to the meaning of the scene. Copyright © 2010 Elsevier B.V. All rights reserved.
A comparison of moving object detection methods for real-time moving object detection
NASA Astrophysics Data System (ADS)
Roshan, Aditya; Zhang, Yun
2014-06-01
Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.
Windowed time-reversal music technique for super-resolution ultrasound imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Lianjie; Labyed, Yassin
Systems and methods for super-resolution ultrasound imaging using a windowed and generalized TR-MUSIC algorithm that divides the imaging region into overlapping sub-regions and applies the TR-MUSIC algorithm to the windowed backscattered ultrasound signals corresponding to each sub-region. The algorithm is also structured to account for the ultrasound attenuation in the medium and the finite-size effects of ultrasound transducer elements.
Image interpolation by adaptive 2-D autoregressive modeling and soft-decision estimation.
Zhang, Xiangjun; Wu, Xiaolin
2008-06-01
The challenge of image interpolation is to preserve spatial details. We propose a soft-decision interpolation technique that estimates missing pixels in groups rather than one at a time. The new technique learns and adapts to varying scene structures using a 2-D piecewise autoregressive model. The model parameters are estimated in a moving window in the input low-resolution image. The pixel structure dictated by the learnt model is enforced by the soft-decision estimation process onto a block of pixels, including both observed and estimated. The result is equivalent to that of a high-order adaptive nonseparable 2-D interpolation filter. This new image interpolation approach preserves spatial coherence of interpolated images better than the existing methods, and it produces the best results so far over a wide range of scenes in both PSNR measure and subjective visual quality. Edges and textures are well preserved, and common interpolation artifacts (blurring, ringing, jaggies, zippering, etc.) are greatly reduced.
Salvatierra, Rodrigo Villegas; Zakhidov, Dante; Sha, Junwei; Kim, Nam Dong; Lee, Seoung-Ki; Raji, Abdul-Rahman O; Zhao, Naiqin; Tour, James M
2017-03-28
Here we show that a versatile binary catalyst solution of Fe 3 O 4 /AlO x nanoparticles enables homogeneous growth of single to few-walled carbon nanotube (CNT) carpets from three-dimensional carbon-based substrates, moving past existing two-dimensional limited growth methods. The binary catalyst is composed of amorphous AlO x nanoclusters over Fe 3 O 4 crystalline nanoparticles, facilitating the creation of seamless junctions between the CNTs and the underlying carbon platform. The resulting graphene-CNT (GCNT) structure is a high-density CNT carpet ohmically connected to the carbon substrate, an important feature for advanced carbon electronics. As a demonstration of the utility of this approach, we use GCNTs as anodes and cathodes in binder-free lithium-ion capacitors, producing stable devices with high-energy densities (∼120 Wh kg -1 ), high-power density capabilities (∼20,500 W kg -1 at 29 Wh kg -1 ), and a large operating voltage window (4.3 to 0.01 V).
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
Reading time allocation strategies and working memory using rapid serial visual presentation.
Busler, Jessica N; Lazarte, Alejandro A
2017-09-01
Rapid serial visual presentation (RSVP) is a useful method for controlling the timing of text presentations and studying how readers' characteristics, such as working memory (WM) and reading strategies for time allocation, influence text recall. In the current study, a modified version of RSVP (Moving Window RSVP [MW-RSVP]) was used to induce longer pauses at the ends of clauses and ends of sentences when reading texts with multiple embedded clauses. We studied if WM relates to allocation of time at end of clauses or sentences in a self-paced reading task and in 2 MW-RSVP reading conditions (Constant MW-RSVP and Paused MW-RSVP) in which the reading rate was kept constant or pauses were induced. Higher WM span readers were more affected by the restriction of time allocation in the MW-RSVP conditions. In addition, the recall of both higher and lower WM-span readers benefited from the paused MW-RSVP presentation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
U.S. stock market interaction network as learned by the Boltzmann machine
Borysov, Stanislav S.; Roudi, Yasser; Balatsky, Alexander V.
2015-12-07
Here, we study historical dynamics of joint equilibrium distribution of stock returns in the U.S. stock market using the Boltzmann distribution model being parametrized by external fields and pairwise couplings. Within Boltzmann learning framework for statistical inference, we analyze historical behavior of the parameters inferred using exact and approximate learning algorithms. Since the model and inference methods require use of binary variables, effect of this mapping of continuous returns to the discrete domain is studied. The presented results show that binarization preserves the correlation structure of the market. Properties of distributions of external fields and couplings as well as themore » market interaction network and industry sector clustering structure are studied for different historical dates and moving window sizes. We demonstrate that the observed positive heavy tail in distribution of couplings is related to the sparse clustering structure of the market. We also show that discrepancies between the model’s parameters might be used as a precursor of financial instabilities.« less
NASA Astrophysics Data System (ADS)
Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura
2014-05-01
Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (<300Euros) Windows tablets. However, the lack of flexibility in data structure makes for cumbersome workflows when trying to interface our existing shapefile-centric data structures to Move. Nonetheless, in spring 2014 we will experiment with full-3D immersion in the field using the full Move software package in combination with ground based LiDAR and photogrammetry. One new workflow suggested by our initial experiments is that field geologists should consider using photogrammetry software to capture 3D visualizations of key outcrops. This process is now straightforward in several software packages, and it affords a previously unheard of potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.
Tsuo, S.; Langford, A.A.
1989-03-28
Unwanted build-up of the film deposited on the transparent light-transmitting window of a photochemical vacuum deposition (photo-CVD) chamber is eliminated by flowing an etchant into the part of the photolysis region in the chamber immediately adjacent the window and remote from the substrate and from the process gas inlet. The respective flows of the etchant and the process gas are balanced to confine the etchant reaction to the part of the photolysis region proximate to the window and remote from the substrate. The etchant is preferably one that etches film deposit on the window, does not etch or affect the window itself, and does not produce reaction by-products that are deleterious to either the desired film deposited on the substrate or to the photolysis reaction adjacent the substrate. 3 figs.
Tsuo, Simon; Langford, Alison A.
1989-01-01
Unwanted build-up of the film deposited on the transparent light-transmitting window of a photochemical vacuum deposition (photo-CVD) chamber is eliminated by flowing an etchant into the part of the photolysis region in the chamber immediately adjacent the window and remote from the substrate and from the process gas inlet. The respective flows of the etchant and the process gas are balanced to confine the etchant reaction to the part of the photolysis region proximate to the window and remote from the substrate. The etchant is preferably one that etches film deposit on the window, does not etch or affect the window itself, and does not produce reaction by-products that are deleterious to either the desired film deposited on the substrate or to the photolysis reaction adjacent the substrate.
Electrochemical Stability of Li 10GeP 2S 12 and Li 7La 3Zr 2O 12 Solid Electrolytes
Han, Fudong; Zhu, Yizhou; He, Xingfeng; ...
2016-01-21
The electrochemical stability window of solid electrolyte is overestimated by the conventional experimental method using a Li/electrolyte/inert metal semiblocking electrode because of the limited contact area between solid electrolyte and inert metal. Since the battery is cycled in the overestimated stability window, the decomposition of the solid electrolyte at the interfaces occurs but has been ignored as a cause for high interfacial resistances in previous studies, limiting the performance improvement of the bulk-type solid-state battery despite the decades of research efforts. Thus, there is an urgent need to identify the intrinsic stability window of the solid electrolyte. The thermodynamic electrochemicalmore » stability window of solid electrolytes is calculated using first principles computation methods, and an experimental method is developed to measure the intrinsic electrochemical stability window of solid electrolytes using a Li/electrolyte/electrolyte-carbon cell. The most promising solid electrolytes, Li10GeP2S12 and cubic Li-garnet Li7La3Zr2O12, are chosen as the model materials for sulfide and oxide solid electrolytes, respectively. The results provide valuable insights to address the most challenging problems of the interfacial stability and resistance in high-performance solid-state batteries.« less
An Efficient Adaptive Window Size Selection Method for Improving Spectrogram Visualization.
Nisar, Shibli; Khan, Omar Usman; Tariq, Muhammad
2016-01-01
Short Time Fourier Transform (STFT) is an important technique for the time-frequency analysis of a time varying signal. The basic approach behind it involves the application of a Fast Fourier Transform (FFT) to a signal multiplied with an appropriate window function with fixed resolution. The selection of an appropriate window size is difficult when no background information about the input signal is known. In this paper, a novel empirical model is proposed that adaptively adjusts the window size for a narrow band-signal using spectrum sensing technique. For wide-band signals, where a fixed time-frequency resolution is undesirable, the approach adapts the constant Q transform (CQT). Unlike the STFT, the CQT provides a varying time-frequency resolution. This results in a high spectral resolution at low frequencies and high temporal resolution at high frequencies. In this paper, a simple but effective switching framework is provided between both STFT and CQT. The proposed method also allows for the dynamic construction of a filter bank according to user-defined parameters. This helps in reducing redundant entries in the filter bank. Results obtained from the proposed method not only improve the spectrogram visualization but also reduce the computation cost and achieves 87.71% of the appropriate window length selection.
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 makes its way toward Launch Pad 39B (in the background). The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
Unity connecting module moving to new site in SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
In the Space Station Processing Facility (SSPF) Unity is suspended in air as it is moved to a now location in the SSPF. At right, visitors watch through a viewing window, part of the visitors tour at the Center. As the primary payload on mission STS-88, scheduled to launch Dec. 3, 1998, Unity will be mated to the Russian-built Zarya control module which should already be in orbit at that time. In the SSPF, Unity is undergoing testing such as the Pad Demonstration Test to verify the compatibility of the module with the Space Shuttle, as well as the ability of the astronauts to send and receive commands to Unity from the flight deck of the orbiter, and the common berthing mechanism to which other space station elements will dock. Unity is expected to be ready for installation into the payload canister on Oct. 25, and transported to Launch Pad 39-A on Oct. 27.
Broadband laser ranging precision and accuracy experiments with PDV benchmarking
NASA Astrophysics Data System (ADS)
Catenacci, Jared; Daykin, Ed; Howard, Marylesa; Lalone, Brandon; Miller, Kirk
2017-06-01
Broadband laser ranging (BLR) is a developmental diagnostic designed to measure the precise position of surfaces and particle clouds moving at velocities of several kilometers per second. Recent single stage gas gun experiments were conducted to quantify the precision and accuracy possible with a typical BLR system. For these experiments, the position of a mirrored projectile is measured relative to the location of a stationary optical flat (uncoated window) mounted within the gun catch tank. Projectile velocity is constrained to one-dimensional motion within the gun barrel. A collimating probe is aligned to be orthogonal to both the target window and the mirrored impactor surface. The probe is used to simultaneously measure the position and velocity with a BLR and conventional Photonic Doppler Velocimetry (PDV) system. Since there is a negligible lateral component to the target velocity, coupled with strong signal returns from a mirrored surface, integrating the PDV measurement provides a high fidelity distance measurement reference to which the BLR measurement may be compared.
Apparatus for solar coal gasification
Gregg, D.W.
1980-08-04
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials is described. Incident solar radiation is focused from an array of heliostats through a window onto the surface of a moving bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called synthesis gas, which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam in one embodiment at the rear surface of a secondary mirror used to redirect the focused sunlight. Another novel feature of the invention is the location and arrangement of the array of mirrors on an inclined surface (e.g., a hillside) to provide for direct optical communication of said mirrors and the carbonaceous feed without a secondary redirecting mirror.
Complex Patterns in Financial Time Series Through HIGUCHI’S Fractal Dimension
NASA Astrophysics Data System (ADS)
Grace Elizabeth Rani, T. G.; Jayalalitha, G.
2016-11-01
This paper analyzes the complexity of stock exchanges through fractal theory. Closing price indices of four stock exchanges with different industry sectors are selected. Degree of complexity is assessed through Higuchi’s fractal dimension. Various window sizes are considered in evaluating the fractal dimension. It is inferred that the data considered as a whole represents random walk for all the four indices. Analysis of financial data through windowing procedure exhibits multi-fractality. Attempts to apply moving averages to reduce noise in the data revealed lower estimates of fractal dimension, which was verified using fractional Brownian motion. A change in the normalization factor in Higuchi’s algorithm did improve the results. It is quintessential to focus on rural development to realize a standard and steady growth of economy. Tools must be devised to settle the issues in this regard. Micro level institutions are necessary for the economic growth of a country like India, which would induce a sporadic development in the present global economical scenario.
Development of an external beam nuclear microprobe on the Aglae facility of the Louvre museum
NASA Astrophysics Data System (ADS)
Calligaro, T.; Dran, J.-C.; Ioannidou, E.; Moignard, B.; Pichon, L.; Salomon, J.
2000-03-01
The external beam line of our facility has been recently equipped with the focusing system previously mounted on a classical nuclear microprobe. When using a 0.1 μm thick Si 3N 4 foil for the exit window and flowing helium on the sample under analysis, a beam spot as small as 10 μm is attainable at a distance of 3 mm from the window. Elemental micromapping is performed by mechanical scanning. An electronic device has been designed which allows XY scanning by moving the sample under the beam by steps down to 0.1 μm. Beam monitoring is carried out by means of the weak X-ray signal emitted by the exit foil and detected by a specially designed Si(Li) detector cooled by Peltier effect. The characteristics of external beams of protons and alpha particles are evaluated by means of resonance scanning and elemental mapping of a grid. An example of application is presented, dealing with elemental micro-mapping of inclusions in gemstones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siegfried, Matthew J.; Radford, Daniel R.; Huffman, Russell K.
An electrostatic particle collector may generally include a housing having sidewalls extending lengthwise between a first end and a second end. The housing may define a plate slot that extends heightwise within the housing between a top end and a bottom end. The housing may further include a plate access window that provides access to the bottom end of the plate slot. The collector may also include a collector plate configured to be installed within the plate slot that extends heightwise between a top edge and a bottom edge. Additionally, when the collector plate is installed within the plate slot,more » the bottom edge of the collector plate may be accessible from an exterior of the housing via the plate access window so as to allow the bottom edge of the collector plate to be moved relative to the housing to facilitate removal of the collector plate from the housing.« less
2013-06-15
ISS036-E-008165 (15 June 2013) --- Expedition 36 Flight Engineer Fyodor Yurchikhin with Russia's Federal Space Agency (Roscosmos) takes pictures of a highly anticipated event from a window in the Pirs module on the International Space Station. His electronic still camera is equipped with a 400mm lens to capture distant images of the European Space Agency's Automated Transfer Vehicle-4 (ATV-4) “Albert Einstein.” The spacecraft eventually moved in much closer and successfully docked to the orbital outpost at 2:07 GMT, June 15, 2013, following a ten-day period of free-flight.
21. ORE DOCK, LOOKING SOUTHWEST. THIS VIEW SHOWS THE WEST ...
21. ORE DOCK, LOOKING SOUTHWEST. THIS VIEW SHOWS THE WEST END OF THE DOCK. EMPTY CARS ARE MOVED IN FROM THE WEST BY 'SHUNT CARS,' PUT INTO PLACE AS NEEDED BENEATH THE HULETTS, FILLED, THEN SHUNTED TO THE EAST END OF THE YARD WHERE THEY ARE MADE UP INTO TRAINS. THE POWER HOUSE (WITH TALL ARCHED WINDOWS) AND THE TWO-STORY DOCK OFFICE CAN BE SEEN HERE. - Pennsylvania Railway Ore Dock, Lake Erie at Whiskey Island, approximately 1.5 miles west of Public Square, Cleveland, Cuyahoga County, OH
2008-04-01
V. X. D. Yang, M. L. Gordon, A. Mok, Y. Zhao, Z. Chen, R. Cobbold , B. Wilson, and I. Vitkin, " Improve.d pha<e. re;olve.d optical Doppler tomography... marked according to visualmea~urement (b), rutd the result of the moving circular window filtering. In this case, the location of the mininttun...botmdaries are marked on the OCT image; (g). The ODT image for the same OCT scan; (h). Magnified \\<iew of the region marked in (_ll,). where. the
The predictive power of singular value decomposition entropy for stock market dynamics
NASA Astrophysics Data System (ADS)
Caraiani, Petre
2014-01-01
We use a correlation-based approach to analyze financial data from the US stock market, both daily and monthly observations from the Dow Jones. We compute the entropy based on the singular value decomposition of the correlation matrix for the components of the Dow Jones Industrial Index. Based on a moving window, we derive time varying measures of entropy for both daily and monthly data. We find that the entropy has a predictive ability with respect to stock market dynamics as indicated by the Granger causality tests.
2012-07-27
ISS032-E-010609 (27 July 2012) --- As seen through windows in the Cupola, the station's Canadarm2 robotic arm moves toward the unpiloted Japan Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV-3) as it approaches the International Space Station. NASA astronaut Joe Acaba and Japan Aerospace Exploration Agency astronaut Aki Hoshide, both Expedition 32 flight engineers, used the station's robotic arm to capture and berth the HTV-3 to the Earth-facing port of the station's Harmony node. The attachment was completed at 10:34 a.m. (EDT) on July 27, 2012.
Casing window milling with abrasive fluid jet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vestavik, O.M.; Fidtje, T.H.; Faure, A.M.
1995-12-31
Methods for through tubing re-entry drilling of multilateral wells has a large potential for increasing hydrocarbon production and total recovery. One of the bottle-necks of this technology is initiation of the side-track by milling a window in the casing downhole. A new approach to this problem has been investigated in a joint industry project. An experimental set-up has been built for milling a 4 inch window in a 7 inch steel casing at surface in the laboratory. A specially designed bit developed at RIF using abrasive jet cutting technology has been used for the window milling. The bit has anmore » abrasive jet beam which is always directed in the desired side-track direction, even if the bit is rotating uniformly. The bit performs the milling with a combined mechanical and hydraulic jet action. The method has been successfully demonstrated. The experiments has shown that the window milling can be performed with very low WOB and torque, and that only small side forces are required to perform the operation. Casing milling has been performed without a whipstock, a cement plug has been the only support for the tool. The tests indicate that milling operations can be performed more efficiently with less time and costs than what is required with conventional techniques. However, the method still needs some development of the downhole motor for coiled tubing applications. The method can be used both for milling and drilling giving the advantage of improved rate of penetration, improved bit life and increased horizontal reach. The method is planned to be demonstrated downhole in the near future.« less
Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow.
Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L
2014-07-01
Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, L; Shen, C; Wang, J
Purpose: To reduce cone beam CT (CBCT) imaging dose, we previously proposed a progressive dose control (PDC) scheme to employ temporal correlation between CBCT images at different fractions for image quality enhancement. A temporal non-local means (TNLM) method was developed to enhance quality of a new low-dose CBCT using existing high-quality CBCT. To enhance a voxel value, the TNLM method searches for similar voxels in a window. Due to patient deformation among the two CBCTs, a large searching window was required, reducing image quality and computational efficiency. This abstract proposes a deformation-assisted TNLM (DA-TNLM) method to solve this problem. Methods:more » For a low-dose CBCT to be enhanced using a high-quality CBCT, we first performed deformable image registration between the low-dose CBCT and the high-quality CBCT to approximately establish voxel correspondence between the two. A searching window for a voxel was then set based on the deformation vector field. Specifically, the search window for each voxel was shifted by the deformation vector. A TNLM step was then applied using only voxels within this determined window to correct image intensity at the low-dose CBCT. Results: We have tested the proposed scheme on simulated CIRS phantom data and real patient data. The CITS phantom was scanned on Varian onboard imaging CBCT system with coach shifting and dose reducing for each time. The real patient data was acquired in four fractions with dose reduced from standard CBCT dose to 12.5% of standard dose. It was found that the DA-TNLM method can reduce total dose by over 75% on average in the first four fractions. Conclusion: We have developed a PDC scheme which can enhance the quality of image scanned at low dose using a DA-TNLM method. Tests in phantom and patient studies demonstrated promising results.« less
NASA Astrophysics Data System (ADS)
Zboril, Ondrej; Nedoma, Jan; Cubik, Jakub; Novak, Martin; Bednarek, Lukas; Fajkus, Marcel; Vasinek, Vladimir
2016-04-01
Interferometric sensors are very accurate and sensitive sensors that due to the extreme sensitivity allow sensing vibration and acoustic signals. This paper describes a new method of implementation of Mach-Zehnder interferometer for sensing of vibrations caused by touching on the window panes. Window panes are part of plastic windows, in which the reference arm of the interferometer is mounted and isolated inside the frame, a measuring arm of the interferometer is fixed to the window pane and it is mounted under the cover of the window frame. It prevents visibility of the optical fiber and this arrangement is the basis for the safety system. For the construction of the vibration sensor standard elements of communication networks are used - optical fiber according to G.652D and 1x2 splitters with dividing ratio 1:1. Interferometer operated at a wavelength of 1550 nm. The paper analyses the sensitivity of the window in a 12x12 measuring points matrix, there is specified sensitivity distribution of the window pane.
NASA Astrophysics Data System (ADS)
Whitmore, Alexander Jason
Concentrating solar power systems are currently the predominant solar power technology for generating electricity at the utility scale. The central receiver system, which is a concentrating solar power system, uses a field of mirrors to concentrate solar radiation onto a receiver where a working fluid is heated to drive a turbine. Current central receiver systems operate on a Rankine cycle, which has a large demand for cooling water. This demand for water presents a challenge for the current central receiver systems as the ideal locations for solar power plants have arid climates. An alternative to the current receiver technology is the small particle receiver. The small particle receiver has the potential to produce working fluid temperatures suitable for use in a Brayton cycle which can be more efficient when pressurized to 0.5 MPa. Using a fused quartz window allows solar energy into the receiver while maintaining a pressurized small particle receiver. In this thesis, a detailed numerical investigation for a spectral, three dimensional, cylindrical glass window for a small particle receiver was performed. The window is 1.7 meters in diameter and 0.0254 meters thick. There are three Monte Carlo Ray Trace codes used within this research. The first MCRT code, MIRVAL, was developed by Sandia National Laboratory and modified by a fellow San Diego State University colleague Murat Mecit. This code produces the solar rays on the exterior surface of the window. The second MCRT code was developed by Steve Ruther and Pablo Del Campo. This code models the small particle receiver, which creates the infrared spectral direction flux on the interior surface of the window used in this work. The third MCRT, developed for this work, is used to model radiation heat transfer within the window itself and is coupled to an energy equation solver to produce a temperature distribution. The MCRT program provides a source term to the energy equation. This in turn, produces a new temperature field for the MCRT program; together the equations are solved iteratively. These iterations repeat until convergence is reached for a steady state temperature field. The energy equation was solved using a finite volume method. The window's thermal conductivity is modeled as a function of temperature. This thermal model is used to investigate the effects of different materials, receiver geometries, interior convection coefficients and exterior convection coefficients. To prevent devitrification and the ultimate failure of the window, the window needs to stay below the devitrification temperature of the material. In addition, the temperature gradients within the window need to be kept to a minimum to prevent thermal stresses. A San Diego State University colleague E-Fann Saung uses these temperature maps to insure that the mounting of the window does not produce thermal stresses which can cause cracking in the brittle fused quartz. The simulations in this thesis show that window temperatures are below the devitrification temperature of the window when there are cooling jets on both surfaces of the window. Natural convection on the exterior window surface was explored and it does not provide adequate cooling; therefore forced convection is required. Due to the low thermal conductivity of the window, the edge mounting thermal boundary condition has little effect on the maximum temperature of the window. The simulations also showed that the solar input flux absorbed less than 1% of the incoming radiation while the window absorbed closer to 20% of the infrared radiation emitted by the receiver. The main source of absorbed power in the window is located directly on the interior surface of the window where the infrared radiation is absorbed. The geometry of the receiver has a large impact on the amount of emitted power which reached the interior surface of the window, and using a conical shaped receiver dramatically reduced the receiver's infrared flux on the window. The importance of internal emission is explored within this research. Internal emission produces a more even emission field throughout the receiver than applying radiation surface emission only. Due to a majority of the infrared receiver re-radiation being absorbed right at the interior surface, the surface emission only approximation method produces lower maximum temperatures.
Calculating the Effect of External Shading on the Solar Heat Gain Coefficient of Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohler, Christian; Shukla, Yash; Rawal, Rajan
Current prescriptive building codes have limited ways to account for the effect of solar shading, such as overhangs and awnings, on window solar heat gains. We propose two new indicators, the adjusted Solar Heat Gain Coefficient (aSHGC) which accounts for external shading while calculating the SHGC of a window, and a weighted SHGC (SHGCw) which provides a seasonal SHGC weighted by solar intensity. We demonstrate a method to calculate these indices using existing tools combined with additional calculations. The method is demonstrated by calculating the effect of an awning on a clear double glazing in New Delhi.
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
Dobramysl, U; Holcman, D
2018-02-15
Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.
Three-dimensional laser window formation for industrial application
NASA Technical Reports Server (NTRS)
Verhoff, Vincent G.; Kowalski, David
1993-01-01
The NASA Lewis Research Center has developed and implemented a unique process for forming flawless three-dimensional, compound-curvature laser windows to extreme accuracies. These windows represent an integral component of specialized nonintrusive laser data acquisition systems that are used in a variety of compressor and turbine research testing facilities. These windows are molded to the flow surface profile of turbine and compressor casings and are required to withstand extremely high pressures and temperatures. This method of glass formation could also be used to form compound-curvature mirrors that would require little polishing and for a variety of industrial applications, including research view ports for testing devices and view ports for factory machines with compound-curvature casings. Currently, sodium-alumino-silicate glass is recommended for three-dimensional laser windows because of its high strength due to chemical strengthening and its optical clarity. This paper discusses the main aspects of three-dimensional laser window formation. It focuses on the unique methodology and the peculiarities that are associated with the formation of these windows.
Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design
NASA Astrophysics Data System (ADS)
Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.
1987-04-01
Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.
Kwon, Sangil; Park, Yonghee; Park, Junhong; Kim, Jeongsoo; Choi, Kwang-Ho; Cha, Jun-Seok
2017-01-15
This paper presents the on-road nitrogen oxides (NO x ) emissions measurements from Euro 6 light-duty diesel vehicles using a portable emissions measurement system on the predesigned test routes in the metropolitan area of Seoul, Korea. Six diesel vehicles were tested and the NO x emissions results were analyzed according to the driving routes, driving conditions, data analysis methods, and ambient temperatures. Total NO x emissions for route 1, which has higher driving severity than route 2, differed by -4-60% from those for route 2. The NO x emissions when the air conditioner (AC) was used were higher by 68% and 85%, on average, for routes 1 and 2, respectively, compared to when the AC was not used. The analytical results for NO x emissions by the moving averaging window method were higher by 2-31% compared to the power binning method. NO x emissions at lower ambient temperatures (0-5°C) were higher by 82-192% compared to those at higher ambient temperatures (15-20°C). This result shows that performance improvements of exhaust gas recirculation and the NO x after-treatment system will be needed at lower ambient temperatures. Copyright © 2016 Elsevier B.V. All rights reserved.
Two-dimensional correlation spectroscopy — Biannual survey 2007-2009
NASA Astrophysics Data System (ADS)
Noda, Isao
2010-06-01
The publication activities in the field of 2D correlation spectroscopy are surveyed with the emphasis on papers published during the last two years. Pertinent review articles and conference proceedings are discussed first, followed by the examination of noteworthy developments in the theory and applications of 2D correlation spectroscopy. Specific topics of interest include Pareto scaling, analysis of randomly sampled spectra, 2D analysis of data obtained under multiple perturbations, evolution of 2D spectra along additional variables, comparison and quantitative analysis of multiple 2D spectra, orthogonal sample design to eliminate interfering cross peaks, quadrature orthogonal signal correction and other data transformation techniques, data pretreatment methods, moving window analysis, extension of kernel and global phase angle analysis, covariance and correlation coefficient mapping, variant forms of sample-sample correlation, and different display methods. Various static and dynamic perturbation methods used in 2D correlation spectroscopy, e.g., temperature, composition, chemical reactions, H/D exchange, physical phenomena like sorption, diffusion and phase transitions, optical and biological processes, are reviewed. Analytical probes used in 2D correlation spectroscopy include IR, Raman, NIR, NMR, X-ray, mass spectrometry, chromatography, and others. Application areas of 2D correlation spectroscopy are diverse, encompassing synthetic and natural polymers, liquid crystals, proteins and peptides, biomaterials, pharmaceuticals, food and agricultural products, solutions, colloids, surfaces, and the like.
Measurement of the noise power spectrum in digital x-ray detectors
NASA Astrophysics Data System (ADS)
Aufrichtig, Richard; Su, Yu; Cheng, Yu; Granfors, Paul R.
2001-06-01
The noise power spectrum, NPS, is a key imaging property of a detector and one of the principle quantities needed to compute the detective quantum efficiency. NPS is measured by computing the Fourier transform of flat field images. Different measurement methods are investigated and evaluated with images obtained from an amorphous silicon flat panel x-ray imaging detector. First, the influence of fixed pattern structures is minimized by appropriate background corrections. For a given data set the effect of using different types of windowing functions is studied. Also different window sizes and amounts of overlap between windows are evaluated and compared to theoretical predictions. Results indicate that measurement error is minimized when applying overlapping Hanning windows on the raw data. Finally it is shown that radial averaging is a useful method of reducing the two-dimensional noise power spectrum to one dimension.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Addressing scale dependence in roughness and morphometric statistics derived from point cloud data.
NASA Astrophysics Data System (ADS)
Buscombe, D.; Wheaton, J. M.; Hensleigh, J.; Grams, P. E.; Welcker, C. W.; Anderson, K.; Kaplinski, M. A.
2015-12-01
The heights of natural surfaces can be measured with such spatial density that almost the entire spectrum of physical roughness scales can be characterized, down to the morphological form and grain scales. With an ability to measure 'microtopography' comes a demand for analytical/computational tools for spatially explicit statistical characterization of surface roughness. Detrended standard deviation of surface heights is a popular means to create continuous maps of roughness from point cloud data, using moving windows and reporting window-centered statistics of variations from a trend surface. If 'roughness' is the statistical variation in the distribution of relief of a surface, then 'texture' is the frequency of change and spatial arrangement of roughness. The variance in surface height as a function of frequency obeys a power law. In consequence, roughness is dependent on the window size through which it is examined, which has a number of potential disadvantages: 1) the choice of window size becomes crucial, and obstructs comparisons between data; 2) if windows are large relative to multiple roughness scales, it is harder to discriminate between those scales; 3) if roughness is not scaled by the texture length scale, information on the spacing and clustering of roughness `elements' can be lost; and 4) such practice is not amenable to models describing the scattering of light and sound from rough natural surfaces. We discuss the relationship between roughness and texture. Some useful parameters which scale vertical roughness to characteristic horizontal length scales are suggested, with examples of bathymetric point clouds obtained using multibeam from two contrasting riverbeds, namely those of the Colorado River in Grand Canyon, and the Snake River in Hells Canyon. Such work, aside from automated texture characterization and texture segmentation, roughness and grain size calculation, might also be useful for feature detection and classification from point clouds.
Sun, S P; Lu, W; Lei, Y B; Men, X M; Zuo, B; Ding, S G
2017-08-07
Objective: To discuss the prediction of round window(RW) visibility in cochlear implantation(CI) with temporal bone high resolution computed tomography(HRCT). Methods: From January 2013 to January 2017, 130 cases underwent both HRCT and CI in our hospital were analyzed. The distance from facial nerve to posterior canal wall(FWD), the angle between facial nerve and inner margin of round window(FRA), and the angle between facial nerve and tympanic anulus to inner margin of round window(FRAA) were detected at the level of round window on axial temporal bone HRCT. A line parallel to the posterior wall of ear canal was drawn from the anterior wall of facial nerve at the level of round window on axial temporal bone HRCT and its relationship with round window was detected (facial-round window line, FRL): type0-posterior to the round window, type1-between the round window, type2-anterior to the round window. Their(FWD, FRA, FRAA, FRL) relationships with intra-operative round window visibility were analyzed by SPSS 17.0 software. Results: FWD( F =18.76, P =0.00), FRA( F =34.57, P =0.00), FRAA ( F =14.24, P =0.00) could affect the intra-operative RW visibility significantly. RW could be exposed completely during CI when preoperative HRCT showing type0 FRL. RW might be partly exposed and not exposed when preoperative HRCT showing type1 and type2 FRL respectively. Conclusion: FWD, FRA, FRAA and FRL of temporal bone HRCT can predict intra-operative round window visibility effectively in CI surgery.
Variational formulation of macroparticle models for electromagnetic plasma simulations
Stamm, Alexander B.; Shadwick, Bradley A.; Evstatiev, Evstati G.
2014-06-01
A variational method is used to derive a self-consistent macroparticle model for relativistic electromagnetic kinetic plasma simulations. Extending earlier work, discretization of the electromagnetic Low Lagrangian is performed via a reduction of the phase-space distribution function onto a collection of finite-sized macroparticles of arbitrary shape and discretization of field quantities onto a spatial grid. This approach may be used with lab frame coordinates or moving window coordinates; the latter can greatly improve computational efficiency for studying some types of laser-plasma interactions. The primary advantage of the variational approach is the preservation of Lagrangian symmetries, which in our case leads tomore » energy conservation and thus avoids difficulties with grid heating. In addition, this approach decouples particle size from grid spacing and relaxes restrictions on particle shape, leading to low numerical noise. The variational approach also guarantees consistent approximations in the equations of motion and is amenable to higher order methods in both space and time. We restrict our attention to the 1.5-D case (one coordinate and two momenta). Lastly, simulations are performed with the new models and demonstrate energy conservation and low noise.« less
Rong, Xing; Du, Yong; Frey, Eric C
2012-06-21
Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for calculating the bias due to model-mismatch and the variance of the VOI activity estimates, respectively. To obtain the optimal acquisition energy window for general situations of interest in clinical (90)Y microsphere imaging, we generated phantoms with multiple tumors of various sizes and various tumor-to-normal activity concentration ratios using a digital phantom that realistically simulates human anatomy, simulated (90)Y microsphere imaging with a clinical SPECT system and typical imaging parameters using a previously validated Monte Carlo simulation code, and used a previously proposed method for modeling the image degrading effects in quantitative SPECT reconstruction. The obtained optimal acquisition energy window was 100-160 keV. The values of the proposed FOM were much larger than the FOM taking into account only the variance of the activity estimates, thus demonstrating in our experiment that the bias of the activity estimates due to model-mismatch was a more important factor than the variance in terms of limiting the reliability of activity estimates.
Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang
2016-08-01
Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation. Copyright © 2016 Elsevier B.V. All rights reserved.
Method and apparatus for monitoring the flow of mercury in a system
Grossman, Mark W.
1987-01-01
An apparatus and method for monitoring the flow of mercury in a system. The equipment enables the entrainment of the mercury in a carrier gas e.g., an inert gas, which passes as mercury vapor between a pair of optically transparent windows. The attenuation of the emission is indicative of the quantity of mercury (and its isotopes) in the system. A 253.7 nm light is shone through one of the windows and the unabsorbed light is detected through the other window. The absorption of the 253.7 nm light is thereby measured whereby the quantity of mercury passing between the windows can be determined. The apparatus includes an in-line sensor for measuring the quantity of mercury. It includes a conduit together with a pair of apertures disposed in a face to face relationship and arranged on opposite sides of the conduit. A pair of optically transparent windows are disposed upon a pair of viewing tubes. A portion of each of the tubes is disposed inside of the conduit and within each of the apertures. The two windows are disposed in a face to face relationship on the ends of the viewing tubes and the entire assembly is hermetically sealed from the atmosphere whereby when 253.7 nm ultraviolet light is shone through one of the windows and detected through the other, the quantity of mercury which is passing by can be continuously monitored due to absorption which is indicated by attenuation of the amplitude of the observed emission.
Leyde, Brian P; Klein, Sanford A; Nellis, Gregory F; Skye, Harrison
2017-03-01
This paper presents a new method called the Crossed Contour Method for determining the effective properties (borehole radius and ground thermal conductivity) of a vertical ground-coupled heat exchanger. The borehole radius is used as a proxy for the overall borehole thermal resistance. The method has been applied to both simulated and experimental borehole Thermal Response Test (TRT) data using the Duct Storage vertical ground heat exchanger model implemented in the TRansient SYstems Simulation software (TRNSYS). The Crossed Contour Method generates a parametric grid of simulated TRT data for different combinations of borehole radius and ground thermal conductivity in a series of time windows. The error between the average of the simulated and experimental bore field inlet and outlet temperatures is calculated for each set of borehole properties within each time window. Using these data, contours of the minimum error are constructed in the parameter space of borehole radius and ground thermal conductivity. When all of the minimum error contours for each time window are superimposed, the point where the contours cross (intersect) identifies the effective borehole properties for the model that most closely represents the experimental data in every time window and thus over the entire length of the experimental data set. The computed borehole properties are compared with results from existing model inversion methods including the Ground Property Measurement (GPM) software developed by Oak Ridge National Laboratory, and the Line Source Model.
Mahmood, Hafiz Sultan; Hoogmoed, Willem B.; van Henten, Eldert J.
2013-01-01
Fine-scale spatial information on soil properties is needed to successfully implement precision agriculture. Proximal gamma-ray spectroscopy has recently emerged as a promising tool to collect fine-scale soil information. The objective of this study was to evaluate a proximal gamma-ray spectrometer to predict several soil properties using energy-windows and full-spectrum analysis methods in two differently managed sandy loam fields: conventional and organic. In the conventional field, both methods predicted clay, pH and total nitrogen with a good accuracy (R2 ≥ 0.56) in the top 0–15 cm soil depth, whereas in the organic field, only clay content was predicted with such accuracy. The highest prediction accuracy was found for total nitrogen (R2 = 0.75) in the conventional field in the energy-windows method. Predictions were better in the top 0–15 cm soil depths than in the 15–30 cm soil depths for individual and combined fields. This implies that gamma-ray spectroscopy can generally benefit soil characterisation for annual crops where the condition of the seedbed is important. Small differences in soil structure (conventional vs. organic) cannot be determined. As for the methodology, we conclude that the energy-windows method can establish relations between radionuclide data and soil properties as accurate as the full-spectrum analysis method. PMID:24287541
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Socket Widget Class ("Class" is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network connections for graphical-user-interface (GUI) computer programs. UNIX Transmission Control Protocol/Internet Protocol (TCP/IP) socket programming libraries require many method calls to configure, operate, and destroy sockets. Most X Windows GUI programs use widget sets or toolkits to facilitate management of complex objects. The widget standards facilitate construction of toolkits and application programs. The X-Windows Socket Widget Class encapsulates UNIX TCP/IP socket-management tasks within the framework of an X Windows widget. Using the widget framework, X Windows GUI programs can treat one or more network socket instances in the same manner as that of other graphical widgets, making it easier to program sockets. Wrapping ISP socket programming libraries inside a widget framework enables a programmer to treat a network interface as though it were a GUI.
Stereo matching using census cost over cross window and segmentation-based disparity refinement
NASA Astrophysics Data System (ADS)
Li, Qingwu; Ni, Jinyan; Ma, Yunpeng; Xu, Jinxin
2018-03-01
Stereo matching is a vital requirement for many applications, such as three-dimensional (3-D) reconstruction, robot navigation, object detection, and industrial measurement. To improve the practicability of stereo matching, a method using census cost over cross window and segmentation-based disparity refinement is proposed. First, a cross window is obtained using distance difference and intensity similarity in binocular images. Census cost over the cross window and color cost are combined as the matching cost, which is aggregated by the guided filter. Then, winner-takes-all strategy is used to calculate the initial disparities. Second, a graph-based segmentation method is combined with color and edge information to achieve moderate under-segmentation. The segmented regions are classified into reliable regions and unreliable regions by consistency checking. Finally, the two regions are optimized by plane fitting and propagation, respectively, to match the ambiguous pixels. The experimental results are on Middlebury Stereo Datasets, which show that the proposed method has good performance in occluded and discontinuous regions, and it obtains smoother disparity maps with a lower average matching error rate compared with other algorithms.
Windowed multitaper correlation analysis of multimodal brain monitoring parameters.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels.
Al-Samman, A M; Azmi, M H; Rahman, T A; Khan, I; Hindia, M N; Fattouh, A
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk-1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method.
Window-Based Channel Impulse Response Prediction for Time-Varying Ultra-Wideband Channels
Al-Samman, A. M.; Azmi, M. H.; Rahman, T. A.; Khan, I.; Hindia, M. N.; Fattouh, A.
2016-01-01
This work proposes channel impulse response (CIR) prediction for time-varying ultra-wideband (UWB) channels by exploiting the fast movement of channel taps within delay bins. Considering the sparsity of UWB channels, we introduce a window-based CIR (WB-CIR) to approximate the high temporal resolutions of UWB channels. A recursive least square (RLS) algorithm is adopted to predict the time evolution of the WB-CIR. For predicting the future WB-CIR tap of window wk, three RLS filter coefficients are computed from the observed WB-CIRs of the left wk−1, the current wk and the right wk+1 windows. The filter coefficient with the lowest RLS error is used to predict the future WB-CIR tap. To evaluate our proposed prediction method, UWB CIRs are collected through measurement campaigns in outdoor environments considering line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios. Under similar computational complexity, our proposed method provides an improvement in prediction errors of approximately 80% for LOS and 63% for NLOS scenarios compared with a conventional method. PMID:27992445
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media
NASA Astrophysics Data System (ADS)
Bruno, O. P.; Pérez-Arancibia, C.
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.
Bruno, O P; Pérez-Arancibia, C
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Method of high speed flow field influence and restrain on laser communication
NASA Astrophysics Data System (ADS)
Meng, Li-xin; Wang, Chun-hui; Qian, Cun-zhu; Wang, Shuo; Zhang, Li-zhong
2013-08-01
For laser communication performance which carried by airplane or airship, due to high-speed platform movement, the air has two influences in platform and laser communication terminal window. The first influence is that aerodynamic effect causes the deformation of the optical window; the second one is that a shock wave and boundary layer would be generated. For subsonic within the aircraft, the boundary layer is the main influence. The presence of a boundary layer could change the air density and the temperature of the optical window, which causes the light deflection and received beam spot flicker. Ultimately, the energy hunting of the beam spot which reaches receiving side increases, so that the error rate increases. In this paper, aerodynamic theory is used in analyzing the influence of the optical window deformation due to high speed air. Aero-optics theory is used to analyze the influence of the boundary layer in laser communication link. Based on this, we focused on working on exploring in aerodynamic and aero-optical effect suppression method in the perspective of the optical window design. Based on planning experimental aircraft types and equipment installation location, we optimized the design parameters of the shape and thickness of the optical window, the shape and size of air-management kit. Finally, deformation of the optical window and air flow distribution were simulated by fluid simulation software in the different mach and different altitude fly condition. The simulation results showed that the optical window can inhibit the aerodynamic influence after optimization. In addition, the boundary layer is smoothed; the turbulence influence is reduced, which meets the requirements of the airborne laser communication.
Multifractal analysis of 2001 Mw 7 . 7 Bhuj earthquake sequence in Gujarat, Western India
NASA Astrophysics Data System (ADS)
Aggarwal, Sandeep Kumar; Pastén, Denisse; Khan, Prosanta Kumar
2017-12-01
The 2001 Mw 7 . 7 Bhuj mainshock seismic sequence in the Kachchh area, occurring during 2001 to 2012, has been analyzed using mono-fractal and multi-fractal dimension spectrum analysis technique. This region was characterized by frequent moderate shocks of Mw ≥ 5 . 0 for more than a decade since the occurrence of 2001 Bhuj earthquake. The present study is therefore important for precursory analysis using this sequence. The selected long-sequence has been investigated first time for completeness magnitude Mc 3.0 using the maximum curvature method. Multi-fractal Dq spectrum (Dq ∼ q) analysis was carried out using effective window-length of 200 earthquakes with a moving window of 20 events overlapped by 180 events. The robustness of the analysis has been tested by considering the magnitude completeness correction term of 0.2 to Mc 3.0 as Mc 3.2 and we have tested the error in the calculus of Dq for each magnitude threshold. On the other hand, the stability of the analysis has been investigated down to the minimum magnitude of Mw ≥ 2 . 6 in the sequence. The analysis shows the multi-fractal dimension spectrum Dq decreases with increasing of clustering of events with time before a moderate magnitude earthquake in the sequence, which alternatively accounts for non-randomness in the spatial distribution of epicenters and its self-organized criticality. Similar behavior is ubiquitous elsewhere around the globe, and warns for proximity of a damaging seismic event in an area. OS: Please confirm math roman or italics in abs.
NASA Technical Reports Server (NTRS)
Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume; Koster, Randal D. (Editor)
2014-01-01
An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory. SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.
NASA Technical Reports Server (NTRS)
Keppenne, Christian L.; Rienecker, Michele; Kovach, Robin M.; Vernieres, Guillaume
2014-01-01
An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.
Design and comparison of laser windows for high-power lasers
NASA Astrophysics Data System (ADS)
Niu, Yanxiong; Liu, Wenwen; Liu, Haixia; Wang, Caili; Niu, Haisha; Man, Da
2014-11-01
High-power laser systems are getting more and more widely used in industry and military affairs. It is necessary to develop a high-power laser system which can operate over long periods of time without appreciable degradation in performance. When a high-energy laser beam transmits through a laser window, it is possible that the permanent damage is caused to the window because of the energy absorption by window materials. So, when we design a high-power laser system, a suitable laser window material must be selected and the laser damage threshold of the window must be known. In this paper, a thermal analysis model of high-power laser window is established, and the relationship between the laser intensity and the thermal-stress field distribution is studied by deducing the formulas through utilizing the integral-transform method. The influence of window radius, thickness and laser intensity on the temperature and stress field distributions is analyzed. Then, the performance of K9 glass and the fused silica glass is compared, and the laser-induced damage mechanism is analyzed. Finally, the damage thresholds of laser windows are calculated. The results show that compared with K9 glass, the fused silica glass has a higher damage threshold due to its good thermodynamic properties. The presented theoretical analysis and simulation results are helpful for the design and selection of high-power laser windows.
Bergmann, Helmar; Minear, Gregory; Raith, Maria; Schaffarich, Peter M
2008-12-09
The accuracy of multiple window spatial resolution characterises the performance of a gamma camera for dual isotope imaging. In the present study we investigate an alternative method to the standard NEMA procedure for measuring this performance parameter. A long-lived 133Ba point source with gamma energies close to 67Ga and a single bore lead collimator were used to measure the multiple window spatial registration error. Calculation of the positions of the point source in the images used the NEMA algorithm. The results were validated against the values obtained by the standard NEMA procedure which uses a liquid 67Ga source with collimation. Of the source-collimator configurations under investigation an optimum collimator geometry, consisting of a 5 mm thick lead disk with a diameter of 46 mm and a 5 mm central bore, was selected. The multiple window spatial registration errors obtained by the 133Ba method showed excellent reproducibility (standard deviation < 0.07 mm). The values were compared with the results from the NEMA procedure obtained at the same locations and showed small differences with a correlation coefficient of 0.51 (p < 0.05). In addition, the 133Ba point source method proved to be much easier to use. A Bland-Altman analysis showed that the 133Ba and the 67Ga Method can be used interchangeably. The 133Ba point source method measures the multiple window spatial registration error with essentially the same accuracy as the NEMA-recommended procedure, but is easier and safer to use and has the potential to replace the current standard procedure.
Implementation of Kriging Methods in Mobile GIS to Estimate Damage to Buildings in Crisis Scenarios
NASA Astrophysics Data System (ADS)
Laun, S.; Rösch, N.; Breunig, M.; Doori, M. Al
2016-06-01
In the paper an example for the application of kriging methods to estimate damage to buildings in crisis scenarios is introduced. Furthermore, the Java implementations for Ordinary and Universal Kriging on mobile GIS are presented. As variogram models an exponential, a Gaussian and a spherical variogram are tested in detail. Different test constellations are introduced with various information densities. As test data set, public data from the analysis of the 2010 Haiti earthquake by satellite images are pre-processed and visualized in a Geographic Information System. As buildings, topography and other external influences cannot be seen as being constant for the whole area under investigation, semi variograms are calculated by consulting neighboured classified buildings using the so called moving window method. The evaluation of the methods shows that the underlying variogram model is the determining factor for the quality of the interpolation rather than the choice of the kriging method or increasing the information density of a random sample. The implementation is completely realized with the programming language Java. Thereafter, the implemented software component is integrated into GeoTech Mobile, a mobile GIS Android application based on the processing of standardized spatial data representations defined by the Open Geospatial Consortium (OGC). As a result the implemented methods can be used on mobile devices, i.e. they may be transferred to other application fields. That is why we finally point out further research with new applications in the Dubai region.
On interrelations of recurrences and connectivity trends between stock indices
NASA Astrophysics Data System (ADS)
Goswami, B.; Ambika, G.; Marwan, N.; Kurths, J.
2012-09-01
Financial data has been extensively studied for correlations using Pearson's cross-correlation coefficient ρ as the point of departure. We employ an estimator based on recurrence plots - the correlation of probability of recurrence (CPR) - to analyze connections between nine stock indices spread worldwide. We suggest a slight modification of the CPR approach in order to get more robust results. We examine trends in CPR for an approximately 19-month window moved along the time series and compare them to trends in ρ. Binning CPR into three levels of connectedness (strong, moderate, and weak), we extract the trends in number of connections in each bin over time. We also look at the behavior of CPR during the dot-com bubble by shifting the time series to align their peaks. CPR mainly uncovers that the markets move in and out of periods of strong connectivity erratically, instead of moving monotonically towards increasing global connectivity. This is in contrast to ρ, which gives a picture of ever-increasing correlation. CPR also exhibits that time-shifted markets have high connectivity around the dot-com bubble of 2000. We use significance tests using twin surrogates to interpret all the measures estimated in the study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Junwei; Chen, Ruizhu; Hartlep, Thomas
2015-08-10
Helioseismic and magnetohydrodynamic waves are abundant in and above sunspots. Through cross-correlating oscillation signals in the photosphere observed by the Solar Dynamics Observatory/Helioseismic and Magnetic Imager, we reconstruct how waves propagate away from virtual wave sources located inside a sunspot. In addition to the usual helioseismic wave, a fast-moving wave is detected traveling along the sunspot’s radial direction from the umbra to about 15 Mm beyond the sunspot boundary. The wave has a frequency range of 2.5–4.0 mHz with a phase velocity of 45.3 km s{sup −1}, substantially faster than the typical speeds of Alfvén and magnetoacoustic waves in themore » photosphere. The observed phenomenon is consistent with a scenario of that a magnetoacoustic wave is excited at approximately 5 Mm beneath the sunspot. Its wavefront travels to and sweeps across the photosphere with a speed higher than the local magnetoacoustic speed. The fast-moving wave, if truly excited beneath the sunspot’s surface, will help open a new window for studying the internal structure and dynamics of sunspots.« less
Uncertainty visualisation in the Model Web
NASA Astrophysics Data System (ADS)
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).
Marchant, Carol A; Briggs, Katharine A; Long, Anthony
2008-01-01
ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.
Phast4Windows: a 3D graphical user interface for the reactive-transport simulator PHAST.
Charlton, Scott R; Parkhurst, David L
2013-01-01
Phast4Windows is a Windows® program for developing and running groundwater-flow and reactive-transport models with the PHAST simulator. This graphical user interface allows definition of grid-independent spatial distributions of model properties-the porous media properties, the initial head and chemistry conditions, boundary conditions, and locations of wells, rivers, drains, and accounting zones-and other parameters necessary for a simulation. Spatial data can be defined without reference to a grid by drawing, by point-by-point definitions, or by importing files, including ArcInfo® shape and raster files. All definitions can be inspected, edited, deleted, moved, copied, and switched from hidden to visible through the data tree of the interface. Model features are visualized in the main panel of the interface, so that it is possible to zoom, pan, and rotate features in three dimensions (3D). PHAST simulates single phase, constant density, saturated groundwater flow under confined or unconfined conditions. Reactions among multiple solutes include mineral equilibria, cation exchange, surface complexation, solid solutions, and general kinetic reactions. The interface can be used to develop and run simple or complex models, and is ideal for use in the classroom, for analysis of laboratory column experiments, and for development of field-scale simulations of geochemical processes and contaminant transport. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.
Ehara, Shoichi; Okuyama, Takuhiro; Shirai, Nobuyuki; Sugioka, Kenichi; Oe, Hiroki; Itoh, Toshihide; Matsuoka, Toshiyuki; Ikura, Yoshihiro; Ueda, Makiko; Naruko, Takahiko; Hozumi, Takeshi; Yoshiyama, Minoru
2009-08-01
Previous studies have shown a correlation between coronary artery cross-sectional diameter and left ventricular (LV) mass. However, no studies have examined the correlation between actual coronary artery volume (CAV) and LV mass. In the present study, measurements of CAV by 64-multislice computed tomography (MSCT) were validated and the relationship between CAV and LV mass was investigated. First, coronary artery phantoms consisting of syringes filled with solutions of contrast medium moving at simulated heart rates were scanned by 64-MSCT. Display window settings permitting accurate calculation of small volumes were optimized by evaluating volume-rendered images of the segmented contrast medium at different window settings. Next, 61 patients without significant coronary artery stenosis were scanned by 64-MSCT with the same protocol as for the phantoms. Coronary arteries were segmented on a workstation and the same window settings were applied to the volume-rendered images to calculate total CAV. Significant correlations between total CAV and LV mass (r=0.660, P<0.0001) were found, whereas an inverse relation was present between total CAV per 100 g of LV mass and LV mass. The novel concept of "CAV" for the characterization of coronary arteries may prove useful for future research, particularly on the causes of LV hypertrophy.
NASA Astrophysics Data System (ADS)
Jian, Wang; Xiaohong, Meng; Hong, Liu; Wanqiu, Zheng; Yaning, Liu; Sheng, Gui; Zhiyang, Wang
2017-03-01
Full waveform inversion and reverse time migration are active research areas for seismic exploration. Forward modeling in the time domain determines the precision of the results, and numerical solutions of finite difference have been widely adopted as an important mathematical tool for forward modeling. In this article, the optimum combined of window functions was designed based on the finite difference operator using a truncated approximation of the spatial convolution series in pseudo-spectrum space, to normalize the outcomes of existing window functions for different orders. The proposed combined window functions not only inherit the characteristics of the various window functions, to provide better truncation results, but also control the truncation error of the finite difference operator manually and visually by adjusting the combinations and analyzing the characteristics of the main and side lobes of the amplitude response. Error level and elastic forward modeling under the proposed combined system were compared with outcomes from conventional window functions and modified binomial windows. Numerical dispersion is significantly suppressed, which is compared with modified binomial window function finite-difference and conventional finite-difference. Numerical simulation verifies the reliability of the proposed method.
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-01-01
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction. PMID:26540059
Yu, Xiao; Ding, Enjie; Chen, Chunxu; Liu, Xiaoming; Li, Li
2015-11-03
Because roller element bearings (REBs) failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC) to select salient features from the marginal spectrum of vibration signals by Hilbert-Huang Transform (HHT). In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS) into window spectrums, following which Rand Index (RI) criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs). Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines). The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU). The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500-800 and a m range of 50-300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR) = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault classification accuracy and a good performance in Gauss white noise reduction.
Standards for efficient employment of wide-area motion imagery (WAMI) sensors
NASA Astrophysics Data System (ADS)
Randall, L. Scott; Maenner, Paul F.
2013-05-01
Airborne Wide Area Motion Imagery (WAMI) sensors provide the opportunity for continuous high-resolution surveillance of geographic areas covering tens of square kilometers. This is both a blessing and a curse. Data volumes from "gigapixel-class" WAMI sensors are orders of magnitude greater than for traditional "megapixel-class" video sensors. The amount of data greatly exceeds the capacities of downlinks to ground stations, and even if this were not true, the geographic coverage is too large for effective human monitoring. Although collected motion imagery is recorded on the platform, typically only small "windows" of the full field of view are transmitted to the ground; the full set of collected data can be retrieved from the recording device only after the mission has concluded. Thus, the WAMI environment presents several difficulties: (1) data is too massive for downlink; (2) human operator selection and control of the video windows may not be effective; (3) post-mission storage and dissemination may be limited by inefficient file formats; and (4) unique system implementation characteristics may thwart exploitation by available analysis tools. To address these issues, the National Geospatial-Intelligence Agency's Motion Imagery Standards Board (MISB) is developing relevant standard data exchange formats: (1) moving target indicator (MTI) and tracking metadata to support tipping and cueing of WAMI windows using "watch boxes" and "trip wires"; (2) control channel commands for positioning the windows within the full WAMI field of view; and (3) a full-field-of-view spatiotemporal tiled file format for efficient storage, retrieval, and dissemination. The authors previously provided an overview of this suite of standards. This paper describes the latest progress, with specific concentration on a detailed description of the spatiotemporal tiled file format.
2011-04-01
this limitation the length of the windows needs to be shortened. It is also leads to a narrower confidence interval, see Figure 2.9. 82 The " big ...least one event will occur within the window. The windows are then grouped in sets of two and the process is reapeated for a window size twice as big ...0 505 T. Fu 1 506 D. Walden 1 508 J. Brown 1 55 T.Applebee 0 55 M. Dipper 1 551 T. Smith I 551 C. Bassler 3 3 551 V. Belenky 1 551 W. Belknap
2013-03-01
Weave Welding Method Wheel Assembly Wind Load Wind Loads Wind Uplift Resistance Wind Uplift Resistance Class Window Category Window Finish Window... wind - blast Elongation UFGS 2.1 percent Insert Value Visual Defects UFGS 2.1 n/a Insert Value ERDC/CERL CR-13-1 39 Attribute Source...Sustainability COBie Guide n/a insert reqts FRP Strengthening UFGS 1.2 n/a seismic - wind - blast Elongation UFGS 2.2 percent Insert Value Tensile
Zhang, Jinshui; Yuan, Zhoumiqi; Shuai, Guanyuan; Pan, Yaozhong; Zhu, Xiufang
2017-04-26
This paper developed an approach, the window-based validation set for support vector data description (WVS-SVDD), to determine optimal parameters for support vector data description (SVDD) model to map specific land cover by integrating training and window-based validation sets. Compared to the conventional approach where the validation set included target and outlier pixels selected visually and randomly, the validation set derived from WVS-SVDD constructed a tightened hypersphere because of the compact constraint by the outlier pixels which were located neighboring to the target class in the spectral feature space. The overall accuracies for wheat and bare land achieved were as high as 89.25% and 83.65%, respectively. However, target class was underestimated because the validation set covers only a small fraction of the heterogeneous spectra of the target class. The different window sizes were then tested to acquire more wheat pixels for validation set. The results showed that classification accuracy increased with the increasing window size and the overall accuracies were higher than 88% at all window size scales. Moreover, WVS-SVDD showed much less sensitivity to the untrained classes than the multi-class support vector machine (SVM) method. Therefore, the developed method showed its merits using the optimal parameters, tradeoff coefficient ( C ) and kernel width ( s ), in mapping homogeneous specific land cover.
Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel
2018-01-01
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
Acoustic window planning for ultrasound acquisition.
Göbl, Rüdiger; Virga, Salvatore; Rackerseder, Julia; Frisch, Benjamin; Navab, Nassir; Hennersperger, Christoph
2017-06-01
Autonomous robotic ultrasound has recently gained considerable interest, especially for collaborative applications. Existing methods for acquisition trajectory planning are solely based on geometrical considerations, such as the pose of the transducer with respect to the patient surface. This work aims at establishing acoustic window planning to enable autonomous ultrasound acquisitions of anatomies with restricted acoustic windows, such as the liver or the heart. We propose a fully automatic approach for the planning of acquisition trajectories, which only requires information about the target region as well as existing tomographic imaging data, such as X-ray computed tomography. The framework integrates both geometrical and physics-based constraints to estimate the best ultrasound acquisition trajectories with respect to the available acoustic windows. We evaluate the developed method using virtual planning scenarios based on real patient data as well as for real robotic ultrasound acquisitions on a tissue-mimicking phantom. The proposed method yields superior image quality in comparison with a naive planning approach, while maintaining the necessary coverage of the target. We demonstrate that by taking image formation properties into account acquisition planning methods can outperform naive plannings. Furthermore, we show the need for such planning techniques, since naive approaches are not sufficient as they do not take the expected image quality into account.
Schüpbach, Jörg; Gebhardt, Martin D.; Scherrer, Alexandra U.; Bisset, Leslie R.; Niederhauser, Christoph; Regenass, Stephan; Yerly, Sabine; Aubert, Vincent; Suter, Franziska; Pfister, Stefan; Martinetti, Gladys; Andreutti, Corinne; Klimkait, Thomas; Brandenberger, Marcel; Günthard, Huldrych F.
2013-01-01
Background Tests for recent infections (TRIs) are important for HIV surveillance. We have shown that a patient's antibody pattern in a confirmatory line immunoassay (Inno-Lia) also yields information on time since infection. We have published algorithms which, with a certain sensitivity and specificity, distinguish between incident (< = 12 months) and older infection. In order to use these algorithms like other TRIs, i.e., based on their windows, we now determined their window periods. Methods We classified Inno-Lia results of 527 treatment-naïve patients with HIV-1 infection < = 12 months according to incidence by 25 algorithms. The time after which all infections were ruled older, i.e. the algorithm's window, was determined by linear regression of the proportion ruled incident in dependence of time since infection. Window-based incident infection rates (IIR) were determined utilizing the relationship ‘Prevalence = Incidence x Duration’ in four annual cohorts of HIV-1 notifications. Results were compared to performance-based IIR also derived from Inno-Lia results, but utilizing the relationship ‘incident = true incident + false incident’ and also to the IIR derived from the BED incidence assay. Results Window periods varied between 45.8 and 130.1 days and correlated well with the algorithms' diagnostic sensitivity (R2 = 0.962; P<0.0001). Among the 25 algorithms, the mean window-based IIR among the 748 notifications of 2005/06 was 0.457 compared to 0.453 obtained for performance-based IIR with a model not correcting for selection bias. Evaluation of BED results using a window of 153 days yielded an IIR of 0.669. Window-based IIR and performance-based IIR increased by 22.4% and respectively 30.6% in 2008, while 2009 and 2010 showed a return to baseline for both methods. Conclusions IIR estimations by window- and performance-based evaluations of Inno-Lia algorithm results were similar and can be used together to assess IIR changes between annual HIV notification cohorts. PMID:23990968
Vendemia, Nicholas; Chao, Jerry; Ivanidze, Jana; Sanelli, Pina; Spinelli, Henry M
2011-01-01
Medpor (Porex Surgical, Inc, Newnan, GA) is composed of porous polyethylene and is commonly used in craniofacial reconstruction. When complications such as seroma or abscess formation arise, diagnostic modalities are limited because Medpor is radiolucent on conventional radiologic studies. This poses a problem in situations where imaging is necessary to distinguish the implant from surrounding tissues. To present a clinically useful method for imaging Medpor with conventional computed tomographic (CT) scanning. Eleven patients (12 total implants) who have undergone reconstructive surgery with Medpor were included in the study. A retrospective review of CT scans done between 1 and 16 months postoperatively was performed using 3 distinct CT window settings. Measurements of implant dimensions and Hounsfield units were recorded and qualitatively assessed. Of the 3 distinct window settings studied, namely, "bone" (W1100/L450), "soft tissue"; (W500/L50), and "implant" (W800/L200), the implant window proved the most ideal, allowing the investigators to visualize and evaluate Medpor in all cases. Qualitative analysis revealed that Medpor implants were able to be distinguished from surrounding tissue in both the implant and soft tissue windows, with a density falling between that of fat and fluid. In 1 case, Medpor could not be visualized in the soft tissue window, although it could be visualized in the implant window. Quantitative analysis demonstrated a mean (SD) density of -38.7 (7.4) Hounsfield units. Medpor may be optimally visualized on conventional CT scans using the implant window settings W800/L200, which can aid in imaging Medpor and diagnosing implant-related complications.
NASA Astrophysics Data System (ADS)
Yao, Hua-Dong; Davidson, Lars
2018-03-01
We investigate the interior noise caused by turbulent flows past a generic side-view mirror. A rectangular glass window is placed downstream of the mirror. The window vibration is excited by the surface pressure fluctuations and emits the interior noise in a cuboid cavity. The turbulent flows are simulated using a compressible large eddy simulation method. The window vibration and interior noise are predicted with a finite element method. The wavenumber-frequency spectra of the surface pressure fluctuations are analyzed. The spectra are identified with some new features that cannot be explained by the Chase model for turbulent boundary layers. The spectra contain a minor hydrodynamic domain in addition to the hydrodynamic domain caused by the main convection of the turbulent boundary layer. The minor domain results from the local convection of the recirculating flow. These domains are formed in bent elliptic shapes. The spanwise expansion of the wake is found causing the bending. Based on the wavenumber-frequency relationships in the spectra, the surface pressure fluctuations are decomposed into hydrodynamic and acoustic components. The acoustic component is more efficient in the generation of the interior noise than the hydrodynamic component. However, the hydrodynamic component is still dominant at low frequencies below approximately 250 Hz since it has low transmission losses near the hydrodynamic critical frequency of the window. The structural modes of the window determine the low-frequency interior tonal noise. The combination of the mode shapes of the window and cavity greatly affects the magnitude distribution of the interior noise.
Method and apparatus for monitoring the flow of mercury in a system
Grossman, M.W.
1987-12-15
An apparatus and method for monitoring the flow of mercury in a system are disclosed. The equipment enables the entrainment of the mercury in a carrier gas e.g., an inert gas, which passes as mercury vapor between a pair of optically transparent windows. The attenuation of the emission is indicative of the quantity of mercury (and its isotopes) in the system. A 253.7 nm light is shone through one of the windows and the unabsorbed light is detected through the other window. The absorption of the 253.7 nm light is thereby measured whereby the quantity of mercury passing between the windows can be determined. The apparatus includes an in-line sensor for measuring the quantity of mercury. It includes a conduit together with a pair of apertures disposed in a face to face relationship and arranged on opposite sides of the conduit. A pair of optically transparent windows are disposed upon a pair of viewing tubes. A portion of each of the tubes is disposed inside of the conduit and within each of the apertures. The two windows are disposed in a face to face relationship on the ends of the viewing tubes and the entire assembly is hermetically sealed from the atmosphere whereby when 253.7 nm ultraviolet light is shone through one of the windows and detected through the other, the quantity of mercury which is passing by can be continuously monitored due to absorption which is indicated by attenuation of the amplitude of the observed emission. 4 figs.
Sakao, Yukinori; Kuroda, Hiroaki; Mun, Mingyon; Uehara, Hirofumi; Motoi, Noriko; Ishikawa, Yuichi; Nakagawa, Ken; Okumura, Sakae
2014-01-01
Background We aimed to clarify that the size of the lung adenocarcinoma evaluated using mediastinal window on computed tomography is an important and useful modality for predicting invasiveness, lymph node metastasis and prognosis in small adenocarcinoma. Methods We evaluated 176 patients with small lung adenocarcinomas (diameter, 1–3 cm) who underwent standard surgical resection. Tumours were examined using computed tomography with thin section conditions (1.25 mm thick on high-resolution computed tomography) with tumour dimensions evaluated under two settings: lung window and mediastinal window. We also determined the patient age, gender, preoperative nodal status, tumour size, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and pathological status (lymphatic vessel, vascular vessel or pleural invasion). Recurrence-free survival was used for prognosis. Results Lung window, mediastinal window, tumour disappearance ratio and preoperative nodal status were significant predictive factors for recurrence-free survival in univariate analyses. Areas under the receiver operator curves for recurrence were 0.76, 0.73 and 0.65 for mediastinal window, tumour disappearance ratio and lung window, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant predictive factors for lymph node metastasis in univariate analyses; areas under the receiver operator curves were 0.61, 0.76, 0.72 and 0.66, for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Lung window, mediastinal window, tumour disappearance ratio, preoperative serum carcinoembryonic antigen levels and preoperative nodal status were significant factors for lymphatic vessel, vascular vessel or pleural invasion in univariate analyses; areas under the receiver operator curves were 0.60, 0.81, 0.81 and 0.65 for lung window, mediastinal window, tumour disappearance ratio and preoperative serum carcinoembryonic antigen levels, respectively. Conclusions According to the univariate analyses including a logistic regression and ROCs performed for variables with p-values of <0.05 on univariate analyses, our results suggest that measuring tumour size using mediastinal window on high-resolution computed tomography is a simple and useful preoperative prognosis modality in small adenocarcinoma. PMID:25365326
Prediction of CpG-island function: CpG clustering vs. sliding-window methods
2010-01-01
Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903
Towards component-based validation of GATE: aspects of the coincidence processor.
Moraes, Eder R; Poon, Jonathan K; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D
2015-02-01
GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to "ground truth" obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the "multiple window method"), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the "single window method"). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Johnson, Christine M; Sullivan, Jess; Buck, Cara L; Trexel, Julie; Scarpuzzi, Mike
2015-01-01
Anticipating the location of a temporarily obscured target-what Piaget (the construction of reality in the child. Basic Books, New York, 1954) called "object permanence"-is a critical skill, especially in hunters of mobile prey. Previous research with bottlenose dolphins found they could predict the location of a target that had been visibly displaced into an opaque container, but not one that was first placed in an opaque container and then invisibly displaced to another container. We tested whether, by altering the task to involve occlusion rather than containment, these animals could show more advanced object permanence skills. We projected dynamic visual displays at an underwater-viewing window and videotaped the animals' head moves while observing these displays. In Experiment 1, the animals observed a small black disk moving behind occluders that shifted in size, ultimately forming one large occluder. Nine out of ten subjects "tracked" the presumed movement of the disk behind this occluder on their first trial-and in a statistically significant number of subsequent trials-confirming their visible displacement abilities. In Experiment 2, we tested their invisible displacement abilities. The disk first disappeared behind a pair of moving occluders, which then moved behind a stationary occluder. The moving occluders then reappeared and separated, revealing that the disk was no longer behind them. The subjects subsequently looked to the correct stationary occluder on eight of their ten first trials, and in a statistically significant number of subsequent trials. Thus, by altering the stimuli to be more ecologically valid, we were able to show that the dolphins could indeed succeed at an invisible displacement task.
Leyde, Brian P.; Klein, Sanford A; Nellis, Gregory F.; Skye, Harrison
2017-01-01
This paper presents a new method called the Crossed Contour Method for determining the effective properties (borehole radius and ground thermal conductivity) of a vertical ground-coupled heat exchanger. The borehole radius is used as a proxy for the overall borehole thermal resistance. The method has been applied to both simulated and experimental borehole Thermal Response Test (TRT) data using the Duct Storage vertical ground heat exchanger model implemented in the TRansient SYstems Simulation software (TRNSYS). The Crossed Contour Method generates a parametric grid of simulated TRT data for different combinations of borehole radius and ground thermal conductivity in a series of time windows. The error between the average of the simulated and experimental bore field inlet and outlet temperatures is calculated for each set of borehole properties within each time window. Using these data, contours of the minimum error are constructed in the parameter space of borehole radius and ground thermal conductivity. When all of the minimum error contours for each time window are superimposed, the point where the contours cross (intersect) identifies the effective borehole properties for the model that most closely represents the experimental data in every time window and thus over the entire length of the experimental data set. The computed borehole properties are compared with results from existing model inversion methods including the Ground Property Measurement (GPM) software developed by Oak Ridge National Laboratory, and the Line Source Model. PMID:28785125
NASA Astrophysics Data System (ADS)
Jorge, Marco G.; Brennand, Tracy A.
2017-07-01
Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.
Rusterholz, Thomas; Achermann, Peter; Dürr, Roland; Koenig, Thomas; Tarokh, Leila
2017-06-01
Investigating functional connectivity between brain networks has become an area of interest in neuroscience. Several methods for investigating connectivity have recently been developed, however, these techniques need to be applied with care. We demonstrate that global field synchronization (GFS), a global measure of phase alignment in the EEG as a function of frequency, must be applied considering signal processing principles in order to yield valid results. Multichannel EEG (27 derivations) was analyzed for GFS based on the complex spectrum derived by the fast Fourier transform (FFT). We examined the effect of window functions on GFS, in particular of non-rectangular windows. Applying a rectangular window when calculating the FFT revealed high GFS values for high frequencies (>15Hz) that were highly correlated (r=0.9) with spectral power in the lower frequency range (0.75-4.5Hz) and tracked the depth of sleep. This turned out to be spurious synchronization. With a non-rectangular window (Tukey or Hanning window) these high frequency synchronization vanished. Both, GFS and power density spectra significantly differed for rectangular and non-rectangular windows. Previous papers using GFS typically did not specify the applied window and may have used a rectangular window function. However, the demonstrated impact of the window function raises the question of the validity of some previous findings at higher frequencies. We demonstrated that it is crucial to apply an appropriate window function for determining synchronization measures based on a spectral approach to avoid spurious synchronization in the beta/gamma range. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bunai, Tasya; Rokhmatuloh; Wibowo, Adi
2018-05-01
In this paper, two methods to retrieve the Land Surface Temperature (LST) from thermal infrared data supplied by band 10 and 11 of the Thermal Infrared Sensor (TIRS) onboard the Landsat 8 is compared. The first is mono window algorithm developed by Qin et al. and the second is split window algorithm by Rozenstein et al. The purpose of this study is to perform the spatial distribution of land surface temperature, as well as to determine more accurate algorithm for retrieving land surface temperature by calculated root mean square error (RMSE). Finally, we present comparison the spatial distribution of land surface temperature by both of algorithm, and more accurate algorithm is split window algorithm refers to the root mean square error (RMSE) is 7.69° C.
Development of a collapsible reinforced cylindrical space observation window
NASA Technical Reports Server (NTRS)
Khan, A. Q.
1971-01-01
Existing material technology was applied to the development of a collapsible transparent window suitable for manned spacecraft structures. The effort reported encompasses the evaluation of flame retardants intended for use in the window matrix polymer, evaluation of reinforcement angle which would allow for a twisting pantographing motion as the cylindrical window is mechanically collapsed upon itself, and evaluation of several reinforcement embedment methods. A fabrication technique was developed to produce a reinforced cylindrical space window of 45.7 cm diameter and 61.0 cm length. The basic technique involved the application of a clear film on a male-section mold; winding axial and girth reinforcements and vacuum casting the outer layer. The high-strength transparent window composite consisted of a polyether urethane matrix reinforced with an orthogonal pattern of black-coated carbon steel wire cable. A thin film of RTV silicone rubber was applied to both surfaces of the urethane. The flexibility, retraction system, and installation system are described.
NASA Astrophysics Data System (ADS)
Watanabe, Koji; Matsuno, Kenichi
This paper presents a new method for simulating flows driven by a body traveling with neither restriction on motion nor a limit of a region size. In the present method named 'Moving Computational Domain Method', the whole of the computational domain including bodies inside moves in the physical space without the limit of region size. Since the whole of the grid of the computational domain moves according to the movement of the body, a flow solver of the method has to be constructed on the moving grid system and it is important for the flow solver to satisfy physical and geometric conservation laws simultaneously on moving grid. For this issue, the Moving-Grid Finite-Volume Method is employed as the flow solver. The present Moving Computational Domain Method makes it possible to simulate flow driven by any kind of motion of the body in any size of the region with satisfying physical and geometric conservation laws simultaneously. In this paper, the method is applied to the flow around a high-speed car passing through a hairpin curve. The distinctive flow field driven by the car at the hairpin curve has been demonstrated in detail. The results show the promising feature of the method.
An embedded laser marking controller based on ARM and FPGA processors.
Dongyun, Wang; Xinpiao, Ye
2014-01-01
Laser marking is an important branch of the laser information processing technology. The existing laser marking machine based on PC and WINDOWS operating system, are large and inconvenient to move. Still, it cannot work outdoors or in other harsh environments. In order to compensate for the above mentioned disadvantages, this paper proposed an embedded laser marking controller based on ARM and FPGA processors. Based on the principle of laser galvanometer scanning marking, the hardware and software were designed for the application. Experiments showed that this new embedded laser marking controller controls the galvanometers synchronously and could achieve precise marking.
NASA TileWorld manual (system version 2.2)
NASA Technical Reports Server (NTRS)
Philips, Andrew B.; Bresina, John L.
1991-01-01
The commands are documented of the NASA TileWorld simulator, as well as providing information about how to run it and extend it. The simulator, implemented in Common Lisp with Common Windows, encodes a particular range in a spectrum of domains, for controllable research experiments. TileWorld consists of a two dimensional grid of cells, a set of polygonal tiles, and a single agent which can grasp and move tiles. In addition to agent executable actions, there is an external event over which the agent has not control; this event correspond to a 'gust of wind'.
2001-08-08
KODIAK ISLAND, Alaska -- The Sapphire payload is moved into position next to the Starshine 3 payload at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
Windowed Multitaper Correlation Analysis of Multimodal Brain Monitoring Parameters
Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome. PMID:25821507
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polewko-Klim, A., E-mail: anetapol@uwb.edu.pl; Uba, S.; Uba, L.
2014-07-15
A solution to the problem of disturbing effect of the background Faraday rotation in the cryostat windows on longitudinal magneto-optical Kerr effect (LMOKE) measured under vacuum conditions and/or at low temperatures is proposed. The method for eliminating the influence of Faraday rotation in cryostat windows is based on special arrangement of additional mirrors placed on sample holder. In this arrangement, the orientation of the cryostat window is perpendicular to the light beam direction and parallel to an external magnetic field generated by the H-frame electromagnet. The operation of the LMOKE magnetometer with the special sample holder based on polarization modulationmore » technique with a photo-elastic modulator is theoretically analyzed with the use of Jones matrices, and formulas for evaluating of the actual Kerr rotation and ellipticity of the sample are derived. The feasibility of the method and good performance of the magnetometer is experimentally demonstrated for the LMOKE effect measured in Fe/Au multilayer structures. The influence of imperfect alignment of the magnetometer setup on the Kerr angles, as derived theoretically through the analytic model and verified experimentally, is examined and discussed.« less
Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul
2011-07-01
In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of moving average tracking was up to four times higher than that of real-time tracking and approached the efficiency of no compensation for all cases. The geometric accuracy and dosimetric accuracy of the moving average algorithm was between real-time tracking and no compensation, approximately half the percentage of dosimetric points failing the gamma-test compared with no compensation.
Batch production of microchannel plate photo-multipliers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frisch, Henry J.; Wetstein, Matthew; Elagin, Andrey
In-situ methods for the batch fabrication of flat-panel micro-channel plate (MCP) photomultiplier tube (PMT) detectors (MCP-PMTs), without transporting either the window or the detector assembly inside a vacuum vessel are provided. The method allows for the synthesis of a reflection-mode photocathode on the entrance to the pores of a first MCP or the synthesis of a transmission-mode photocathode on the vacuum side of a photodetector entrance window.
NASA Astrophysics Data System (ADS)
Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih
2018-03-01
Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.
Drug exposure in register-based research—An expert-opinion based evaluation of methods
Taipale, Heidi; Koponen, Marjaana; Tolppanen, Anna-Maija; Hartikainen, Sirpa; Ahonen, Riitta; Tiihonen, Jari
2017-01-01
Background In register-based pharmacoepidemiological studies, construction of drug exposure periods from drug purchases is a major methodological challenge. Various methods have been applied but their validity is rarely evaluated. Our objective was to conduct an expert-opinion based evaluation of the correctness of drug use periods produced by different methods. Methods Drug use periods were calculated with three fixed methods: time windows, assumption of one Defined Daily Dose (DDD) per day and one tablet per day, and with PRE2DUP that is based on modelling of individual drug purchasing behavior. Expert-opinion based evaluation was conducted with 200 randomly selected purchase histories of warfarin, bisoprolol, simvastatin, risperidone and mirtazapine in the MEDALZ-2005 cohort (28,093 persons with Alzheimer’s disease). Two experts reviewed purchase histories and judged which methods had joined correct purchases and gave correct duration for each of 1000 drug exposure periods. Results The evaluated correctness of drug use periods was 70–94% for PRE2DUP, and depending on grace periods and time window lengths 0–73% for tablet methods, 0–41% for DDD methods and 0–11% for time window methods. The highest rate of evaluated correct solutions for each method class were observed for 1 tablet per day with 180 days grace period (TAB_1_180, 43–73%), and 1 DDD per day with 180 days grace period (1–41%). Time window methods produced at maximum only 11% correct solutions. The best performing fixed method TAB_1_180 reached highest correctness for simvastatin 73% (95% CI 65–81%) whereas 89% (95% CI 84–94%) of PRE2DUP periods were judged as correct. Conclusions This study shows inaccuracy of fixed methods and the urgent need for new data-driven methods. In the expert-opinion based evaluation, the lowest error rates were observed with data-driven method PRE2DUP. PMID:28886089
Windows Into the Real World From a Virtual Globe
NASA Astrophysics Data System (ADS)
Rich, J.; Urban-Rich, J.
2007-12-01
Virtual globes such as Google Earth can be great tools for learning about the geographical variation of the earth. The key to virtual globes is the use of satellite imagery to provide a highly accurate view of the earth's surface. However, because the images are not updated regularly, variations in climate and vegetation over time can not be easily seen. In order to enhance the view of the earth and observe these changes by region and over time we are working to add near real time "windows" into the real world from a virtual globe. For the past 4 years we have been installing web cameras in areas of the world that will provide long term monitoring of global changes. By archiving hourly images from arctic, temperate and tropical regions we are creating a visual data set that is already beginning to tell the story of climate variability. The cameras are currently installed in 10 elementary schools in 3 countries and show the student's view out each window. The Windows Around the World program (http://www.WindowsAroundTheWorld.org) uses the images from these cameras to help students gain a better understanding of earth process and variability in climate and vegetation between different regions and over time. Previously we have used standard web based technologies such as DHTML and AJAX to provide near real-time access to these images and also provide enhanced functionality such as dynamic time lapse movies that allow users to see changes over months, days or hours up to the current hour (http://www.windowsaroundtheworld.org/north_america.aspx). We have integrated the camera images from Windows Around the World into Google Earth. Through network links and models we are creating a way for students to "fly" to another school in the program and see what the current view is out the window. By using a model as a screen, the image can be viewed from the same direction as the students who are sitting in a classroom at the participating school. Once at the school, visiting students can move around the area in three dimensions and gain a better understanding of what they are seeing out the window. Currently time-lapse images can be viewed at a lower resolution for all schools on the globe or when flying into an individual school, higher resolution time-lapse images can be seen. The observation of shadows, precipitation, movement of the sun and changes in vegetation allows the viewer to gain a better understanding of how the earth works and how the environment changes between regions and over time. World.org
Siegel, Nisan; Rosen, Joseph; Brooker, Gary
2013-10-01
Recent advances in Fresnel incoherent correlation holography (FINCH) increase the signal-to-noise ratio in hologram recording by interference of images from two diffractive lenses with focal lengths close to the image plane. Holograms requiring short reconstruction distances are created that reconstruct poorly with existing Fresnel propagation methods. Here we show a dramatic improvement in reconstructed fluorescent images when a 2D Hamming window function substituted for the disk window typically used to bound the impulse response in the Fresnel propagation. Greatly improved image contrast and quality are shown for simulated and experimentally determined FINCH holograms using a 2D Hamming window without significant loss in lateral or axial resolution.
Optical Distortion Evaluation in Large Area Windows using Interferometry
NASA Technical Reports Server (NTRS)
Youngquist, Robert C.; Skow, Miles; Nurge, Mark A.
2015-01-01
It is important that imagery seen through large area windows, such as those used on space vehicles, not be substantially distorted. Many approaches are described in the literature for measuring the distortion of an optical window, but most suffer from either poor resolution or processing difficulties. In this paper a new definition of distortion is presented, allowing accurate measurement using an optical interferometer. This new definition is shown to be equivalent to the definitions provided by the military and the standards organizations. In order to determine the advantages and disadvantages of this new approach the distortion of an acrylic window is measured using three different methods; image comparison, Moiré interferometry, and phase-shifting interferometry.
Alkaline battery operational methodology
Sholklapper, Tal; Gallaway, Joshua; Steingart, Daniel; Ingale, Nilesh; Nyce, Michael
2016-08-16
Methods of using specific operational charge and discharge parameters to extend the life of alkaline batteries are disclosed. The methods can be used with any commercial primary or secondary alkaline battery, as well as with newer alkaline battery designs, including batteries with flowing electrolyte. The methods include cycling batteries within a narrow operating voltage window, with minimum and maximum cut-off voltages that are set based on battery characteristics and environmental conditions. The narrow voltage window decreases available capacity but allows the batteries to be cycled for hundreds or thousands of times.
Mittmann, Philipp; Ernst, A; Mittmann, M; Todt, I
2016-11-01
To preserve residual hearing in cochlear implant candidates, the atraumatic insertion of the cochlea electrode has become a focus of cochlea implant research. In a previous study, intracochlear pressure changes during the opening of the round window membrane were investigated. In the current study, intracochlear pressure changes during opening of the round window membrane under dry and transfluid conditions were investigated. Round window openings were performed in an artificial cochlear model. Intracochlear pressure changes were measured using a micro-optical pressure sensor, which was placed in the apex. Openings of the round window membrane were performed under dry and wet conditions using a cannula and a diode laser. Statistically significant differences in the intracochlear pressure changes were seen between the different methods used for opening of the round window membrane. Lower pressure changes were seen by opening the round window membrane with the diode laser than with the cannula. A significant difference was seen between the dry and wet conditions. The atraumatic approach to the cochlea is assumed to be essential for the preservation of residual hearing. Opening of the round window under wet conditions produce a significant advantage on intracochlear pressure changes in comparison to dry conditions by limiting negative outward pressure.
NASA Astrophysics Data System (ADS)
Dwi Prastyo, Dedy; Handayani, Dwi; Fam, Soo-Fen; Puteri Rahayu, Santi; Suhartono; Luh Putu Satyaning Pradnya Paramita, Ni
2018-03-01
Risk assessment and evaluation becomes essential for financial institution to measure the potential risk of their counterparties. In middle of 2016 until first quarter of 2017, there is national program from Indonesian government so-called Tax Amnesty. One subsector that has potential to receive positive impact from the Tax Amnesty program is property and real estate. This work evaluates the risk of top five companies in term of capital share listed in Indonesia stock exchange (IDX). To do this, the Value-at-Risk (VaR) with ARMAX-GARCHX approach is employed. The ARMAX-GARCHX simultaneously models the adaptive mean and variance of stock return of each company considering exogenous variables, i.e. IDR/USD exchange rate and Jakarta Composite Index (JCI). The risk is evaluated in scheme of time moving window. The risk evaluation using 5% quantile with window size 500 transaction days perform better result compare to other scenarios. In addition, duration test is used to test the dependency between shortfalls. It informs that series of shortfall are independent.
Retinoic acid temporally orchestrates colonization of the gut by vagal neural crest cells.
Uribe, Rosa A; Hong, Stephanie S; Bronner, Marianne E
2018-01-01
The enteric nervous system arises from neural crest cells that migrate as chains into and along the primitive gut, subsequently differentiating into enteric neurons and glia. Little is known about the mechanisms governing neural crest migration en route to and along the gut in vivo. Here, we report that Retinoic Acid (RA) temporally controls zebrafish enteric neural crest cell chain migration. In vivo imaging reveals that RA loss severely compromises the integrity and migration of the chain of neural crest cells during the window of time window when they are moving along the foregut. After loss of RA, enteric progenitors accumulate in the foregut and differentiate into enteric neurons, but subsequently undergo apoptosis resulting in a striking neuronal deficit. Moreover, ectopic expression of the transcription factor meis3 and/or the receptor ret, partially rescues enteric neuron colonization after RA attenuation. Collectively, our findings suggest that retinoic acid plays a critical temporal role in promoting enteric neural crest chain migration and neuronal survival upstream of Meis3 and RET in vivo. Copyright © 2017 Elsevier Inc. All rights reserved.
Stuetz, R M
2004-01-01
An online monitoring system based on an array of non-specific sensors was used for the detection of chemical pollutants in wastewater and water. By superimposing sensor profiles for defined sampling window, the identification of data points outside these normal sensor response patterns was used to represent potential pollution episodes or other abnormalities within the process stream. Principle component analysis supported the detection of outliers or rapid changes in the sensor responses as an indicator of chemical pollutants. A model based on the comparison of sensor relative responses to a moving average for a defined sample window was tested for detecting and identifying sudden changes in the online data over a 6-month period. These results show the technical advantages of using a non-specific based monitoring system that can respond to a range of chemical species, due to broad selectivity of the sensor compositions. The findings demonstrate how this non-invasive technique could be further developed to provide early warning systems for application at the inlet of wastewater treatment plants.
Software for Viewing Landsat Mosaic Images
NASA Technical Reports Server (NTRS)
Watts, Zack; Farve, Catharine L.; Harvey, Craig
2003-01-01
A Windows-based computer program has been written to enable novice users (especially educators and students) to view images of large areas of the Earth (e.g., the continental United States) generated from image data acquired in the Landsat observations performed circa the year 1990. The large-area images are constructed as mosaics from the original Landsat images, which were acquired in several wavelength bands and each of which spans an area (in effect, one tile of a mosaic) of .5 in latitude by .6 in longitude. Whereas the original Landsat data are registered on a universal transverse Mercator (UTM) grid, the program converts the UTM coordinates of a mouse pointer in the image to latitude and longitude, which are continuously updated and displayed as the pointer is moved. The mosaic image currently on display can be exported as a Windows bitmap file. Other images (e.g., of state boundaries or interstate highways) can be overlaid on Landsat mosaics. The program interacts with the user via standard toolbar, keyboard, and mouse user interfaces. The program is supplied on a compact disk along with tutorial and educational information.
Verch, Andreas; Pfaff, Marina; de Jonge, Niels
2015-06-30
Gold nanoparticles were observed to move at a liquid/solid interface 3 orders of magnitude slower than expected for the movement in a bulk liquid by Brownian motion. The nanoscale movement was studied with scanning transmission electron microscopy (STEM) using a liquid enclosure consisting of microchips with silicon nitride windows. The experiments involved a variation of the electron dose, the coating of the nanoparticles, the surface charge of the enclosing membrane, the viscosity, and the liquid thickness. The observed slow movement was not a result of hydrodynamic hindrance near a wall but instead explained by the presence of a layer of ordered liquid exhibiting a viscosity 5 orders of magnitude larger than a bulk liquid. The increased viscosity presumably led to a dramatic slowdown of the movement. The layer was formed as a result of the surface charge of the silicon nitride windows. The exceptionally slow motion is a crucial aspect of electron microscopy of specimens in liquid, enabling a direct observation of the movement and agglomeration of nanoscale objects in liquid.
Car glass microphones using piezoelectric transducers for external alarm detection and localization
NASA Astrophysics Data System (ADS)
Bolzmacher, Christian; Le Guelvouit, Valentin
2015-05-01
This work describes the potential use of car windows as a long range acoustic sensing device for external alarm signals. The goal is to detect and localize siren signals (e.g. ambulances and police cars) and to alert presbycusic drivers of its presence by visual and acoustic feedback in order to improve individual mobility and increase the sense of security. The glass panes of a Renault Zoé operating as an acoustic antenna have been equipped with large 50 mm outer diameter piezoceramic rings, hidden in the lower part of the door structure and the lower part of the windshield and the rear window. The response of the glass to quasi-static signals and sweep excitation has been recorded. In general, the glass pane is acting as a high pass filter due to its inherent stiffness and provides only little damping. This effect is compensated by using a charge amplifier electronic circuit. The detection capability up to 120 m as well as a dynamic test where the car is moving towards the sound source is reported.
Hubbard, Timothy L; Motes, Michael A
2005-08-01
Memory for the initial and final positions of moving targets was examined. When targets appeared adjacent to the boundary of a larger enclosing window, memory for initial position exhibited a Fröhlich effect (i.e., a displacement forward), and when distance of initial position from the boundary increased, memory for initial position exhibited a smaller Fröhlich effect or an onset repulsion effect (i.e., a displacement backward). When targets vanished adjacent to the boundary of a larger enclosing window, memory for final position was displaced backward, and when distance of final position from the boundary increased, memory for final position did not exhibit significant displacement. These patterns differed from previously reported displacements of initial and final positions of targets presented on a blank background. Possible influences of attention and extrapolation of trajectory on whether memory for initial position exhibits a Fröhlich effect or an onset repulsion effect and on backward displacement in memory for final position are discussed.
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 nears Launch Pad 39B (in the background, right). The tip of the orange external tank can be seen above the rotating service structure surrounding the shuttle. The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 makes its way toward Launch Pad 39B (in the background). The tip of the orange external tank can be seen above the rotating service structure surrounding the shuttle. The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
Aperture Synthesis Shows Perceptual Integration of Geometrical Form Across Saccades.
Schreiber, Kai; Morgan, Michael
2018-03-01
We investigated the perceptual bias in perceived relative lengths in the Brentano version of the Müller-Lyer arrowheads figure. The magnitude of the bias was measured both under normal whole-figure viewing condition and under an aperture viewing condition, where participants moved their gaze around the figure but could see only one arrowhead at a time through a Gaussian-weighted contrast window. The extent of the perceptual bias was similar under the two conditions. The stimuli were presented on a CRT in a light-proof room with room-lights off, but visual context was provided by a rectangular frame surrounding the figure. The frame was either stationary with respect to the figure or moved in such a manner that the bias would be counteracted if the observer were locating features with respect to the frame. Biases were reduced in the latter condition. We conclude that integration occurs over saccades, but largely in an external visual framework, rather than in a body-centered frame using an extraretinal signal.
Bzorgi, Fariborz M.
2015-05-19
In various embodiments an apparatus is presented for securing a structure such as a door, window, hatch, or gate that moves between an open and a closed position relative to a fixed structure to provide or deny access to a compartment, a room, an outdoor area, or a facility. Various embodiments provide a delay in opening the closure of sufficient duration to frustrate a rapid activation that might be desired by a person who is attempting to pass through the closure for some illicit purpose. Typically, hydraulics are used to activate the apparatus and no electrical energy or electronic signals are employed. In one embodiment, a plurality of actuations of a hand lever operates a hydraulic pump that moves a locking bolt from a first position in which a locking bolt is engaged with a recess in the fixed structure (preventing opening of a gate) to a second position in which the locking bolt is disengaged from the recess to permit opening of the gate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xing, Yulong; Shu, Chi-wang; Noelle, Sebastian
This note aims at demonstrating the advantage of moving-water well-balanced schemes over still-water well-balanced schemes for the shallow water equations. We concentrate on numerical examples with solutions near a moving-water equilibrium. For such examples, still-water well-balanced methods are not capable of capturing the small perturbations of the moving-water equilibrium and may generate significant spurious oscillations, unless an extremely refined mesh is used. On the other hand, moving-water well-balanced methods perform well in these tests. The numerical examples in this note clearly demonstrate the importance of utilizing moving-water well-balanced methods for solutions near a moving-water equilibrium.
Gaussian windows: A tool for exploring multivariate data
NASA Technical Reports Server (NTRS)
Jaeckel, Louis A.
1990-01-01
Presented here is a method for interactively exploring a large set of quantitative multivariate data, in order to estimate the shape of the underlying density function. It is assumed that the density function is more or less smooth, but no other specific assumptions are made concerning its structure. The local structure of the data in a given region may be examined by viewing the data through a Gaussian window, whose location and shape are chosen by the user. A Gaussian window is defined by giving each data point a weight based on a multivariate Gaussian function. The weighted sample mean and sample covariance matrix are then computed, using the weights attached to the data points. These quantities are used to compute an estimate of the shape of the density function in the window region. The local structure of the data is described by a method similar to the method of principal components. By taking many such local views of the data, we can form an idea of the structure of the data set. The method is applicable in any number of dimensions. The method can be used to find and describe simple structural features such as peaks, valleys, and saddle points in the density function, and also extended structures in higher dimensions. With some practice, we can apply our geometrical intuition to these structural features in any number of dimensions, so that we can think about and describe the structure of the data. Since the computations involved are relatively simple, the method can easily be implemented on a small computer.