Least Squares Moving-Window Spectral Analysis.
Lee, Young Jong
2017-08-01
Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.
A refined technique to calculate finite helical axes from rigid body trackers.
McLachlin, Stewart D; Ferreira, Louis M; Dunning, Cynthia E
2014-12-01
Finite helical axes (FHAs) are a potentially effective tool for joint kinematic analysis. Unfortunately, no straightforward guidelines exist for calculating accurate FHAs using prepackaged six degree-of-freedom (6 DOF) rigid body trackers. Thus, this study aimed to: (1) describe a protocol for calculating FHA parameters from 6 DOF rigid body trackers using the screw matrix and (2) to maximize the number of accurate FHAs generated from a given data set using a moving window analysis. Four Optotrak® Smart Markers were used as the rigid body trackers, two moving and two fixed, at different distances from the hinge joint of a custom-machined jig. 6D OF pose information was generated from 51 static positions of the jig rotated and fixed in 0.5 deg increments up to 25 deg. Output metrics included the FHA direction cosines, the rotation about the FHA, the translation along the axis, and the intercept of the FHA with the plane normal to the jig's hinge joint. FHA metrics were calculated using the relative tracker rotation from the starting position, and using a moving window analysis to define a minimum acceptable rotational displacement between the moving tracker data points. Data analysis found all FHA rotations calculated from the starting position were within 0.15 deg of the prescribed jig rotation. FHA intercepts were most stable when determined using trackers closest to the hinge axis. Increasing the moving window size improved the FHA direction cosines and center of rotation accuracy. Window sizes larger than 2 deg had an intercept deviation of less than 1 mm. Furthermore, compared to the 0 deg window size, the 2 deg window had a 90% improvement in FHA intercept precision while generating almost an equivalent number of FHA axes. This work identified a solution to improve FHA calculations for biomechanical researchers looking to describe changes in 3D joint motion.
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
Wang, Shenghao; Zhang, Yuyan; Cao, Fuyi; Pei, Zhenying; Gao, Xuewei; Zhang, Xu; Zhao, Yong
2018-02-13
This paper presents a novel spectrum analysis tool named synergy adaptive moving window modeling based on immune clone algorithm (SA-MWM-ICA) considering the tedious and inconvenient labor involved in the selection of pre-processing methods and spectral variables by prior experience. In this work, immune clone algorithm is first introduced into the spectrum analysis field as a new optimization strategy, covering the shortage of the relative traditional methods. Based on the working principle of the human immune system, the performance of the quantitative model is regarded as antigen, and a special vector corresponding to the above mentioned antigen is regarded as antibody. The antibody contains a pre-processing method optimization region which is created by 11 decimal digits, and a spectrum variable optimization region which is formed by some moving windows with changeable width and position. A set of original antibodies are created by modeling with this algorithm. After calculating the affinity of these antibodies, those with high affinity will be selected to clone. The regulation for cloning is that the higher the affinity, the more copies will be. In the next step, another import operation named hyper-mutation is applied to the antibodies after cloning. Moreover, the regulation for hyper-mutation is that the lower the affinity, the more possibility will be. Several antibodies with high affinity will be created on the basis of these steps. Groups of simulated dataset, gasoline near-infrared spectra dataset, and soil near-infrared spectra dataset are employed to verify and illustrate the performance of SA-MWM-ICA. Analysis results show that the performance of the quantitative models adopted by SA-MWM-ICA are better especially for structures with relatively complex spectra than traditional models such as partial least squares (PLS), moving window PLS (MWPLS), genetic algorithm PLS (GAPLS), and pretreatment method classification and adjustable parameter changeable size moving window PLS (CA-CSMWPLS). The selected pre-processing methods and spectrum variables are easily explained. The proposed method will converge in few generations and can be used not only for near-infrared spectroscopy analysis but also for other similar spectral analysis, such as infrared spectroscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Zhang, Mingjing; Wen, Ming; Zhang, Zhi-Min; Lu, Hongmei; Liang, Yizeng; Zhan, Dejian
2015-03-01
Retention time shift is one of the most challenging problems during the preprocessing of massive chromatographic datasets. Here, an improved version of the moving window fast Fourier transform cross-correlation algorithm is presented to perform nonlinear and robust alignment of chromatograms by analyzing the shifts matrix generated by moving window procedure. The shifts matrix in retention time can be estimated by fast Fourier transform cross-correlation with a moving window procedure. The refined shift of each scan point can be obtained by calculating the mode of corresponding column of the shifts matrix. This version is simple, but more effective and robust than the previously published moving window fast Fourier transform cross-correlation method. It can handle nonlinear retention time shift robustly if proper window size has been selected. The window size is the only one parameter needed to adjust and optimize. The properties of the proposed method are investigated by comparison with the previous moving window fast Fourier transform cross-correlation and recursive alignment by fast Fourier transform using chromatographic datasets. The pattern recognition results of a gas chromatography mass spectrometry dataset of metabolic syndrome can be improved significantly after preprocessing by this method. Furthermore, the proposed method is available as an open source package at https://github.com/zmzhang/MWFFT2. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Domingo-Almenara, Xavier; Perera, Alexandre; Brezmes, Jesus
2016-11-25
Gas chromatography-mass spectrometry (GC-MS) produces large and complex datasets characterized by co-eluted compounds and at trace levels, and with a distinct compound ion-redundancy as a result of the high fragmentation by the electron impact ionization. Compounds in GC-MS can be resolved by taking advantage of the multivariate nature of GC-MS data by applying multivariate resolution methods. However, multivariate methods have to be applied in small regions of the chromatogram, and therefore chromatograms are segmented prior to the application of the algorithms. The automation of this segmentation process is a challenging task as it implies separating between informative data and noise from the chromatogram. This study demonstrates the capabilities of independent component analysis-orthogonal signal deconvolution (ICA-OSD) and multivariate curve resolution-alternating least squares (MCR-ALS) with an overlapping moving window implementation to avoid the typical hard chromatographic segmentation. Also, after being resolved, compounds are aligned across samples by an automated alignment algorithm. We evaluated the proposed methods through a quantitative analysis of GC-qTOF MS data from 25 serum samples. The quantitative performance of both moving window ICA-OSD and MCR-ALS-based implementations was compared with the quantification of 33 compounds by the XCMS package. Results shown that most of the R 2 coefficients of determination exhibited a high correlation (R 2 >0.90) in both ICA-OSD and MCR-ALS moving window-based approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.
2013-01-01
The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…
Lidar point density analysis: implications for identifying water bodies
Worstell, Bruce B.; Poppenga, Sandra K.; Evans, Gayla A.; Prince, Sandra
2014-01-01
Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
James, S. R.; Knox, H. A.; Abbott, R. E.; ...
2017-04-13
Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variationsmore » that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. Lastly, this improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.« less
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
An impact of environmental changes on flows in the reach scale under a range of climatic conditions
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Romanowicz, Renata J.
2016-04-01
The present paper combines detection and adequate identification of causes of changes in flow regime at cross-sections along the Middle River Vistula reach using different methods. Two main experimental set ups (designs) have been applied to study the changes, a moving three-year window and low- and high-flow event based approach. In the first experiment, a Stochastic Transfer Function (STF) model and a quantile-based statistical analysis of flow patterns were compared. These two methods are based on the analysis of changes of the STF model parameters and standardised differences of flow quantile values. In the second experiment, in addition to the STF-based also a 1-D distributed model, MIKE11 was applied. The first step of the procedure used in the study is to define the river reaches that have recorded information on land use and water management changes. The second task is to perform the moving window analysis of standardised differences of flow quantiles and moving window optimisation of the STF model for flow routing. The third step consists of an optimisation of the STF and MIKE11 models for high- and low-flow events. The final step is to analyse the results and relate the standardised quantile changes and model parameter changes to historical land use changes and water management practices. Results indicate that both models give consistent assessment of changes in the channel for medium and high flows. ACKNOWLEDGEMENTS This research was supported by the Institute of Geophysics Polish Academy of Sciences through the Young Scientist Grant no. 3b/IGF PAN/2015.
NASA Technical Reports Server (NTRS)
Casasent, D.
1978-01-01
The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.
Illusory displacement of equiluminous kinetic edges.
Ramachandran, V S; Anstis, S M
1990-01-01
A stationary window was cut out of a stationary random-dot pattern. When a field of dots was moved continuously behind the window (a) the window appeared to move in the same direction even though it was stationary, (b) the position of the 'kinetic edges' defining the window was also displaced along the direction of dot motion, and (c) the edges of the window tended to fade on steady fixation even though the dots were still clearly visible. The illusory displacement was enhanced considerably if the kinetic edge was equiluminous and if the 'window' region was seen as 'figure' rather than 'ground'. Since the extraction of kinetic edges probably involves the use of direction-selective cells, the illusion may provide insights into how the visual system uses the output of these cells to localize the kinetic edges.
Detrending moving average algorithm for multifractals
NASA Astrophysics Data System (ADS)
Gu, Gao-Feng; Zhou, Wei-Xing
2010-07-01
The detrending moving average (DMA) algorithm is a widely used technique to quantify the long-term correlations of nonstationary time series and the long-range correlations of fractal surfaces, which contains a parameter θ determining the position of the detrending window. We develop multifractal detrending moving average (MFDMA) algorithms for the analysis of one-dimensional multifractal measures and higher-dimensional multifractals, which is a generalization of the DMA method. The performance of the one-dimensional and two-dimensional MFDMA methods is investigated using synthetic multifractal measures with analytical solutions for backward (θ=0) , centered (θ=0.5) , and forward (θ=1) detrending windows. We find that the estimated multifractal scaling exponent τ(q) and the singularity spectrum f(α) are in good agreement with the theoretical values. In addition, the backward MFDMA method has the best performance, which provides the most accurate estimates of the scaling exponents with lowest error bars, while the centered MFDMA method has the worse performance. It is found that the backward MFDMA algorithm also outperforms the multifractal detrended fluctuation analysis. The one-dimensional backward MFDMA method is applied to analyzing the time series of Shanghai Stock Exchange Composite Index and its multifractal nature is confirmed.
NASA Astrophysics Data System (ADS)
Taira, T.; Kato, A.
2013-12-01
A high-resolution Vp/Vs ratio estimate is one of the key parameters to understand spatial variations of composition and physical state within the Earth. Lin and Shearer (2007, BSSA) recently developed a methodology to obtain local Vp/Vs ratios in individual similar earthquake clusters, based on P- and S-wave differential times. A waveform cross-correlation approach is typically employed to measure those differential times for pairs of seismograms from similar earthquakes clusters, at narrow time windows around the direct P and S waves. This approach effectively collects P- and S-wave differential times and however requires the robust P- and S-wave time windows that are extracted based on either manually or automatically picked P- and S-phases. We present another technique to estimate P- and S-wave differential times by exploiting temporal properties of delayed time as a function of elapsed time on the seismograms with a moving-window cross-correlation analysis (e.g., Snieder, 2002, Phys. Rev. E; Niu et al. 2003, Nature). Our approach is based on the principle that the delayed time for the direct S wave differs from that for the direct P wave. Two seismograms aligned by the direct P waves from a pair of similar earthquakes yield that delayed times become zero around the direct P wave. In contrast, delayed times obtained from time windows including the direct S wave have non-zero value. Our approach, in principle, is capable of measuring both P- and S-wave differential times from single-component seismograms. In an ideal case, the temporal evolution of delayed time becomes a step function with its discontinuity at the onset of the direct S wave. The offset in the resulting step function would be the S-wave differential time, relative to the P-wave differential time as the two waveforms are aligned by the direct P wave. We apply our moving-window cross-correlation technique to the two different data sets collected at: 1) the Wakayama district, Japan and 2) the Geysers geothermal field, California. The both target areas are characterized by earthquake swarms that provide a number of similar events clusters. We use the following automated procedure to systematically analyze the two data sets: 1) the identification of the direct P arrivals by using an Akaike Information Criterion based phase picking algorithm introduced by Zhang and Thurber (2003, BSSA), 2) the waveform alignment by the P-wave with a waveform cross-correlation to obtain P-wave differential time, 3) the moving-time window analysis to estimate the S-differential time. Kato et al. (2010, GRL) have estimated the Vp/Vs ratios for a few similar earthquake clusters from the Wakayama data set, by a conventional approach to obtain differential times. We find that the resulting Vp/Vs ratios from our approach for the same earthquake clusters are comparable with those obtained from Kato et al. (2010, GRL). We show that the moving-window cross-correlation technique effectively measures both P- and S-wave differential times for the seismograms in which the clear P and S phases are not observed. We will show spatial distributions in Vp/Vs ratios in our two target areas.
Design and control of a 3-DOF rehabilitation robot for forearm and wrist.
Lincong Luo; Liang Peng; Zengguang Hou; Weiqun Wang
2017-07-01
This paper presents a 3-DOF compact rehabilitation robot, involving mechanical structure design, control system design and gravity compensation analysis. The robot can simultaneously provide assistance for pronation/supination(P/S), flexion/extension(F/E) and adduction/abduction(A/A) joints rehabilitation training. The P/S and F/E joints are designed to be driven by cable transmission to gain a high backdrivability, and an adjustment plate is adopted to decrease the distance between the rotation axis of F/E joint of the human wrist and the robot. In addition, gravity compensation is considered to offset the impact of self-gravity on the performance of the controller. A "moving window" control strategy based on impedance control is proposed and implemented on the robot. A comparison between the "moving window" control and classical impedance control indicates that the former has more potential to stimulate the voluntary efforts of the participant, and has a less limitation moving in a fixed reference trajectory. Meanwhile, the results also validate the feasibility and safety of the wrist robot system.
Michael L. Hoppus; Rachel I. Riemann; Andrew J. Lister; Mark V. Finco
2002-01-01
The panchromatic bands of Landsat 7, SPOT, and IRS satellite imagery provide an opportunity to evaluate the effectiveness of texture analysis of satellite imagery for mapping of land use/cover, especially forest cover. A variety of texture algorithms, including standard deviation, Ryherd-Woodcock minimum variance adaptive window, low pass etc., were applied to moving...
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to combine the advantages brought forward by the proposed EWMA-GLRT fault detection chart with the KPCA model. Thus, it is used to enhance fault detection of the Cad System in E. coli model through monitoring some of the key variables involved in this model such as enzymes, transport proteins, regulatory proteins, lysine, and cadaverine. The results demonstrate the effectiveness of the proposed KPCA-based EWMA-GLRT method over Q , GLRT, EWMA, Shewhart, and moving window-GLRT methods. The detection performance is assessed and evaluated in terms of FAR, missed detection rates, and average run length (ARL 1 ) values.
Eye movement evidence for defocused attention in dysphoria--a perceptual span analysis.
Brzezicka, Aneta; Krejtz, Izabela; von Hecker, Ulrich; Laubrock, Jochen
2012-07-01
The defocused attention hypothesis (von Hecker and Meiser, 2005) assumes that negative mood broadens attention, whereas the analytical rumination hypothesis (Andrews and Thompson, 2009) suggests a narrowing of the attentional focus with depression. We tested these conflicting hypotheses by directly measuring the perceptual span in groups of dysphoric and control subjects, using eye tracking. In the moving window paradigm, information outside of a variable-width gaze-contingent window was masked during reading of sentences. In measures of sentence reading time and mean fixation duration, dysphoric subjects were more pronouncedly affected than controls by a reduced window size. This difference supports the defocused attention hypothesis and seems hard to reconcile with a narrowing of attentional focus. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Mulligan, J. B.; Stone, L. S.; Hargens, Alan R. (Technical Monitor)
1995-01-01
We have shown that moving a plaid in an asymmetric window biases the perceived direction of motion (Beutter, Mulligan & Stone, ARVO 1994). We now explore whether these biased motion signals might also drive the smooth eye-movement response by comparing the perceived and tracked directions. The human smooth oculomotor response to moving plaids appears to be driven by the perceived rather than the veridical direction of motion. This suggests that human motion perception and smooth eye movements share underlying neural motion-processing substrates as has already been shown to be true for monkeys.
Climate Exposure of US National Parks in a New Era of Change
Monahan, William B.; Fisichelli, Nicholas A.
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901–2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change. PMID:24988483
Climate exposure of US national parks in a new era of change.
Monahan, William B; Fisichelli, Nicholas A
2014-01-01
US national parks are challenged by climate and other forms of broad-scale environmental change that operate beyond administrative boundaries and in some instances are occurring at especially rapid rates. Here, we evaluate the climate change exposure of 289 natural resource parks administered by the US National Park Service (NPS), and ask which are presently (past 10 to 30 years) experiencing extreme (<5th percentile or >95th percentile) climates relative to their 1901-2012 historical range of variability (HRV). We consider parks in a landscape context (including surrounding 30 km) and evaluate both mean and inter-annual variation in 25 biologically relevant climate variables related to temperature, precipitation, frost and wet day frequencies, vapor pressure, cloud cover, and seasonality. We also consider sensitivity of findings to the moving time window of analysis (10, 20, and 30 year windows). Results show that parks are overwhelmingly at the extreme warm end of historical temperature distributions and this is true for several variables (e.g., annual mean temperature, minimum temperature of the coldest month, mean temperature of the warmest quarter). Precipitation and other moisture patterns are geographically more heterogeneous across parks and show greater variation among variables. Across climate variables, recent inter-annual variation is generally well within the range of variability observed since 1901. Moving window size has a measureable effect on these estimates, but parks with extreme climates also tend to exhibit low sensitivity to the time window of analysis. We highlight particular parks that illustrate different extremes and may facilitate understanding responses of park resources to ongoing climate change. We conclude with discussion of how results relate to anticipated future changes in climate, as well as how they can inform NPS and neighboring land management and planning in a new era of change.
Jalali-Heravi, Mehdi; Moazeni-Pourasil, Roudabeh Sadat; Sereshti, Hassan
2015-03-01
In analysis of complex natural matrices by gas chromatography-mass spectrometry (GC-MS), many disturbing factors such as baseline drift, spectral background, homoscedastic and heteroscedastic noise, peak shape deformation (non-Gaussian peaks), low S/N ratio and co-elution (overlapped and/or embedded peaks) lead the researchers to handle them to serve time, money and experimental efforts. This study aimed to improve the GC-MS analysis of complex natural matrices utilizing multivariate curve resolution (MCR) methods. In addition, to assess the peak purity of the two-dimensional data, a method called variable size moving window-evolving factor analysis (VSMW-EFA) is introduced and examined. The proposed methodology was applied to the GC-MS analysis of Iranian Lavender essential oil, which resulted in extending the number of identified constituents from 56 to 143 components. It was found that the most abundant constituents of the Iranian Lavender essential oil are α-pinene (16.51%), camphor (10.20%), 1,8-cineole (9.50%), bornyl acetate (8.11%) and camphene (6.50%). This indicates that the Iranian type Lavender contains a relatively high percentage of α-pinene. Comparison of different types of Lavender essential oils showed the composition similarity between Iranian and Italian (Sardinia Island) Lavenders. Published by Elsevier B.V.
Best Practices Case Study: Schneider Homes, Inc. - Village at Miller Creek, Burien, W
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2010-09-01
Case study of Schneider Homes, who achieved 50% savings over the 2004 IECC with analysis and recommendations from DOE’s Building America including moving ducts and furnace into conditioned space, R-23 blown fiberglass in the walls and R-38 in the attics, and high-performance HVAC, lighting, appliances, and windows.
Effect of window length on performance of the elbow-joint angle prediction based on electromyography
NASA Astrophysics Data System (ADS)
Triwiyanto; Wahyunggoro, Oyas; Adi Nugroho, Hanung; Herianto
2017-05-01
The high performance of the elbow joint angle prediction is essential on the development of the devices based on electromyography (EMG) control. The performance of the prediction depends on the feature of extraction parameters such as window length. In this paper, we evaluated the effect of the window length on the performance of the elbow-joint angle prediction. The prediction algorithm consists of zero-crossing feature extraction and second order of Butterworth low pass filter. The feature was used to extract the EMG signal by varying window length. The EMG signal was collected from the biceps muscle while the elbow was moved in the flexion and extension motion. The subject performed the elbow motion by holding a 1-kg load and moved the elbow in different periods (12 seconds, 8 seconds and 6 seconds). The results indicated that the window length affected the performance of the prediction. The 250 window lengths yielded the best performance of the prediction algorithm of (mean±SD) root mean square error = 5.68%±1.53% and Person’s correlation = 0.99±0.0059.
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, X; Cao, D; Housley, D
2014-06-01
Purpose: In this work, we have tested the performance of new respiratory gating solutions for Elekta linacs. These solutions include the Response gating and the C-RAD Catalyst surface mapping system.Verification measurements have been performed for a series of clinical cases. We also examined the beam on latency of the system and its impact on delivery efficiency. Methods: To verify the benefits of tighter gating windows, a Quasar Respiratory Motion Platform was used. Its vertical-motion plate acted as a respiration surrogate and was tracked by the Catalyst system to generate gating signals. A MatriXX ion-chamber array was mounted on its longitudinal-movingmore » platform. Clinical plans are delivered to a stationary and moving Matrix array at 100%, 50% and 30% gating windows and gamma scores were calculated comparing moving delivery results to the stationary result. It is important to note that as one moves to tighter gating windows, the delivery efficiency will be impacted by the linac's beam-on latency. Using a specialized software package, we generated beam-on signals of lengths of 1000ms, 600ms, 450ms, 400ms, 350ms and 300ms. As the gating windows get tighter, one can expect to reach a point where the dose rate will fall to nearly zero, indicating that the gating window is close to beam-on latency. A clinically useful gating window needs to be significantly longer than the latency for the linac. Results: As expected, the use of tighter gating windows improved delivery accuracy. However, a lower limit of the gating window, largely defined by linac beam-on latency, exists at around 300ms. Conclusion: The Response gating kit, combined with the C-RAD Catalyst, provides an effective solution for respiratorygated treatment delivery. Careful patient selection, gating window design, even visual/audio coaching may be necessary to ensure both delivery quality and efficiency. This research project is funded by Elekta.« less
Eye movements and the span of the effective stimulus in visual search.
Bertera, J H; Rayner, K
2000-04-01
The span of the effective stimulus during visual search through an unstructured alphanumeric array was investigated by using eye-contingent-display changes while the subjects searched for a target letter. In one condition, a window exposing the search array moved in synchrony with the subjects' eye movements, and the size of the window was varied. Performance reached asymptotic levels when the window was 5 degrees. In another condition, a foveal mask moved in synchrony with each eye movement, and the size of the mask was varied. The foveal mask conditions were much more detrimental to search behavior than the window conditions, indicating the importance of foveal vision during search. The size of the array also influenced performance, but performance reached asymptote for all array sizes tested at the same window size, and the effect of the foveal mask was the same for all array sizes. The results indicate that both acuity and difficulty of the search task influenced the span of the effective stimulus during visual search.
Solving the chemical master equation using sliding windows
2010-01-01
Background The chemical master equation (CME) is a system of ordinary differential equations that describes the evolution of a network of chemical reactions as a stochastic process. Its solution yields the probability density vector of the system at each point in time. Solving the CME numerically is in many cases computationally expensive or even infeasible as the number of reachable states can be very large or infinite. We introduce the sliding window method, which computes an approximate solution of the CME by performing a sequence of local analysis steps. In each step, only a manageable subset of states is considered, representing a "window" into the state space. In subsequent steps, the window follows the direction in which the probability mass moves, until the time period of interest has elapsed. We construct the window based on a deterministic approximation of the future behavior of the system by estimating upper and lower bounds on the populations of the chemical species. Results In order to show the effectiveness of our approach, we apply it to several examples previously described in the literature. The experimental results show that the proposed method speeds up the analysis considerably, compared to a global analysis, while still providing high accuracy. Conclusions The sliding window method is a novel approach to address the performance problems of numerical algorithms for the solution of the chemical master equation. The method efficiently approximates the probability distributions at the time points of interest for a variety of chemically reacting systems, including systems for which no upper bound on the population sizes of the chemical species is known a priori. PMID:20377904
NASA Astrophysics Data System (ADS)
Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.
2017-04-01
A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.
A Study of Feature Extraction Using Divergence Analysis of Texture Features
NASA Technical Reports Server (NTRS)
Hallada, W. A.; Bly, B. G.; Boyd, R. K.; Cox, S.
1982-01-01
An empirical study of texture analysis for feature extraction and classification of high spatial resolution remotely sensed imagery (10 meters) is presented in terms of specific land cover types. The principal method examined is the use of spatial gray tone dependence (SGTD). The SGTD method reduces the gray levels within a moving window into a two-dimensional spatial gray tone dependence matrix which can be interpreted as a probability matrix of gray tone pairs. Haralick et al (1973) used a number of information theory measures to extract texture features from these matrices, including angular second moment (inertia), correlation, entropy, homogeneity, and energy. The derivation of the SGTD matrix is a function of: (1) the number of gray tones in an image; (2) the angle along which the frequency of SGTD is calculated; (3) the size of the moving window; and (4) the distance between gray tone pairs. The first three parameters were varied and tested on a 10 meter resolution panchromatic image of Maryville, Tennessee using the five SGTD measures. A transformed divergence measure was used to determine the statistical separability between four land cover categories forest, new residential, old residential, and industrial for each variation in texture parameters.
Efficient Windows Collaborative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nils Petermann
2010-02-28
The project goals covered both the residential and commercial windows markets and involved a range of audiences such as window manufacturers, builders, homeowners, design professionals, utilities, and public agencies. Essential goals included: (1) Creation of 'Master Toolkits' of information that integrate diverse tools, rating systems, and incentive programs, customized for key audiences such as window manufacturers, design professionals, and utility programs. (2) Delivery of education and outreach programs to multiple audiences through conference presentations, publication of articles for builders and other industry professionals, and targeted dissemination of efficient window curricula to professionals and students. (3) Design and implementation of mechanismsmore » to encourage and track sales of more efficient products through the existing Window Products Database as an incentive for manufacturers to improve products and participate in programs such as NFRC and ENERGY STAR. (4) Development of utility incentive programs to promote more efficient residential and commercial windows. Partnership with regional and local entities on the development of programs and customized information to move the market toward the highest performing products. An overarching project goal was to ensure that different audiences adopt and use the developed information, design and promotion tools and thus increase the market penetration of energy efficient fenestration products. In particular, a crucial success criterion was to move gas and electric utilities to increase the promotion of energy efficient windows through demand side management programs as an important step toward increasing the market share of energy efficient windows.« less
Moving object detection using dynamic motion modelling from UAV aerial images.
Saif, A F M Saifuddin; Prabuwono, Anton Satria; Mahayuddin, Zainal Rasyid
2014-01-01
Motion analysis based moving object detection from UAV aerial image is still an unsolved issue due to inconsideration of proper motion estimation. Existing moving object detection approaches from UAV aerial images did not deal with motion based pixel intensity measurement to detect moving object robustly. Besides current research on moving object detection from UAV aerial images mostly depends on either frame difference or segmentation approach separately. There are two main purposes for this research: firstly to develop a new motion model called DMM (dynamic motion model) and secondly to apply the proposed segmentation approach SUED (segmentation using edge based dilation) using frame difference embedded together with DMM model. The proposed DMM model provides effective search windows based on the highest pixel intensity to segment only specific area for moving object rather than searching the whole area of the frame using SUED. At each stage of the proposed scheme, experimental fusion of the DMM and SUED produces extracted moving objects faithfully. Experimental result reveals that the proposed DMM and SUED have successfully demonstrated the validity of the proposed methodology.
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study.
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-06-17
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows.
Point-point and point-line moving-window correlation spectroscopy and its applications
NASA Astrophysics Data System (ADS)
Zhou, Qun; Sun, Suqin; Zhan, Daqi; Yu, Zhiwu
2008-07-01
In this paper, we present a new extension of generalized two-dimensional (2D) correlation spectroscopy. Two new algorithms, namely point-point (P-P) correlation and point-line (P-L) correlation, have been introduced to do the moving-window 2D correlation (MW2D) analysis. The new method has been applied to a spectral model consisting of two different processes. The results indicate that P-P correlation spectroscopy can unveil the details and re-constitute the entire process, whilst the P-L can provide general feature of the concerned processes. Phase transition behavior of dimyristoylphosphotidylethanolamine (DMPE) has been studied using MW2D correlation spectroscopy. The newly proposed method verifies that the phase transition temperature is 56 °C, same as the result got from a differential scanning calorimeter. To illustrate the new method further, a lysine and lactose mixture has been studied under thermo perturbation. Using the P-P MW2D, the Maillard reaction of the mixture was clearly monitored, which has been very difficult using conventional display of FTIR spectra.
Microwave Radiometers for Fire Detection in Trains: Theory and Feasibility Study †
Alimenti, Federico; Roselli, Luca; Bonafoni, Stefania
2016-01-01
This paper introduces the theory of fire detection in moving vehicles by microwave radiometers. The system analysis is discussed and a feasibility study is illustrated on the basis of two implementation hypotheses. The basic idea is to have a fixed radiometer and to look inside the glass windows of the wagon when it passes in front of the instrument antenna. The proposed sensor uses a three-pixel multi-beam configuration that allows an image to be formed by the movement of the train itself. Each pixel is constituted by a direct amplification microwave receiver operating at 31.4 GHz. At this frequency, the antenna can be a 34 cm offset parabolic dish, whereas a 1 K brightness temperature resolution is achievable with an overall system noise figure of 6 dB, an observation bandwidth of 2 GHz and an integration time of 1 ms. The effect of the detector noise is also investigated and several implementation hypotheses are discussed. The presented study is important since it could be applied to the automatic fire alarm in trains and moving vehicles with dielectric wall/windows. PMID:27322280
The research on the mean shift algorithm for target tracking
NASA Astrophysics Data System (ADS)
CAO, Honghong
2017-06-01
The traditional mean shift algorithm for target tracking is effective and high real-time, but there still are some shortcomings. The traditional mean shift algorithm is easy to fall into local optimum in the tracking process, the effectiveness of the method is weak when the object is moving fast. And the size of the tracking window never changes, the method will fail when the size of the moving object changes, as a result, we come up with a new method. We use particle swarm optimization algorithm to optimize the mean shift algorithm for target tracking, Meanwhile, SIFT (scale-invariant feature transform) and affine transformation make the size of tracking window adaptive. At last, we evaluate the method by comparing experiments. Experimental result indicates that the proposed method can effectively track the object and the size of the tracking window changes.
Quantifying rapid changes in cardiovascular state with a moving ensemble average.
Cieslak, Matthew; Ryan, William S; Babenko, Viktoriya; Erro, Hannah; Rathbun, Zoe M; Meiring, Wendy; Kelsey, Robert M; Blascovich, Jim; Grafton, Scott T
2018-04-01
MEAP, the moving ensemble analysis pipeline, is a new open-source tool designed to perform multisubject preprocessing and analysis of cardiovascular data, including electrocardiogram (ECG), impedance cardiogram (ICG), and continuous blood pressure (BP). In addition to traditional ensemble averaging, MEAP implements a moving ensemble averaging method that allows for the continuous estimation of indices related to cardiovascular state, including cardiac output, preejection period, heart rate variability, and total peripheral resistance, among others. Here, we define the moving ensemble technique mathematically, highlighting its differences from fixed-window ensemble averaging. We describe MEAP's interface and features for signal processing, artifact correction, and cardiovascular-based fMRI analysis. We demonstrate the accuracy of MEAP's novel B point detection algorithm on a large collection of hand-labeled ICG waveforms. As a proof of concept, two subjects completed a series of four physical and cognitive tasks (cold pressor, Valsalva maneuver, video game, random dot kinetogram) on 3 separate days while ECG, ICG, and BP were recorded. Critically, the moving ensemble method reliably captures the rapid cyclical cardiovascular changes related to the baroreflex during the Valsalva maneuver and the classic cold pressor response. Cardiovascular measures were seen to vary considerably within repetitions of the same cognitive task for each individual, suggesting that a carefully designed paradigm could be used to capture fast-acting event-related changes in cardiovascular state. © 2017 Society for Psychophysiological Research.
Kim, Joowhan; Min, Sung-Wook; Lee, Byoungho
2007-10-01
Integral floating display is a recently proposed three-dimensional (3D) display method which provides a dynamic 3D image in the vicinity to an observer. It has a viewing window only through which correct 3D images can be observed. However, the positional difference between the viewing window and the floating image causes limited viewing zone in integral floating system. In this paper, we provide the principle and experimental results of the location adjustment of the viewing window of the integral floating display system by modifying the elemental image region for integral imaging. We explain the characteristics of the viewing window and propose how to move the viewing window to maximize the viewing zone.
INTERIOR VIEW OF DEBITEUSE ROOM. MONORAIL USED TO MOVE DEBIS ...
INTERIOR VIEW OF DEBITEUSE ROOM. MONORAIL USED TO MOVE DEBIS IS FROM ORIGINAL CLAY HOUSE. VIEW SHOWS WORKER USING AIR HAMMER TO BEGIN FINISH ON DEBI. - Chambers-McKee Window Glass Company, Debiteuse, Clay Avenue Extension, Jeannette, Westmoreland County, PA
Ionospheric gravity wave measurements with the USU dynasonde
NASA Technical Reports Server (NTRS)
Berkey, Frank T.; Deng, Jun Yuan
1992-01-01
A method for the measurement of ionospheric Gravity Wave (GW) using the USU Dynasonde is outlined. This method consists of a series of individual procedures, which includes functions for data acquisition, adaptive scaling, polarization discrimination, interpolation and extrapolation, digital filtering, windowing, spectrum analysis, GW detection, and graphics display. Concepts of system theory are applied to treat the ionosphere as a system. An adaptive ionogram scaling method was developed for automatically extracting ionogram echo traces from noisy raw sounding data. The method uses the well known Least Mean Square (LMS) algorithm to form a stochastic optimal estimate of the echo trace which is then used to control a moving window. The window tracks the echo trace, simultaneously eliminating the noise and interference. Experimental results show that the proposed method functions as designed. Case studies which extract GW from ionosonde measurements were carried out using the techniques described. Geophysically significant events were detected and the resultant processed results are illustrated graphically. This method was also developed for real time implementation in mind.
Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng
2016-06-24
Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
2003-01-01
January 28, 2003The Mars Exploration Rover -2 is moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars. Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.NASA Astrophysics Data System (ADS)
Levine, Zachary H.; Pintar, Adam L.
2015-11-01
A simple algorithm for averaging a stochastic sequence of 1D arrays in a moving, expanding window is provided. The samples are grouped in bins which increase exponentially in size so that a constant fraction of the samples is retained at any point in the sequence. The algorithm is shown to have particular relevance for a class of Monte Carlo sampling problems which includes one characteristic of iterative reconstruction in computed tomography. The code is available in the CPC program library in both Fortran 95 and C and is also available in R through CRAN.
NASA Astrophysics Data System (ADS)
Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han
2017-11-01
Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey-Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey-Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-14
This package contains statistical routines for extracting features from multivariate time-series data which can then be used for subsequent multivariate statistical analysis to identify patterns and anomalous behavior. It calculates local linear or quadratic regression model fits to moving windows for each series and then summarizes the model coefficients across user-defined time intervals for each series. These methods are domain agnostic-but they have been successfully applied to a variety of domains, including commercial aviation and electric power grid data.
Moving Rivers, Shifting Streams: Perspectives on the Existence of a Policy Window
ERIC Educational Resources Information Center
Galligan, Ann M.; Burgess, Chris N.
2005-01-01
This article represents differing perspectives on the creation and establishment of the Rhode Island Arts Learning Network (ALN). At the heart of this discussion is whether or not the Rhode Island task force in charge of this process took advantage of what noted public policy analyst John Kingdon refers to as a "policy window" where…
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach.
Elgendi, Mohamed
2016-11-02
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages ("TERMA") involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages ( W 1 and W 2 ) have to follow the inequality ( 8 × W 1 ) ≥ W 2 ≥ ( 2 × W 1 ) . Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions.
Balabin, Roman M; Smirnov, Sergey V
2011-04-29
During the past several years, near-infrared (near-IR/NIR) spectroscopy has increasingly been adopted as an analytical tool in various fields from petroleum to biomedical sectors. The NIR spectrum (above 4000 cm(-1)) of a sample is typically measured by modern instruments at a few hundred of wavelengths. Recently, considerable effort has been directed towards developing procedures to identify variables (wavelengths) that contribute useful information. Variable selection (VS) or feature selection, also called frequency selection or wavelength selection, is a critical step in data analysis for vibrational spectroscopy (infrared, Raman, or NIRS). In this paper, we compare the performance of 16 different feature selection methods for the prediction of properties of biodiesel fuel, including density, viscosity, methanol content, and water concentration. The feature selection algorithms tested include stepwise multiple linear regression (MLR-step), interval partial least squares regression (iPLS), backward iPLS (BiPLS), forward iPLS (FiPLS), moving window partial least squares regression (MWPLS), (modified) changeable size moving window partial least squares (CSMWPLS/MCSMWPLSR), searching combination moving window partial least squares (SCMWPLS), successive projections algorithm (SPA), uninformative variable elimination (UVE, including UVE-SPA), simulated annealing (SA), back-propagation artificial neural networks (BP-ANN), Kohonen artificial neural network (K-ANN), and genetic algorithms (GAs, including GA-iPLS). Two linear techniques for calibration model building, namely multiple linear regression (MLR) and partial least squares regression/projection to latent structures (PLS/PLSR), are used for the evaluation of biofuel properties. A comparison with a non-linear calibration model, artificial neural networks (ANN-MLP), is also provided. Discussion of gasoline, ethanol-gasoline (bioethanol), and diesel fuel data is presented. The results of other spectroscopic techniques application, such as Raman, ultraviolet-visible (UV-vis), or nuclear magnetic resonance (NMR) spectroscopies, can be greatly improved by an appropriate feature selection choice. Copyright © 2011 Elsevier B.V. All rights reserved.
Method and apparatus for scientific analysis under low temperature vacuum conditions
Winefordner, James D.; Jones, Bradley T.
1990-01-01
A method and apparatus for scientific analysis of a sample under low temperature vacuum conditions uses a vacuum chamber with a conveyor belt disposed therein. One end of the conveyor belt is a cool end in thermal contact with the cold stage of a refrigerator, whereas the other end of the conveyor belt is a warm end spaced from the refrigerator. A septum allows injection of a sample into the vacuum chamber on top of the conveyor belt for spectroscopic or other analysis. The sample freezes on the conveyor belt at the cold end. One or more windows in the vacuum chamber housing allow spectroscopic analysis of the sample. Following the spectroscopic analysis, the conveyor belt may be moved such that the sample moves toward the warm end of the conveyor belt where upon it evaporates, thereby cleaning the conveyor belt. Instead of injecting the sample by way of a septum and use of a syringe and needle, the present device may be used in series with capillary-column gas chromatography or micro-bore high performance liquid chromatography.
ERIC Educational Resources Information Center
Selcuk, Gamze Sezgin; Yurumezoglu, Kemal
2013-01-01
Someone in a car moving at constant speed along a smooth, straight road cannot perceive movement unless he looks out a window. When the person looks out and sees another car traveling alongside, in the same direction and at an equal speed, he will think that the other car is not moving either. When we see a tree in the distance as we are driving…
Image pre-processing method for near-wall PIV measurements over moving curved interfaces
NASA Astrophysics Data System (ADS)
Jia, L. C.; Zhu, Y. D.; Jia, Y. X.; Yuan, H. J.; Lee, C. B.
2017-03-01
PIV measurements near a moving interface are always difficult. This paper presents a PIV image pre-processing method that returns high spatial resolution velocity profiles near the interface. Instead of re-shaping or re-orientating the interrogation windows, interface tracking and an image transformation are used to stretch the particle image strips near a curved interface into rectangles. Then the adaptive structured interrogation windows can be arranged at specified distances from the interface. Synthetic particles are also added into the solid region to minimize interfacial effects and to restrict particles on both sides of the interface. Since a high spatial resolution is only required in high velocity gradient region, adaptive meshing and stretching of the image strips in the normal direction is used to improve the cross-correlation signal-to-noise ratio (SN) by reducing the velocity difference and the particle image distortion within the interrogation window. A two dimensional Gaussian fit is used to compensate for the effects of stretching particle images. The working hypothesis is that fluid motion near the interface is ‘quasi-tangential flow’, which is reasonable in most fluid-structure interaction scenarios. The method was validated against the window deformation iterative multi-grid scheme (WIDIM) using synthetic image pairs with different velocity profiles. The method was tested for boundary layer measurements of a supersonic turbulent boundary layer on a flat plate, near a rotating blade and near a flexible flapping flag. This image pre-processing method provides higher spatial resolution than conventional WIDIM and good robustness for measuring velocity profiles near moving interfaces.
NASA Astrophysics Data System (ADS)
Kaur, Paramjit; Wasan, Ajay
2017-03-01
We present a theoretical model, using density matrix approach, to study the effect of external longitudinal and transverse magnetic fields on the optical properties of an inhomogeneously broadened multilevel Λ-system using the D2 line in 85Rb and 87Rb atoms. The presence of closely spaced multiple excited states causes asymmetry in the absorption and dispersion profiles. We observe a wide EIT window with a positive slope at the line center for a stationary atom. While for a moving atom, the linewidth of EIT window reduces and positive dispersion becomes steeper. When magnetic field is applied, our calculations show multiple EIT subwindows that are significantly narrower and shallow than single EIT window. The number of EIT subwindows depend on the orientation of the magnetic field. We also obtain multiple positive dispersive regions for subluminal propagation in the medium. The anomalous dispersion exists in between two subwindows showing the superluminal light propagation. Our theoretical analysis explain the experiments performed by Wei et al. [Phys. Rev. A 72, 023806 (2005)] and Iftiquar et al. [Phys. Rev. A 79, 013808 (2009)].
NASA Astrophysics Data System (ADS)
Lanka, Karthikeyan; Pan, Ming; Konings, Alexandra; Piles, María; D, Nagesh Kumar; Wood, Eric
2017-04-01
Traditionally, passive microwave retrieval algorithms such as Land Parameter Retrieval Model (LPRM) estimate simultaneously soil moisture and Vegetation Optical Depth (VOD) using brightness temperature (Tb) data. The algorithm requires a surface roughness parameter which - despite implications - is generally assumed to be constant at global scale. Due to inherent noise in the satellite data and retrieval algorithm, the VOD retrievals are usually observed to be highly fluctuating at daily scale which may not occur in reality. Such noisy VOD retrievals along with spatially invariable roughness parameter may affect the quality of soil moisture retrievals. The current work aims to smoothen the VOD retrievals (with an assumption that VOD remains constant over a period of time) and simultaneously generate, for the first time, global surface roughness map using multiple descending X-band Tb observations of AMSR-E. The methodology utilizes Tb values under a moving-time-window-setup to estimate concurrently the soil moisture of each day and a constant VOD in the window. Prior to this step, surface roughness parameter is estimated using the complete time series of Tb record. Upon carrying out the necessary sensitivity analysis, the smoothened VOD along with soil moisture retrievals is generated for the 10-year duration of AMSR-E (2002-2011) with a 7-day moving window using the LPRM framework. The spatial patterns of resulted global VOD maps are in coherence with vegetation biomass and climate conditions. The VOD results also exhibit a smoothening effect in terms of lower values of standard deviation. This is also evident from time series comparison of VOD and LPRM VOD retrievals without optimization over moving windows at several grid locations across the globe. The global surface roughness map also exhibited spatial patterns that are strongly influenced by topography and land use conditions. Some of the noticeable features include high roughness over mountainous regions and heavily vegetated tropical rainforests, low roughness in desert areas and moderate roughness value over higher latitudes. The new datasets of VOD and surface roughness can help improving the quality of soil moisture retrievals. Also, the methodology proposed is generic by nature and can be implemented over currently operating AMSR2, SMOS, and SMAP soil moisture missions.
[Online endpoint detection algorithm for blending process of Chinese materia medica].
Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang
2017-03-01
Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.
Li, Xinjian; Cao, Vania Y; Zhang, Wenyu; Mastwal, Surjeet S; Liu, Qing; Otte, Stephani; Wang, Kuan Hong
2017-11-01
In vivo optical imaging of neural activity provides important insights into brain functions at the single-cell level. Cranial windows and virally delivered calcium indicators are commonly used for imaging cortical activity through two-photon microscopes in head-fixed animals. Recently, head-mounted one-photon microscopes have been developed for freely behaving animals. However, minimizing tissue damage from the virus injection procedure and maintaining window clarity for imaging can be technically challenging. We used a wide-diameter glass pipette at the cortical surface for infusing the viral calcium reporter AAV-GCaMP6 into the cortex. After infusion, the scalp skin over the implanted optical window was sutured to facilitate postoperative recovery. The sutured scalp was removed approximately two weeks later and a miniature microscope was attached above the window to image neuronal activity in freely moving mice. We found that cortical surface virus infusion efficiently labeled neurons in superficial layers, and scalp skin suturing helped to maintain the long-term clarity of optical windows. As a result, several hundred neurons could be recorded in freely moving animals. Compared to intracortical virus injection and open-scalp postoperative recovery, our methods minimized tissue damage and dura overgrowth underneath the optical window, and significantly increased the experimental success rate and the yield of identified neurons. Our improved cranial surgery technique allows for high-yield calcium imaging of cortical neurons with head-mounted microscopes in freely behaving animals. This technique may be beneficial for other optical applications such as two-photon microscopy, multi-site imaging, and optogenetic modulation. Published by Elsevier B.V.
The dynamics underlying the regeneration and stalling of Hurricane Harvey
NASA Astrophysics Data System (ADS)
Liang, X. S.
2017-12-01
The explosive regeneration and stalling make the hurricane Harvey go from a little-noticed storm to an extremely destructive behemoth in late August 2017 that incurred an estimated economic loss at 70-200 billion USD. In this study, we use a recently developed analysis tool, namely, multiscale window transform (MWT), and the MWT-based theory of canonical transfer, to investigate the dynamics underlying this regeneration and stalling. The atmospheric fields are reconstructed onto three scale ranges or windows, namely, large-scale, tropical cyclone-scale, and cumulus convection-scale windows. The intertwined cyclone-scale nonlinear energy process is uniquely separated into a transport of energy within the cyclone window and an interscale transfer through reconstructing the "atomic" energy fluxes on the multiple scale windows. The resulting transfer bears a Lie bracket form, reminiscent of the Poisson bracket in Hamiltonian mechanics, and is hence referred to as canonical. It is found that within the Gulf of Mexico, Harvey gains much energy from the cumulus convection window through an inverse energy cascade, leading to its explosive growth. In the mean time, there is a barotropic instability (positive canonical transfer) center of the mean circulation in the lower and mid troposphere which lies quasi-steadily over Houston during August 22 through early September. The northwestward propagating Harvey meets that center and then stalls for two days near the coastline, dropping torrential and unprecedented amounts of rainfall and causing catastrophic flooding. It moves out of the instability center by the end of August, and then dissipates quickly in the following days.
Information transfer across intra/inter-structure of CDS and stock markets
NASA Astrophysics Data System (ADS)
Lim, Kyuseong; Kim, Sehyun; Kim, Soo Yong
2017-11-01
We investigate the information flow between industrial sectors in credit default swap and stock markets in the United States based on transfer entropy. Both markets have been studied with respect to dynamics and relations. Our approach considers the intra-structure of each financial market as well as the inter-structure between two markets through a moving window in order to scan a period from 2005 to 2012. We examine the information transfer with different k, especially k = 3, k = 5 and k = 7. Analysis indicates that the cases with k = 3 and k = 7 show the opposite trends but similar characteristics. Change in transfer entropy for intra-structure of CDS market precedes that of stock market in view of the entire time windows. Abrupt rise and fall in inter-structural information transfer between two markets are detected at the periods related to the financial crises, which can be considered as early warnings.
TERMA Framework for Biomedical Signal Analysis: An Economic-Inspired Approach
Elgendi, Mohamed
2016-01-01
Biomedical signals contain features that represent physiological events, and each of these events has peaks. The analysis of biomedical signals for monitoring or diagnosing diseases requires the detection of these peaks, making event detection a crucial step in biomedical signal processing. Many researchers have difficulty detecting these peaks to investigate, interpret and analyze their corresponding events. To date, there is no generic framework that captures these events in a robust, efficient and consistent manner. A new method referred to for the first time as two event-related moving averages (“TERMA”) involves event-related moving averages and detects events in biomedical signals. The TERMA framework is flexible and universal and consists of six independent LEGO building bricks to achieve high accuracy detection of biomedical events. Results recommend that the window sizes for the two moving averages (W1 and W2) have to follow the inequality (8×W1)≥W2≥(2×W1). Moreover, TERMA is a simple yet efficient event detector that is suitable for wearable devices, point-of-care devices, fitness trackers and smart watches, compared to more complex machine learning solutions. PMID:27827852
NASA Astrophysics Data System (ADS)
Kwon, O.; Kim, W.; Kim, J.
2017-12-01
Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)
The assessment and evaluation of low-frequency noise near the region of infrasound.
Ziaran, Stanislav
2014-01-01
The main aim of this paper is to present recent knowledge about the assessment and evaluation of low-frequency sounds (noise) and infrasound, close to the threshold of hearing, and identify their potential effect on human health and annoyance. Low-frequency noise generated by air flowing over a moving car with an open window was chosen as a typical scenario which can be subjectively assessed by people traveling by automobile. The principle of noise generated within the interior of the car and its effects on the comfort of the driver and passengers are analyzed at different velocities. An open window of a car at high velocity behaves as a source of specifically strong tonal low-frequency noise which is generally perceived as annoying. The interior noise generated by an open window of a passenger car was measured under different conditions: Driving on a highway and driving on a typical roadway. First, an octave-band analysis was used to assess the noise level and its impact on the driver's comfort. Second, a fast Fourier transform (FFT) analysis and one-third octave-band analysis were used for the detection of tonal low-frequency noise. Comparison between two different car makers was also done. Finally, the paper suggests some possibilities for scientifically assessing and evaluating low-frequency sounds in general, and some recommendations are introduced for scientific discussion, since sounds with strong low-frequency content (but not only strong) engender greater annoyance than is predicted by an A-weighted sound pressure level.
On the temporal window of auditory-brain system in connection with subjective responses
NASA Astrophysics Data System (ADS)
Mouri, Kiminori
2003-08-01
The human auditory-brain system processes information extracted from autocorrelation function (ACF) of the source signal and interaural cross correlation function (IACF) of binaural sound signals which are associated with the left and right cerebral hemispheres, respectively. The purpose of this dissertation is to determine the desirable temporal window (2T: integration interval) for ACF and IACF mechanisms. For the ACF mechanism, the visual change of Φ(0), i.e., the power of ACF, was associated with the change of loudness, and it is shown that the recommended temporal window is given as about 30(τe)min [s]. The value of (τe)min is the minimum value of effective duration of the running ACF of the source signal. It is worth noticing from the experiment of EEG that the most preferred delay time of the first reflection sound is determined by the piece indicating (τe)min in the source signal. For the IACF mechanism, the temporal window is determined as below: The measured range of τIACC corresponding to subjective angle for the moving image sound depends on the temporal window. Here, the moving image was simulated by the use of two loudspeakers located at +/-20° in the horizontal plane, reproducing amplitude modulated band-limited noise alternatively. It is found that the temporal window has a wide range of values from 0.03 to 1 [s] for the modulation frequency below 0.2 Hz. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Kiminori Mouri, 5-3-3-1110 Harayama-dai, Sakai city, Osaka 590-0132, Japan. E-mail address: km529756@aol.com
Pettit runs a drill while looking through a camera mounted on the Nadir window in the U.S. Lab
2003-04-05
ISS006-E-44305 (5 April 2003) --- Astronaut Donald R. Pettit, Expedition Six NASA ISS science officer, runs a drill while looking through a camera mounted on the nadir window in the Destiny laboratory on the International Space Station (ISS). The device is called a barn door tracker. The drill turns the screw, which moves the camera and its spotting scope.
[A fast iterative algorithm for adaptive histogram equalization].
Cao, X; Liu, X; Deng, Z; Jiang, D; Zheng, C
1997-01-01
In this paper, we propose an iterative algorthm called FAHE., which is based on the relativity between the current local histogram and the one before the sliding window moving. Comparing with the basic AHE, the computing time of FAHE is decreased from 5 hours to 4 minutes on a 486dx/33 compatible computer, when using a 65 x 65 sliding window for a 512 x 512 with 8 bits gray-level range.
Air change rates of motor vehicles and in-vehicle pollutant concentrations from secondhand smoke.
Ott, Wayne; Klepeis, Neil; Switzer, Paul
2008-05-01
The air change rates of motor vehicles are relevant to the sheltering effect from air pollutants entering from outside a vehicle and also to the interior concentrations from any sources inside its passenger compartment. We made more than 100 air change rate measurements on four motor vehicles under moving and stationary conditions; we also measured the carbon monoxide (CO) and fine particle (PM(2.5)) decay rates from 14 cigarettes smoked inside the vehicle. With the vehicle stationary and the fan off, the ventilation rate in air changes per hour (ACH) was less than 1 h(-1) with the windows closed and increased to 6.5 h(-1) with one window fully opened. The vehicle speed, window position, ventilation system, and air conditioner setting was found to affect the ACH. For closed windows and passive ventilation (fan off and no recirculation), the ACH was linearly related to the vehicle speed over the range from 15 to 72 mph (25 to 116 km h(-1)). With a vehicle moving, windows closed, and the ventilation system off (or the air conditioner set to AC Max), the ACH was less than 6.6 h(-1) for speeds ranging from 20 to 72 mph (32 to 116 km h(-1)). Opening a single window by 3'' (7.6 cm) increased the ACH by 8-16 times. For the 14 cigarettes smoked in vehicles, the deposition rate k and the air change rate a were correlated, following the equation k=1.3a (R(2)=82%; n=14). With recirculation on (or AC Max) and closed windows, the interior PM(2.5) concentration exceeded 2000 microg m(-3) momentarily for all cigarettes tested, regardless of speed. The concentration time series measured inside the vehicle followed the mathematical solutions of the indoor mass balance model, and the 24-h average personal exposure to PM(2.5) could exceed 35 microg m(-3) for just two cigarettes smoked inside the vehicle.
Dual-filter estimation for rotating-panel sample designs
Francis Roesch
2017-01-01
Dual-filter estimators are described and tested for use in the annual estimation for national forest inventories. The dual-filter approach involves the use of a moving widow estimator in the first pass, which is used as input to Theilâs mixed estimator in the second pass. The moving window and dual-filter estimators are tested along with two other estimators in a...
Can we estimate total magnetization directions from aeromagnetic data using Helbig's integrals?
Phillips, J.D.
2005-01-01
An algorithm that implements Helbig's (1963) integrals for estimating the vector components (mx, my, mz) of tile magnetic dipole moment from the first order moments of the vector magnetic field components (??X, ??Y, ??Z) is tested on real and synthetic data. After a grid of total field aeromagnetic data is converted to vector component grids using Fourier filtering, Helbig's infinite integrals are evaluated as finite integrals in small moving windows using a quadrature algorithm based on the 2-D trapezoidal rule. Prior to integration, best-fit planar surfaces must be removed from the component data within the data windows in order to make the results independent of the coordinate system origin. Two different approaches are described for interpreting the results of the integration. In the "direct" method, results from pairs of different window sizes are compared to identify grid nodes where the angular difference between solutions is small. These solutions provide valid estimates of total magnetization directions for compact sources such as spheres or dipoles, but not for horizontally elongated or 2-D sources. In the "indirect" method, which is more forgiving of source geometry, results of the quadrature analysis are scanned for solutions that are parallel to a specified total magnetization direction.
Time-localized wavelet multiple regression and correlation
NASA Astrophysics Data System (ADS)
Fernández-Macho, Javier
2018-02-01
This paper extends wavelet methodology to handle comovement dynamics of multivariate time series via moving weighted regression on wavelet coefficients. The concept of wavelet local multiple correlation is used to produce one single set of multiscale correlations along time, in contrast with the large number of wavelet correlation maps that need to be compared when using standard pairwise wavelet correlations with rolling windows. Also, the spectral properties of weight functions are investigated and it is argued that some common time windows, such as the usual rectangular rolling window, are not satisfactory on these grounds. The method is illustrated with a multiscale analysis of the comovements of Eurozone stock markets during this century. It is shown how the evolution of the correlation structure in these markets has been far from homogeneous both along time and across timescales featuring an acute divide across timescales at about the quarterly scale. At longer scales, evidence from the long-term correlation structure can be interpreted as stable perfect integration among Euro stock markets. On the other hand, at intramonth and intraweek scales, the short-term correlation structure has been clearly evolving along time, experiencing a sharp increase during financial crises which may be interpreted as evidence of financial 'contagion'.
Window of visibility - A psychophysical theory of fidelity in time-sampled visual motion displays
NASA Technical Reports Server (NTRS)
Watson, A. B.; Ahumada, A. J., Jr.; Farrell, J. E.
1986-01-01
A film of an object in motion presents on the screen a sequence of static views, while the human observer sees the object moving smoothly across the screen. Questions related to the perceptual identity of continuous and stroboscopic displays are examined. Time-sampled moving images are considered along with the contrast distribution of continuous motion, the contrast distribution of stroboscopic motion, the frequency spectrum of continuous motion, the frequency spectrum of stroboscopic motion, the approximation of the limits of human visual sensitivity to spatial and temporal frequencies by a window of visibility, the critical sampling frequency, the contrast distribution of staircase motion and the frequency spectrum of this motion, and the spatial dependence of the critical sampling frequency. Attention is given to apparent motion, models of motion, image recording, and computer-generated imagery.
Etchells, Peter J; Benton, Christopher P; Ludwig, Casimir J H; Gilchrist, Iain D
2011-01-01
A growing number of studies in vision research employ analyses of how perturbations in visual stimuli influence behavior on single trials. Recently, we have developed a method along such lines to assess the time course over which object velocity information is extracted on a trial-by-trial basis in order to produce an accurate intercepting saccade to a moving target. Here, we present a simplified version of this methodology, and use it to investigate how changes in stimulus contrast affect the temporal velocity integration window used when generating saccades to moving targets. Observers generated saccades to one of two moving targets which were presented at high (80%) or low (7.5%) contrast. In 50% of trials, target velocity stepped up or down after a variable interval after the saccadic go signal. The extent to which the saccade endpoint can be accounted for as a weighted combination of the pre- or post-step velocities allows for identification of the temporal velocity integration window. Our results show that the temporal integration window takes longer to peak in the low when compared to high contrast condition. By enabling the assessment of how information such as changes in velocity can be used in the programming of a saccadic eye movement on single trials, this study describes and tests a novel methodology with which to look at the internal processing mechanisms that transform sensory visual inputs into oculomotor outputs.
Lian, Yanyun; Song, Zhijian
2014-01-01
Brain tumor segmentation from magnetic resonance imaging (MRI) is an important step toward surgical planning, treatment planning, monitoring of therapy. However, manual tumor segmentation commonly used in clinic is time-consuming and challenging, and none of the existed automated methods are highly robust, reliable and efficient in clinic application. An accurate and automated tumor segmentation method has been developed for brain tumor segmentation that will provide reproducible and objective results close to manual segmentation results. Based on the symmetry of human brain, we employed sliding-window technique and correlation coefficient to locate the tumor position. At first, the image to be segmented was normalized, rotated, denoised, and bisected. Subsequently, through vertical and horizontal sliding-windows technique in turn, that is, two windows in the left and the right part of brain image moving simultaneously pixel by pixel in two parts of brain image, along with calculating of correlation coefficient of two windows, two windows with minimal correlation coefficient were obtained, and the window with bigger average gray value is the location of tumor and the pixel with biggest gray value is the locating point of tumor. At last, the segmentation threshold was decided by the average gray value of the pixels in the square with center at the locating point and 10 pixels of side length, and threshold segmentation and morphological operations were used to acquire the final tumor region. The method was evaluated on 3D FSPGR brain MR images of 10 patients. As a result, the average ratio of correct location was 93.4% for 575 slices containing tumor, the average Dice similarity coefficient was 0.77 for one scan, and the average time spent on one scan was 40 seconds. An fully automated, simple and efficient segmentation method for brain tumor is proposed and promising for future clinic use. Correlation coefficient is a new and effective feature for tumor location.
NASA Astrophysics Data System (ADS)
Gligor, M.; Ausloos, M.
2007-05-01
The statistical distances between countries, calculated for various moving average time windows, are mapped into the ultrametric subdominant space as in classical Minimal Spanning Tree methods. The Moving Average Minimal Length Path (MAMLP) algorithm allows a decoupling of fluctuations with respect to the mass center of the system from the movement of the mass center itself. A Hamiltonian representation given by a factor graph is used and plays the role of cost function. The present analysis pertains to 11 macroeconomic (ME) indicators, namely the GDP (x1), Final Consumption Expenditure (x2), Gross Capital Formation (x3), Net Exports (x4), Consumer Price Index (y1), Rates of Interest of the Central Banks (y2), Labour Force (z1), Unemployment (z2), GDP/hour worked (z3), GDP/capita (w1) and Gini coefficient (w2). The target group of countries is composed of 15 EU countries, data taken between 1995 and 2004. By two different methods (the Bipartite Factor Graph Analysis and the Correlation Matrix Eigensystem Analysis) it is found that the strongly correlated countries with respect to the macroeconomic indicators fluctuations can be partitioned into stable clusters.
Teng, Wei-Zhuo; Song, Jia; Meng, Fan-Xin; Meng, Qing-Fan; Lu, Jia-Hui; Hu, Shuang; Teng, Li-Rong; Wang, Di; Xie, Jing
2014-10-01
Partial least squares (PLS) and radial basis function neural network (RBFNN) combined with near infrared spectros- copy (NIR) were applied to develop models for cordycepic acid, polysaccharide and adenosine analysis in Paecilomyces hepialid fermentation mycelium. The developed models possess well generalization and predictive ability which can be applied for crude drugs and related productions determination. During the experiment, 214 Paecilomyces hepialid mycelium samples were obtained via chemical mutagenesis combined with submerged fermentation. The contents of cordycepic acid, polysaccharide and adenosine were determined via traditional methods and the near infrared spectroscopy data were collected. The outliers were removed and the numbers of calibration set were confirmed via Monte Carlo partial least square (MCPLS) method. Based on the values of degree of approach (Da), both moving window partial least squares (MWPLS) and moving window radial basis function neural network (MWRBFNN) were applied to optimize characteristic wavelength variables, optimum preprocessing methods and other important variables in the models. After comparison, the RBFNN, RBFNN and PLS models were developed successfully for cordycepic acid, polysaccharide and adenosine detection, and the correlation between reference values and predictive values in both calibration set (R2c) and validation set (R2p) of optimum models was 0.9417 and 0.9663, 0.9803 and 0.9850, and 0.9761 and 0.9728, respectively. All the data suggest that these models possess well fitness and predictive ability.
Sabushimike, Donatien; Na, Seung You; Kim, Jin Young; Bui, Ngoc Nam; Seo, Kyung Sik; Kim, Gil Gyeom
2016-01-01
The detection of a moving target using an IR-UWB Radar involves the core task of separating the waves reflected by the static background and by the moving target. This paper investigates the capacity of the low-rank and sparse matrix decomposition approach to separate the background and the foreground in the trend of UWB Radar-based moving target detection. Robust PCA models are criticized for being batched-data-oriented, which makes them inconvenient in realistic environments where frames need to be processed as they are recorded in real time. In this paper, a novel method based on overlapping-windows processing is proposed to cope with online processing. The method consists of processing a small batch of frames which will be continually updated without changing its size as new frames are captured. We prove that RPCA (via its Inexact Augmented Lagrange Multiplier (IALM) model) can successfully separate the two subspaces, which enhances the accuracy of target detection. The overlapping-windows processing method converges on the optimal solution with its batch counterpart (i.e., processing batched data with RPCA), and both methods prove the robustness and efficiency of the RPCA over the classic PCA and the commonly used exponential averaging method. PMID:27598159
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, S. R.; Knox, H. A.; Abbott, R. E.
Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variationsmore » that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. Lastly, this improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.« less
[A peak recognition algorithm designed for chromatographic peaks of transformer oil].
Ou, Linjun; Cao, Jian
2014-09-01
In the field of the chromatographic peak identification of the transformer oil, the traditional first-order derivative requires slope threshold to achieve peak identification. In terms of its shortcomings of low automation and easy distortion, the first-order derivative method was improved by applying the moving average iterative method and the normalized analysis techniques to identify the peaks. Accurate identification of the chromatographic peaks was realized through using multiple iterations of the moving average of signal curves and square wave curves to determine the optimal value of the normalized peak identification parameters, combined with the absolute peak retention times and peak window. The experimental results show that this algorithm can accurately identify the peaks and is not sensitive to the noise, the chromatographic peak width or the peak shape changes. It has strong adaptability to meet the on-site requirements of online monitoring devices of dissolved gases in transformer oil.
Traversing Time and Space from the Blessing Window
NASA Astrophysics Data System (ADS)
Huang, Ya-Ling
2013-02-01
The visual graphics for the holographic artwork "Blessing Window" were created from observations of Tainan city, with a focus on the beauty of Chinese characters, their typographic. The concept of movement in the artwork is from a traditional Chinese philosophy, "When the mountain does not move, the road extends, when the road does not extend to the destination, the heart will extend". One multiplex-hologram and an interactive installation were used to combine the visual concepts of typography and the philosophy.
Cassidy looks through window into the PMA-2 during STS-127 Mission
2009-07-17
S127-E-006705 (17 July 2009) --- Astronaut Christopher Cassidy, STS-127 mission specialist, peers through a window in the hatch that separates seven Endeavour crew members from six International Space Station inhabitants. But the separation wasn't for long, as soon afterward the hatch was opened and the visitors from Earth moved onto the station to set the population record at 13. More importantly, over a week's worth of joint activities lies ahead for the two crews.
Canonical Probability Distributions for Model Building, Learning, and Inference
2006-07-14
hand , are for Ranked nodes set at Unobservable and Auxiliary nodes. The value of alpha is set in the diagnostic window by moving the slider in the upper...right hand side of the window. The upper bound of alpha can be modified by typing the new value in the small edit box to the right of the slider. f...TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER University of Pittsburgh
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindran, P; Wui Ann, W; Lim, Y
Purpose: In general, the linear accelerator is gated using respiratory signal obtained by way of external sensors to account for the breathing motion during radiotherapy. One of the commonly used gating devices is the Varian RPM device. Calypso system that uses electromagnetic tracking of implanted or surface transponders could also be used for gating. The aim of this study is to compare the gating efficiency of RPM device and the calypso system by phantom studies. Methods: An ArcCheck insert was used as the phantom with a Gafchromic film placed in its holder. The ArcCheck insert was placed on a Motionmore » Sim platform and moved in the longitudinal direction simulating a respiratory motion with a period of 5 seconds and amplitude of ±6mm. The Gafchromic film was exposed to a 2 × 2cm{sup 2} field, i) with the phantom static, ii) phantom moving but ungated iii) gated with gating window of 2mm and 3mm. This was repeated with Calypso system using surface transponders with the same gating window. The Gafchromic films were read with an EPSON 11000 flatbed scanner and analysed with ‘Medphysto’ software. Results: The full width at half maximum (FWHM) as measured with film at the level of the film holder was 1.65cm when the phantom was static. FWHM measured with phantom moving and without gating was 1.16 cm and penumbra was 7 mm (80–20%) on both sides. When the beam was gated with 2 mm gating window the FWHM was 1.8 cm with RPM device and 1.9 cm with Calypso. Similarly, when the beam was gated with 3 mm window, the FWHM was 1.9cm with RPM device and 2cm with Calypso. Conclusion: This work suggests that the gating efficiency of RPM device is better than that of the Calypso with surface transponder, with reference to the latency in gating.« less
Complex Patterns in Financial Time Series Through HIGUCHI’S Fractal Dimension
NASA Astrophysics Data System (ADS)
Grace Elizabeth Rani, T. G.; Jayalalitha, G.
2016-11-01
This paper analyzes the complexity of stock exchanges through fractal theory. Closing price indices of four stock exchanges with different industry sectors are selected. Degree of complexity is assessed through Higuchi’s fractal dimension. Various window sizes are considered in evaluating the fractal dimension. It is inferred that the data considered as a whole represents random walk for all the four indices. Analysis of financial data through windowing procedure exhibits multi-fractality. Attempts to apply moving averages to reduce noise in the data revealed lower estimates of fractal dimension, which was verified using fractional Brownian motion. A change in the normalization factor in Higuchi’s algorithm did improve the results. It is quintessential to focus on rural development to realize a standard and steady growth of economy. Tools must be devised to settle the issues in this regard. Micro level institutions are necessary for the economic growth of a country like India, which would induce a sporadic development in the present global economical scenario.
Development of an external beam nuclear microprobe on the Aglae facility of the Louvre museum
NASA Astrophysics Data System (ADS)
Calligaro, T.; Dran, J.-C.; Ioannidou, E.; Moignard, B.; Pichon, L.; Salomon, J.
2000-03-01
The external beam line of our facility has been recently equipped with the focusing system previously mounted on a classical nuclear microprobe. When using a 0.1 μm thick Si 3N 4 foil for the exit window and flowing helium on the sample under analysis, a beam spot as small as 10 μm is attainable at a distance of 3 mm from the window. Elemental micromapping is performed by mechanical scanning. An electronic device has been designed which allows XY scanning by moving the sample under the beam by steps down to 0.1 μm. Beam monitoring is carried out by means of the weak X-ray signal emitted by the exit foil and detected by a specially designed Si(Li) detector cooled by Peltier effect. The characteristics of external beams of protons and alpha particles are evaluated by means of resonance scanning and elemental mapping of a grid. An example of application is presented, dealing with elemental micro-mapping of inclusions in gemstones.
Novel windowing technique realized in FPGA for radar system
NASA Astrophysics Data System (ADS)
Escamilla-Hernandez, E.; Kravchenko, V. F.; Ponomaryov, V. I.; Ikuo, Arai
2006-02-01
To improve the weak target detection ability in radar applications a pulse compression is usually used that in the case linear FM modulation can improve the SNR. One drawback in here is that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) and resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, etc. in window processing. Additionally to classical ones in this paper we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. To implement the compression-windowing model on hardware level it has been employed FPGA. This work aims at demonstrating a reasonably flexible implementation of FM-linear signal, pulse compression and windowing employing FPGA's. Classical and novel AF window technique has been investigated to reduce the SLL taking into account the noise influence and increasing the detection ability of the small or weak targets in the imaging radar. Paper presents the experimental hardware results of windowing in pulse compression radar resolving several targets for rectangular, Hamming, Kaiser-Bessel, (see manuscript for formula) functions windows. The windows created by use the atomic functions offer sufficiently better decreasing of the SLL in case of noise presence and when we move away of the main lobe in comparison with classical windows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ojeda-Gonzalez, A.; Prestes, A.; Klausner, V.
Spatio-temporal entropy (STE) analysis is used as an alternative mathematical tool to identify possible magnetic cloud (MC) candidates. We analyze Interplanetary Magnetic Field (IMF) data using a time interval of only 10 days. We select a convenient data interval of 2500 records moving forward by 200 record steps until the end of the time series. For every data segment, the STE is calculated at each step. During an MC event, the STE reaches values close to zero. This extremely low value of STE is due to MC structure features. However, not all of the magnetic components in MCs have STEmore » values close to zero at the same time. For this reason, we create a standardization index (the so-called Interplanetary Entropy, IE, index). This index is a worthwhile effort to develop new tools to help diagnose ICME structures. The IE was calculated using a time window of one year (1999), and it has a success rate of 70% over other identifiers of MCs. The unsuccessful cases (30%) are caused by small and weak MCs. The results show that the IE methodology identified 9 of 13 MCs, and emitted nine false alarm cases. In 1999, a total of 788 windows of 2500 values existed, meaning that the percentage of false alarms was 1.14%, which can be considered a good result. In addition, four time windows, each of 10 days, are studied, where the IE method was effective in finding MC candidates. As a novel result, two new MCs are identified in these time windows.« less
Reduction of display artifacts by random sampling
NASA Technical Reports Server (NTRS)
Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.
1983-01-01
The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.
NASA Astrophysics Data System (ADS)
Jeter, G. W.; Carter, G. A.
2013-12-01
Guy (Will) Wilburn Jeter Jr., Gregory A. Carter University of Southern Mississippi Geography and Geology Gulf Coast Geospatial Center The over-arching goal of this research is to assess habitat change over a seventy year period to better understand the combined effects of global sea level rise and storm impacts on the stability of Horn Island, MS habitats. Historical aerial photography is often overlooked as a resource for use in determining habitat change. However, the spatial information provided even by black and white imagery can give insight into past habitat composition via textural analysis. This research will evaluate characteristic dimensions; most notably patch size of habitat types using simple geo-statistics and textures of brightness values of historical aerial imagery. It is assumed that each cover type has an identifiable patch size that can be used as a unique classifier of each habitat type. Analytical methods applied to the 1940 imagery were developed using 2010 field data and USDA aerial imagery. Textural moving window methods and basic geo-statistics were used to estimate characteristic dimensions of each cover type in 1940 aerial photography. The moving window texture analysis was configured with multiple window sizes to capture the characteristic dimensions of six habitat types; water, bare sand , dune herb land, estuarine shrub land, marsh land and slash pine woodland. Coefficient of variation (CV), contrast, and entropy texture filters were used to analyze the spatial variability of the 1940 and 2010 imagery. (CV) was used to depict the horizontal variability of each habitat characteristic dimension. Contrast was used to represent the variability of bright versus dark pixel values; entropy was used to show the variation in the slash pine woodland habitat type. Results indicate a substantial increase in marshland habitat relative to other habitat types since 1940. Results also reveal each habitat-type, such as dune herb-land, marsh-land, estuarine shrub-land, bare sand, slash pine woodland, and water exhibit a characteristic dimension that may be estimated from horizontal variability in image brightness values. These characteristic dimensions are estimated at less than one 1 meter for marsh-land bare sand and water, 3 meters for estuarine shrub-land and dune herb-land, and 5 to 7 meters for slash pine woodland.
Dynamic Granger-Geweke causality modeling with application to interictal spike propagation
Lin, Fa-Hsuan; Hara, Keiko; Solo, Victor; Vangel, Mark; Belliveau, John W.; Stufflebeam, Steven M.; Hamalainen, Matti S.
2010-01-01
A persistent problem in developing plausible neurophysiological models of perception, cognition, and action is the difficulty of characterizing the interactions between different neural systems. Previous studies have approached this problem by estimating causal influences across brain areas activated during cognitive processing using Structural Equation Modeling and, more recently, with Granger-Geweke causality. While SEM is complicated by the need for a priori directional connectivity information, the temporal resolution of dynamic Granger-Geweke estimates is limited because the underlying autoregressive (AR) models assume stationarity over the period of analysis. We have developed a novel optimal method for obtaining data-driven directional causality estimates with high temporal resolution in both time and frequency domains. This is achieved by simultaneously optimizing the length of the analysis window and the chosen AR model order using the SURE criterion. Dynamic Granger-Geweke causality in time and frequency domains is subsequently calculated within a moving analysis window. We tested our algorithm by calculating the Granger-Geweke causality of epileptic spike propagation from the right frontal lobe to the left frontal lobe. The results quantitatively suggested the epileptic activity at the left frontal lobe was propagated from the right frontal lobe, in agreement with the clinical diagnosis. Our novel computational tool can be used to help elucidate complex directional interactions in the human brain. PMID:19378280
Phast4Windows: A 3D graphical user interface for the reactive-transport simulator PHAST
Charlton, Scott R.; Parkhurst, David L.
2013-01-01
Phast4Windows is a Windows® program for developing and running groundwater-flow and reactive-transport models with the PHAST simulator. This graphical user interface allows definition of grid-independent spatial distributions of model properties—the porous media properties, the initial head and chemistry conditions, boundary conditions, and locations of wells, rivers, drains, and accounting zones—and other parameters necessary for a simulation. Spatial data can be defined without reference to a grid by drawing, by point-by-point definitions, or by importing files, including ArcInfo® shape and raster files. All definitions can be inspected, edited, deleted, moved, copied, and switched from hidden to visible through the data tree of the interface. Model features are visualized in the main panel of the interface, so that it is possible to zoom, pan, and rotate features in three dimensions (3D). PHAST simulates single phase, constant density, saturated groundwater flow under confined or unconfined conditions. Reactions among multiple solutes include mineral equilibria, cation exchange, surface complexation, solid solutions, and general kinetic reactions. The interface can be used to develop and run simple or complex models, and is ideal for use in the classroom, for analysis of laboratory column experiments, and for development of field-scale simulations of geochemical processes and contaminant transport.
Queues with Choice via Delay Differential Equations
NASA Astrophysics Data System (ADS)
Pender, Jamol; Rand, Richard H.; Wesson, Elizabeth
Delay or queue length information has the potential to influence the decision of a customer to join a queue. Thus, it is imperative for managers of queueing systems to understand how the information that they provide will affect the performance of the system. To this end, we construct and analyze two two-dimensional deterministic fluid models that incorporate customer choice behavior based on delayed queue length information. In the first fluid model, customers join each queue according to a Multinomial Logit Model, however, the queue length information the customer receives is delayed by a constant Δ. We show that the delay can cause oscillations or asynchronous behavior in the model based on the value of Δ. In the second model, customers receive information about the queue length through a moving average of the queue length. Although it has been shown empirically that giving patients moving average information causes oscillations and asynchronous behavior to occur in U.S. hospitals, we analytically and mathematically show for the first time that the moving average fluid model can exhibit oscillations and determine their dependence on the moving average window. Thus, our analysis provides new insight on how operators of service systems should report queue length information to customers and how delayed information can produce unwanted system dynamics.
Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn
2013-05-01
Breathing dynamics vary between infant sleep states, and are likely to exhibit non-linear behaviour. This study applied the non-linear analytical tool recurrence quantification analysis (RQA) to 400 breath interval periods of REM and N-REM sleep, and then using an overlapping moving window. The RQA variables were different between sleep states, with REM radius 150% greater than N-REM radius, and REM laminarity 79% greater than N-REM laminarity. RQA allowed the observation of temporal variations in non-linear breathing dynamics across a night's sleep at 30s resolution, and provides a basis for quantifying changes in complex breathing dynamics with physiology and pathology. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pasten, D.; Comte, D.; Vallejos, J.
2013-05-01
During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).
[Recognition of walking stance phase and swing phase based on moving window].
Geng, Xiaobo; Yang, Peng; Wang, Xinran; Geng, Yanli; Han, Yu
2014-04-01
Wearing transfemoral prosthesis is the only way to complete daily physical activity for amputees. Motion pattern recognition is important for the control of prosthesis, especially in the recognizing swing phase and stance phase. In this paper, it is reported that surface electromyography (sEMG) signal is used in swing and stance phase recognition. sEMG signal of related muscles was sampled by Infiniti of a Canadian company. The sEMG signal was then filtered by weighted filtering window and analyzed by height permitted window. The starting time of stance phase and swing phase is determined through analyzing special muscles. The sEMG signal of rectus femoris was used in stance phase recognition and sEMG signal of tibialis anterior is used in swing phase recognition. In a certain tolerating range, the double windows theory, including weighted filtering window and height permitted window, can reach a high accuracy rate. Through experiments, the real walking consciousness of the people was reflected by sEMG signal of related muscles. Using related muscles to recognize swing and stance phase is reachable. The theory used in this paper is useful for analyzing sEMG signal and actual prosthesis control.
Cockpit Window Edge Proximity Effects on Judgements of Horizon Vertical Displacement
NASA Technical Reports Server (NTRS)
Haines, R. F.
1984-01-01
To quantify the influence of a spatially fixed edge on vertical displacement threshold, twenty-four males (12 pilots, 12 non-pilots) were presented a series of forced choice, paired comparison trials in which a 32 deg arc wide, thin, luminous horizontal stimulus line moved smoothly downward through five angles from a common starting position within a three second-long period. The five angles were 1.4, 1.7, 2, 2.3, and 2.6 deg. Each angle was presented paired with itself and the other four angles in all combinations in random order. For each pair of trials the observer had to choose which trial possessed the largest displacement. A confidence response also was made. The independent variable was the angular separation between the lower edge of a stable 'window' aperture through which the stimulus was seen to move and the lowest position attained by the stimulus. It was found that vertical displacement accuracy is inversely related to the angle separating the stimulus and the fixed window edge (p = .05). In addition, there is a strong tendency for pilot confidence to be lower than that of non-pilots for each of the three angular separations. These results are discussed in erms of selected cockpit features and as they relate to how pilots judge changes in aircraft pitch attitude.
The short time Fourier transform and local signals
NASA Astrophysics Data System (ADS)
Okumura, Shuhei
In this thesis, I examine the theoretical properties of the short time discrete Fourier transform (STFT). The STFT is obtained by applying the Fourier transform by a fixed-sized, moving window to input series. We move the window by one time point at a time, so we have overlapping windows. I present several theoretical properties of the STFT, applied to various types of complex-valued, univariate time series inputs, and their outputs in closed forms. In particular, just like the discrete Fourier transform, the STFT's modulus time series takes large positive values when the input is a periodic signal. One main point is that a white noise time series input results in the STFT output being a complex-valued stationary time series and we can derive the time and time-frequency dependency structure such as the cross-covariance functions. Our primary focus is the detection of local periodic signals. I present a method to detect local signals by computing the probability that the squared modulus STFT time series has consecutive large values exceeding some threshold after one exceeding observation following one observation less than the threshold. We discuss a method to reduce the computation of such probabilities by the Box-Cox transformation and the delta method, and show that it works well in comparison to the Monte Carlo simulation method.
NASA Astrophysics Data System (ADS)
Schuchardt, Patrick; Unger, Miriam; Siesler, Heinz W.
2018-01-01
In the present communication the potential of 2DCOS analysis and the spin-off technique perturbation-correlation moving window 2D (PCMW2D) analysis is illustrated with reference to spectroscopic changes observed in a data set recorded by in-line fiber-coupled FT-IR spectroscopy in the attenuated total reflection (ATR) mode during a polyurethane solution polymerization at different temperatures. In view of the chemical functionalities involved, hydrogen bonding plays an important role in this polymerization reaction. Based on the 2DCOS and PCMW2D analysis, the sequence of hydrogen bonding changes accompanying the progress of polymerization and precipitation of solid polymer can be determined. Complementary to the kinetic data derived from the original variable-temperature spectra in a previous publication the results provide a more detailed picture of the investigated solution polymerization.
Whitford, Veronica; O'Driscoll, Gillian A; Pack, Christopher C; Joober, Ridha; Malla, Ashok; Titone, Debra
2013-02-01
Language and oculomotor disturbances are 2 of the best replicated findings in schizophrenia. However, few studies have examined skilled reading in schizophrenia (e.g., Arnott, Sali, Copland, 2011; Hayes & O'Grady, 2003; Revheim et al., 2006; E. O. Roberts et al., 2012), and none have examined the contribution of cognitive and motor processes that underlie reading performance. Thus, to evaluate the relationship of linguistic processes and oculomotor control to skilled reading in schizophrenia, 20 individuals with schizophrenia and 16 demographically matched controls were tested using a moving window paradigm (McConkie & Rayner, 1975). Linguistic skills supporting reading (phonological awareness) were assessed with the Comprehensive Test of Phonological Processing (R. K. Wagner, Torgesen, & Rashotte, 1999). Eye movements were assessed during reading tasks and during nonlinguistic tasks tapping basic oculomotor control (prosaccades, smooth pursuit) and executive functions (predictive saccades, antisaccades). Compared with controls, schizophrenia patients exhibited robust oculomotor markers of reading difficulty (e.g., reduced forward saccade amplitude) and were less affected by reductions in window size, indicative of reduced perceptual span. Reduced perceptual span in schizophrenia was associated with deficits in phonological processing and reduced saccade amplitudes. Executive functioning (antisaccade errors) was not related to perceptual span but was related to reading comprehension. These findings suggest that deficits in language, oculomotor control, and cognitive control contribute to skilled reading deficits in schizophrenia. Given that both language and oculomotor dysfunction precede illness onset, reading may provide a sensitive window onto cognitive dysfunction in schizophrenia vulnerability and be an important target for cognitive remediation. 2013 APA, all rights reserved
A complete passive blind image copy-move forensics scheme based on compound statistics features.
Peng, Fei; Nie, Yun-ying; Long, Min
2011-10-10
Since most sensor pattern noise based image copy-move forensics methods require a known reference sensor pattern noise, it generally results in non-blinded passive forensics, which significantly confines the application circumstances. In view of this, a novel passive-blind image copy-move forensics scheme is proposed in this paper. Firstly, a color image is transformed into a grayscale one, and wavelet transform based de-noising filter is used to extract the sensor pattern noise, then the variance of the pattern noise, the signal noise ratio between the de-noised image and the pattern noise, the information entropy and the average energy gradient of the original grayscale image are chosen as features, non-overlapping sliding window operations are done to the images to divide them into different sub-blocks. Finally, the tampered areas are detected by analyzing the correlation of the features between the sub-blocks and the whole image. Experimental results and analysis show that the proposed scheme is completely passive-blind, has a good detection rate, and is robust against JPEG compression, noise, rotation, scaling and blurring. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Remote analysis of planetary soils: X-ray diffractometer development
NASA Technical Reports Server (NTRS)
Gregory, J. C.
1973-01-01
A system is described suitable for remote low power mineralogical analysis of lunar, planetary, or asteroid soils. It includes an X-ray diffractometer, fluorescence spectrometer, and sample preparation system. A one Curie Fe-55 source provides a monochromatic X-ray beam of 5.9 keV. Seeman-Bohlin or focusing geometry is employed in the camera, allowing peak detection to proceed simultaneously at all angles and obviating the need for moving parts. The detector system is an array of 500-600 proportional counters with a wire-spacing of 1 mm. An electronics unit comprising preamplifier, postamplifier, window discriminators, and storage flipflops requiring only 3.5 milliwatts was designed and tested. Total instrument power is less than 5 watts. Powder diffraction patterns using a flat breadboard multiwire counter were recorded.
Model Identification of Integrated ARMA Processes
ERIC Educational Resources Information Center
Stadnytska, Tetiana; Braun, Simone; Werner, Joachim
2008-01-01
This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…
Human-Robot Interface Controller Usability for Mission Planning on the Move
2012-11-01
5 Figure 3. Microsoft Xbox 360 controller for Windows...6 Figure 5. Microsoft Trackball Explorer. .........................................................................................7 Figure 6...Xbox 360 Controller is a registered trademark of Microsoft Corporation. 4 3.2.1 HMMWV The HMMWV was equipped with a diesel engine
Transparent Conveyor of Dielectric Liquids or Particles
NASA Technical Reports Server (NTRS)
Calle, Carlos I.; Mantovani, James G.
2009-01-01
The concept of a transparent conveyor of small loose dielectric parti cles or small amounts of dielectric liquids has emerged as an outgro wth of an effort to develop efficient, reliable means of automated re moval of dust from solar cells and from windows of optical instrumen ts. This concept is based on the previously reported concept of an e lectrodynamic screen, according to which a grid-like electric field is established on and near a surface and is moved along the surface p erpendicularly to the grid lines. The resulting electrodynamic force s on loose dielectric particles or dielectric liquid drops in the vic inity would move the particles or drops along the surface. In the or iginal dust-removal application, dust particles would thus be swept out of the affected window area. Other potential applications may occ ur in nanotechnology -- for example, involving mixing of two or more fluids and/or nanoscale particles under optical illumination and/or optical observation.
An accelerating precursor to predict "time-to-failure" in creep and volcanic eruptions
NASA Astrophysics Data System (ADS)
Hao, Shengwang; Yang, Hang; Elsworth, Derek
2017-09-01
Real-time prediction by monitoring of the evolution of response variables is a central goal in predicting rock failure. A linear relation Ω˙Ω¨-1 = C(tf - t) has been developed to describe the time to failure, where Ω represents a response quantity, C is a constant and tf represents the failure time. Observations from laboratory creep failure experiments and precursors to volcanic eruptions are used to test the validity of the approach. Both cumulative and simple moving window techniques are developed to perform predictions and to illustrate the effects of data selection on the results. Laboratory creep failure experiments on granites show that the linear relation works well during the final approach to failure. For blind prediction, the simple moving window technique is preferred because it always uses the most recent data and excludes effects of early data deviating significantly from the predicted trend. When the predicted results show only small fluctuations, failure is imminent.
NASA Astrophysics Data System (ADS)
Elias, E.; Rango, A.; James, D.; Maxwell, C.; Anderson, J.; Abatzoglou, J. T.
2016-12-01
Researchers evaluating climate projections across southwestern North America observed a decreasing precipitation trend. Aridification was most pronounced in the cold (non-monsoonal) season, whereas downward trends in precipitation were smaller in the warm (monsoonal) season. In this region, based upon a multimodel mean of 20 Coupled Model Intercomparison Project 5 models using a business-as-usual (Representative Concentration Pathway 8.5) trajectory, midcentury precipitation is projected to increase slightly during the monsoonal time period (July-September; 6%) and decrease slightly during the remainder of the year (October-June; -4%). We use observed long-term (1915-2015) monthly precipitation records from 16 weather stations to investigate how well measured trends corroborate climate model predictions during the monsoonal and non-monsoonal timeframe. Running trend analysis using the Mann-Kendall test for 15 to 101 year moving windows reveals that half the stations showed significant (p≤0.1), albeit small, increasing trends based on the longest term record. Trends based on shorter-term records reveal a period of significant precipitation decline at all stations representing the 1950s drought. Trends from 1930 to 2015 reveal significant annual, monsoonal and non-monsoonal increases in precipitation (Fig 1). The 1960 to 2015 time window shows no significant precipitation trends. The more recent time window (1980 to 2015) shows a slight, but not significant, increase in monsoonal precipitation and a larger, significant decline in non-monsoonal precipitation. GCM precipitation projections are consistent with more recent trends for the region. Running trends from the most recent time window (mid-1990s to 2015) at all stations show increasing monsoonal precipitation and decreasing Oct-Jun precipitation, with significant trends at 6 of 16 stations. Running trend analysis revealed that the long-term trends were not persistent throughout the series length, but depended on the period examined. Recent trends in Southwest precipitation are directionally consistent with anthropogenic climate change.
Multifractal analysis of 2001 Mw 7 . 7 Bhuj earthquake sequence in Gujarat, Western India
NASA Astrophysics Data System (ADS)
Aggarwal, Sandeep Kumar; Pastén, Denisse; Khan, Prosanta Kumar
2017-12-01
The 2001 Mw 7 . 7 Bhuj mainshock seismic sequence in the Kachchh area, occurring during 2001 to 2012, has been analyzed using mono-fractal and multi-fractal dimension spectrum analysis technique. This region was characterized by frequent moderate shocks of Mw ≥ 5 . 0 for more than a decade since the occurrence of 2001 Bhuj earthquake. The present study is therefore important for precursory analysis using this sequence. The selected long-sequence has been investigated first time for completeness magnitude Mc 3.0 using the maximum curvature method. Multi-fractal Dq spectrum (Dq ∼ q) analysis was carried out using effective window-length of 200 earthquakes with a moving window of 20 events overlapped by 180 events. The robustness of the analysis has been tested by considering the magnitude completeness correction term of 0.2 to Mc 3.0 as Mc 3.2 and we have tested the error in the calculus of Dq for each magnitude threshold. On the other hand, the stability of the analysis has been investigated down to the minimum magnitude of Mw ≥ 2 . 6 in the sequence. The analysis shows the multi-fractal dimension spectrum Dq decreases with increasing of clustering of events with time before a moderate magnitude earthquake in the sequence, which alternatively accounts for non-randomness in the spatial distribution of epicenters and its self-organized criticality. Similar behavior is ubiquitous elsewhere around the globe, and warns for proximity of a damaging seismic event in an area. OS: Please confirm math roman or italics in abs.
Gap-filling methods to impute eddy covariance flux data by preserving variance.
NASA Astrophysics Data System (ADS)
Kunwor, S.; Staudhammer, C. L.; Starr, G.; Loescher, H. W.
2015-12-01
To represent carbon dynamics, in terms of exchange of CO2 between the terrestrial ecosystem and the atmosphere, eddy covariance (EC) data has been collected using eddy flux towers from various sites across globe for more than two decades. However, measurements from EC data are missing for various reasons: precipitation, routine maintenance, or lack of vertical turbulence. In order to have estimates of net ecosystem exchange of carbon dioxide (NEE) with high precision and accuracy, robust gap-filling methods to impute missing data are required. While the methods used so far have provided robust estimates of the mean value of NEE, little attention has been paid to preserving the variance structures embodied by the flux data. Preserving the variance of these data will provide unbiased and precise estimates of NEE over time, which mimic natural fluctuations. We used a non-linear regression approach with moving windows of different lengths (15, 30, and 60-days) to estimate non-linear regression parameters for one year of flux data from a long-leaf pine site at the Joseph Jones Ecological Research Center. We used as our base the Michaelis-Menten and Van't Hoff functions. We assessed the potential physiological drivers of these parameters with linear models using micrometeorological predictors. We then used a parameter prediction approach to refine the non-linear gap-filling equations based on micrometeorological conditions. This provides us an opportunity to incorporate additional variables, such as vapor pressure deficit (VPD) and volumetric water content (VWC) into the equations. Our preliminary results indicate that improvements in gap-filling can be gained with a 30-day moving window with additional micrometeorological predictors (as indicated by lower root mean square error (RMSE) of the predicted values of NEE). Our next steps are to use these parameter predictions from moving windows to gap-fill the data with and without incorporation of potential driver variables of the parameters traditionally used. Then, comparisons of the predicted values from these methods and 'traditional' gap-filling methods (using 12 fixed monthly windows) will be assessed to show the scale of preserving variance. Further, this method will be applied to impute artificially created gaps for analyzing if variance is preserved.
Gunn, W J; Shigehisa, T; Shepherd, W T
1979-10-01
The conditions were examined under which more valid and reliable estimates could be made of the effects of aircraft noise on people. In Exper. 1, 12 Ss in 2 different houses directly under the flight path of a major airport (JFK) indicated 1 of 12 possible flight paths (4 directly overhead and 8 to one side) for each of 3 jet aircraft flyovers: 3% of cases in House A and 56% in House B (which had open windows) were correctly identified. Despite judgment inaccuracy, Ss were more than moderately certain of the correctness of their judgments. In Exper. II. Ss either inside or outside of 2 houses in Wallops Station, Virginia, indicated on diagrams the direction of flyovers. Each of 4 aircraft (Boeing 737, C-54, UE-1 helicopter, Queenaire) made 8 flyovers directly over the houses and 8 to one side. Windows were either open or closed. All flyovers and conditions were counterbalanced. All sound sources under all conditions were usually judged to be overhead and moving, but for Ss indoors with windows closed the to-the-side flyovers were judged to be off to the side in 24% of cases. Outdoor Ss reported correct direction in 75% of cases while indoor Ss were correct in only 25% (windows open) or 18% (windows closed). Judgments "to the side" were significantly better (p = less than .02) with windows open vs closed, while with windows closed judgments were significantly better (p = less than .05) for flyovers overhead vs to the side. In Exper. III, Ss localized in azimuth and in the vertical plane recorded noises (10 1-oct noise bands of CF = 28.12 c/s - 14.4kc/s, spoken voice, and jet aircraft takeoffs and landings), presented through 1, 2, or 4 floor-level loudspeakers at each corner of a simulated living room (4.2 x 5.4m)built inside an IAC soundproof room. Aircraft noises presented by 4 loudspeakers were localized as "directly" overhead 80% of the time and "generally overhead" about 90% of the time; other sounds were so localized about 50% and 75% of the time respectively. Through only 2 loudspeakers, aircraft noises were localized 25-36 degrees above horizontal. Implications are that acoustic realism can be achieved in simulated aircraft overflights and that future laboratory noise-effects research should incorporate the required conditions.
ECG artifact cancellation in surface EMG signals by fractional order calculus application.
Miljković, Nadica; Popović, Nenad; Djordjević, Olivera; Konstantinović, Ljubica; Šekara, Tomislav B
2017-03-01
New aspects for automatic electrocardiography artifact removal from surface electromyography signals by application of fractional order calculus in combination with linear and nonlinear moving window filters are explored. Surface electromyography recordings of skeletal trunk muscles are commonly contaminated with spike shaped artifacts. This artifact originates from electrical heart activity, recorded by electrocardiography, commonly present in the surface electromyography signals recorded in heart proximity. For appropriate assessment of neuromuscular changes by means of surface electromyography, application of a proper filtering technique of electrocardiography artifact is crucial. A novel method for automatic artifact cancellation in surface electromyography signals by applying fractional order calculus and nonlinear median filter is introduced. The proposed method is compared with the linear moving average filter, with and without prior application of fractional order calculus. 3D graphs for assessment of window lengths of the filters, crest factors, root mean square differences, and fractional calculus orders (called WFC and WRC graphs) have been introduced. For an appropriate quantitative filtering evaluation, the synthetic electrocardiography signal and analogous semi-synthetic dataset have been generated. The examples of noise removal in 10 able-bodied subjects and in one patient with muscle dystrophy are presented for qualitative analysis. The crest factors, correlation coefficients, and root mean square differences of the recorded and semi-synthetic electromyography datasets showed that the most successful method was the median filter in combination with fractional order calculus of the order 0.9. Statistically more significant (p < 0.001) ECG peak reduction was obtained by the median filter application compared to the moving average filter in the cases of low level amplitude of muscle contraction compared to ECG spikes. The presented results suggest that the novel method combining a median filter and fractional order calculus can be used for automatic filtering of electrocardiography artifacts in the surface electromyography signal envelopes recorded in trunk muscles. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Phast4Windows: a 3D graphical user interface for the reactive-transport simulator PHAST.
Charlton, Scott R; Parkhurst, David L
2013-01-01
Phast4Windows is a Windows® program for developing and running groundwater-flow and reactive-transport models with the PHAST simulator. This graphical user interface allows definition of grid-independent spatial distributions of model properties-the porous media properties, the initial head and chemistry conditions, boundary conditions, and locations of wells, rivers, drains, and accounting zones-and other parameters necessary for a simulation. Spatial data can be defined without reference to a grid by drawing, by point-by-point definitions, or by importing files, including ArcInfo® shape and raster files. All definitions can be inspected, edited, deleted, moved, copied, and switched from hidden to visible through the data tree of the interface. Model features are visualized in the main panel of the interface, so that it is possible to zoom, pan, and rotate features in three dimensions (3D). PHAST simulates single phase, constant density, saturated groundwater flow under confined or unconfined conditions. Reactions among multiple solutes include mineral equilibria, cation exchange, surface complexation, solid solutions, and general kinetic reactions. The interface can be used to develop and run simple or complex models, and is ideal for use in the classroom, for analysis of laboratory column experiments, and for development of field-scale simulations of geochemical processes and contaminant transport. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.
Stuetz, R M
2004-01-01
An online monitoring system based on an array of non-specific sensors was used for the detection of chemical pollutants in wastewater and water. By superimposing sensor profiles for defined sampling window, the identification of data points outside these normal sensor response patterns was used to represent potential pollution episodes or other abnormalities within the process stream. Principle component analysis supported the detection of outliers or rapid changes in the sensor responses as an indicator of chemical pollutants. A model based on the comparison of sensor relative responses to a moving average for a defined sample window was tested for detecting and identifying sudden changes in the online data over a 6-month period. These results show the technical advantages of using a non-specific based monitoring system that can respond to a range of chemical species, due to broad selectivity of the sensor compositions. The findings demonstrate how this non-invasive technique could be further developed to provide early warning systems for application at the inlet of wastewater treatment plants.
Microwave window breakdown experiments and simulations on the UM/L-3 relativistic magnetron
NASA Astrophysics Data System (ADS)
Hoff, B. W.; Mardahl, P. J.; Gilgenbach, R. M.; Haworth, M. D.; French, D. M.; Lau, Y. Y.; Franzi, M.
2009-09-01
Experiments have been performed on the UM/L-3 (6-vane, L-band) relativistic magnetron to test a new microwave window configuration designed to limit vacuum side breakdown. In the baseline case, acrylic microwave windows were mounted between three of the waveguide coupling cavities in the anode block vacuum housing and the output waveguides. Each of the six 3 cm deep coupling cavities is separated from its corresponding anode cavity by a 1.75 cm wide aperture. In the baseline case, vacuum side window breakdown was observed to initiate at single waveguide output powers close to 20 MW. In the new window configuration, three Air Force Research Laboratory-designed, vacuum-rated directional coupler waveguide segments were mounted between the coupling cavities and the microwave windows. The inclusion of the vacuum side power couplers moved the microwave windows an additional 30 cm away from the anode apertures. Additionally, the Lucite microwave windows were replaced with polycarbonate windows and the microwave window mounts were redesigned to better maintain waveguide continuity in the region around the microwave windows. No vacuum side window breakdown was observed in the new window configuration at single waveguide output powers of 120+MW (a factor of 3 increase in measured microwave pulse duration and factor of 3 increase in measured peak power over the baseline case). Simulations were performed to investigate likely causes for the window breakdown in the original configuration. Results from these simulations have shown that in the original configuration, at typical operating voltage and magnetic field ranges, electrons emitted from the anode block microwave apertures strike the windows with a mean kinetic energy of 33 keV with a standard deviation of 14 keV. Calculations performed using electron impact angle and energy data predict a first generation secondary electron yield of 65% of the primary electron population. The effects of the primary aperture electron impacts, combined with multiplication of the secondary populations, were determined to be the likely causes of the poor microwave window performance in the original configuration.
Non-invasive detection of language-related prefrontal high gamma band activity with beamforming MEG.
Hashimoto, Hiroaki; Hasegawa, Yuka; Araki, Toshihiko; Sugata, Hisato; Yanagisawa, Takufumi; Yorifuji, Shiro; Hirata, Masayuki
2017-10-27
High gamma band (>50 Hz) activity is a key oscillatory phenomenon of brain activation. However, there has not been a non-invasive method established to detect language-related high gamma band activity. We used a 160-channel whole-head magnetoencephalography (MEG) system equipped with superconducting quantum interference device (SQUID) gradiometers to non-invasively investigate neuromagnetic activities during silent reading and verb generation tasks in 15 healthy participants. Individual data were divided into alpha (8-13 Hz), beta (13-25 Hz), low gamma (25-50 Hz), and high gamma (50-100 Hz) bands and analysed with the beamformer method. The time window was consecutively moved. Group analysis was performed to delineate common areas of brain activation. In the verb generation task, transient power increases in the high gamma band appeared in the left middle frontal gyrus (MFG) at the 550-750 ms post-stimulus window. We set a virtual sensor on the left MFG for time-frequency analysis, and high gamma event-related synchronization (ERS) induced by a verb generation task was demonstrated at 650 ms. In contrast, ERS in the high gamma band was not detected in the silent reading task. Thus, our study successfully non-invasively measured language-related prefrontal high gamma band activity.
Handling and analysis of ices in cryostats and glove boxes in view of cometary samples
NASA Technical Reports Server (NTRS)
Roessler, K.; Eich, G.; Heyl, M.; Kochan, H.; Oehler, A.; Patnaik, A.; Schlosser, W.; Schulz, R.
1989-01-01
Comet nucleus sample return mission and other return missions from planets and satellites need equipment for handling and analysis of icy samples at low temperatures under vacuum or protective gas. Two methods are reported which were developed for analysis of small icy samples and which are modified for larger samples in cometary matter simulation experiments (KOSI). A conventional optical cryostat system was modified to allow for transport of samples at 5 K, ion beam irradiation, and measurement in an off-line optical spectrophotometer. The new system consists of a removable window plug containing nozzles for condensation of water and volatiles onto a cold finger. This plug can be removed in a vacuum system, changed against another plug (e.g., with other windows (IR, VIS, VUV) or other nozzles). While open, the samples can be treated under vacuum with cooling by manipulators (cut, removal, sample taking, irradiation with light, photons, or ions). After bringing the plug back, the samples can be moved to another site of analysis. For handling the 30 cm diameter mineral-ice samples from the KOSI experiments an 80x80x80 cm glove box made out of plexiglass was used. The samples were kept in a liquid nitrogen bath, which was filled from the outside. A stream a dry N2 and evaporating gas from the bath purified the glove box from impurity gases and, in particular, H2O, which otherwise would condense onto the samples.
Nonlinear filtering properties of detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-11-01
Detrended fluctuation analysis (DFA) has been widely used for quantifying long-range correlation and fractal scaling behavior. In DFA, to avoid spurious detection of scaling behavior caused by a nonstationary trend embedded in the analyzed time series, a detrending procedure using piecewise least-squares fitting has been applied. However, it has been pointed out that the nonlinear filtering properties involved with detrending may induce instabilities in the scaling exponent estimation. To understand this issue, we investigate the adverse effects of the DFA detrending procedure on the statistical estimation. We show that the detrending procedure using piecewise least-squares fitting results in the nonuniformly weighted estimation of the root-mean-square deviation and that this property could induce an increase in the estimation error. In addition, for comparison purposes, we investigate the performance of a centered detrending moving average analysis with a linear detrending filter and sliding window DFA and show that these methods have better performance than the standard DFA.
Utility of correlation techniques in gravity and magnetic interpretation
NASA Technical Reports Server (NTRS)
Chandler, V. W.; Koski, J. S.; Braice, L. W.; Hinze, W. J.
1977-01-01
Internal correspondence uses Poisson's Theorem in a moving-window linear regression analysis between the anomalous first vertical derivative of gravity and total magnetic field reduced to the pole. The regression parameters provide critical information on source characteristics. The correlation coefficient indicates the strength of the relation between magnetics and gravity. Slope value gives delta j/delta sigma estimates of the anomalous source. The intercept furnishes information on anomaly interference. Cluster analysis consists of the classification of subsets of data into groups of similarity based on correlation of selected characteristics of the anomalies. Model studies are used to illustrate implementation and interpretation procedures of these methods, particularly internal correspondence. Analysis of the results of applying these methods to data from the midcontinent and a transcontinental profile shows they can be useful in identifying crustal provinces, providing information on horizontal and vertical variations of physical properties over province size zones, validating long wavelength anomalies, and isolating geomagnetic field removal problems.
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Mulligan, J. B.; Stone, L. S.; Statler, Irving C. (Technical Monitor)
1994-01-01
Mulligan showed that the perceived direction of a moving grating can be biased by the shape of the Gaussian window in which it is viewed. We sought to determine if a 2-D pattern with an unambiguous velocity would also show such biases. Observers viewed a drifting plaid (sum of two orthogonal 2.5 c/d sinusoidal gratings of 12% contrast, each with a TF of 4 Hz.) whose contrast was modulated spatially by a stationary, asymmetric 2-D Gaussian window (i.e. unequal standard deviations in the principal directions). The direction of plaid motion with respect to the orientation of the window's major axis (Delta Theta) was varied while all other motion parameters were held fixed. Observers reported the perceived plaid direction of motion by adjusting the orientation of a pointer. All five observers showed systematic biases in perceived plaid direction that depended on Delta Theta and the aspect ratio of the Gaussian window (lambda). For circular Gaussian windows Lambda = 1), plaid direction was veridically perceived. However, biases of up to 10 deg. were found for lambda = 2 and Delta Theta = 30 deg. These data present a challenge to models of motion perception which do not explicitly consider the integration of information across the visual field.
NASA Astrophysics Data System (ADS)
Johnson, R. M.; Herrold, A.; Holzer, M. A.; Passow, M. J.
2010-12-01
The geoscience research and education community is interested in developing scalable and effective user-friendly strategies for reaching the public, students and educators with information about the Earth and space sciences. Based on experience developed over the past decade with education and outreach programs seeking to reach these populations, there is a growing consensus that this will be best achieved through collaboration, leveraging the resources and networks already in existence. While it is clear that gifted researchers and developers can create wonderful online educational resources, many programs have been stymied by the difficulty of attracting an audience to these resources. The National Earth Science Teachers Association (NESTA) has undertaken an exciting new project, with support from the William and Flora Hewlett Foundation, that provides a new platform for the geoscience education and research community to share their research, resources, programs, products and services with a wider audience. In April 2010, the Windows to the Universe project (http://windows2universe.org) moved from the University Corporation for Atmospheric Research to NESTA. Windows to the Universe, which started in 1995 at the University of Michigan, is one of the most popular Earth and space science education websites globally, with over 16 million visits annually. The objective of this move is to develop a suite of new opportunities and capabilities on the website that will allow it become a sustainable education and outreach platform for the geoscience research and education community hosting open educational resources. This presentation will provide an update on our progress, highlighting our new strategies, synergies with community needs, and opportunities for collaboration.
Impact of smoking on in-vehicle fine particle exposure during driving
NASA Astrophysics Data System (ADS)
Sohn, Hongji; Lee, Kiyoung
2010-09-01
Indoor smoking ban in public places can reduce secondhand smoke (SHS) exposure. However, smoking in cars and homes has continued. The purpose of this study was to assess particulate matter less than 2.5 μm (PM 2.5) concentration in moving cars with different window opening conditions. The PM 2.5 level was measured by an aerosol spectrometer inside and outside moving cars simultaneously, along with ultrafine particle (UFP) number concentration, speed, temperature and humidity inside cars. Two sport utility vehicles were used. Three different ventilation conditions were evaluated by up to 20 repeated experiments. In the pre-smoking phase, average in-vehicle PM 2.5 concentrations were 16-17 μg m -3. Regardless of different window opening conditions, the PM 2.5 levels promptly increased when smoking occurred and decreased after cigarette was extinguished. Although only a single cigarette was smoked, the average PM 2.5 levels were 506-1307 μg m -3 with different window opening conditions. When smoking was ceased, the average PM 2.5 levels for 15 min were several times higher than the US National Ambient Air Quality Standard of 35 μg m -3. It took longer than 10 min to reach the level of the pre-smoking phase. Although UFP levels had a similar temporal profile of PM 2.5, the increased levels during the smoking phase were relatively small. This study demonstrated that the SHS exposure in cars with just a single cigarette being smoked could exceed the US EPA NAAQS under realistic window opening conditions. Therefore, the findings support the need for public education against smoking in cars and advocacy for a smoke-free car policy.
Simple and effective method to lock buoy position to ocean currents
NASA Technical Reports Server (NTRS)
Vachon, W. A.; Dahlen, J. M.
1975-01-01
Window-shade drogue, used with drifting buoys to keep them moving with current at speed as close to that of current as possible, has drag coefficient of 1.93 compared to maximum of 1.52 for previous drogues. It is remarkably simple to construct, use, and store.
Keep Your Windows Open and Mirrors Polished: On Quality Education in a Changing America
ERIC Educational Resources Information Center
Katz, Lucinda Lee
2011-01-01
Lucinda Lee Katz, head of Marin Country Day School (California), received the 2009 NAIS Diversity Leadership Award. This article presents an edited excerpt of her acceptance speech. In this speech, she outlines what is necessary to move school communities ahead in one's diversity work.
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
Operation and control software for APNEA
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClelland, J.H.; Storm, B.H. Jr.; Ahearn, J.
1997-11-01
The human interface software for the Lockheed Martin Specialty Components (LMSC) Active/Passive Neutron Examination & Analysis System (APENA) provides a user friendly operating environment for the movement and analysis of waste drums. It is written in Microsoft Visual C++ on a Windows NT platform. Object oriented and multitasking techniques are used extensively to maximize the capability of the system. A waste drum is placed on a loading platform with a fork lift and then automatically moved into the APNEA chamber in preparation for analysis. A series of measurements is performed, controlled by menu commands to hardware components attached as peripheralmore » devices, in order to create data files for analysis. The analysis routines use the files to identify the pertinent radioactive characteristics of the drum, including the type, location, and quantity of fissionable material. At the completion of the measurement process, the drum is automatically unloaded and the data are archived in preparation for storage as part of the drum`s data signature. 3 figs.« less
Standards for efficient employment of wide-area motion imagery (WAMI) sensors
NASA Astrophysics Data System (ADS)
Randall, L. Scott; Maenner, Paul F.
2013-05-01
Airborne Wide Area Motion Imagery (WAMI) sensors provide the opportunity for continuous high-resolution surveillance of geographic areas covering tens of square kilometers. This is both a blessing and a curse. Data volumes from "gigapixel-class" WAMI sensors are orders of magnitude greater than for traditional "megapixel-class" video sensors. The amount of data greatly exceeds the capacities of downlinks to ground stations, and even if this were not true, the geographic coverage is too large for effective human monitoring. Although collected motion imagery is recorded on the platform, typically only small "windows" of the full field of view are transmitted to the ground; the full set of collected data can be retrieved from the recording device only after the mission has concluded. Thus, the WAMI environment presents several difficulties: (1) data is too massive for downlink; (2) human operator selection and control of the video windows may not be effective; (3) post-mission storage and dissemination may be limited by inefficient file formats; and (4) unique system implementation characteristics may thwart exploitation by available analysis tools. To address these issues, the National Geospatial-Intelligence Agency's Motion Imagery Standards Board (MISB) is developing relevant standard data exchange formats: (1) moving target indicator (MTI) and tracking metadata to support tipping and cueing of WAMI windows using "watch boxes" and "trip wires"; (2) control channel commands for positioning the windows within the full WAMI field of view; and (3) a full-field-of-view spatiotemporal tiled file format for efficient storage, retrieval, and dissemination. The authors previously provided an overview of this suite of standards. This paper describes the latest progress, with specific concentration on a detailed description of the spatiotemporal tiled file format.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mestrovic, Ante; Chitsazzadeh, Shadi; Wells, Derek
2016-08-15
Purpose: To develop a highly sensitive patient specific QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: A platform was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside the ArcCheck. The Quasar phantom controller uses a patient-specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. With this system the ion chamber is used to QA the correct phase of the gated delivery and the ArcCheck diodes are used to QA the overall dose distribution. This novelmore » approach requires a single plan delivery for a complete QA of a gated plan. The sensitivity of the gating QA procedure was investigated with respect to the following parameters: PTV size, exhale duration, baseline drift, gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns is currently undergoing to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
NASA Astrophysics Data System (ADS)
Kappler, Karl N.; Schneider, Daniel D.; MacLean, Laura S.; Bleier, Thomas E.
2017-08-01
A method for identification of pulsations in time series of magnetic field data which are simultaneously present in multiple channels of data at one or more sensor locations is described. Candidate pulsations of interest are first identified in geomagnetic time series by inspection. Time series of these "training events" are represented in matrix form and transpose-multiplied to generate time-domain covariance matrices. The ranked eigenvectors of this matrix are stored as a feature of the pulsation. In the second stage of the algorithm, a sliding window (approximately the width of the training event) is moved across the vector-valued time-series comprising the channels on which the training event was observed. At each window position, the data covariance matrix and associated eigenvectors are calculated. We compare the orientation of the dominant eigenvectors of the training data to those from the windowed data and flag windows where the dominant eigenvectors directions are similar. This was successful in automatically identifying pulses which share polarization and appear to be from the same source process. We apply the method to a case study of continuously sampled (50 Hz) data from six observatories, each equipped with three-component induction coil magnetometers. We examine a 90-day interval of data associated with a cluster of four observatories located within 50 km of Napa, California, together with two remote reference stations-one 100 km to the north of the cluster and the other 350 km south. When the training data contains signals present in the remote reference observatories, we are reliably able to identify and extract global geomagnetic signals such as solar-generated noise. When training data contains pulsations only observed in the cluster of local observatories, we identify several types of non-plane wave signals having similar polarization.
Baczkowski, Blazej M; Johnstone, Tom; Walter, Henrik; Erk, Susanne; Veer, Ilya M
2017-06-01
We evaluated whether sliding-window analysis can reveal functionally relevant brain network dynamics during a well-established fear conditioning paradigm. To this end, we tested if fMRI fluctuations in amygdala functional connectivity (FC) can be related to task-induced changes in physiological arousal and vigilance, as reflected in the skin conductance level (SCL). Thirty-two healthy individuals participated in the study. For the sliding-window analysis we used windows that were shifted by one volume at a time. Amygdala FC was calculated for each of these windows. Simultaneously acquired SCL time series were averaged over time frames that corresponded to the sliding-window FC analysis, which were subsequently regressed against the whole-brain seed-based amygdala sliding-window FC using the GLM. Surrogate time series were generated to test whether connectivity dynamics could have occurred by chance. In addition, results were contrasted against static amygdala FC and sliding-window FC of the primary visual cortex, which was chosen as a control seed, while a physio-physiological interaction (PPI) was performed as cross-validation. During periods of increased SCL, the left amygdala became more strongly coupled with the bilateral insula and anterior cingulate cortex, core areas of the salience network. The sliding-window analysis yielded a connectivity pattern that was unlikely to have occurred by chance, was spatially distinct from static amygdala FC and from sliding-window FC of the primary visual cortex, but was highly comparable to that of the PPI analysis. We conclude that sliding-window analysis can reveal functionally relevant fluctuations in connectivity in the context of an externally cued task. Copyright © 2017 Elsevier Inc. All rights reserved.
2003-04-25
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility help guide the Mars Exploration Rover 1 (MER-1) as it is moved to the lander base petal for installation. The MER Mission consists of two identical rovers, landing at different regions of Mars, designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The first rover has a launch window opening June 5, and the second rover a window opening June 25. The rovers will be launched from Cape Canaveral Air Force Station.
2003-04-04
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility check the Mars Exploration Rover 2 (MER-2) before it is lifted and moved to the lander where it will be mated to the base petal. Set to launch in Spring 2003, the MER Mission consists of two identical rovers, landing at different regions of Mars, designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The first rover has a launch window opening May 30, and the second rover a window opening June 25.
NASA Astrophysics Data System (ADS)
Pellerin, Morgane; Castaing, Victor; Gourier, Didier; Chanéac, Corinne; Viana, Bruno
2018-02-01
Persistent luminescence materials present many applications including security lighting and bio-imaging. Many progresses have been made in the elaboration of persistent luminescent nanoparticles suitable for the first NIR partial transparency window (650 - 950 nm). Moving to the second and third near-infrared partial transparency windows (1000 nm - 1800 nm) allows further reducing of scattering, absorption and tissue autofluorescence effects. In this work, we present the synthesis of Co2+ and Ni2+ doped zinc-gallate nanoparticles with broad emission covering the NIR-II range. Site occupancy, energy levels, optical features and persistent phenomena are presented.
2003-01-28
KENNEDY SPACE CENTER, FLA. -- The Mars Exploration Rover -2 is moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
2003-01-28
KENNEDY SPACE CENTER, FLA. - Workers in the Payload Hazardous Servicing Facility move the Mars Exploration Rover -2 to a workstand in the high bay. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
Mechanisms of Cochlear Stimulation Through the Round Window
NASA Astrophysics Data System (ADS)
Lukashkin, Andrei N.; Weddell, Thomas; Russell, Ian J.
2011-11-01
The round window membrane (RW) functions as a pressure relief valve in conventional hearing allowing structures of the middle ear to move. Investigations in recent years have shown that middle ear implants can be used to stimulate the cochlea via the RW. Isolated clinical uses of this technique have been applied but more thorough theoretical and empirical studies are required. Using guinea pigs as test subjects we have investigated physiological effects of RW stimulation using a simulation of active middle ear prosthesis, a cylindrical neodymium iron boron disk magnet placed upon the RW which can be stimulated by an electromagnetic coil positioned in close proximity to the magnet.
Tunable plasmon-induced transparency in plasmonic metamaterial composed of three identical rings
NASA Astrophysics Data System (ADS)
Tian, Yuchen; Ding, Pei; Fan, Chunzhen
2017-10-01
We numerically investigated the plasmon-induced transparency (PIT) effect in a three-dimensional plasmonic metamaterial composed of three identical rings. It is illustrated that the PIT effect appears as a result of the destructive interference between the electric dipole and the quadrupole resonance mode. By tuning gap distance, radius or rotation angle of the metamaterial, the required transmission spectra with a narrow sharp transparency peak can be realized. In particular, it is found that an on-to-off amplitude modulation of the PIT transparency window can be achieved by moving or rotating the horizontal ring. Two dips move to high frequency and low frequency regions, respectively, in the transmission spectra by moving the horizontal ring, namely, the width of transmission peak becomes larger. With the rotation of horizontal ring, both width and position of transmission peak are kept invariant. Our designed structure achieved a maximum group index of 352 in the visible frequency range, which has a significant slow light effect. Moreover, the PIT effect is explained based on the classical two-oscillator theory, which is in well agreement with the numerical results. It indicates our proposed structure and theoretical analysis may open up avenues for the tunable control of light in highly integrated optical circuits.
Shot boundary detection and label propagation for spatio-temporal video segmentation
NASA Astrophysics Data System (ADS)
Piramanayagam, Sankaranaryanan; Saber, Eli; Cahill, Nathan D.; Messinger, David
2015-02-01
This paper proposes a two stage algorithm for streaming video segmentation. In the first stage, shot boundaries are detected within a window of frames by comparing dissimilarity between 2-D segmentations of each frame. In the second stage, the 2-D segments are propagated across the window of frames in both spatial and temporal direction. The window is moved across the video to find all shot transitions and obtain spatio-temporal segments simultaneously. As opposed to techniques that operate on entire video, the proposed approach consumes significantly less memory and enables segmentation of lengthy videos. We tested our segmentation based shot detection method on the TRECVID 2007 video dataset and compared it with block-based technique. Cut detection results on the TRECVID 2007 dataset indicate that our algorithm has comparable results to the best of the block-based methods. The streaming video segmentation routine also achieves promising results on a challenging video segmentation benchmark database.
NASA Astrophysics Data System (ADS)
Moliner, L.; Correcher, C.; Gimenez-Alventosa, V.; Ilisie, V.; Alvarez, J.; Sanchez, S.; Rodríguez-Alvarez, M. J.
2017-11-01
Nowadays, with the increase of the computational power of modern computers together with the state-of-the-art reconstruction algorithms, it is possible to obtain Positron Emission Tomography (PET) images in practically real time. These facts open the door to new applications such as radio-pharmaceuticals tracking inside the body or the use of PET for image-guided procedures, such as biopsy interventions, among others. This work is a proof of concept that aims to improve the user experience with real time PET images. Fixed, incremental, overlapping, sliding and hybrid windows are the different statistical combinations of data blocks used to generate intermediate images in order to follow the path of the activity in the Field Of View (FOV). To evaluate these different combinations, a point source is placed in a dedicated breast PET device and moved along the FOV. These acquisitions are reconstructed according to the different statistical windows, resulting in a smoother transition of positions for the image reconstructions that use the sliding and hybrid window.
2.5-month-old infants' reasoning about when objects should and should not be occluded.
Aguiar, A; Baillargeon, R
1999-09-01
The present research examined 2.5-month-old infants' reasoning about occlusion events. Three experiments investigated infants' ability to predict whether an object should remain continuously hidden or become temporarily visible when passing behind an occluder with an opening in its midsection. In Experiment 1, the infants were habituated to a short toy mouse that moved back and forth behind a screen. Next, the infants saw two test events that were identical to the habituation event except that a portion of the screen's midsection was removed to create a large window. In one event (high-window event), the window extended from the screen's upper edge; the mouse was shorter than the bottom of the window and thus did not become visible when passing behind the screen. In the other event (low-window event), the window extended from the screen's lower edge; although the mouse was shorter than the top of the window and hence should have become fully visible when passing behind the screen, it never appeared in the window. The infants tended to look equally at the high- and low-window events, suggesting that they were not surprised when the mouse failed to appear in the low window. However, positive results were obtained in Experiment 2 when the low-window event was modified: a portion of the screen above the window was removed so that the left and right sections of the screen were no longer connected (two-screens event). The infants looked reliably longer at the two-screens than at the high-window event. Together, the results of Experiments 1 and 2 suggested that, at 2.5 months of age, infants possess only very limited expectations about when objects should and should not be occluded. Specifically, infants expect objects (1) to become visible when passing between occluders and (2) to remain hidden when passing behind occluders, irrespective of whether these have openings extending from their upper or lower edges. Experiment 3 provided support for this interpretation. The implications of these findings for models of the origins and development of infants' knowledge about occlusion events are discussed. Copyright 1999 Academic Press.
Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena
2016-01-01
Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential 'window of opportunity' to promote pro-environmental behaviours.
Thomas, Gregory Owen; Poortinga, Wouter; Sautkina, Elena
2016-01-01
Repeated behaviours in stable contexts can become automatic habits. Habits are resistant to information-based techniques to change behaviour, but are contextually cued, so a change in behaviour context (e.g., location) weakens habit strength and can facilitate greater consideration of the behaviour. This idea was demonstrated in previous work, whereby people with strong environmental attitudes have lower car use, but only after recently moving home. We examine the habit discontinuity hypothesis by analysing the Understanding Society dataset with 18,053 individuals representative of the UK population, measuring time since moving home, travel mode to work, and strength of environmental attitudes. Results support previous findings where car use is significantly lower among those with stronger environmental views (but only after recently moving home), and in addition, demonstrate a trend where this effects decays as the time since moving home increases. We discuss results in light of moving into a new home being a potential ‘window of opportunity’ to promote pro-environmental behaviours. PMID:27120333
Through the Sliding Glass Door: #EmpowerTheReader
ERIC Educational Resources Information Center
Johnson, Nancy J.; Koss, Melanie D.; Martinez, Miriam
2018-01-01
This article seeks to complicate the understanding of Bishop's (1990) metaphor of mirrors, windows, and sliding glass doors, with particular emphasis on sliding glass doors and the emotional connections needed for readers to move through them. The authors begin by examining the importance of the reader and the characters he or she meets. Next, the…
NASA Astrophysics Data System (ADS)
Ma, Zhi-Sai; Liu, Li; Zhou, Si-Da; Yu, Lei; Naets, Frank; Heylen, Ward; Desmet, Wim
2018-01-01
The problem of parametric output-only identification of time-varying structures in a recursive manner is considered. A kernelized time-dependent autoregressive moving average (TARMA) model is proposed by expanding the time-varying model parameters onto the basis set of kernel functions in a reproducing kernel Hilbert space. An exponentially weighted kernel recursive extended least squares TARMA identification scheme is proposed, and a sliding-window technique is subsequently applied to fix the computational complexity for each consecutive update, allowing the method to operate online in time-varying environments. The proposed sliding-window exponentially weighted kernel recursive extended least squares TARMA method is employed for the identification of a laboratory time-varying structure consisting of a simply supported beam and a moving mass sliding on it. The proposed method is comparatively assessed against an existing recursive pseudo-linear regression TARMA method via Monte Carlo experiments and shown to be capable of accurately tracking the time-varying dynamics. Furthermore, the comparisons demonstrate the superior achievable accuracy, lower computational complexity and enhanced online identification capability of the proposed kernel recursive extended least squares TARMA approach.
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices.
Song, Soonyong; Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-04-17
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions.
BOREAS AFM-2 King Air 1994 Aircraft Flux and Moving Window Data
NASA Technical Reports Server (NTRS)
Kelly, Robert D.; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)
2000-01-01
The BOREAS AFM-2 team collected pass-by-pass fluxes (and many other statistics) for a large number of level (constant altitude), straight-line passes used in a variety of flight patterns. The data were collected by the University of Wyoming King Air in 1994 BOREAS IFCs 1-3. Most of these data were collected at 60-70 m above ground level, but a significant number of passes were also flown at various levels in the planetary boundary layer, up to about the inversion height. This documentation concerns only the data from the straight and level passes that are presented as original (over the NSA and SSA) and moving window values (over the Transect). Another archive of King Air data is also available, containing data from all the soundings flown by the King Air 1994 IFCs 1-3. The data are stored in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
Quantifying Wikipedia Usage Patterns Before Stock Market Moves
NASA Astrophysics Data System (ADS)
Moat, Helen Susannah; Curme, Chester; Avakian, Adam; Kenett, Dror Y.; Stanley, H. Eugene; Preis, Tobias
2013-05-01
Financial crises result from a catastrophic combination of actions. Vast stock market datasets offer us a window into some of the actions that have led to these crises. Here, we investigate whether data generated through Internet usage contain traces of attempts to gather information before trading decisions were taken. We present evidence in line with the intriguing suggestion that data on changes in how often financially related Wikipedia pages were viewed may have contained early signs of stock market moves. Our results suggest that online data may allow us to gain new insight into early information gathering stages of decision making.
Quantifying Wikipedia Usage Patterns Before Stock Market Moves
Moat, Helen Susannah; Curme, Chester; Avakian, Adam; Kenett, Dror Y.; Stanley, H. Eugene; Preis, Tobias
2013-01-01
Financial crises result from a catastrophic combination of actions. Vast stock market datasets offer us a window into some of the actions that have led to these crises. Here, we investigate whether data generated through Internet usage contain traces of attempts to gather information before trading decisions were taken. We present evidence in line with the intriguing suggestion that data on changes in how often financially related Wikipedia pages were viewed may have contained early signs of stock market moves. Our results suggest that online data may allow us to gain new insight into early information gathering stages of decision making.
Improved Scanners for Microscopic Hyperspectral Imaging
NASA Technical Reports Server (NTRS)
Mao, Chengye
2009-01-01
Improved scanners to be incorporated into hyperspectral microscope-based imaging systems have been invented. Heretofore, in microscopic imaging, including spectral imaging, it has been customary to either move the specimen relative to the optical assembly that includes the microscope or else move the entire assembly relative to the specimen. It becomes extremely difficult to control such scanning when submicron translation increments are required, because the high magnification of the microscope enlarges all movements in the specimen image on the focal plane. To overcome this difficulty, in a system based on this invention, no attempt would be made to move either the specimen or the optical assembly. Instead, an objective lens would be moved within the assembly so as to cause translation of the image at the focal plane: the effect would be equivalent to scanning in the focal plane. The upper part of the figure depicts a generic proposed microscope-based hyperspectral imaging system incorporating the invention. The optical assembly of this system would include an objective lens (normally, a microscope objective lens) and a charge-coupled-device (CCD) camera. The objective lens would be mounted on a servomotor-driven translation stage, which would be capable of moving the lens in precisely controlled increments, relative to the camera, parallel to the focal-plane scan axis. The output of the CCD camera would be digitized and fed to a frame grabber in a computer. The computer would store the frame-grabber output for subsequent viewing and/or processing of images. The computer would contain a position-control interface board, through which it would control the servomotor. There are several versions of the invention. An essential feature common to all versions is that the stationary optical subassembly containing the camera would also contain a spatial window, at the focal plane of the objective lens, that would pass only a selected portion of the image. In one version, the window would be a slit, the CCD would contain a one-dimensional array of pixels, and the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion. The image built up by scanning in this case would be an ordinary (non-spectral) image. In another version, the optics of which are depicted in the lower part of the figure, the spatial window would be a slit, the CCD would contain a two-dimensional array of pixels, the slit image would be refocused onto the CCD by a relay-lens pair consisting of a collimating and a focusing lens, and a prism-gratingprism optical spectrometer would be placed between the collimating and focusing lenses. Consequently, the image on the CCD would be spatially resolved along the slit axis and spectrally resolved along the axis perpendicular to the slit. As in the first-mentioned version, the objective lens would be moved along an axis perpendicular to the slit to spatially scan the image of the specimen in pushbroom fashion.
Windowing technique in FM radar realized by FPGA for better target resolution
NASA Astrophysics Data System (ADS)
Ponomaryov, Volodymyr I.; Escamilla-Hernandez, Enrique; Kravchenko, Victor F.
2006-09-01
Remote sensing systems, such as SAR usually apply FM signals to resolve nearly placed targets (objects) and improve SNR. Main drawbacks in the pulse compression of FM radar signal that it can add the range side-lobes in reflectivity measurements. Using weighting window processing in time domain it is possible to decrease significantly the side-lobe level (SLL) of output radar signal that permits to resolve small or low power targets those are masked by powerful ones. There are usually used classical windows such as Hamming, Hanning, Blackman-Harris, Kaiser-Bessel, Dolph-Chebyshev, Gauss, etc. in window processing. Additionally to classical ones in here we also use a novel class of windows based on atomic functions (AF) theory. For comparison of simulation and experimental results we applied the standard parameters, such as coefficient of amplification, maximum level of side-lobe, width of main lobe, etc. In this paper we also proposed to implement the compression-windowing model on a hardware level employing Field Programmable Gate Array (FPGA) that offers some benefits like instantaneous implementation, dynamic reconfiguration, design, and field programmability. It has been investigated the pulse compression design on FPGA applying classical and novel window technique to reduce the SLL in absence and presence of noise. The paper presents simulated and experimental examples of detection of small or nearly placed targets in the imaging radar. Paper also presents the experimental hardware results of windowing in FM radar demonstrating resolution of the several targets for classical rectangular, Hamming, Kaiser-Bessel, and some novel ones: Up(x), fup 4(x)•D 3(x), fup 6(x)•G 3(x), etc. It is possible to conclude that windows created on base of the AFs offer better decreasing of the SLL in cases of presence or absence of noise and when we move away of the main lobe in comparison with classical windows.
An energy function for dynamics simulations of polypeptides in torsion angle space
NASA Astrophysics Data System (ADS)
Sartori, F.; Melchers, B.; Böttcher, H.; Knapp, E. W.
1998-05-01
Conventional simulation techniques to model the dynamics of proteins in atomic detail are restricted to short time scales. A simplified molecular description, in which high frequency motions with small amplitudes are ignored, can overcome this problem. In this protein model only the backbone dihedrals φ and ψ and the χi of the side chains serve as degrees of freedom. Bond angles and lengths are fixed at ideal geometry values provided by the standard molecular dynamics (MD) energy function CHARMM. In this work a Monte Carlo (MC) algorithm is used, whose elementary moves employ cooperative rotations in a small window of consecutive amide planes, leaving the polypeptide conformation outside of this window invariant. A single of these window MC moves generates local conformational changes only. But, the application of many such moves at different parts of the polypeptide backbone leads to global conformational changes. To account for the lack of flexibility in the protein model employed, the energy function used to evaluate conformational energies is split into sequentially neighbored and sequentially distant contributions. The sequentially neighbored part is represented by an effective (φ,ψ)-torsion potential. It is derived from MD simulations of a flexible model dipeptide using a conventional MD energy function. To avoid exaggeration of hydrogen bonding strengths, the electrostatic interactions involving hydrogen atoms are scaled down at short distances. With these adjustments of the energy function, the rigid polypeptide model exhibits the same equilibrium distributions as obtained by conventional MD simulation with a fully flexible molecular model. Also, the same temperature dependence of the stability and build-up of α helices of 18-alanine as found in MD simulations is observed using the adapted energy function for MC simulations. Analyses of transition frequencies demonstrate that also dynamical aspects of MD trajectories are faithfully reproduced. Finally, it is demonstrated that even for high temperature unfolded polypeptides the MC simulation is more efficient by a factor of 10 than conventional MD simulations.
NASA Astrophysics Data System (ADS)
Dai, Junhu; Xu, Yunjia; Wang, Huanjiong; Alatalo, Juha; Tao, Zexing; Ge, Quansheng
2017-12-01
Continuous long-term temperature sensitivity (ST) of leaf unfolding date (LUD) and main impacting factors in spring in the period 1978-2014 for 40 plant species in Mudanjiang, Heilongjiang Province, Northeast China, were analyzed by using observation data from the China Phenological Observation Network (CPON), together with the corresponding meteorological data from the China Meteorological Data Service Center. Temperature sensitivities, slopes of the regression between LUD and mean temperature during the optimum preseason (OP), were analyzed using 15-year moving window to determine their temporal trends. Major factors impacting ST were then chosen and evaluated by applying a random sampling method. The results showed that LUD was sensitive to mean temperature in a defined period before phenophase onset for all plant species analyzed. Over the period 1978-2014, the mean ST of LUD for all plant species was - 3.2 ± 0.49 days °C-1. The moving window analysis revealed that 75% of species displayed increasing ST of LUD, with 55% showing significant increases (P < 0.05). ST for the other 25% exhibited a decreasing trend, with 17% showing significant decreases (P < 0.05). On average, ST increased by 16%, from - 2.8 ± 0.83 days °C-1 during 1980-1994 to - 3.30 ± 0.65 days °C-1 during 2000-2014. For species with later LUD and longer OP, ST tended to increase more, while species with earlier LUD and shorter OP tended to display a decreasing ST. The standard deviation of preseason temperature impacted the temporal variation in ST. Chilling conditions influenced ST for some species, but photoperiod limitation did not have significant or coherent effects on changes in ST.
Comparison of muscles activity of abled bodied and amputee subjects for around shoulder movement.
Kaur, Amanpreet; Agarwal, Ravinder; Kumar, Amod
2016-05-12
Worldwide, about 56% of the amputees are upper limb amputees. This research deals a method with two-channel surface electromyogram (SEMG) signal recorded from around shoulder to estimate the changes in muscle activity in non-amputee and the residual limb of trans humeral amputees with different movements of arm. Identification of different muscles activity of near shoulder amputee and non-amputee persons. SEMG signal were acquired during three distinct exercises from three-selected muscles location around shoulder. The participants were asked to move their dominant arm from an assigned position to record their muscles activity recorded with change in position. Results shows the muscles activity in scalene is more than the other muscles like pectoralis and infraspinatus with the same shoulder motion. In addition, STFT (Short-Time Fourier Transform) spectrogram with window length of 256 samples at maximum of 512 frequency bins using hamming window has used to identify the signal for the maximum muscles activity with best resolution in spectrum plot. The results suggest that one can use this analysis for making a suitable device for around shoulder prosthetic users based on muscles activation of amputee persons.
Sahi, Kamal; Jackson, Stuart; Wiebe, Edward; Armstrong, Gavin; Winters, Sean; Moore, Ronald; Low, Gavin
2014-02-01
To assess if "liver window" settings improve the conspicuity of small renal cell carcinomas (RCC). Patients were analysed from our institution's pathology-confirmed RCC database that included the following: (1) stage T1a RCCs, (2) an unenhanced computed tomography (CT) abdomen performed ≤ 6 months before histologic diagnosis, and (3) age ≥ 17 years. Patients with multiple tumours, prior nephrectomy, von Hippel-Lindau disease, and polycystic kidney disease were excluded. The unenhanced CT was analysed, and the tumour locations were confirmed by using corresponding contrast-enhanced CT or magnetic resonance imaging studies. Representative single-slice axial, coronal, and sagittal unenhanced CT images were acquired in "soft tissue windows" (width, 400 Hounsfield unit (HU); level, 40 HU) and liver windows (width, 150 HU; level, 88 HU). In addition, single-slice axial, coronal, and sagittal unenhanced CT images of nontumourous renal tissue (obtained from the same cases) were acquired in soft tissue windows and liver windows. These data sets were randomized, unpaired, and were presented independently to 3 blinded radiologists for analysis. The presence or absence of suspicious findings for tumour was scored on a 5-point confidence scale. Eighty-three of 415 patients met the study criteria. Receiver operating characteristics (ROC) analysis, t test analysis, and kappa analysis were used. ROC analysis showed statistically superior diagnostic performance for liver windows compared with soft tissue windows (area under the curve of 0.923 vs 0.879; P = .0002). Kappa statistics showed "good" vs "moderate" agreement between readers for liver windows compared with soft tissue windows. Use of liver windows settings improves the detection of small RCCs on the unenhanced CT. Copyright © 2014 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.
Transportable Applications Environment Plus, Version 5.1
NASA Technical Reports Server (NTRS)
1994-01-01
Transportable Applications Environment Plus (TAE+) computer program providing integrated, portable programming environment for developing and running application programs based on interactive windows, text, and graphical objects. Enables both programmers and nonprogrammers to construct own custom application interfaces easily and to move interfaces and application programs to different computers. Used to define corporate user interface, with noticeable improvements in application developer's and end user's learning curves. Main components are; WorkBench, What You See Is What You Get (WYSIWYG) software tool for design and layout of user interface; and WPT (Window Programming Tools) Package, set of callable subroutines controlling user interface of application program. WorkBench and WPT's written in C++, and remaining code written in C.
2003-03-28
KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility, workers move the Mars Exploration Rover-2 (MER-2) into position over the base petal of its lander assembly. Set to launch in Spring 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover, a window opening June 25.
2003-01-28
KENNEDY SPACE CENTER, FLA. - After being cleaned up, the Mars Exploration Rover -2 is ready to be moved to a workstand in the Payload Hazardous Servicing Facility. Set to launch in 2003, the Mars Exploration Rover Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, 2003, and the second rover a window opening June 25, 2003.
2003-03-28
KENNEDY SPACE CENTER, FLA. - In the Payload Hazardous Servicing Facility, workers move the Mars Exploration Rover-2 (MER-2) towards the base petal of its lander assembly. Set to launch in Spring 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards each Martian day over various terrain. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover, a window opening June 25.
2003-01-31
KENNEDY SPACE CENTER, FLA. - Suspended by an overhead crane in the Payload Hazardous Servicing Facility, the Mars Exploration Rover (MER) aeroshell is guided by workers as it moves to a rotation stand. Set to launch in 2003, the MER Mission will consist of two identical rovers designed to cover roughly 110 yards (100 meters) each Martian day. Each rover will carry five scientific instruments that will allow it to search for evidence of liquid water that may have been present in the planet's past. The rovers will be identical to each other, but will land at different regions of Mars. The first rover has a launch window opening May 30, and the second rover a window opening June 25, 2003.
NASA Technical Reports Server (NTRS)
Tescher, Andrew G. (Editor)
1989-01-01
Various papers on image compression and automatic target recognition are presented. Individual topics addressed include: target cluster detection in cluttered SAR imagery, model-based target recognition using laser radar imagery, Smart Sensor front-end processor for feature extraction of images, object attitude estimation and tracking from a single video sensor, symmetry detection in human vision, analysis of high resolution aerial images for object detection, obscured object recognition for an ATR application, neural networks for adaptive shape tracking, statistical mechanics and pattern recognition, detection of cylinders in aerial range images, moving object tracking using local windows, new transform method for image data compression, quad-tree product vector quantization of images, predictive trellis encoding of imagery, reduced generalized chain code for contour description, compact architecture for a real-time vision system, use of human visibility functions in segmentation coding, color texture analysis and synthesis using Gibbs random fields.
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
Text-Based Recall and Extra-Textual Generations Resulting from Simplified and Authentic Texts
ERIC Educational Resources Information Center
Crossley, Scott A.; McNamara, Danielle S.
2016-01-01
This study uses a moving windows self-paced reading task to assess text comprehension of beginning and intermediate-level simplified texts and authentic texts by L2 learners engaged in a text-retelling task. Linear mixed effects (LME) models revealed statistically significant main effects for reading proficiency and text level on the number of…
ERIC Educational Resources Information Center
Tam, Cynthia; Wells, David
2009-01-01
Visual-cognitive loads influence the effectiveness of word prediction technology. Adjusting parameters of word prediction programs can lessen visual-cognitive loads. This study evaluated the benefits of WordQ word prediction software for users' performance when the prediction window was moved to a personal digital assistant (PDA) device placed at…
Reading Time Allocation Strategies and Working Memory Using Rapid Serial Visual Presentation
ERIC Educational Resources Information Center
Busler, Jessica N.; Lazarte, Alejandro A.
2017-01-01
Rapid serial visual presentation (RSVP) is a useful method for controlling the timing of text presentations and studying how readers' characteristics, such as working memory (WM) and reading strategies for time allocation, influence text recall. In the current study, a modified version of RSVP (Moving Window RSVP [MW-RSVP]) was used to induce…
Initial Scene Representations Facilitate Eye Movement Guidance in Visual Search
ERIC Educational Resources Information Center
Castelhano, Monica S.; Henderson, John M.
2007-01-01
What role does the initial glimpse of a scene play in subsequent eye movement guidance? In 4 experiments, a brief scene preview was followed by object search through the scene via a small moving window that was tied to fixation position. Experiment 1 demonstrated that the scene preview resulted in more efficient eye movements compared with a…
Visualising Cultures: The "European Picture Book Collection" Moves "Down Under"
ERIC Educational Resources Information Center
Cotton, Penni; Daly, Nicola
2015-01-01
The potential for picture books in national collections to act as mirrors reflecting the reader's cultural identity, is widely accepted. This paper shows that the books in a New Zealand Picture Book Collection can also become windows into unfamiliar worlds for non-New Zealand readers, giving them the opportunity to learn more about a context in…
Pereira, Telma; Lemos, Luís; Cardoso, Sandra; Silva, Dina; Rodrigues, Ana; Santana, Isabel; de Mendonça, Alexandre; Guerreiro, Manuela; Madeira, Sara C
2017-07-19
Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.
NASA Astrophysics Data System (ADS)
Lyubushin, Alexey
2016-04-01
The problem of estimate of current seismic danger based on monitoring of seismic noise properties from broadband seismic network F-net in Japan (84 stations) is considered. Variations of the following seismic noise parameters are analyzed: multifractal singularity spectrum support width, generalized Hurst exponent, minimum Hölder-Lipschitz exponent and minimum normalized entropy of squared orthogonal wavelet coefficients. These parameters are estimated within adjacent time windows of the length 1 day for seismic noise waveforms from each station. Calculating daily median values of these parameters by all stations provides 4-dimensional time series which describes integral properties of the seismic noise in the region covered by the network. Cluster analysis is applied to the sequence of clouds of 4-dimensional vectors within moving time window of the length 365 days with mutual shift 3 days starting from the beginning of 1997 up to the current time. The purpose of the cluster analysis is to find the best number of clusters (BNC) from probe numbers which are varying from 1 up to the maximum value 40. The BNC is found from the maximum of pseudo-F-statistics (PFS). A 2D map could be created which presents dependence of PFS on the tested probe number of clusters and the right-hand end of moving time window which is rather similar to usual spectral time-frequency diagrams. In the paper [1] it was shown that the BNC before Tohoku mega-earthquake on March 11, 2011, has strongly chaotic regime with jumps from minimum up to maximum values in the time interval 1 year before the event and this time intervals was characterized by high PFS values. The PFS-map is proposed as the method for extracting time intervals with high current seismic danger. The next danger time interval after Tohoku mega-EQ began at the end of 2012 and was finished at the middle of 2013. Starting from middle of 2015 the high PFS values and chaotic regime of BNC variations were returned. This could be interpreted as the increasing of the danger of the next mega-EQ in Japan in the region of Nankai Trough [1] at the first half of 2016. References 1. Lyubushin, A., 2013. How soon would the next mega-earthquake occur in Japan? // Natural Science, 5 (8A1), 1-7. http://dx.doi.org/10.4236/ns.2013.58A1001
NASA Astrophysics Data System (ADS)
Pan, Chu-Dong; Yu, Ling; Liu, Huan-Lin
2017-08-01
Traffic-induced moving force identification (MFI) is a typical inverse problem in the field of bridge structural health monitoring. Lots of regularization-based methods have been proposed for MFI. However, the MFI accuracy obtained from the existing methods is low when the moving forces enter into and exit a bridge deck due to low sensitivity of structural responses to the forces at these zones. To overcome this shortcoming, a novel moving average Tikhonov regularization method is proposed for MFI by combining with the moving average concepts. Firstly, the bridge-vehicle interaction moving force is assumed as a discrete finite signal with stable average value (DFS-SAV). Secondly, the reasonable signal feature of DFS-SAV is quantified and introduced for improving the penalty function (∣∣x∣∣2 2) defined in the classical Tikhonov regularization. Then, a feasible two-step strategy is proposed for selecting regularization parameter and balance coefficient defined in the improved penalty function. Finally, both numerical simulations on a simply-supported beam and laboratory experiments on a hollow tube beam are performed for assessing the accuracy and the feasibility of the proposed method. The illustrated results show that the moving forces can be accurately identified with a strong robustness. Some related issues, such as selection of moving window length, effect of different penalty functions, and effect of different car speeds, are discussed as well.
Electron wind in strong wave guide fields
NASA Astrophysics Data System (ADS)
Krienen, F.
1985-03-01
The X-ray activity observed near highly powered waveguide structures is usually caused by local electric discharges originating from discontinuities such as couplers, tuners or bends. In traveling waves electrons move in the direction of the power flow. Seed electrons can multipactor in a traveling wave, the moving charge pattern is different from the multipactor in a resonant structure and is self-extinguishing. The charge density in the wave guide will modify impedance and propagation constant of the wave guide. The radiation level inside the output wave guide of the SLAC, 50 MW, S-band, klystron is estimated. Possible contributions of radiation to window failure are discussed.
2001-05-29
KODIAK ISLAND, Alaska -- A boat moves a ramp into place that will allow Castor 120, the first stage of the Athena 1 launch vehicle, to safely move onto the dock at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
Optimal Window and Lattice in Gabor Transform. Application to Audio Analysis.
Lachambre, Helene; Ricaud, Benjamin; Stempfel, Guillaume; Torrésani, Bruno; Wiesmeyr, Christoph; Onchis-Moaca, Darian
2015-01-01
This article deals with the use of optimal lattice and optimal window in Discrete Gabor Transform computation. In the case of a generalized Gaussian window, extending earlier contributions, we introduce an additional local window adaptation technique for non-stationary signals. We illustrate our approach and the earlier one by addressing three time-frequency analysis problems to show the improvements achieved by the use of optimal lattice and window: close frequencies distinction, frequency estimation and SNR estimation. The results are presented, when possible, with real world audio signals.
Measuring floodplain spatial patterns using continuous surface metrics at multiple scales
Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.
2015-01-01
Interactions between fluvial processes and floodplain ecosystems occur upon a floodplain surface that is often physically complex. Spatial patterns in floodplain topography have only recently been quantified over multiple scales, and discrepancies exist in how floodplain surfaces are perceived to be spatially organised. We measured spatial patterns in floodplain topography for pool 9 of the Upper Mississippi River, USA, using moving window analyses of eight surface metrics applied to a 1 × 1 m2 DEM over multiple scales. The metrics used were Range, SD, Skewness, Kurtosis, CV, SDCURV,Rugosity, and Vol:Area, and window sizes ranged from 10 to 1000 m in radius. Surface metric values were highly variable across the floodplain and revealed a high degree of spatial organisation in floodplain topography. Moran's I correlograms fit to the landscape of each metric at each window size revealed that patchiness existed at nearly all window sizes, but the strength and scale of patchiness changed within window size, suggesting that multiple scales of patchiness and patch structure exist in the topography of this floodplain. Scale thresholds in the spatial patterns were observed, particularly between the 50 and 100 m window sizes for all surface metrics and between the 500 and 750 m window sizes for most metrics. These threshold scales are ~ 15–20% and 150% of the main channel width (1–2% and 10–15% of the floodplain width), respectively. These thresholds may be related to structuring processes operating across distinct scale ranges. By coupling surface metrics, multi-scale analyses, and correlograms, quantifying floodplain topographic complexity is possible in ways that should assist in clarifying how floodplain ecosystems are structured.
Muilenberg, M L; Skellenger, W S; Burge, H A; Solomon, W R
1991-02-01
Penetration of particulate aeroallergens into the interiors of two, new, similar Chrysler Corporation passenger vehicles (having no evidence of intrinsic microbial contamination) was studied on a large circular test track during periods of high pollen and spore prevalence. Impactor collections were obtained at front and rear seat points and at the track center during periods with (1) windows and vents closed and air conditioning on, (2) windows closed, vents open, and no air conditioning, and (3) air conditioner off, front windows open, and vents closed. These conditions were examined sequentially during travel at 40, 50, 60, and 80 kph. Particle recoveries within the two, new, similar Chrysler Corporation passenger vehicles did not vary with the speed of travel, either overall or with regard to each of the three ventilatory modalities. In addition, collections at front and rear seat sampling points were comparable. Highest interior aeroallergen levels were recorded with WO, and yet, these levels averaged only half the concurrent outside concentrations at track center. Recoveries within the cars were well below recoveries obtained outside when windows were closed (both VO and AC modes). These findings suggest window ventilation as an overriding factor determining particle ingress into moving vehicles. Efforts to delineate additional determinants of exposure by direct sampling are feasible and would appear essential in formulating realistic strategies of avoidance.
In Search of Conversational Grain Size: Modelling Semantic Structure Using Moving Stanza Windows
ERIC Educational Resources Information Center
Siebert-Evenstone, Amanda L.; Irgens, Golnaz Arastoopour; Collier, Wesley; Swiecki, Zachari; Ruis, Andrew R.; Shaffer, David Williamson
2017-01-01
Analyses of learning based on student discourse need to account not only for the content of the utterances but also for the ways in which students make connections across turns of talk. This requires segmentation of discourse data to define when connections are likely to be meaningful. In this paper, we present an approach to segmenting data for…
View of the ISS stack as seen during the fly-around by the STS-96 crew
2017-04-20
S96-E-5218 (3 June 1999) --- Partially silhouetted over clouds and a wide expanse of ocean waters, the unmanned International Space Station (ISS) moves away from the Space Shuttle Discovery. An electronic still camera (ESC) was aimed through aft flight deck windows to capture the image at 23:01:00 GMT, June 3, 1999.
ERIC Educational Resources Information Center
Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.
2014-01-01
This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…
The Advantage of Word-Based Processing in Chinese Reading: Evidence from Eye Movements
ERIC Educational Resources Information Center
Li, Xingshan; Gu, Junjuan; Liu, Pingping; Rayner, Keith
2013-01-01
In 2 experiments, we tested the prediction that reading is more efficient when characters belonging to a word are presented simultaneously than when they are not in Chinese reading using a novel variation of the moving window paradigm (McConkie & Rayner, 1975). In Experiment 1, we found that reading was slowed down when Chinese readers could…
Power-Efficient Beacon Recognition Method Based on Periodic Wake-Up for Industrial Wireless Devices
Lee, Donghun; Jang, Ingook; Choi, Jinchul; Son, Youngsung
2018-01-01
Energy harvester-integrated wireless devices are attractive for generating semi-permanent power from wasted energy in industrial environments. The energy-harvesting wireless devices may have difficulty in their communication with access points due to insufficient power supply for beacon recognition during network initialization. In this manuscript, we propose a novel method of beacon recognition based on wake-up control to reduce instantaneous power consumption in the initialization procedure. The proposed method applies a moving window for the periodic wake-up of the wireless devices. For unsynchronized wireless devices, beacons are always located in the same positions within each beacon interval even though the starting offsets are unknown. Using these characteristics, the moving window checks the existence of the beacon associated withspecified resources in a beacon interval, checks again for neighboring resources at the next beacon interval, and so on. This method can reduce instantaneous power and generates a surplus of charging time. Thus, the proposed method alleviates the problems of power insufficiency in the network initialization. The feasibility of the proposed method is evaluated using computer simulations of power shortage in various energy-harvesting conditions. PMID:29673206
Moving-window dynamic optimization: design of stimulation profiles for walking.
Dosen, Strahinja; Popović, Dejan B
2009-05-01
The overall goal of the research is to improve control for electrical stimulation-based assistance of walking in hemiplegic individuals. We present the simulation for generating offline input (sensors)-output (intensity of muscle stimulation) representation of walking that serves in synthesizing a rule-base for control of electrical stimulation for restoration of walking. The simulation uses new algorithm termed moving-window dynamic optimization (MWDO). The optimization criterion was to minimize the sum of the squares of tracking errors from desired trajectories with the penalty function on the total muscle efforts. The MWDO was developed in the MATLAB environment and tested using target trajectories characteristic for slow-to-normal walking recorded in healthy individual and a model with the parameters characterizing the potential hemiplegic user. The outputs of the simulation are piecewise constant intensities of electrical stimulation and trajectories generated when the calculated stimulation is applied to the model. We demonstrated the importance of this simulation by showing the outputs for healthy and hemiplegic individuals, using the same target trajectories. Results of the simulation show that the MWDO is an efficient tool for analyzing achievable trajectories and for determining the stimulation profiles that need to be delivered for good tracking.
Hussein, Sami; Kruger, Jörg
2011-01-01
Robot assisted training has proven beneficial as an extension of conventional therapy to improve rehabilitation outcome. Further facilitation of this positive impact is expected from the application of cooperative control algorithms to increase the patient's contribution to the training effort according to his level of ability. This paper presents an approach for cooperative training for end-effector based gait rehabilitation devices. Thereby it provides the basis to firstly establish sophisticated cooperative control methods in this class of devices. It uses a haptic control framework to synthesize and render complex, task specific training environments, which are composed of polygonal primitives. Training assistance is integrated as part of the environment into the haptic control framework. A compliant window is moved along a nominal training trajectory compliantly guiding and supporting the foot motion. The level of assistance is adjusted via the stiffness of the moving window. Further an iterative learning algorithm is used to automatically adjust this assistance level. Stable haptic rendering of the dynamic training environments and adaptive movement assistance have been evaluated in two example training scenarios: treadmill walking and stair climbing. Data from preliminary trials with one healthy subject is provided in this paper. © 2011 IEEE
Model MTF for the mosaic window
NASA Astrophysics Data System (ADS)
Xing, Zhenchong; Hong, Yongfeng; Zhang, Bao
2017-10-01
An electro-optical targeting system mounted either within an airframe or housed in separate pods requires a window to form an environmental barrier to the outside world. In current practice, such windows usually use a mosaic or segmented window. When scanning the target, internally gimbaled systems sweep over the window, which can affect the modulation transfer function (MTF) due to wave-front division and optical path differences arising from the thickness/wedge differences between panes. In this paper, a mathematical model of the MTF of the mosaic window is presented that allows an analysis of influencing factors; we show how the model may be integrated into ZEMAX® software for optical design. The model can be used to guide both the design and the tolerance analysis of optical systems that employ a mosaic window.
Rivera, Diego; Lillo, Mario; Granda, Stalin
2014-12-01
The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chitsazzadeh, S; Wells, D; Mestrovic, A
2016-06-15
Purpose: To develop a QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: An interface was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside an ArcCheck diode array. The Quasar phantom controller used a patient specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. An amplitude-based RPM tracking system was specified to turn the beam on during the exhale phase of the breathing pattern. SABR plans were developed using Eclipse for liver PTVs ranging in sizemore » from 3-12 cm in diameter using a 2-arc VMAT technique. Dose was measured in the middle of the penumbra region, where the high dose gradient allowed for sensitive detection of any inaccuracies in gated dose delivery. The overall fidelity of the dose distribution was confirmed using ArcCheck. The sensitivity of the gating QA procedure was investigated with respect to the following four parameters: PTV size, duration of exhale, baseline drift, and gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns will be required to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
Weibull Analysis and Area Scaling for Infrared Window Materials (U)
2016-08-01
the strength of a window scales inversely with the size of the window. This report was reviewed for technical accuracy by Howard Poisl, Thomas M...strength of a window scales inversely with the size of the window. Test data are given for aluminum oxynitride (ALON), calcium fluoride, chemical vapor...failure of an optical window in the absence of slow crack growth. This report illustrates how the strength of a window scales inversely with the size of
2015-01-19
MS WINDOWS platform, which enables multitasking with simultaneous evaluation and operation 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with simultaneous...Proteus measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with
VizieR Online Data Catalog: OGLE UBVI phot. in Baade's Window (Paczynski+, 1999)
NASA Astrophysics Data System (ADS)
Paczynski, B.; Udalski, A.; Szymanski, M.; Kubiak, M.; Pietrzynski, G.; Soszynski, I.; Wozniak, P.; Zebrun, K.
2000-01-01
We present UBVI photometry for 8530 stars in Baade's Window obtained during the OGLE-II microlensing survey. Among these are over one thousand red clump giants. 1391 of them have photometry with errors smaller than 0.04, 0.06, 0.12, and 0.20 mag in the I, V, B, and U-band, respectively. We constructed a map of interstellar reddening. The corrected colors of the red clump giants: (U-B)0, (B-V)0, and (V-I)0 are very well correlated, indicating that a single parameter determines the observed spread of their values, reaching almost 2mag in the (U-B)0. It seems most likely that heavy element content is the dominant parameter, but it is possible that another parameter: the age (or mass) of a star moves it along the same trajectory in the color-color diagram as the metallicity. The current ambiguity can be resolved with spectral analysis, and our catalog may be useful as a finding list of red clump giants. We point out that these K giants are more suitable for a fair determination of the distribution of metallicity than brighter M giants. We also present a compilation of UBVI data for 308 red clump giants near the Sun, for which Hipparcos parallaxes are more accurate than 10%. Spectral analysis of their metallicity may provide information about the local metallicity distribution as well as the extent to which mass (age) of these stars affects their colors. (3 data files).
NASA Astrophysics Data System (ADS)
Mannon, Timothy Patrick, Jr.
Improving well design has and always will be the primary goal in drilling operations in the oil and gas industry. Oil and gas plays are continuing to move into increasingly hostile drilling environments, including near and/or sub-salt proximities. The ability to reduce the risk and uncertainly involved in drilling operations in unconventional geologic settings starts with improving the techniques for mudweight window modeling. To address this issue, an analysis of wellbore stability and well design improvement has been conducted. This study will show a systematic approach to well design by focusing on best practices for mudweight window projection for a field in Mississippi Canyon, Gulf of Mexico. The field includes depleted reservoirs and is in close proximity of salt intrusions. Analysis of offset wells has been conducted in the interest of developing an accurate picture of the subsurface environment by making connections between depth, non-productive time (NPT) events, and mudweights used. Commonly practiced petrophysical methods of pore pressure, fracture pressure, and shear failure gradient prediction have been applied to key offset wells in order to enhance the well design for two proposed wells. For the first time in the literature, the accuracy of the commonly accepted, seismic interval velocity based and the relatively new, seismic frequency based methodologies for pore pressure prediction are qualitatively and quantitatively compared for accuracy. Accuracy standards will be based on the agreement of the seismic outputs to pressure data obtained while drilling and petrophysically based pore pressure outputs for each well. The results will show significantly higher accuracy for the seismic frequency based approach in wells that were in near/sub-salt environments and higher overall accuracy for all of the wells in the study as a whole.
Burriel-Valencia, Jordi; Martinez-Roman, Javier; Sapena-Bano, Angel
2018-01-01
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current’s spectrogram with a significant reduction of the required computational resources. PMID:29316650
Burriel-Valencia, Jordi; Puche-Panadero, Ruben; Martinez-Roman, Javier; Sapena-Bano, Angel; Pineda-Sanchez, Manuel
2018-01-06
The aim of this paper is to introduce a new methodology for the fault diagnosis of induction machines working in the transient regime, when time-frequency analysis tools are used. The proposed method relies on the use of the optimized Slepian window for performing the short time Fourier transform (STFT) of the stator current signal. It is shown that for a given sequence length of finite duration, the Slepian window has the maximum concentration of energy, greater than can be reached with a gated Gaussian window, which is usually used as the analysis window. In this paper, the use and optimization of the Slepian window for fault diagnosis of induction machines is theoretically introduced and experimentally validated through the test of a 3.15-MW induction motor with broken bars during the start-up transient. The theoretical analysis and the experimental results show that the use of the Slepian window can highlight the fault components in the current's spectrogram with a significant reduction of the required computational resources.
ERIC Educational Resources Information Center
Stone, Tammy; Coussons-Read, Mary
2011-01-01
Moving from a faculty position to an administrative office frequently entails gaining considerable responsibility, but ambiguous power. The hope of these two authors is that this volume will serve as a reference and a source of support for current associate and assistant deans and as a window into these jobs for faculty who may be considering such…
Air bubble migration is a random event post embryo transfer.
Confino, E; Zhang, J; Risquez, F
2007-06-01
Air bubble location following embryo transfer (ET) is the presumable placement spot of embryos. The purpose of this study was to document endometrial air bubble position and migration following embryo transfer. Multicenter prospective case study. Eighty-eight embryo transfers were performed under abdominal ultrasound guidance in two countries by two authors. A single or double air bubble was loaded with the embryos using a soft, coaxial, end opened catheters. The embryos were slowly injected 10-20 mm from the fundus. Air bubble position was recorded immediately, 30 minutes later and when the patient stood up. Bubble marker location analysis revealed a random distribution without visible gravity effect when the patients stood up. The bubble markers demonstrated splitting, moving in all directions and dispersion. Air bubbles move and split frequently post ET with the patient in the horizontal position, suggestive of active uterine contractions. Bubble migration analysis supports a rather random movement of the bubbles and possibly the embryos. Standing up changed somewhat bubble configuration and distribution in the uterine cavity. Gravity related bubble motion was uncommon, suggesting that horizontal rest post ET may not be necessary. This report challenges the common belief that a very accurate ultrasound guided embryo placement is mandatory. The very random bubble movement observed in this two-center study suggests that a large "window" of embryo placement maybe present.
Earthquake Occurrence in Bangladesh and Surrounding Region
NASA Astrophysics Data System (ADS)
Al-Hussaini, T. M.; Al-Noman, M.
2011-12-01
The collision of the northward moving Indian plate with the Eurasian plate is the cause of frequent earthquakes in the region comprising Bangladesh and neighbouring India, Nepal and Myanmar. Historical records indicate that Bangladesh has been affected by five major earthquakes of magnitude greater than 7.0 (Richter scale) during 1869 to 1930. This paper presents some statistical observations of earthquake occurrence in fulfilment of a basic groundwork for seismic hazard assessment of this region. An up to date catalogue covering earthquake information in the region bounded within 17°-30°N and 84°-97°E for the period of historical period to 2010 is derived from various reputed international sources including ISC, IRIS, Indian sources and available publications. Careful scrutiny is done to remove duplicate or uncertain earthquake events. Earthquake magnitudes in the range of 1.8 to 8.1 have been obtained and relationships between different magnitude scales have been studied. Aftershocks are removed from the catalogue using magnitude dependent space window and time window. The main shock data are then analyzed to obtain completeness period for different magnitudes evaluating their temporal homogeneity. Spatial and temporal distribution of earthquakes, magnitude-depth histograms and other statistical analysis are performed to understand the distribution of seismic activity in this region.
End effector of the Discovery's RMS with tools moves toward Syncom-IV
1985-04-17
51D-44-046 (17 April 1985) --- The Space Shuttle Discovery's Remote Manipulator System (RMS) arm and two specially designed extensions move toward the troubled Syncom-IV (LEASAT) communications satellite during a station keeping mode of the two spacecraft in Earth orbit. Inside the Shuttle's cabin, astronaut Rhea Seddon, 51D mission specialist, controlled the Canadian-built arm in an attempt to move an external lever on the satellite. Crewmembers learned of the satellite's problems shortly after it was deployed from the cargo bay on April 13, 1985. The arm achieved physical contact with the lever as planned. However, the satellite did not respond to the contact as hoped. A 70mm handheld Hassellblad camera, aimed through Discovery's windows, recorded this frame -- one of the first to be released to news media following return of the seven-member crew on April 17, 1985.
Advances and applications of ABCI
NASA Astrophysics Data System (ADS)
Chin, Y. H.
1993-05-01
ABCI (Azimuthal Beam Cavity Interaction) is a computer program which solves the Maxwell equations directly in the time domain when a Gaussian beam goes through an axi-symmetrical structure on or off axis. Many new features have been implemented in the new version of ABCI (presently version 6.6), including the 'moving mesh' and Napoly's method of calculation of wake potentials. The mesh is now generated only for the part of the structure inside a window and moves together with the window frame. This moving mesh option reduces the number of mesh points considerably, and very fine meshes can be used. Napoly's integration method makes it possible to compute wake potentials in a structure such as a collimator, where parts of the cavity material are at smaller radii than that of the beam pipes, in such a way that the contribution from the beam pipes vanishes. For the monopole wake potential, ABCI can be applied even to structures with unequal beam pipe radii. Furthermore, the radial mesh size can be varied over the structure, permitting use a fine mesh only where actually needed. With these improvements, the program allows computation of wake fields for structures far too complicated for older codes. Plots of a cavity shape and wake potentials can be obtained in the form of a Top Drawer file. The program can also calculate and plot the impedance of a structure and/or the distribution of the deposited energy as a function of the frequency from Fourier transforms of wake potentials. Its usefulness is illustrated by showing some numerical examples.
Muon catalyzed fusion beam window mechanical strength testing and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ware, A.G.; Zabriskie, J.M.
A thin aluminum window (0.127 mm (0.005-inch) thick x 146 mm (5 3/4-inch) diameter) of 2024-T6 alloy was modeled and analyzed using the ABAQUS non-linear finite element analysis code. A group of windows was fabricated, heat-treated and subsequently tested. Testing included both ultimate burst pressure and fatigue. Fatigue testing cycles involved ''oil-canning'' behavior representing vacuum purge and reversal to pressure. Test results are compared to predictions and the mode of failure is discussed. Operational requirements, based on the above analysis and correlational testing, for the actual beam windows are discussed. 1 ref., 3 figs.
WINDOWS: a program for the analysis of spectral data foil activation measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stallmann, F.W.; Eastham, J.F.; Kam, F.B.K.
The computer program WINDOWS together with its subroutines is described for the analysis of neutron spectral data foil activation measurements. In particular, the unfolding of the neutron differential spectrum, estimated windows and detector contributions, upper and lower bounds for an integral response, and group fluxes obtained from neutron transport calculations. 116 references. (JFP)
Apparatus and method for solar coal gasification
Gregg, David W.
1980-01-01
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials. Incident solar radiation is focused from an array of heliostats onto a tower-mounted secondary mirror which redirects the focused solar radiation down through a window onto the surface of a vertically-moving bed of coal, or a fluidized bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called "synthesis gas", which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam at the rear surface of the secondary mirror.
Apparatus for solar coal gasification
Gregg, D.W.
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials is described. Incident solar radiation is focused from an array of heliostats onto a tower-mounted secondary mirror which redirects the focused solar radiation down through a window onto the surface of a vertically-moving bed of coal, or a fluidized bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called synthesis gas, which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam at the rear surface of the secondary mirror.
Effects of Spatio-Temporal Aliasing on Out-the-Window Visual Systems
NASA Technical Reports Server (NTRS)
Sweet, Barbara T.; Stone, Leland S.; Liston, Dorion B.; Hebert, Tim M.
2014-01-01
Designers of out-the-window visual systems face a challenge when attempting to simulate the outside world as viewed from a cockpit. Many methodologies have been developed and adopted to aid in the depiction of particular scene features, or levels of static image detail. However, because aircraft move, it is necessary to also consider the quality of the motion in the simulated visual scene. When motion is introduced in the simulated visual scene, perceptual artifacts can become apparent. A particular artifact related to image motion, spatiotemporal aliasing, will be addressed. The causes of spatio-temporal aliasing will be discussed, and current knowledge regarding the impact of these artifacts on both motion perception and simulator task performance will be reviewed. Methods of reducing the impact of this artifact are also addressed
Robotic Attention Processing And Its Application To Visual Guidance
NASA Astrophysics Data System (ADS)
Barth, Matthew; Inoue, Hirochika
1988-03-01
This paper describes a method of real-time visual attention processing for robots performing visual guidance. This robot attention processing is based on a novel vision processor, the multi-window vision system that was developed at the University of Tokyo. The multi-window vision system is unique in that it only processes visual information inside local area windows. These local area windows are quite flexible in their ability to move anywhere on the visual screen, change their size and shape, and alter their pixel sampling rate. By using these windows for specific attention tasks, it is possible to perform high speed attention processing. The primary attention skills of detecting motion, tracking an object, and interpreting an image are all performed at high speed on the multi-window vision system. A basic robotic attention scheme using the attention skills was developed. The attention skills involved detection and tracking of salient visual features. The tracking and motion information thus obtained was utilized in producing the response to the visual stimulus. The response of the attention scheme was quick enough to be applicable to the real-time vision processing tasks of playing a video 'pong' game, and later using an automobile driving simulator. By detecting the motion of a 'ball' on a video screen and then tracking the movement, the attention scheme was able to control a 'paddle' in order to keep the ball in play. The response was faster than that of a human's, allowing the attention scheme to play the video game at higher speeds. Further, in the application to the driving simulator, the attention scheme was able to control both direction and velocity of a simulated vehicle following a lead car. These two applications show the potential of local visual processing in its use for robotic attention processing.
Cross, Troy J.; Keller-Ross, Manda; Issa, Amine; Wentz, Robert; Taylor, Bryan; Johnson, Bruce
2015-01-01
Study Objectives: To determine the impact of averaging window-length on the “desaturation” indexes (DIs) obtained via overnight pulse oximetry (SpO2) at high altitude. Design: Overnight SpO2 data were collected during a 10-day sojourn at high altitude. SpO2 was obtained using a commercial wrist-worn finger oximeter whose firmware was modified to store unaveraged beat-to-beat data. Simple moving averages of window lengths spanning 2 to 20 cardiac beats were retrospectively applied to beat-to-beat SpO2 datasets. After SpO2 artifacts were removed, the following DIs were then calculated for each of the averaged datasets: oxygen desaturation index (ODI); total sleep time with SpO2 < 80% (TST < 80), and the lowest SpO2 observed during sleep (SpO2 low). Setting: South Base Camp, Mt. Everest (5,364 m elevation). Participants: Five healthy, adult males (35 ± 5 y; 180 ± 1 cm; 85 ± 4 kg). Interventions: N/A. Measurements and Results: 49 datasets were obtained from the 5 participants, totalling 239 hours of data. For all window lengths ≥ 2 beats, ODI and TST < 80 were lower, and SpO2 low was higher than those values obtained from the beat-to-beat SpO2 time series data (P < 0.05). Conclusions: Our findings indicate that increasing oximeter averaging window length progressively underestimates the frequency and magnitude of sleep disordered breathing events at high altitude, as indirectly assessed via the desaturation indexes. Citation: Cross TJ, Keller-Ross M, Issa A, Wentz R, Taylor B, Johnson B. The impact of averaging window length on the “desaturation” indexes obtained via overnight pulse oximetry at high altitude. SLEEP 2015;38(8):1331–1334. PMID:25581919
Sink detection on tilted terrain for automated identification of glacial cirques
NASA Astrophysics Data System (ADS)
Prasicek, Günther; Robl, Jörg; Lang, Andreas
2016-04-01
Glacial cirques are morphologically distinct but complex landforms and represent a vital part of high mountain topography. Their distribution, elevation and relief are expected to hold information on (1) the extent of glacial occupation, (2) the mechanism of glacial cirque erosion, and (3) how glacial in concert with periglacial processes can limit peak altitude and mountain range height. While easily detectably for the expert's eye both in nature and on various representations of topography, their complicated nature makes them a nemesis for computer algorithms. Consequently, manual mapping of glacial cirques is commonplace in many mountain landscapes worldwide, but consistent datasets of cirque distribution and objectively mapped cirques and their morphometrical attributes are lacking. Among the biggest problems for algorithm development are the complexity in shape and the great variability of cirque size. For example, glacial cirques can be rather circular or longitudinal in extent, exist as individual and composite landforms, show prominent topographic depressions or can entirely be filled with water or sediment. For these reasons, attributes like circularity, size, drainage area and topology of landform elements (e.g. a flat floor surrounded by steep walls) have only a limited potential for automated cirque detection. Here we present a novel, geomorphometric method for automated identification of glacial cirques on digital elevation models that exploits their genetic bowl-like shape. First, we differentiate between glacial and fluvial terrain employing an algorithm based on a moving window approach and multi-scale curvature, which is also capable of fitting the analysis window to valley width. We then fit a plane to the valley stretch clipped by the analysis window and rotate the terrain around the center cell until the plane is level. Doing so, we produce sinks of considerable size if the clipped terrain represents a cirque, while no or only very small sinks develop on other valley stretches. We normalize sink area by window size for sink classification, apply this method to the Sawtooth Mountains, Idaho, and to Fiordland, New Zealand, and compare the results to manually mapped reference cirques. Results indicate that false negatives are produced only in very rugged terrain and false positives occur in rare cases, when valleys are strongly curved in longitudinal direction.
Traffic-Related Air Pollution, Blood Pressure, and Adaptive Response of Mitochondrial Abundance.
Zhong, Jia; Cayir, Akin; Trevisi, Letizia; Sanchez-Guerra, Marco; Lin, Xinyi; Peng, Cheng; Bind, Marie-Abèle; Prada, Diddier; Laue, Hannah; Brennan, Kasey J M; Dereix, Alexandra; Sparrow, David; Vokonas, Pantel; Schwartz, Joel; Baccarelli, Andrea A
2016-01-26
Exposure to black carbon (BC), a tracer of vehicular-traffic pollution, is associated with increased blood pressure (BP). Identifying biological factors that attenuate BC effects on BP can inform prevention. We evaluated the role of mitochondrial abundance, an adaptive mechanism compensating for cellular-redox imbalance, in the BC-BP relationship. At ≥ 1 visits among 675 older men from the Normative Aging Study (observations=1252), we assessed daily BP and ambient BC levels from a stationary monitor. To determine blood mitochondrial abundance, we used whole blood to analyze mitochondrial-to-nuclear DNA ratio (mtDNA/nDNA) using quantitative polymerase chain reaction. Every standard deviation increase in the 28-day BC moving average was associated with 1.97 mm Hg (95% confidence interval [CI], 1.23-2.72; P<0.0001) and 3.46 mm Hg (95% CI, 2.06-4.87; P<0.0001) higher diastolic and systolic BP, respectively. Positive BC-BP associations existed throughout all time windows. BC moving averages (5-day to 28-day) were associated with increased mtDNA/nDNA; every standard deviation increase in 28-day BC moving average was associated with 0.12 standard deviation (95% CI, 0.03-0.20; P=0.007) higher mtDNA/nDNA. High mtDNA/nDNA significantly attenuated the BC-systolic BP association throughout all time windows. The estimated effect of 28-day BC moving average on systolic BP was 1.95-fold larger for individuals at the lowest mtDNA/nDNA quartile midpoint (4.68 mm Hg; 95% CI, 3.03-6.33; P<0.0001), in comparison with the top quartile midpoint (2.40 mm Hg; 95% CI, 0.81-3.99; P=0.003). In older adults, short-term to moderate-term ambient BC levels were associated with increased BP and blood mitochondrial abundance. Our findings indicate that increased blood mitochondrial abundance is a compensatory response and attenuates the cardiac effects of BC. © 2015 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Ramezanzadeh, B.; Arman, S. Y.; Mehdipour, M.; Markhali, B. P.
2014-01-01
In this study, the corrosion inhibition properties of two similar heterocyclic compounds namely benzotriazole (BTA) and benzothiazole (BNS) inhibitors on copper in 1.0 M H2SO4 solution were studied by electrochemical techniques as well as surface analysis. The results showed that corrosion inhibition of copper largely depends on the molecular structure and concentration of the inhibitors. The effect of DC trend on the interpretation of electrochemical noise (ECN) results in time domain was evaluated by moving average removal (MAR) method. Accordingly, the impact of square and Hanning window functions as drift removal methods in frequency domain was studied. After DC trend removal, a good trend was observed between electrochemical noise (ECN) data and the results obtained from EIS and potentiodynamic polarization. Furthermore, the shot noise theory in frequency domain was applied to approach the charge of each electrochemical event (q) from the potential and current noise signals.
NASA Astrophysics Data System (ADS)
Liu, Meixian; Xu, Xianli; Sun, Alex
2015-07-01
Climate extremes can cause devastating damage to human society and ecosystems. Recent studies have drawn many conclusions about trends in climate extremes, but few have focused on quantitative analysis of their spatial variability and underlying mechanisms. By using the techniques of overlapping moving windows, the Mann-Kendall trend test, correlation, and stepwise regression, this study examined the spatial-temporal variation of precipitation extremes and investigated the potential key factors influencing this variation in southwestern (SW) China, a globally important biodiversity hot spot and climate-sensitive region. Results showed that the changing trends of precipitation extremes were not spatially uniform, but the spatial variability of these precipitation extremes decreased from 1959 to 2012. Further analysis found that atmospheric circulations rather than local factors (land cover, topographic conditions, etc.) were the main cause of such precipitation extremes. This study suggests that droughts or floods may become more homogenously widespread throughout SW China. Hence, region-wide assessments and coordination are needed to help mitigate the economic and ecological impacts.
Death from a driverless vehicle.
Das, Siddhartha; Menezes, Ritesh G
2018-03-01
Road traffic accidents are a major cause of fatalities around the world, and a number of deaths are caused by moving traffic on public roads. Deaths from vehicles that are off the highway may be called non-traffic fatalities which can be due to a vehicle reversing, carbon monoxide poisoning, weather-induced over-heating inside the vehicle and electric windows. Children (and animals) are the usual victims. We report a case from India where a man was found lying dead by the roadside with a lorry nearby. The autopsy findings indicated that he had been run over, but as there was no history of a vehicular collision and with no eyewitnesses, the investigators were unsure of the probable sequence of events that led to his death. The autopsy findings, history, circumstantial evidence and chemical analysis enabled us to work out what had happened.
Exponential smoothing weighted correlations
NASA Astrophysics Data System (ADS)
Pozzi, F.; Di Matteo, T.; Aste, T.
2012-06-01
In many practical applications, correlation matrices might be affected by the "curse of dimensionality" and by an excessive sensitiveness to outliers and remote observations. These shortcomings can cause problems of statistical robustness especially accentuated when a system of dynamic correlations over a running window is concerned. These drawbacks can be partially mitigated by assigning a structure of weights to observational events. In this paper, we discuss Pearson's ρ and Kendall's τ correlation matrices, weighted with an exponential smoothing, computed on moving windows using a data-set of daily returns for 300 NYSE highly capitalized companies in the period between 2001 and 2003. Criteria for jointly determining optimal weights together with the optimal length of the running window are proposed. We find that the exponential smoothing can provide more robust and reliable dynamic measures and we discuss that a careful choice of the parameters can reduce the autocorrelation of dynamic correlations whilst keeping significance and robustness of the measure. Weighted correlations are found to be smoother and recovering faster from market turbulence than their unweighted counterparts, helping also to discriminate more effectively genuine from spurious correlations.
NASA Astrophysics Data System (ADS)
Abrokwah, K.; O'Reilly, A. M.
2017-12-01
Groundwater is an important resource that is extracted every day because of its invaluable use for domestic, industrial and agricultural purposes. The need for sustaining groundwater resources is clearly indicated by declining water levels and has led to modeling and forecasting accurate groundwater levels. In this study, spectral decomposition of climatic forcing time series was used to develop hybrid wavelet analysis (WA) and moving window average (MWA) artificial neural network (ANN) models. These techniques are explored by modeling historical groundwater levels in order to provide understanding of potential causes of the observed groundwater-level fluctuations. Selection of the appropriate decomposition level for WA and window size for MWA helps in understanding the important time scales of climatic forcing, such as rainfall, that influence water levels. Discrete wavelet transform (DWT) is used to decompose the input time-series data into various levels of approximate and details wavelet coefficients, whilst MWA acts as a low-pass signal-filtering technique for removing high-frequency signals from the input data. The variables used to develop and validate the models were daily average rainfall measurements from five National Atmospheric and Oceanic Administration (NOAA) weather stations and daily water-level measurements from two wells recorded from 1978 to 2008 in central Florida, USA. Using different decomposition levels and different window sizes, several WA-ANN and MWA-ANN models for simulating the water levels were created and their relative performances compared against each other. The WA-ANN models performed better than the corresponding MWA-ANN models; also higher decomposition levels of the input signal by the DWT gave the best results. The results obtained show the applicability and feasibility of hybrid WA-ANN and MWA-ANN models for simulating daily water levels using only climatic forcing time series as model inputs.
The Flash-Preview Moving Window Paradigm: Unpacking Visual Expertise One Glimpse at a Time
ERIC Educational Resources Information Center
Litchfield, Damien; Donovan, Tim
2017-01-01
How we make sense of what we see and where best to look is shaped by our experience, our current task goals and how we first perceive our environment. An established way of demonstrating these factors work together is to study how eye movement patterns change as a function of expertise and to observe how experts can solve complex tasks after only…
Impact of Advanced Avionics Technology on Ground Attack Weapon Systems.
1982-02-01
as the relevant feature. 3.0 Problem The task is to perform the automatic cueing of moving objects in a natural environment . Additional problems...views on this subject to the American Defense Preparedness Association (ADPA) on 11 February 1981 in Orlando, Florida. ENVIRONMENTAL CONDITIONS OUR...the operating window or the environmental conditions of combat that our forces may encounter worldwide. The three areas selected were Europe, the
ERIC Educational Resources Information Center
Branzburg, Jeffrey
2007-01-01
Interactive whiteboards have made quite a splash in classrooms in recent years. When a computer image is projected on the whiteboard using an LCD projector, users can directly control the computer from the whiteboard. In some systems such as Smart and Mimio, the finger is used in place of a mouse to open and run programs or move windows around. In…
ERIC Educational Resources Information Center
Whitford, Veronica; O'Driscoll, Gillian A.; Pack, Christopher C.; Joober, Ridha; Malla, Ashok; Titone, Debra
2013-01-01
Language and oculomotor disturbances are 2 of the best replicated findings in schizophrenia. However, few studies have examined skilled reading in schizophrenia (e.g., Arnott, Sali, Copland, 2011; Hayes & O'Grady, 2003; Revheim et al., 2006; E. O. Roberts et al., 2012), and none have examined the contribution of cognitive and motor processes that…
Optimal routing of coordinated aircraft to Identify moving surface contacts
2017-06-01
Time TAO Tactical Action Officer TSP Traveling Salesman Problem TSPTW TSP with Time Windows UAV unmanned aerial vehicle VRP Vehicle Routing...Orienteering Problem (OP), while the ORCA TI formulation follows the structure of a time dependent Traveling Salesman Problem (TSP), or a time dependent...Fox, Kenneth R., Bezalel Gavish, and Stephen C. Graves. 1980. “An n- Constraint Formulation of the ( Time Dependent) Traveling Salesman Problem
ERIC Educational Resources Information Center
Whitford, Veronica; Titone, Debra
2015-01-01
Eye movement measures demonstrate differences in first-language (L1) and second-language (L2) paragraph-level reading as a function of individual differences in current L2 exposure among bilinguals (Whitford & Titone, 2012). Specifically, as current L2 exposure increases, the ease of L2 word processing increases, but the ease of L1 word…
Design and DSP implementation of star image acquisition and star point fast acquiring and tracking
NASA Astrophysics Data System (ADS)
Zhou, Guohui; Wang, Xiaodong; Hao, Zhihang
2006-02-01
Star sensor is a special high accuracy photoelectric sensor. Attitude acquisition time is an important function index of star sensor. In this paper, the design target is to acquire 10 samples per second dynamic performance. On the basis of analyzing CCD signals timing and star image processing, a new design and a special parallel architecture for improving star image processing are presented in this paper. In the design, the operation moving the data in expanded windows including the star to the on-chip memory of DSP is arranged in the invalid period of CCD frame signal. During the CCD saving the star image to memory, DSP processes the data in the on-chip memory. This parallelism greatly improves the efficiency of processing. The scheme proposed here results in enormous savings of memory normally required. In the scheme, DSP HOLD mode and CPLD technology are used to make a shared memory between CCD and DSP. The efficiency of processing is discussed in numerical tests. Only in 3.5ms is acquired the five lightest stars in the star acquisition stage. In 43us, the data in five expanded windows including stars are moved into the internal memory of DSP, and in 1.6ms, five star coordinates are achieved in the star tracking stage.
Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L
2012-09-01
Geostatistical methods are widely used in estimating long-term exposures for epidemiological studies on air pollution, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and the uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian maximum entropy (BME) method and applied this framework to estimate fine particulate matter (PM(2.5)) yearly average concentrations over the contiguous US. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingness in the air-monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM(2.5) data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM(2.5). Moreover, the MWBME method further reduces the MSE by 8.4-43.7%, with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM(2.5) across large geographical domains with expected spatial non-stationarity.
NASA Astrophysics Data System (ADS)
Ham, Seung-Hee; Kato, Seiji; Barker, Howard W.; Rose, Fred G.; Sun-Mack, Sunny
2014-01-01
Three-dimensional (3-D) effects on broadband shortwave top of atmosphere (TOA) nadir radiance, atmospheric absorption, and surface irradiance are examined using 3-D cloud fields obtained from one hour's worth of A-train satellite observations and one-dimensional (1-D) independent column approximation (ICA) and full 3-D radiative transfer simulations. The 3-D minus ICA differences in TOA nadir radiance multiplied by π, atmospheric absorption, and surface downwelling irradiance, denoted as πΔI, ΔA, and ΔT, respectively, are analyzed by cloud type. At the 1 km pixel scale, πΔI, ΔA, and ΔT exhibit poor spatial correlation. Once averaged with a moving window, however, better linear relationships among πΔI, ΔA, and ΔT emerge, especially for moving windows larger than 5 km and large θ0. While cloud properties and solar geometry are shown to influence the relationships amongst πΔI, ΔA, and ΔT, once they are separated by cloud type, their linear relationships become much stronger. This suggests that ICA biases in surface irradiance and atmospheric absorption can be approximated based on ICA biases in nadir radiance as a function of cloud type.
Spatiotemporal Patterns of Contact Across the Rat Vibrissal Array During Exploratory Behavior
Hobbs, Jennifer A.; Towal, R. Blythe; Hartmann, Mitra J. Z.
2016-01-01
The rat vibrissal system is an important model for the study of somatosensation, but the small size and rapid speed of the vibrissae have precluded measuring precise vibrissal-object contact sequences during behavior. We used a laser light sheet to quantify, with 1 ms resolution, the spatiotemporal structure of whisker-surface contact as five naïve rats freely explored a flat, vertical glass wall. Consistent with previous work, we show that the whisk cycle cannot be uniquely defined because different whiskers often move asynchronously, but that quasi-periodic (~8 Hz) variations in head velocity represent a distinct temporal feature on which to lock analysis. Around times of minimum head velocity, whiskers protract to make contact with the surface, and then sustain contact with the surface for extended durations (~25–60 ms) before detaching. This behavior results in discrete temporal windows in which large numbers of whiskers are in contact with the surface. These “sustained collective contact intervals” (SCCIs) were observed on 100% of whisks for all five rats. The overall spatiotemporal structure of the SCCIs can be qualitatively predicted based on information about head pose and the average whisk cycle. In contrast, precise sequences of whisker-surface contact depend on detailed head and whisker kinematics. Sequences of vibrissal contact were highly variable, equally likely to propagate in all directions across the array. Somewhat more structure was found when sequences of contacts were examined on a row-wise basis. In striking contrast to the high variability associated with contact sequences, a consistent feature of each SCCI was that the contact locations of the whiskers on the glass converged and moved more slowly on the sheet. Together, these findings lead us to propose that the rat uses a strategy of “windowed sampling” to extract an object's spatial features: specifically, the rat spatially integrates quasi-static mechanical signals across whiskers during the period of sustained contact, resembling an “enclosing” haptic procedure. PMID:26778990
Dynamic time-correlated single-photon counting laser ranging
NASA Astrophysics Data System (ADS)
Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang
2018-03-01
We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.
Using OpenOffice as a Portable Interface to JAVA-Based Applications
NASA Astrophysics Data System (ADS)
Comeau, T.; Garrett, B.; Richon, J.; Romelfanger, F.
2004-07-01
STScI previously used Microsoft Word and Microsoft Access, a Sybase ODBC driver, and the Adobe Acrobat PDF writer, along with a substantial amount of Visual Basic, to generate a variety of documents for the internal Space Telescope Grants Administration System (STGMS). While investigating an upgrade to Microsoft Office XP, we began considering alternatives, ultimately selecting an open source product, OpenOffice.org. This reduces the total number of products required to operate the internal STGMS system, simplifies the build system, and opens the possibility of moving to a non-Windows platform. We describe the experience of moving from Microsoft Office to OpenOffice.org, and our other internal uses of OpenOffice.org in our development environment.
Xie, Li-Hong; Tang, Jie; Miao, Wen-Jie; Tang, Xiang-Long; Li, Heng; Tang, An-Zhou
2018-06-01
We evaluated the risk of cochlear implantation through the round window membrane in the facial recess through a preoperative analysis of the angle between the facial nerve-round window and the cranial midline using high-resolution temporal bone CT. Temporal bone CT films of 176 patients with profound sensorineural hearing loss at our hospital from 2013 to 2015 were reviewed. The preoperative temporal bone CT scans of the patients were retrospectively analysed. The vertical distance (d value) from the leading edge of the facial nerve to the posterior wall of the external auditory canal and the angle (α value) between the line from the leading edge of the facial nerve to the midpoint of the round window membrane and the median sagittal line on the round window membrane plane were measured. Based on intraoperative observation, the round window membrane was divided into complete round window membrane exposure (group A), partial exposure (group B), and unexposed (group C) groups, and statistical analysis was performed. The α value could be effectively measured for all 176 patients (62.60 ± 7.12), and the d value could be effectively measured for 95 cases (5.53 ± 1.00). An analysis of the correlation between the α and d values of these 95 cases found a negative correlation. Of the 176 cases, one-way analysis of variance (ANOVA) showed that the differences among the groups were significant [P = 0.000 (< 0.05)]. The angle (α value) between the line connecting the leading edge of the facial nerve to the midpoint of the round window and the median sagittal line measured in preoperative CT scans was associated with the difficulty of intraoperatively exposing the round window membrane. When the α value was larger than a certain degree, the difficulty of exposing the round window membrane was increased. In such cases, the surgeon should fully expose the round window membrane during surgery, which could result decrease the likelihood of complications.
Recognition of the Multi Specularity Objects using the Eigen-Window,
1996-02-29
analysis to each eigen-window [21]. The basic idea is that, even if some of the windows are occluded, the remaining windows are still effective and can...K.Ikeuchi, “The Machanical Manipulation of Randomly Oriented Parts”, Scientific American, Vol.251, No.2, pp.100-111, 1984. [5] S.A.Hutchinson and A.C.Kak
NASA Technical Reports Server (NTRS)
Ko, William L.; Gong, Leslie
2000-01-01
To visually record the initial free flight event of the Hyper-X research flight vehicle immediately after separation from the Pegasus(registered) booster rocket, a video camera was mounted on the bulkhead of the adapter through which Hyper-X rides on Pegasus. The video camera was shielded by a protecting camera window made of heat-resistant quartz material. When Hyper-X separates from Pegasus, this camera window will be suddenly exposed to Mach 7 stagnation thermal shock and dynamic pressure loading (aerothermal loading). To examine the structural integrity, thermoelastic analysis was performed, and the stress distributions in the camera windows were calculated. The critical stress point where the tensile stress reaches a maximum value for each camera window was identified, and the maximum tensile stress level at that critical point was found to be considerably lower than the tensile failure stress of the camera window material.
Occupant-responsive optimal control of smart facade systems
NASA Astrophysics Data System (ADS)
Park, Cheol-Soo
Windows provide occupants with daylight, direct sunlight, visual contact with the outside and a feeling of openness. Windows enable the use of daylighting and offer occupants a outside view. Glazing may also cause a number of problems: undesired heat gain/loss in winter. An over-lit window can cause glare, which is another major complaint by occupants. Furthermore, cold or hot window surfaces induce asymmetric thermal radiation which can result in thermal discomfort. To reduce the potential problems of window systems, double skin facades and airflow window systems have been introduced in the 1970s. They typically contain interstitial louvers and ventilation openings. The current problem with double skin facades and airflow windows is that their operation requires adequate dynamic control to reach their expected performance. Many studies have recognized that only an optimal control enables these systems to truly act as active energy savers and indoor environment controllers. However, an adequate solution for this dynamic optimization problem has thus far not been developed. The primary objective of this study is to develop occupant responsive optimal control of smart facade systems. The control could be implemented as a smart controller that operates the motorized Venetian blind system and the opening ratio of ventilation openings. The objective of the control is to combine the benefits of large windows with low energy demands for heating and cooling, while keeping visual well-being and thermal comfort at an optimal level. The control uses a simulation model with an embedded optimization routine that allows occupant interaction via the Web. An occupant can access the smart controller from a standard browser and choose a pre-defined mode (energy saving mode, visual comfort mode, thermal comfort mode, default mode, nighttime mode) or set a preferred mode (user-override mode) by moving preference sliders on the screen. The most prominent feature of these systems is the capability of dynamically reacting to the environmental input data through real-time optimization. The proposed occupant responsive optimal control of smart facade systems could provide a breakthrough in this under-developed area and lead to a renewed interest in smart facade systems.
Thurlow, W R
1980-01-01
Messages were presented which moved from right to left along an electronic alphabetic display which was varied in "window" size from 4 through 32 letter spaces. Deaf subjects signed the messages they perceived. Relatively few errors were made even at the highest rate of presentation, which corresponded to a typing rate of 60 words/min. It is concluded that many deaf persons can make effective use of a small visual display. A reduced cost is then possible for visual communication instruments for these people through reduced display size. Deaf subjects who can profit from a small display can be located by a sentence test administered by tape recorder which drives the display of the communication device by means of the standard code of the deaf teletype network.
Design and comparison of laser windows for high-power lasers
NASA Astrophysics Data System (ADS)
Niu, Yanxiong; Liu, Wenwen; Liu, Haixia; Wang, Caili; Niu, Haisha; Man, Da
2014-11-01
High-power laser systems are getting more and more widely used in industry and military affairs. It is necessary to develop a high-power laser system which can operate over long periods of time without appreciable degradation in performance. When a high-energy laser beam transmits through a laser window, it is possible that the permanent damage is caused to the window because of the energy absorption by window materials. So, when we design a high-power laser system, a suitable laser window material must be selected and the laser damage threshold of the window must be known. In this paper, a thermal analysis model of high-power laser window is established, and the relationship between the laser intensity and the thermal-stress field distribution is studied by deducing the formulas through utilizing the integral-transform method. The influence of window radius, thickness and laser intensity on the temperature and stress field distributions is analyzed. Then, the performance of K9 glass and the fused silica glass is compared, and the laser-induced damage mechanism is analyzed. Finally, the damage thresholds of laser windows are calculated. The results show that compared with K9 glass, the fused silica glass has a higher damage threshold due to its good thermodynamic properties. The presented theoretical analysis and simulation results are helpful for the design and selection of high-power laser windows.
A Novel Feature Extraction Method for Monitoring (Vehicular) Fuel Storage System Leaks
2014-10-02
gives a continuous output of the DPDF with predefined partitions . Resolution a DPDF is dependent on pre-determined signal range and number of... partitions within that range. Conceptually, proposed implementation is identical to the creation of a histogram with a moving data windown given some...window. The crisp partitions within specified signal range act as “competing and possible” scenarios or alternatives where we impose a “winner takes all
climwin: An R Toolbox for Climate Window Analysis.
Bailey, Liam D; van de Pol, Martijn
2016-01-01
When studying the impacts of climate change, there is a tendency to select climate data from a small set of arbitrary time periods or climate windows (e.g., spring temperature). However, these arbitrary windows may not encompass the strongest periods of climatic sensitivity and may lead to erroneous biological interpretations. Therefore, there is a need to consider a wider range of climate windows to better predict the impacts of future climate change. We introduce the R package climwin that provides a number of methods to test the effect of different climate windows on a chosen response variable and compare these windows to identify potential climate signals. climwin extracts the relevant data for each possible climate window and uses this data to fit a statistical model, the structure of which is chosen by the user. Models are then compared using an information criteria approach. This allows users to determine how well each window explains variation in the response variable and compare model support between windows. climwin also contains methods to detect type I and II errors, which are often a problem with this type of exploratory analysis. This article presents the statistical framework and technical details behind the climwin package and demonstrates the applicability of the method with a number of worked examples.
Cross, Paul C.; Caillaud, Damien; Heisey, Dennis M.
2013-01-01
Many ecological and epidemiological studies occur in systems with mobile individuals and heterogeneous landscapes. Using a simulation model, we show that the accuracy of inferring an underlying biological process from observational data depends on movement and spatial scale of the analysis. As an example, we focused on estimating the relationship between host density and pathogen transmission. Observational data can result in highly biased inference about the underlying process when individuals move among sampling areas. Even without sampling error, the effect of host density on disease transmission is underestimated by approximately 50 % when one in ten hosts move among sampling areas per lifetime. Aggregating data across larger regions causes minimal bias when host movement is low, and results in less biased inference when movement rates are high. However, increasing data aggregation reduces the observed spatial variation, which would lead to the misperception that a spatially targeted control effort may not be very effective. In addition, averaging over the local heterogeneity will result in underestimating the importance of spatial covariates. Minimizing the bias due to movement is not just about choosing the best spatial scale for analysis, but also about reducing the error associated with using the sampling location as a proxy for an individual’s spatial history. This error associated with the exposure covariate can be reduced by choosing sampling regions with less movement, including longitudinal information of individuals’ movements, or reducing the window of exposure by using repeated sampling or younger individuals.
Misfeldt, Renée; Suter, Esther; Mallinson, Sara; Boakye, Omenaa; Wong, Sabrina; Nasmith, Louise
2017-08-01
This paper discusses findings from a high-level scan of the contextual factors and actors that influenced policies on team-based primary healthcare in three Canadian provinces: British Columbia, Alberta and Saskatchewan. The team searched diverse sources (e.g., news reports, press releases, discussion papers) for contextual information relevant to primary healthcare teams. We also conducted qualitative interviews with key health system informants from the three provinces. Data from documents and interviews were analyzed qualitatively using thematic analysis. We then wrote narrative summaries highlighting pivotal policy and local system events and the influence of actors and context. Our overall findings highlight the value of reviewing the context, relationships and power dynamics, which come together and create "policy windows" at different points in time. We observed physician-centric policy processes with some recent moves to rebalance power and be inclusive of other actors and perspectives. The context review also highlighted the significant influence of changes in political leadership and prioritization in driving policies on team-based care. While this existed in different degrees in the three provinces, the push and pull of political and professional power dynamics shaped Canadian provincial policies governing team-based care. If we are to move team-based primary healthcare forward in Canada, the provinces need to review the external factors and the complex set of relationships and trade-offs that underscore the policy process. Copyright © 2017 Longwoods Publishing.
Testing Land-Vegetation retrieval algorithms for the ICESat-2 mission
NASA Astrophysics Data System (ADS)
Zhou, T.; Popescu, S. C.
2017-12-01
The upcoming spaceborne satellite, the Ice, Cloud and land Elevation Satellite 2 (ICESat-2), will provide topography and canopy profiles at the global scale using photon counting LiDAR. To prepare for the mission launch, the aim of this research is to develop a framework for retrieving ground and canopy height in different forest types and noise levels using two ICESat-2 testbed sensor data: MABEL (Multiple Altimeter Beam Experimental Lidar) data and simulated ICESat-2 data. The first step of the framework is to reduce as many noise photons as possible through grid statistical methods and cluster analysis. Subsequently, we employed the overlapping moving windows and estimated quantile heights in each window to characterize the possible ground and canopy top using the filtered photons. Both MABEL and simulated ICESat-2 data generated satisfactory results with reasonable accuracy, while the results of simulated ICESat-2 data were better than that of MABEL data with smaller root mean square errors (RMSEs). For example, the RMSEs of canopy top identification in various vegetation using simulated ICESat-2 data were less than 3.78 m comparing to 6.48 m for the MABE data. It is anticipated that the methodology will advance data processing of the ICESat-2 mission and expand potential applications of ICESat-2 data once available such as mapping vegetation canopy heights.
Power strain imaging based on vibro-elastography techniques
NASA Astrophysics Data System (ADS)
Wen, Xu; Salcudean, S. E.
2007-03-01
This paper describes a new ultrasound elastography technique, power strain imaging, based on vibro-elastography (VE) techniques. With this method, tissue is compressed by a vibrating actuator driven by low-pass or band-pass filtered white noise, typically in the 0-20 Hz range. Tissue displacements at different spatial locations are estimated by correlation-based approaches on the raw ultrasound radio frequency signals and recorded in time sequences. The power spectra of these time sequences are computed by Fourier spectral analysis techniques. As the average of the power spectrum is proportional to the squared amplitude of the tissue motion, the square root of the average power over the range of excitation frequencies is used as a measure of the tissue displacement. Then tissue strain is determined by the least squares estimation of the gradient of the displacement field. The computation of the power spectra of the time sequences can be implemented efficiently by using Welch's periodogram method with moving windows or with accumulative windows with a forgetting factor. Compared to the transfer function estimation originally used in VE, the computation of cross spectral densities is not needed, which saves both the memory and computational times. Phantom experiments demonstrate that the proposed method produces stable and operator-independent strain images with high signal-to-noise ratio in real time. This approach has been also tested on a few patient data of the prostate region, and the results are encouraging.
Physical metallurgy of metastable Bcc lanthanide-magnesium alloys for R = La, Gd, and Dy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herchenroeder, J.W.; Manfrinetti, P.; Gschneidner, K.A. Jr.
1989-09-01
Bcc La-Mg, Gd-Mg, and Dy-Mg alloys have been prepared by an ice water/acetone quench from liquid melts. Single-phase alloys could be retained in a window around the eutectoid composition: 13 to 22 at. pct Mg, 23.6 to 29 at. pct Mg, and 27 to 29 at. pct Mg for La, Gd, and Dy alloys, respectively. At the center of the windows, x-ray diffraction peaks are extremely sharp as in equilibrium bcc structures; however, as alloy composition is moved away from the eutectoid, line broadening is observed. Reversion of the bcc phase to the equilibrium microstructure for R-Mg alloys (R =more » La, Gd, or Dy) has been characterized by differential thermal analysis (DTA) or differential scanning calorimetry (DSC) and isothermal annealing. La-Mg alloys revert directly to {alpha}La (dhcp) + LaMg at about 350{degrees}C when heated at 10{degrees}C/min. In contrast, the Gd and Dy alloys revert by a two-step process: first, a transition to an intermediate distorted hcp phase between 300{degrees}C and 400{degrees}C, and, second, the relaxation of this phase to {alpha}R (hcp) + RMg at about 490{degrees}C when heated at 10{degrees}C/min. Isothermal annealing and high temperature x-ray diffraction confirm the nature of these reactions.« less
A frequency-based window width optimized two-dimensional S-Transform profilometry
NASA Astrophysics Data System (ADS)
Zhong, Min; Chen, Feng; Xiao, Chao
2017-11-01
A new scheme is proposed to as a frequency-based window width optimized two-dimensional S-Transform profilometry, in which parameters pu and pv are introduced to control the width of a two-dimensional Gaussian window. Unlike the standard two-dimensional S-transform using the Gaussian window with window width proportional to the reciprocal local frequency of the tested signal, the size of window width for the optimized two-dimensional S-Transform varies with the pu th (pv th) power of the reciprocal local frequency fx (fy) in x (y) direction. The paper gives a detailed theoretical analysis of optimized two-dimensional S-Transform in fringe analysis as well as the characteristics of the modified Gauss window. Simulations are applied to evaluate the proposed scheme, the results show that the new scheme has better noise reduction ability and can extract phase distribution more precise in comparison with the standard two-dimensional S-transform even though the surface of the measured object varies sharply. Finally, the proposed scheme is demonstrated on three-dimensional surface reconstruction for a complex plastic cat mask to show its effectiveness.
Active member vibration control experiment in a KC-135 reduced gravity environment
NASA Technical Reports Server (NTRS)
Lawrence, C. R.; Lurie, B. J.; Chen, G.-S.; Swanson, A. D.
1991-01-01
An active member vibration control experiment in a KC-135 reduced gravity environment was carried out by the Air Force Flight Dynamics Laboratory and the Jet Propulsion Laboratory. Two active members, consisting of piezoelectric actuators, displacement sensors, and load cells, were incorporated into a 12-meter, 104 kg box-type test structure. The active member control design involved the use of bridge (compound) feedback concept, in which the collocated force and velocity signals are feedback locally. An impact-type test was designed to accommodate the extremely short duration of the reduced gravity testing window in each parabolic flight. The moving block analysis technique was used to estimate the modal frequencies and dampings from the free-decay responses. A broadband damping performance was demonstrated up to the ninth mode of 40 Hz. The best damping performance achieved in the flight test was about 5 percent in the fourth mode of the test structure.
Rectified directional sensing in long-range cell migration
Nakajima, Akihiko; Ishihara, Shuji; Imoto, Daisuke; Sawai, Satoshi
2014-01-01
How spatial and temporal information are integrated to determine the direction of cell migration remains poorly understood. Here, by precise microfluidics emulation of dynamic chemoattractant waves, we demonstrate that, in Dictyostelium, directional movement as well as activation of small guanosine triphosphatase Ras at the leading edge is suppressed when the chemoattractant concentration is decreasing over time. This ‘rectification’ of directional sensing occurs only at an intermediate range of wave speed and does not require phosphoinositide-3-kinase or F-actin. From modelling analysis, we show that rectification arises naturally in a single-layered incoherent feedforward circuit with zero-order ultrasensitivity. The required stimulus time-window predicts ~5 s transient for directional sensing response close to Ras activation and inhibitor diffusion typical for protein in the cytosol. We suggest that the ability of Dictyostelium cells to move only in the wavefront is closely associated with rectification of adaptive response combined with local activation and global inhibition. PMID:25373620
Solar cycle signatures in the NCEP equatorial annual oscillation
NASA Astrophysics Data System (ADS)
Mayr, H. G.; Mengel, J. G.; Huang, F. T.; Nash, E. R.
2009-08-01
Our analysis of temperature and zonal wind data (1958 to 2006) from the National Center for Atmospheric Research (NCAR) reanalysis (Re-1), supplied by the National Centers for Environmental Prediction (NCEP), shows that the hemispherically symmetric 12-month equatorial annual oscillation (EAO) contains spectral signatures with periods around 11 years. Moving windows of 44 years show that, below 20 km, the 11-year modulation of the EAO is phase locked to the solar cycle (SC). The spectral features from the 48-year data record reveal modulation signatures of 9.6 and 12 years, which produce EAO variations that mimic in limited altitude regimes the varying maxima and minima of the 10.7 cm flux solar index. Above 20 km, the spectra also contain modulation signatures with periods around 11 years, but the filtered variations are too irregular to suggest that systematic SC forcing is the principal agent.
Counter tube window and X-ray fluorescence analyzer study
NASA Technical Reports Server (NTRS)
Hertel, R.; Holm, M.
1973-01-01
A study was performed to determine the best design tube window and X-ray fluorescence analyzer for quantitative analysis of Venusian dust and condensates. The principal objective of the project was to develop the best counter tube window geometry for the sensing element of the instrument. This included formulation of a mathematical model of the window and optimization of its parameters. The proposed detector and instrument has several important features. The instrument will perform a near real-time analysis of dust in the Venusian atmosphere, and is capable of measuring dust layers less than 1 micron thick. In addition, wide dynamic measurement range will be provided to compensate for extreme variations in count rates. An integral pulse-height analyzer and memory accumulate data and read out spectra for detail computer analysis on the ground.
Assessment of gliosis around moveable implants in the brain
Stice, Paula
2010-01-01
Repositioning microelectrodes post-implantation is emerging as a promising approach to achieve long-term reliability in single neuronal recordings. The main goal of this study was to (a) assess glial reaction in response to movement of microelectrodes in the brain post-implantation and (b) determine an optimal window of time post-implantation when movement of microelectrodes within the brain would result in minimal glial reaction. Eleven Sprague-Dawley rats were implanted with two microelectrodes each that could be moved in vivo post-implantation. Three cohorts were investigated: (1) microelectrode moved at day 2 (n = 4 animals), (2) microelectrode moved at day 14 (n = 5 animals) and (3) microelectrode moved at day 28 (n = 2 animals). Histological evaluation was performed in cohorts 1–3 at four-week post-movement (30 days, 42 days and 56 days post-implantation, respectively). In addition, five control animals were implanted with microelectrodes that were not moved. Control animals were implanted for (1) 30 days (n = 1), (2) 42 days (n = 2) and (3) 56 days (n = 2) prior to histological evaluation. Quantitative assessment of glial fibrillary acidic protein (GFAP) around the tip of the microelectrodes demonstrated that GFAP levels were similar around microelectrodes moved at day 2 when compared to the 30-day controls. However, GFAP expression levels around microelectrode tips that moved at day 14 and day 28 were significantly less than those around control microelectrodes implanted for 42 and 56 days, respectively. Therefore, we conclude that moving microelectrodes after implantation is a viable strategy that does not result in any additional damage to brain tissue. Further, moving the microelectrode downwards after 14 days of implantation may actually reduce the levels of GFAP expression around the tips of the microelectrodes in the long term. PMID:19556680
Xu, Yinlin; Ma, Qianli D Y; Schmitt, Daniel T; Bernaola-Galván, Pedro; Ivanov, Plamen Ch
2011-11-01
We investigate how various coarse-graining (signal quantization) methods affect the scaling properties of long-range power-law correlated and anti-correlated signals, quantified by the detrended fluctuation analysis. Specifically, for coarse-graining in the magnitude of a signal, we consider (i) the Floor, (ii) the Symmetry and (iii) the Centro-Symmetry coarse-graining methods. We find that for anti-correlated signals coarse-graining in the magnitude leads to a crossover to random behavior at large scales, and that with increasing the width of the coarse-graining partition interval Δ, this crossover moves to intermediate and small scales. In contrast, the scaling of positively correlated signals is less affected by the coarse-graining, with no observable changes when Δ < 1, while for Δ > 1 a crossover appears at small scales and moves to intermediate and large scales with increasing Δ. For very rough coarse-graining (Δ > 3) based on the Floor and Symmetry methods, the position of the crossover stabilizes, in contrast to the Centro-Symmetry method where the crossover continuously moves across scales and leads to a random behavior at all scales; thus indicating a much stronger effect of the Centro-Symmetry compared to the Floor and the Symmetry method. For coarse-graining in time, where data points are averaged in non-overlapping time windows, we find that the scaling for both anti-correlated and positively correlated signals is practically preserved. The results of our simulations are useful for the correct interpretation of the correlation and scaling properties of symbolic sequences.
Xu, Yinlin; Ma, Qianli D.Y.; Schmitt, Daniel T.; Bernaola-Galván, Pedro; Ivanov, Plamen Ch.
2014-01-01
We investigate how various coarse-graining (signal quantization) methods affect the scaling properties of long-range power-law correlated and anti-correlated signals, quantified by the detrended fluctuation analysis. Specifically, for coarse-graining in the magnitude of a signal, we consider (i) the Floor, (ii) the Symmetry and (iii) the Centro-Symmetry coarse-graining methods. We find that for anti-correlated signals coarse-graining in the magnitude leads to a crossover to random behavior at large scales, and that with increasing the width of the coarse-graining partition interval Δ, this crossover moves to intermediate and small scales. In contrast, the scaling of positively correlated signals is less affected by the coarse-graining, with no observable changes when Δ < 1, while for Δ > 1 a crossover appears at small scales and moves to intermediate and large scales with increasing Δ. For very rough coarse-graining (Δ > 3) based on the Floor and Symmetry methods, the position of the crossover stabilizes, in contrast to the Centro-Symmetry method where the crossover continuously moves across scales and leads to a random behavior at all scales; thus indicating a much stronger effect of the Centro-Symmetry compared to the Floor and the Symmetry method. For coarse-graining in time, where data points are averaged in non-overlapping time windows, we find that the scaling for both anti-correlated and positively correlated signals is practically preserved. The results of our simulations are useful for the correct interpretation of the correlation and scaling properties of symbolic sequences. PMID:25392599
Local concurrent error detection and correction in data structures using virtual backpointers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C.C.J.; Chen, P.P.; Fuchs, W.K.
1989-11-01
A new technique, based on virtual backpointers, is presented in this paper for local concurrent error detection and correction in linked data structures. Two new data structures utilizing virtual backpointers, the Virtual Double-Linked List and the B-Tree and Virtual Backpointers, are described. For these structures, double errors within a fixed-size checking window can be detected in constant time and single errors detected during forward moves can be corrected in constant time.
Two-zone elastic-plastic single shock waves in solids.
Zhakhovsky, Vasily V; Budzevich, Mikalai M; Inogamov, Nail A; Oleynik, Ivan I; White, Carter T
2011-09-23
By decoupling time and length scales in moving window molecular dynamics shock-wave simulations, a new regime of shock-wave propagation is uncovered characterized by a two-zone elastic-plastic shock-wave structure consisting of a leading elastic front followed by a plastic front, both moving with the same average speed and having a fixed net thickness that can extend to microns. The material in the elastic zone is in a metastable state that supports a pressure that can substantially exceed the critical pressure characteristic of the onset of the well-known split-elastic-plastic, two-wave propagation. The two-zone elastic-plastic wave is a general phenomenon observed in simulations of a broad class of crystalline materials and is within the reach of current experimental techniques.
Can Changes in Eye Movement Scanning Alter the Age-Related Deficit in Recognition Memory?
Chan, Jessica P. K.; Kamino, Daphne; Binns, Malcolm A.; Ryan, Jennifer D.
2011-01-01
Older adults typically exhibit poorer face recognition compared to younger adults. These recognition differences may be due to underlying age-related changes in eye movement scanning. We examined whether older adults’ recognition could be improved by yoking their eye movements to those of younger adults. Participants studied younger and older faces, under free viewing conditions (bases), through a gaze-contingent moving window (own), or a moving window which replayed the eye movements of a base participant (yoked). During the recognition test, participants freely viewed the faces with no viewing restrictions. Own-age recognition biases were observed for older adults in all viewing conditions, suggesting that this effect occurs independently of scanning. Participants in the bases condition had the highest recognition accuracy, and participants in the yoked condition were more accurate than participants in the own condition. Among yoked participants, recognition did not depend on age of the base participant. These results suggest that successful encoding for all participants requires the bottom-up contribution of peripheral information, regardless of the locus of control of the viewer. Although altering the pattern of eye movements did not increase recognition, the amount of sampling of the face during encoding predicted subsequent recognition accuracy for all participants. Increased sampling may confer some advantages for subsequent recognition, particularly for people who have declining memory abilities. PMID:21687460
Akita, Yasuyuki; Chen, Jiu-Chiuan; Serre, Marc L.
2013-01-01
Geostatistical methods are widely used in estimating long-term exposures for air pollution epidemiological studies, despite their limited capabilities to handle spatial non-stationarity over large geographic domains and uncertainty associated with missing monitoring data. We developed a moving-window (MW) Bayesian Maximum Entropy (BME) method and applied this framework to estimate fine particulate matter (PM2.5) yearly average concentrations over the contiguous U.S. The MW approach accounts for the spatial non-stationarity, while the BME method rigorously processes the uncertainty associated with data missingnees in the air monitoring system. In the cross-validation analyses conducted on a set of randomly selected complete PM2.5 data in 2003 and on simulated data with different degrees of missing data, we demonstrate that the MW approach alone leads to at least 17.8% reduction in mean square error (MSE) in estimating the yearly PM2.5. Moreover, the MWBME method further reduces the MSE by 8.4% to 43.7% with the proportion of incomplete data increased from 18.3% to 82.0%. The MWBME approach leads to significant reductions in estimation error and thus is recommended for epidemiological studies investigating the effect of long-term exposure to PM2.5 across large geographical domains with expected spatial non-stationarity. PMID:22739679
Online tracking of instantaneous frequency and amplitude of dynamical system response
NASA Astrophysics Data System (ADS)
Frank Pai, P.
2010-05-01
This paper presents a sliding-window tracking (SWT) method for accurate tracking of the instantaneous frequency and amplitude of arbitrary dynamic response by processing only three (or more) most recent data points. Teager-Kaiser algorithm (TKA) is a well-known four-point method for online tracking of frequency and amplitude. Because finite difference is used in TKA, its accuracy is easily destroyed by measurement and/or signal-processing noise. Moreover, because TKA assumes the processed signal to be a pure harmonic, any moving average in the signal can destroy the accuracy of TKA. On the other hand, because SWT uses a constant and a pair of windowed regular harmonics to fit the data and estimate the instantaneous frequency and amplitude, the influence of any moving average is eliminated. Moreover, noise filtering is an implicit capability of SWT when more than three data points are used, and this capability increases with the number of processed data points. To compare the accuracy of SWT and TKA, Hilbert-Huang transform is used to extract accurate time-varying frequencies and amplitudes by processing the whole data set without assuming the signal to be harmonic. Frequency and amplitude trackings of different amplitude- and frequency-modulated signals, vibrato in music, and nonlinear stationary and non-stationary dynamic signals are studied. Results show that SWT is more accurate, robust, and versatile than TKA for online tracking of frequency and amplitude.
Influence of gravity and light on the developmental polarity of Ceratopteris richardii fern spores
NASA Technical Reports Server (NTRS)
Edwards, E. S.; Roux, S. J.
1998-01-01
The polarity of germinating single-celled spores of the fern Ceratopteris richardii Brogn. is influenced by gravity during a time period prior to the first cellular division designated a "polarity-determination window". After this window closes, control of polarity is seen in the downward (with respect to gravity) migration of the nucleus along the proximal face of the spore and the subsequent downward growth of the primary rhizoid. When spores are germinated on a clinostat the direction of nuclear migration and subsequent primary rhizoid growth is random. However, in each case the direction of nuclear migration predicts the direction of rhizoid elongation. Although it is the most obvious movement, the downward migration is not the first movement of the nucleus. During the polarity-determination window, the nucleus moves randomly within a region centered behind the trilete marking. While the polarity of many fern spores has been reported to be controlled by light, spores of C. richardii are the first documented to have their polarity influenced by gravity. Directional white light also affects the polarity of these spores, but this influence is slight and is secondary to that of gravity.
Compositional searching of CpG islands in the human genome
NASA Astrophysics Data System (ADS)
Luque-Escamilla, Pedro Luis; Martínez-Aroza, José; Oliver, José L.; Gómez-Lopera, Juan Francisco; Román-Roldán, Ramón
2005-06-01
We report on an entropic edge detector based on the local calculation of the Jensen-Shannon divergence with application to the search for CpG islands. CpG islands are pieces of the genome related to gene expression and cell differentiation, and thus to cancer formation. Searching for these CpG islands is a major task in genetics and bioinformatics. Some algorithms have been proposed in the literature, based on moving statistics in a sliding window, but its size may greatly influence the results. The local use of Jensen-Shannon divergence is a completely different strategy: the nucleotide composition inside the islands is different from that in their environment, so a statistical distance—the Jensen-Shannon divergence—between the composition of two adjacent windows may be used as a measure of their dissimilarity. Sliding this double window over the entire sequence allows us to segment it compositionally. The fusion of those segments into greater ones that satisfy certain identification criteria must be achieved in order to obtain the definitive results. We find that the local use of Jensen-Shannon divergence is very suitable in processing DNA sequences for searching for compositionally different structures such as CpG islands, as compared to other algorithms in literature.
Li, Qinglin; Zhao, Meng; Wang, Xiaodan
2018-01-01
To compare the differences between the Kidney Disease Improving Global Outcomes (KDIGO) criteria of the 48-hour window and the 7-day window in the diagnosis of acute kidney injury (AKI) in very elderly patients, as well as the relationship between the 48-hour and 7-day windows for diagnosis and 90-day mortality. We retrospectively enrolled very elderly patients (≥75 years old) from the geriatrics department of the Chinese PLA General Hospital between January 2007 and December 2015. AKI patients were divided into 48-hour and 7-day groups by their diagnosis criteria. AKI patients were divided into survivor and nonsurvivor groups by their outcomes within 90 days after diagnosis of AKI. In total, 652 patients were included in the final analysis. The median age of the cohort was 87 (84-91) years, the majority (623, 95.6%) of whom were male. Of the 652 AKI patients, 334 cases (51.2%) were diagnosed with AKI by the 48-hour window for diagnosis, while 318 cases (48.8%) were by the 7-day window for diagnosis. The 90-day mortality was 42.5% in patients with 48-hour window AKI and 24.2% in patients with 7-day window AKI. Kaplan-Meier curves showed that 90-day mortality was lower in the 7-day window AKI group than in the 48-hour window AKI group (log rank: P <0.001). Multivariate analysis by the Cox model revealed that 48-hour window for diagnosis hazard ratio (HR=1.818; 95% CI: 1.256-2.631; P =0.002) was associated with higher 90-day mortality. The 90-day mortality was higher in 48-hour window AKI than in 7-day window AKI in very elderly patients. The 48-hour KDIGO window definition may be less sensitive. The 48-hour KDIGO window definition is significantly better correlated with subsequent mortality and is, therefore, still appropriate for clinical use. Finding early, sensitive biomarkers of kidney damage is a future direction of research.
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
Background An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. Methods The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor’s definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Results Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals’ impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Research Limitations Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. Originality/ Value We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation. PMID:26295157
Liu, Xue-Li; Gai, Shuang-Shuang; Zhang, Shi-Le; Wang, Pu
2015-01-01
An important attribute of the traditional impact factor was the controversial 2-year citation window. So far, several scholars have proposed using different citation time windows for evaluating journals. However, there is no confirmation whether a longer citation time window would be better. How did the journal evaluation effects of 3IF, 4IF, and 6IF comparing with 2IF and 5IF? In order to understand these questions, we made a comparative study of impact factors with different citation time windows with the peer-reviewed scores of ophthalmologic journals indexed by Science Citation Index Expanded (SCIE) database. The peer-reviewed scores of 28 ophthalmologic journals were obtained through a self-designed survey questionnaire. Impact factors with different citation time windows (including 2IF, 3IF, 4IF, 5IF, and 6IF) of 28 ophthalmologic journals were computed and compared in accordance with each impact factor's definition and formula, using the citation analysis function of the Web of Science (WoS) database. An analysis of the correlation between impact factors with different citation time windows and peer-reviewed scores was carried out. Although impact factor values with different citation time windows were different, there was a high level of correlation between them when it came to evaluating journals. In the current study, for ophthalmologic journals' impact factors with different time windows in 2013, 3IF and 4IF seemed the ideal ranges for comparison, when assessed in relation to peer-reviewed scores. In addition, the 3-year and 4-year windows were quite consistent with the cited peak age of documents published by ophthalmologic journals. Our study is based on ophthalmology journals and we only analyze the impact factors with different citation time window in 2013, so it has yet to be ascertained whether other disciplines (especially those with a later cited peak) or other years would follow the same or similar patterns. We designed the survey questionnaire ourselves, specifically to assess the real influence of journals. We used peer-reviewed scores to judge the journal evaluation effect of impact factors with different citation time windows. The main purpose of this study was to help researchers better understand the role of impact factors with different citation time windows in journal evaluation.
Digital PIV (DPIV) Software Analysis System
NASA Technical Reports Server (NTRS)
Blackshire, James L.
1997-01-01
A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.
Sone, M
1998-10-01
The inner layer of the round window membrane is composed of mesothelial cells and this mesothelial cell layer extends to the scala tympani. This study describes the histopathologic findings of temporal bone analysis from a patient with bilateral perilymphatic fistula of the round window membrane. The left ear showed proliferation of mesothelial cells in the scala tympani of the basal turn adjoining the round window membrane. This cell proliferation is thought to be a reaction to the rupture of the round window membrane.
Window acoustic study for advanced turboprop aircraft
NASA Technical Reports Server (NTRS)
Prydz, R. A.; Balena, F. J.
1984-01-01
An acoustic analysis was performed to establish window designs for advanced turboprop powered aircraft. The window transmission loss requirements were based on A-weighted interior noise goals of 80 and 75 dBA. The analytical results showed that a triple pane window consisting of two glass outer panes and an inner pane of acrylic would provide the required transmission loss and meet the sidewall space limits. Two window test articles were fabricated for laboratory evaluation and verification of the predicted transmission loss. Procedures for performing laboratory tests are presented.
Non-intrusive parameter identification procedure user's guide
NASA Technical Reports Server (NTRS)
Hanson, G. D.; Jewell, W. F.
1983-01-01
Written in standard FORTRAN, NAS is capable of identifying linear as well as nonlinear relations between input and output parameters; the only restriction is that the input/output relation be linear with respect to the unknown coefficients of the estimation equations. The output of the identification algorithm can be specified to be in either the time domain (i.e., the estimation equation coefficients) or in the frequency domain (i.e., a frequency response of the estimation equation). The frame length ("window") over which the identification procedure is to take place can be specified to be any portion of the input time history, thereby allowing the freedom to start and stop the identification procedure within a time history. There also is an option which allows a sliding window, which gives a moving average over the time history. The NAS software also includes the ability to identify several assumed solutions simultaneously for the same or different input data.
Karst database development in Minnesota: Design and data assembly
Gao, Y.; Alexander, E.C.; Tipping, R.G.
2005-01-01
The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces. ?? Springer-Verlag 2005.
Systemic risk and hierarchical transitions of financial networks
NASA Astrophysics Data System (ADS)
Nobi, Ashadun; Lee, Jae Woo
2017-06-01
In this paper, the change in topological hierarchy, which is measured by the minimum spanning tree constructed from the cross-correlations between the stock indices from the S & P 500 for 1998-2012 in a one year moving time window, was used to analyze a financial crisis. The hierarchy increased in all minor crises in the observation time window except for the sharp crisis of 2007-2008 when the global financial crisis occurred. The sudden increase in hierarchy just before the global financial crisis can be used for the early detection of an upcoming crisis. Clearly, the higher the hierarchy, the higher the threats to financial stability. The scaling relations were developed to observe the changes in hierarchy with the network topology. These scaling relations can also identify and quantify the financial crisis periods, and appear to contain the predictive power of an upcoming crisis.
Systemic risk and hierarchical transitions of financial networks.
Nobi, Ashadun; Lee, Jae Woo
2017-06-01
In this paper, the change in topological hierarchy, which is measured by the minimum spanning tree constructed from the cross-correlations between the stock indices from the S & P 500 for 1998-2012 in a one year moving time window, was used to analyze a financial crisis. The hierarchy increased in all minor crises in the observation time window except for the sharp crisis of 2007-2008 when the global financial crisis occurred. The sudden increase in hierarchy just before the global financial crisis can be used for the early detection of an upcoming crisis. Clearly, the higher the hierarchy, the higher the threats to financial stability. The scaling relations were developed to observe the changes in hierarchy with the network topology. These scaling relations can also identify and quantify the financial crisis periods, and appear to contain the predictive power of an upcoming crisis.
A view of metals through the terahertz window
NASA Astrophysics Data System (ADS)
Dodge, Steve
2006-05-01
As electrons move through a metal, interaction with their environment tends to slow them down, causing the Drude peak in the optical conductivity to become narrower. The resulting peak width is typically in the terahertz frequency range that sits between microwaves the far infrared, too fast for conventional electronics and too slow for conventional infrared spectroscopy. With femtosecond laser techniques, however, coherent, broadband terahertz radiation can now be generated and detected with exquisite sensitivity, providing a new window onto electronic interactions in metals. I will discuss the application of this technique to a variety of metallic systems, including elemental lead, the nearly magnetic oxide metal CaRuO3, and CrV alloys that span the quantum phase transition from spin-density wave to paramagnetic metal. M. A. Gilmore, S. Kamal, D. M. Broun, and J. S. Dodge, Appl. Phys. Lett. 88, 141910 (2006).
NASA Astrophysics Data System (ADS)
Wang, Xiaohua; Rong, Mingzhe; Qiu, Juan; Liu, Dingxin; Su, Biao; Wu, Yi
A new type of algorithm for predicting the mechanical faults of a vacuum circuit breaker (VCB) based on an artificial neural network (ANN) is proposed in this paper. There are two types of mechanical faults in a VCB: operation mechanism faults and tripping circuit faults. An angle displacement sensor is used to measure the main axle angle displacement which reflects the displacement of the moving contact, to obtain the state of the operation mechanism in the VCB, while a Hall current sensor is used to measure the trip coil current, which reflects the operation state of the tripping circuit. Then an ANN prediction algorithm based on a sliding time window is proposed in this paper and successfully used to predict mechanical faults in a VCB. The research results in this paper provide a theoretical basis for the realization of online monitoring and fault diagnosis of a VCB.
Veldre, Aaron; Andrews, Sally
2014-01-01
Two experiments used the gaze-contingent moving-window paradigm to investigate whether reading comprehension and spelling ability modulate the perceptual span of skilled adult readers during sentence reading. Highly proficient reading and spelling were both associated with increased use information to the right of fixation, but did not systematically modulate the extraction of information to the left of fixation. Individuals who were high in both reading and spelling ability showed the greatest benefit from window sizes larger than 11 characters, primarily because of increases in forward saccade length. They were also significantly more disrupted by being denied close parafoveal information than those poor in reading and/or spelling. These results suggest that, in addition to supporting rapid lexical retrieval of fixated words, the high quality lexical representations indexed by the combination of high reading and spelling ability support efficient processing of parafoveal information and effective saccadic targeting.
The X-windows interactive navigation data editor
NASA Technical Reports Server (NTRS)
Rinker, G. C.
1992-01-01
A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.
Scientific Data Analysis Toolkit: A Versatile Add-in to Microsoft Excel for Windows
ERIC Educational Resources Information Center
Halpern, Arthur M.; Frye, Stephen L.; Marzzacco, Charles J.
2018-01-01
Scientific Data Analysis Toolkit (SDAT) is a rigorous, versatile, and user-friendly data analysis add-in application for Microsoft Excel for Windows (PC). SDAT uses the familiar Excel environment to carry out most of the analytical tasks used in data analysis. It has been designed for student use in manipulating and analyzing data encountered in…
Reading direction and the central perceptual span in Urdu and English.
Paterson, Kevin B; McGowan, Victoria A; White, Sarah J; Malik, Sameen; Abedipour, Lily; Jordan, Timothy R
2014-01-01
Normal reading relies on the reader making a series of saccadic eye movements along lines of text, separated by brief fixational pauses during which visual information is acquired from a region of text. In English and other alphabetic languages read from left to right, the region from which useful information is acquired during each fixational pause is generally reported to extend further to the right of each fixation than to the left. However, the asymmetry of the perceptual span for alphabetic languages read in the opposite direction (i.e., from right to left) has received much less attention. Accordingly, in order to more fully investigate the asymmetry in the perceptual span for these languages, the present research assessed the influence of reading direction on the perceptual span for bilingual readers of Urdu and English. Text in Urdu and English was presented either entirely as normal or in a gaze-contingent moving-window paradigm in which a region of text was displayed as normal at the reader's point of fixation and text outside this region was obscured. The windows of normal text extended symmetrically 0.5° of visual angle to the left and right of fixation, or asymmetrically by increasing the size of each window to 1.5° or 2.5° to either the left or right of fixation. When participants read English, performance for the window conditions was superior when windows extended to the right. However, when reading Urdu, performance was superior when windows extended to the left, and was essentially the reverse of that observed for English. These findings provide a novel indication that the perceptual span is modified by the language being read to produce an asymmetry in the direction of reading and show for the first time that such an asymmetry occurs for reading Urdu.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, P; Tsai, Y; Nien, H
2015-06-15
Purpose: Four dimensional computed tomography (4DCT) scans reliably record whole respiratory phase and generate internal target volumes (ITV) for radiotherapy planning. However, image guiding with cone-beam computed tomography (CBCT) cannot acquire all or specific respiratory phases. This study was designed to investigate the correlation between average CT and Maximum Intensity Projection (MIP) from 4DCT and CBCT. Methods: Retrospective respiratory gating were performed by GE Discovery CT590 RT. 4DCT and CBCT data from CRIS Dynamic Thorax Phantom with simulated breathing mode were analyzed. The lung tissue equivalent material encompassed 3 cm sphere tissue equivalent material. Simulated breathing cycle period was setmore » as 4 seconds, 5 seconds and 6 seconds for representing variation of patient breathing cycle time, and the sphere material moved toward inferior and superior direction with 1 cm amplitude simulating lung tumor motion during respiration. Results: Under lung window, the volume ratio of CBCT scans to ITVs derived from 10 phases average scans was 1.00 ± 0.02, and 1.03 ± 0.03 for ratio of CBCT scans to MIP scans. Under abdomen window, the ratio of CBCT scans to ITVs derived from 10 phases average scans was 0.39 ± 0.06, and 0.06 ± 0.00 for ratio of CBCT scans to MIP scans. There was a significant difference between lung window Result and abdomen window Result. For reducing image guiding uncertainty, CBCT window was set with width 500 and level-250. The ratio of CBCT scans to ITVs derived from 4 phases average scans with abdomen window was 1.19 ± 0.02, and 1.06 ± 0.01 for ratio of CBCT to MIP scans. Conclusion: CBCT images with suitable window width and level can efficiently reduce image guiding uncertainty for patient with mobile tumor. By our setting, we can match motion tumor to gating tumor location on planning CT more accurately neglecting other motion artifacts during CBCT scans.« less
NASA Astrophysics Data System (ADS)
Salinas Solé, Celia; Peña Angulo, Dhais; Gonzalez Hidalgo, Jose Carlos; Brunetti, Michele
2017-04-01
In this poster we applied the moving window approach (see Poster I of this collection) to analyze trends of spring and its corresponding months (March, April, May) temperature mean values of maximum (Tmax) and minimum (Tmin) in Spanish mainland to detect the effects of length period and starting year. Monthly series belong to Monthly Temperature dataset of Spanish mainland (MOTEDAS). Database contains in its grid format of 5236 pixels of monthly series (10x10 km). The threshold used in spatial analyses considers 20% of land under significant trend (p<0.05). The most striking results are as follow: • Seasonal Tmax shows that global trend was positive and significant until the mid 80's with higher values than 75% from between 1954-2010 to 1979-2010, being reduced after to the north region. So, from 1985-2010 no significant trend have been detected. Monthly analyses show differences. March trend is not significant (<20% of area) since 1974-2010, while significant trend in April and May varies between 1961-2010/1979-2010 and 1965-2010/1980-2010 respectively, clearly located in northern midland and Mediterranean coastland. • Spring Tmin trend analyses is significantly (>20%) during all temporal windows, notwithstanding NW do not show global significant trend, and in the most recent temporal windows only affect significantly SE. Monthly analyses also differ. Not significant trend is detected in March from 1979-2010, and from 1985-2010 in May, being April the month in any temporal windows with more than 20% of land affected by significant trend. • Spatial differences are detected between windows (South-North in March, East-West in April-May. We can conclude Tmax trend varies accordingly temporal windows dramatically in spring and no significance has been detected in the recent decades. Northern areas and Mediterranean coastland seems to be the most affected. Monthy Tmax trend spatial analyses confirm the heterogeneity of diurnal temperatures; different spatial gradients in windows have been detected between months. Seasonal Tmin show a more global temporal pattern. Spatial gradients of significance between months have been detected, in some sense contraries to the observed in Tmax.
Two-dimensional correlation spectroscopy — Biannual survey 2007-2009
NASA Astrophysics Data System (ADS)
Noda, Isao
2010-06-01
The publication activities in the field of 2D correlation spectroscopy are surveyed with the emphasis on papers published during the last two years. Pertinent review articles and conference proceedings are discussed first, followed by the examination of noteworthy developments in the theory and applications of 2D correlation spectroscopy. Specific topics of interest include Pareto scaling, analysis of randomly sampled spectra, 2D analysis of data obtained under multiple perturbations, evolution of 2D spectra along additional variables, comparison and quantitative analysis of multiple 2D spectra, orthogonal sample design to eliminate interfering cross peaks, quadrature orthogonal signal correction and other data transformation techniques, data pretreatment methods, moving window analysis, extension of kernel and global phase angle analysis, covariance and correlation coefficient mapping, variant forms of sample-sample correlation, and different display methods. Various static and dynamic perturbation methods used in 2D correlation spectroscopy, e.g., temperature, composition, chemical reactions, H/D exchange, physical phenomena like sorption, diffusion and phase transitions, optical and biological processes, are reviewed. Analytical probes used in 2D correlation spectroscopy include IR, Raman, NIR, NMR, X-ray, mass spectrometry, chromatography, and others. Application areas of 2D correlation spectroscopy are diverse, encompassing synthetic and natural polymers, liquid crystals, proteins and peptides, biomaterials, pharmaceuticals, food and agricultural products, solutions, colloids, surfaces, and the like.
Shakil, Sadia; Lee, Chin-Hui; Keilholz, Shella Dawn
2016-01-01
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. PMID:26952197
Yao, Jingyu; Jia, Lin; Khan, Naheed; Zheng, Qiong-Duan; Moncrief, Ashley; Hauswirth, William W.; Thompson, Debra A.; Zacks, David N.
2012-01-01
Purpose AAV-mediated gene therapy in the rd10 mouse, with retinal degeneration caused by mutation in the rod cyclic guanosine monophosphate phosphodiesterase β-subunit (PDEβ) gene, produces significant, but transient, rescue of photoreceptor structure and function. This study evaluates the ability of AAV-mediated delivery of X-linked inhibitor of apoptosis (XIAP) to enhance and prolong the efficacy of PDEβ gene-replacement therapy. Methods Rd10 mice were bred and housed in darkness. Two groups of animals were generated: Group 1 received sub-retinal AAV5-XIAP or AAV5-GFP at postnatal age (P) 4 or 21 days; Group 2 received sub-retinal AAV5-XIAP plus AAV5- PDEβ, AAV5-GFP plus AAV5- PDEβ, or AAV- PDEβ alone at age P4 or P21. Animals were maintained for an additional 4 weeks in darkness before being moved to a cyclic-light environment. A subset of animals from Group 1 received a second sub-retinal injection of AAV8-733-PDEβ two weeks after being moved to the light. Histology, immunohistochemistry, Western blots, and electroretinograms were performed at different times after moving to the light. Results Injection of AAV5-XIAP alone at P4 and 21 resulted in significant slowing of light-induced retinal degeneration, as measured by outer nuclear thickness and cell counts, but did not result in improved outer segment structure and rhodopsin localization. In contrast, co-injection of AAV5-XIAP and AAV5-PDEβ resulted in increased levels of rescue and decreased rates of retinal degeneration compared to treatment with AAV5-PDEβ alone. Mice treated with AAV5-XIAP at P4, but not P21, remained responsive to subsequent rescue by AAV8-733-PDEβ when injected two weeks after moving to a light-cycling environment. Conclusions Adjunctive treatment with the anti-apoptotic gene XIAP confers additive protective effect to gene-replacement therapy with AAV5-PDEβ in the rd10 mouse. In addition, AAV5-XIAP, when given early, can increase the age at which gene-replacement therapy remains effective, thus effectively prolonging the window of opportunity for therapeutic intervention. PMID:22615940
A case study of exposure to ultrafine particles from secondhand tobacco smoke in an automobile.
Liu, S; Zhu, Y
2010-10-01
Secondhand tobacco smoke (SHS) in enclosed spaces is a major source of potentially harmful airborne particles. To quantify exposure to ultrafine particles (UFP) because of SHS and to investigate the interaction between pollutants from SHS and vehicular emissions, number concentration and size distribution of UFP and other air pollutants (CO, CO(2) , and PM(2.5)) were measured inside a moving vehicle under five different ventilation conditions. A major interstate freeway with a speed limit of 60 mph and an urban roadway with a speed limit of 30 mph were selected to represent typical urban routes. In a typical 30-min commute on urban roadways, the SHS of one cigarette exposed passengers to approximately 10 times the UFP and 120 times the PM(2.5) of ambient air. The most effective solution to protect passengers from SHS exposure is to abstain from smoking in the vehicle. Opening a window is an effective method for decreasing pollutant exposures on most urban roadways. However, under road conditions with high UFP concentrations, such as tunnels or busy freeways with high proportion of heavy-duty diesel trucks (such as the 710 Freeway in Los Angeles, CA, USA), opening a window is not a viable method to reduce UFPs. Time budget studies show that Americans spend, on average, more than 60 min each day in enclosed vehicles. Smoking inside vehicles can expose the driver and other passengers to high levels of pollutants. Thus, an understanding of the variations and interactions of secondhand tobacco smoke (SHS) and vehicular emissions under realistic driving conditions is necessary. Results of this study indicated that high ventilation rates can effectively dilute ultrafine particles (UFP) inside moving vehicles on urban routes. However, driving with open windows and an increased air exchange rate (AER) are not recommended on tunnels and heavily travelled freeways.
Settling into an increasingly hostile world: the rapidly closing "recruitment window" for corals.
Arnold, Suzanne N; Steneck, Robert S
2011-01-01
Free space is necessary for larval recruitment in all marine benthic communities. Settling corals, with limited energy to invest in competitive interactions, are particularly vulnerable during settlement into well-developed coral reef communities. This situation may be exacerbated for corals settling into coral-depauperate reefs where succession in nursery microhabitats moves rapidly toward heterotrophic organisms inhospitable to settling corals. To study effects of benthic organisms (at millimeter to centimeter scales) on newly settled corals and their survivorship we deployed terra-cotta coral settlement plates at 10 m depth on the Mesoamerican Barrier Reef in Belize and monitored them for 38 mo. During the second and third years, annual recruitment rates declined by over 50% from the previous year. Invertebrate crusts (primarily sponges) were absent at the start of the experiment but increased in abundance annually from 39, 60, to 73% of the plate undersides by year three. Subsequently, substrates hospitable to coral recruitment, including crustose coralline algae, biofilmed terra-cotta and polychaete tubes, declined. With succession, substrates upon which spat settled shifted toward organisms inimical to survivorship. Over 50% of spat mortality was due to overgrowth by sponges alone. This result suggests that when a disturbance creates primary substrate a "recruitment window" for settling corals exists from approximately 9 to 14 mo following the disturbance. During the window, early-succession, facilitating species are most abundant. The window closes as organisms hostile to coral settlement and survivorship overgrow nursery microhabitats.
Clipping polygon faces through a polyhedron of vision
NASA Technical Reports Server (NTRS)
Florence, Judit K. (Inventor); Rohner, Michel A. (Inventor)
1980-01-01
A flight simulator combines flight data and polygon face terrain data to provide a CRT display at each window of the simulated aircraft. The data base specifies the relative position of each vertex of each polygon face therein. Only those terrain faces currently appearing within the pyramid of vision defined by the pilots eye and the edges of the pilots window need be displayed at any given time. As the orientation of the pyramid of vision changes in response to flight data, the displayed faces are correspondingly displaced, eventually moving out of the pyramid of vision. Faces which are currently not visible (outside the pyramid of vision) are clipped from the data flow. In addition, faces which are only partially outside of pyramid of vision are reconstructed to eliminate the outside portion. Window coordinates are generated defining the distance between each vertex and each of the boundary planes forming the pyramid of vision. The sign bit of each window coordinate indicates whether the vertex is on the pyramid of vision side of the associated boundary panel (positive), or on the other side thereof (negative). The set of sign bits accompanying each vertex constitute the outcode of that vertex. The outcodes (O.C.) are systematically processed and examined to determine which faces are completely inside the pyramid of vision (Case A--all signs positive), which faces are completely outside (Case C--All signs negative) and which faces must be reconstructed (Case B--both positive and negative signs).
Does preprocessing change nonlinear measures of heart rate variability?
Gomes, Murilo E D; Guimarães, Homero N; Ribeiro, Antônio L P; Aguirre, Luis A
2002-11-01
This work investigated if methods used to produce a uniformly sampled heart rate variability (HRV) time series significantly change the deterministic signature underlying the dynamics of such signals and some nonlinear measures of HRV. Two methods of preprocessing were used: the convolution of inverse interval function values with a rectangular window and the cubic polynomial interpolation. The HRV time series were obtained from 33 Wistar rats submitted to autonomic blockade protocols and from 17 healthy adults. The analysis of determinism was carried out by the method of surrogate data sets and nonlinear autoregressive moving average modelling and prediction. The scaling exponents alpha, alpha(1) and alpha(2) derived from the detrended fluctuation analysis were calculated from raw HRV time series and respective preprocessed signals. It was shown that the technique of cubic interpolation of HRV time series did not significantly change any nonlinear characteristic studied in this work, while the method of convolution only affected the alpha(1) index. The results suggested that preprocessed time series may be used to study HRV in the field of nonlinear dynamics.
Background subtraction for fluorescence EXAFS data of a very dilute dopant Z in Z + 1 host.
Medling, Scott; Bridges, Frank
2011-07-01
When conducting EXAFS at the Cu K-edge for ZnS:Cu with very low Cu concentration (<0.04% Cu), a large background was present that increased with energy. This background arises from a Zn X-ray Raman peak, which moves through the Cu fluorescence window, plus the tail of the Zn fluorescence peak. This large background distorts the EXAFS and must be removed separately before reducing the data. A simple means to remove this background is described.
Modeling laser-plasma acceleration in the laboratory frame
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2011-01-01
A simulation of laser-plasma acceleration in the laboratory frame. Both the laser and the wakefield buckets must be resolved over the entire domain of the plasma, requiring many cells and many time steps. While researchers often use a simulation window that moves with the pulse, this reduces only the multitude of cells, not the multitude of time steps. For an artistic impression of how to solve the simulation by using the boosted-frame method, watch the video "Modeling laser-plasma acceleration in the wakefield frame".
The Optical Gravitational Lensing Experiment. UBVI Photometry of Stars in Baade's Window
NASA Astrophysics Data System (ADS)
Paczynski, B.; Udalski, A.; Szymanski, M.; Kubiak, M.; Pietrzynski, G.; Soszynski, I.; Wozniak, P.; Zebrun, K.
1999-09-01
We present UBVI photometry for 8530 stars in Baade's Window obtained during the OGLE-II microlensing survey. Among these are over one thousand red clump giants. 1391 of them have photometry with errors smaller than 0.04, 0.06, 0.12, and 0.20 mag in the I, V, B, and U-band, respectively. We constructed a map of interstellar reddening. The corrected colors of the red clump giants: (U-B)_0, (B-V)_0, and (V-I)_0 are very well correlated, indicating that a single parameter determines the observed spread of their values, reaching almost 2 mag in the (U-B)_0. It seems most likely that heavy element content is the dominant parameter, but it is possible that another parameter: the age (or mass) of a star moves it along the same trajectory in the color-color diagram as the metallicity. The current ambiguity can be resolved with spectral analysis, and our catalog may be useful as a finding list of red clump giants. We point out that these K giants are more suitable for a fair determination of the distribution of metallicity than brighter M giants. We also present a compilation of UBVI data for 308 red clump giants near the Sun, for which Hipparcos parallaxes are more accurate than 10%. Spectral analysis of their metallicity may provide information about the local metallicity distribution as well as the extent to which mass (age) of these stars affects their colors. It is remarkable that in spite of a number of problems, stellar models agree with observations at the 0.1-0.2 mag level, making red clump giants not only the best calibrated but also the best understood standard candle.
El-Deftar, Moteaa M; Speers, Naomi; Eggins, Stephen; Foster, Simon; Robertson, James; Lennard, Chris
2014-08-01
A commercially available laser-induced breakdown spectroscopy (LIBS) instrument was evaluated for the determination of elemental composition of twenty Australian window glass samples, consisting of 14 laminated samples and 6 non-laminated samples (or not otherwise specified) collected from broken windows at crime scenes. In this study, the LIBS figures of merit were assessed in terms of accuracy, limits of detection and precision using three standard reference materials (NIST 610, 612, and 1831). The discrimination potential of LIBS was compared to that obtained using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), X-ray microfluorescence spectroscopy (μXRF) and scanning electron microscopy energy dispersive X-ray spectrometry (SEM-EDX) for the analysis of architectural window glass samples collected from crime scenes in the Canberra region, Australia. Pairwise comparisons were performed using a three-sigma rule, two-way ANOVA and Tukey's HSD test at 95% confidence limit in order to investigate the discrimination power for window glass analysis. The results show that the elemental analysis of glass by LIBS provides a discrimination power greater than 97% (>98% when combined with refractive index data), which was comparable to the discrimination powers obtained by LA-ICP-MS and μXRF. These results indicate that LIBS is a feasible alternative to the more expensive LA-ICP-MS and μXRF options for the routine forensic analysis of window glass samples. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Purged window apparatus. [On-line spectroscopic analysis of gas flow systems
Ballard, E.O.
1982-04-05
A purged window apparatus is described which utilizes tangentially injected heated purge gases in the vicinity of electromagnetic radiation transmitting windows and a tapered external mounting tube to accelerate these gases to provide a vortex flow on the window surface and a turbulent flow throughout the mounting tube thereby preventing backstreaming of flowing gases under investigation in a chamber to which a plurality of similar purged apparatus is attached with the consequent result that spectroscopic analyses can be undertaken for lengthy periods without the necessity of interrupting the flow for cleaning or replacing the windows due to contamination.
Air Traffic Complexity Measurement Environment (ACME): Software User's Guide
NASA Technical Reports Server (NTRS)
1996-01-01
A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.
Opto-mechanical design of optical window for aero-optics effect simulation instruments
NASA Astrophysics Data System (ADS)
Wang, Guo-ming; Dong, Dengfeng; Zhou, Weihu; Ming, Xing; Zhang, Yan
2016-10-01
A complete theory is established for opto-mechanical systems design of the window in this paper, which can make the design more rigorous .There are three steps about the design. First, the universal model of aerodynamic environment is established based on the theory of Computational Fluid Dynamics, and the pneumatic pressure distribution and temperature data of optical window surface is obtained when aircraft flies in 5-30km altitude, 0.5-3Ma speed and 0-30°angle of attack. The temperature and pressure distribution values for the maximum constraint is selected as the initial value of external conditions on the optical window surface. Then, the optical window and mechanical structure are designed, which is also divided into two parts: First, mechanical structure which meet requirements of the security and tightness is designed. Finally, rigorous analysis and evaluation are given about the structure of optics and mechanics we have designed. There are two parts to be analyzed. First, the Fluid-Solid-Heat Coupled Model is given based on finite element analysis. And the deformation of the glass and structure can be obtained by the model, which can assess the feasibility of the designed optical windows and ancillary structure; Second, the new optical surface is fitted by Zernike polynomials according to the deformation of the surface of the optical window, which can evaluate imaging quality impact of spectral camera by the deformation of window.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belle R. Upadhyaya; J. Wesley Hines
2004-09-27
Integrity monitoring and flaw diagnostics of flat beams and tubular structures was investigated in this research task using guided acoustic signals. A piezo-sensor suite was deployed to activate and collect Lamb wave signals that propagate along metallic specimens. The dispersion curves of Lamb waves along plate and tubular structures are generated through numerical analysis. Several advanced techniques were explored to extract representative features from acoustic time series. Among them, the Hilbert-Huang transform (HHT) is a recently developed technique for the analysis of non-linear and transient signals. A moving window method was introduced to generate the local peak characters from acousticmore » time series, and a zooming window technique was developed to localize the structural flaws. The time-frequency analysis and pattern recognition techniques were combined for classifying structural defects in brass tubes. Several types of flaws in brass tubes were tested, both in the air and in water. The techniques also proved to be effective under background/process noise. A detailed theoretical analysis of Lamb wave propagation was performed and simulations were carried out using the finite element software system ABAQUS. This analytical study confirmed the behavior of the acoustic signals acquired from the experimental studies. The report presents the background the analysis of acoustic signals acquired from piezo-electric transducers for structural defect monitoring. A comparison of the use of time-frequency techniques, including the Hilbert-Huang transform, is presented. The report presents the theoretical study of Lamb wave propagation in flat beams and tubular structures, and the need for mode separation in order to effectively perform defect diagnosis. The results of an extensive experimental study of detection, location, and isolation of structural defects in flat aluminum beams and brass tubes are presented. The results of this research show the feasibility of on-line monitoring of small structural flaws by the use of transient and nonlinear acoustic signal analysis, and its implementation by the proper design of a piezo-electric transducer suite.« less
Independent Orbiter Assessment (IOA): Analysis of the purge, vent and drain subsystem
NASA Technical Reports Server (NTRS)
Bynum, M. C., III
1987-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter PV and D (Purge, Vent and Drain) Subsystem hardware. The PV and D Subsystem controls the environment of unpressurized compartments and window cavities, senses hazardous gases, and purges Orbiter/ET Disconnect. The subsystem is divided into six systems: Purge System (controls the environment of unpressurized structural compartments); Vent System (controls the pressure of unpressurized compartments); Drain System (removes water from unpressurized compartments); Hazardous Gas Detection System (HGDS) (monitors hazardous gas concentrations); Window Cavity Conditioning System (WCCS) (maintains clear windows and provides pressure control of the window cavities); and External Tank/Orbiter Disconnect Purge System (prevents cryo-pumping/icing of disconnect hardware). Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Four of the sixty-two failure modes analyzed were determined as single failures which could result in the loss of crew or vehicle. A possible loss of mission could result if any of twelve single failures occurred. Two of the criticality 1/1 failures are in the Window Cavity Conditioning System (WCCS) outer window cavity, where leakage and/or restricted flow will cause failure to depressurize/repressurize the window cavity. Two criticality 1/1 failures represent leakage and/or restricted flow in the Orbiter/ET disconnect purge network which prevent cryopumping/icing of disconnect hardware. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.
Data in support of energy performance of double-glazed windows.
Shakouri, Mahmoud; Banihashemi, Saeed
2016-06-01
This paper provides the data used in a research project to propose a new simplified windows rating system based on saved annual energy ("Developing an empirical predictive energy-rating model for windows by using Artificial Neural Network" (Shakouri Hassanabadi and Banihashemi Namini, 2012) [1], "Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates" (Banihashemi et al., 2015) [2]). A full factorial simulation study was conducted to evaluate the performance of 26 different types of windows in a four-story residential building. In order to generalize the results, the selected windows were tested in four climates of cold, tropical, temperate, and hot and arid; and four different main orientations of North, West, South and East. The accompanied datasets include the annual saved cooling and heating energy in different climates and orientations by using the selected windows. Moreover, a complete dataset is provided that includes the specifications of 26 windows, climate data, month, and orientation of the window. This dataset can be used to make predictive models for energy efficiency assessment of double glazed windows.
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
Hart, Robert
2018-02-21
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
Numerical and experimental validation for the thermal transmittance of windows with cellular shades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Robert
Some highly energy efficient window attachment products are available today, but more rapid market adoption would be facilitated by fair performance metrics. It is important to have validated simulation tools to provide a basis for this analysis. This paper outlines a review and validation of the ISO 15099 center-of-glass zero-solar-load heat transfer correlations for windows with cellular shades. Thermal transmittance was measured experimentally, simulated using computational fluid dynamics (CFD) analysis, and simulated utilizing correlations from ISO 15099 as implemented in Berkeley Lab WINDOW and THERM software. CFD analysis showed ISO 15099 underestimates heat flux of rectangular cavities by up tomore » 60% when aspect ratio (AR) = 1 and overestimates heat flux up to 20% when AR = 0.5. CFD analysis also showed that wave-type surfaces of cellular shades have less than 2% impact on heat flux through the cavities and less than 5% for natural convection of room-side surface. WINDOW was shown to accurately represent heat flux of the measured configurations to a mean relative error of 0.5% and standard deviation of 3.8%. Finally, several shade parameters showed significant influence on correlation accuracy, including distance between shade and glass, inconsistency in cell stretch, size of perimeter gaps, and the mounting hardware.« less
The window of visibility: A psychological theory of fidelity in time-sampled visual motion displays
NASA Technical Reports Server (NTRS)
Watson, A. B.; Ahumada, A. J., Jr.; Farrell, J. E.
1983-01-01
Many visual displays, such as movies and television, rely upon sampling in the time domain. The spatiotemporal frequency spectra for some simple moving images are derived and illustrations of how these spectra are altered by sampling in the time domain are provided. A simple model of the human perceiver which predicts the critical sample rate required to render sampled and continuous moving images indistinguishable is constructed. The rate is shown to depend upon the spatial and temporal acuity of the observer, and upon the velocity and spatial frequency content of the image. Several predictions of this model are tested and confirmed. The model is offered as an explanation of many of the phenomena known as apparent motion. Finally, the implications of the model for computer-generated imagery are discussed.
Zhong, Ran; Xie, Haiyang; Kong, Fanzhi; Zhang, Qiang; Jahan, Sharmin; Xiao, Hua; Fan, Liuyin; Cao, Chengxi
2016-09-21
In this work, we developed the concept of enzyme catalysis-electrophoresis titration (EC-ET) under ideal conditions, the theory of EC-ET for multiplex enzymatic assay (MEA), and a related method based on a moving reaction boundary (MRB) chip with a collateral channel and cell phone imaging. As a proof of principle, the model enzymes horseradish peroxidase (HRP), laccase and myeloperoxidase (MPO) were chosen for the tests of the EC-ET model. The experiments revealed that the EC-ET model could be achieved via coupling EC with ET within a MRB chip; particularly the MEA analyses of catalysis rate, maximum rate, activity, Km and Kcat could be conducted via a single run of the EC-ET chip, systemically demonstrating the validity of the EC-ET theory. Moreover, the developed method had these merits: (i) two orders of magnitude higher sensitivity than a fluorescence microplate reader, (ii) simplicity and low cost, and (iii) fairly rapid (30 min incubation, 20 s imaging) analysis, fair stability (<5.0% RSD) and accuracy, thus validating the EC-ET method. Finally, the developed EC-ET method was used for the clinical assay of MPO activity in blood samples; the values of MPO activity detected via the EC-ET chip were in agreement with those obtained by a traditional fluorescence microplate reader, indicating the applicability of the EC-ET method. The work opens a window for the development of enzymatic research, enzyme assay, immunoassay, and point-of-care testing as well as titration, one of the oldest methods of analysis, based on a simple chip.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivas, Eric Richard
2016-02-26
A conjugate heat transfer and thermal structural analysis was completed, with the objective of determining the following: Lead bismuth eutectic (LBE) peak temperature, free convective velocity patterns in the LBE, peak beam window temperature, and thermal stress/deformation in the window.
Galias, Zbigniew
2017-05-01
An efficient method to find positions of periodic windows for the quadratic map f(x)=ax(1-x) and a heuristic algorithm to locate the majority of wide periodic windows are proposed. Accurate rigorous bounds of positions of all periodic windows with periods below 37 and the majority of wide periodic windows with longer periods are found. Based on these results, we prove that the measure of the set of regular parameters in the interval [3,4] is above 0.613960137. The properties of periodic windows are studied numerically. The results of the analysis are used to estimate that the true value of the measure of the set of regular parameters is close to 0.6139603.
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.
Maps of averaged spectral deviations from soil lines and their comparison with traditional soil maps
NASA Astrophysics Data System (ADS)
Rukhovich, D. I.; Rukhovich, A. D.; Rukhovich, D. D.; Simakova, M. S.; Kulyanitsa, A. L.; Bryzzhev, A. V.; Koroleva, P. V.
2016-07-01
The analysis of 34 cloudless fragments of Landsat 5, 7, and 8 images (1985-2014) on the territory of Plavsk, Arsen'evsk, and Chern districts of Tula oblast has been performed. It is shown that bare soil surface on the RED-NIR plots derived from the images cannot be described in the form of a sector of spectral plane as it can be done for the NDVI values. The notion of spectral neighborhood of soil line (SNSL) is suggested. It is defined as the sum of points of the RED-NIR spectral space, which are characterized by spectral characteristics of the bare soil applied for constructing soil lines. The way of the SNSL separation along the line of the lowest concentration density of points on the RED-NIR spectral space is suggested. This line separates bare soil surface from vegetating plants. The SNSL has been applied to construct soil line (SL) for each of the 34 images and to delineate bare soil surface on them. Distances from the points with averaged RED-NIR coordinates to the SL have been calculated using the method of moving window. These distances can be referred to as averaged spectral deviations (ASDs). The calculations have been performed strictly for the SNSL areas. As a result, 34 maps of ASDs have been created. These maps contain ASD values for 6036 points of a grid used in the study. Then, the integral map of normalized ASD values has been built with due account for the number of points participating in the calculation (i.e., lying in the SNSL) within the moving window. The integral map of ASD values has been compared with four traditional soil maps on the studied territory. It is shown that this integral map can be interpreted in terms of soil taxa: the areas of seven soil subtypes (soddy moderately podzolic, soddy slightly podzolic, light gray forest. gray forest, dark gray forest, podzolized chernozems, and leached chernozems) belonging to three soil types (soddy-podzolic, gray forest, and chernozemic soils) can be delineated on it.
The Evolution of Root Zone Storage Capacity after Land Use Change
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Hutton, Christopher; Pechlivanidis, Ilias; Capell, René; Arheimer, Berit; Wagener, Thorsten; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-04-01
Root zone storage capacity forms a crucial parameter in ecosystem functioning as it is the key parameter that determines the partitioning between runoff and transpiration. There is increasing evidence from several case studies for specific plants that vegetation adapts to the critical situation of droughts. For example, trees will, on the long term, try to improve their internal hydraulic conductivity after droughts, for example by allocating more biomass for roots. In spite of this understanding, the water storage capacity in the root zone is often treated as constant in hydrological models. In this study, it was hypothesized that root zone storage capacities are altered by deforestation and the regrowth of the ecosystem. Three deforested sub catchments as well as not affected, nearby control catchments of the experimental forests of HJ Andrews and Hubbard Brook were selected for this purpose. Root zone storage capacities were on the one hand estimated by a climate-based approach similar to Gao et al. (2014), making use of simple water balance considerations to determine the evaporative demand of the system. In this way, the maximum deficit between evaporative demand and precipitation allows a robust estimation of the root zone storage capacity. On the other hand, three conceptual hydrological models (FLEX, HYPE, HYMOD) were calibrated in a moving window approach for all catchments. The obtained model parameter values representing the root zone storage capacities of the individual catchments for each moving window period were then compared to the estimates derived from climate data for the same periods. Model- and climate-derived estimates of root zone storage capacities both showed a similar evolution. In the deforested catchments, considerable reductions of the root zone storage capacities, compared to the pre-treatment situation and control catchments, were observed. In addition, the years after forest clearing were characterized by a gradual recovery of the root zone storage capacities, converging to new equilibrium conditions and linked to forest regrowth. Further trend analysis suggested a relatively quick hydrological recovery between 5 and 15 years in the study catchments. The results lend evidence to the role of both, climate and vegetation dynamics for the development of root zone systems and their controlling influence on hydrological response dynamics.
Macro-meso-microsystems integration in LTCC : LDRD report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Smet, Dennis J.; Nordquist, Christopher Daniel; Turner, Timothy Shawn
2007-03-01
Low Temperature Cofired Ceramic (LTCC) has proven to be an enabling medium for microsystem technologies, because of its desirable electrical, physical, and chemical properties coupled with its capability for rapid prototyping and scalable manufacturing of components. LTCC is viewed as an extension of hybrid microcircuits, and in that function it enables development, testing, and deployment of silicon microsystems. However, its versatility has allowed it to succeed as a microsystem medium in its own right, with applications in non-microelectronic meso-scale devices and in a range of sensor devices. Applications include silicon microfluidic ''chip-and-wire'' systems and fluid grid array (FGA)/microfluidic multichip modulesmore » using embedded channels in LTCC, and cofired electro-mechanical systems with moving parts. Both the microfluidic and mechanical system applications are enabled by sacrificial volume materials (SVM), which serve to create and maintain cavities and separation gaps during the lamination and cofiring process. SVMs consisting of thermally fugitive or partially inert materials are easily incorporated. Recognizing the premium on devices that are cofired rather than assembled, we report on functional-as-released and functional-as-fired moving parts. Additional applications for cofired transparent windows, some as small as an optical fiber, are also described. The applications described help pave the way for widespread application of LTCC to biomedical, control, analysis, characterization, and radio frequency (RF) functions for macro-meso-microsystems.« less
Zhang, Liu-Xia; Cao, Yi-Ren; Xiao, Hua; Liu, Xiao-Ping; Liu, Shao-Rong; Meng, Qing-Hua; Fan, Liu-Yin; Cao, Cheng-Xi
2016-03-15
In the present work we address a simple, rapid and quantitative analytical method for detection of different proteins present in biological samples. For this, we proposed the model of titration of double protein (TDP) and its relevant leverage theory relied on the retardation signal of chip moving reaction boundary electrophoresis (MRBE). The leverage principle showed that the product of the first protein content and its absolute retardation signal is equal to that of the second protein content and its absolute one. To manifest the model, we achieved theoretical self-evidence for the demonstration of the leverage principle at first. Then relevant experiments were conducted on the TDP-MRBE chip. The results revealed that (i) there was a leverage principle of retardation signal within the TDP of two pure proteins, and (ii) a lever also existed within these two complex protein samples, evidently demonstrating the validity of TDP model and leverage theory in MRBE chip. It was also showed that the proposed technique could provide a rapid and simple quantitative analysis of two protein samples in a mixture. Finally, we successfully applied the developed technique for the quantification of soymilk in adulterated infant formula. The TDP-MRBE opens up a new window for the detection of adulteration ratio of the poor food (milk) in blended high quality one. Copyright © 2015 Elsevier B.V. All rights reserved.
1st- and 2nd-order motion and texture resolution in central and peripheral vision
NASA Technical Reports Server (NTRS)
Solomon, J. A.; Sperling, G.
1995-01-01
STIMULI. The 1st-order stimuli are moving sine gratings. The 2nd-order stimuli are fields of static visual texture, whose contrasts are modulated by moving sine gratings. Neither the spatial slant (orientation) nor the direction of motion of these 2nd-order (microbalanced) stimuli can be detected by a Fourier analysis; they are invisible to Reichardt and motion-energy detectors. METHOD. For these dynamic stimuli, when presented both centrally and in an annular window extending from 8 to 10 deg in eccentricity, we measured the highest spatial frequency for which discrimination between +/- 45 deg texture slants and discrimination between opposite directions of motion were each possible. RESULTS. For sufficiently low spatial frequencies, slant and direction can be discriminated in both central and peripheral vision, for both 1st- and for 2nd-order stimuli. For both 1st- and 2nd-order stimuli, at both retinal locations, slant discrimination is possible at higher spatial frequencies than direction discrimination. For both 1st- and 2nd-order stimuli, motion resolution decreases 2-3 times more rapidly with eccentricity than does texture resolution. CONCLUSIONS. (1) 1st- and 2nd-order motion scale similarly with eccentricity. (2) 1st- and 2nd-order texture scale similarly with eccentricity. (3) The central/peripheral resolution fall-off is 2-3 times greater for motion than for texture.
Self spectrum window method in wigner-ville distribution.
Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun
2005-01-01
Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.
NASA Astrophysics Data System (ADS)
Genzano, Nicola; Filizzola, Carolina; Hattori, Katsumi; Lisi, Mariano; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio
2017-04-01
In order to increase reliability and precision of short-term seismic hazard assessment (but also a possible earthquakes forecast), the integration of different kinds of observations (chemical, physical, biological, etc.) in a multi-parametric approach could be a useful strategy to be undertaken. Among the different observational methodologies, the fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range, have been proposed since eighties as a potential earthquake precursor. Since 2001, the general change detention approach Robust Satellite Techniques (RST), used in combination with RETIRA (Robust Estimator of TIR Anomalies) index, showed good ability to discriminate anomalous TIR signals possibly associated to seismic activity, from the normal variability of TIR signal due to other causes (e.g. meteorological). In this paper, the RST data analysis approach has been implemented on TIR satellite records collected over Japan by the geostationary satellite sensor MTSAT (Multifunctional Transport SATellites) in the period June 2005 - December 2015 in order to evaluate its possible contribute to an improved multi parametric system for a time-Dependent Assessment of Seismic Hazard (t-DASH). For the first time, thermal anomalies have been identified comparing the daily TIR radiation of each location of the considered satellite portions, with its historical expected value and variation range (i.e. RST reference fields) computed using a a 30 days moving window (i.e. 15 days before and 15 days after the considered day of the year) instead than fixed monthly window. Preliminary results of correlation analysis among the appearance of Significant Sequences of TIR Anomalies (SSTAs) and time, location and magnitude of earthquakes (M≥5), performed by applying predefined space-temporal and magnitude constraints, show that 80% of SSTAs were in an apparent space-time relations with earthquakes with a false alarm rate of 20%.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
Nanometer resolution optical coherence tomography using broad bandwidth XUV and soft x-ray radiation
Fuchs, Silvio; Rödel, Christian; Blinne, Alexander; ...
2016-02-10
Optical coherence tomography (OCT) is a non-invasive technique for cross-sectional imaging. It is particularly advantageous for applications where conventional microscopy is not able to image deeper layers of samples in a reasonable time, e.g. in fast moving, deeper lying structures. However, at infrared and optical wavelengths, which are commonly used, the axial resolution of OCT is limited to about 1 μm, even if the bandwidth of the light covers a wide spectral range. Here, we present extreme ultraviolet coherence tomography (XCT) and thus introduce a new technique for non-invasive cross-sectional imaging of nanometer structures. XCT exploits the nanometerscale coherence lengthsmore » corresponding to the spectral transmission windows of, e.g., silicon samples. The axial resolution of coherence tomography is thus improved from micrometers to a few nanometers. Tomographic imaging with an axial resolution better than 18 nm is demonstrated for layer-type nanostructures buried in a silicon substrate. Using wavelengths in the water transmission window, nanometer-scale layers of platinum are retrieved with a resolution better than 8 nm. As a result, XCT as a nondestructive method for sub-surface tomographic imaging holds promise for several applications in semiconductor metrology and imaging in the water window.« less
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-01-01
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97. PMID:26393607
Generating Daily Synthetic Landsat Imagery by Combining Landsat and MODIS Data.
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-09-18
Owing to low temporal resolution and cloud interference, there is a shortage of high spatial resolution remote sensing data. To address this problem, this study introduces a modified spatial and temporal data fusion approach (MSTDFA) to generate daily synthetic Landsat imagery. This algorithm was designed to avoid the limitations of the conditional spatial temporal data fusion approach (STDFA) including the constant window for disaggregation and the sensor difference. An adaptive window size selection method is proposed in this study to select the best window size and moving steps for the disaggregation of coarse pixels. The linear regression method is used to remove the influence of differences in sensor systems using disaggregated mean coarse reflectance by testing and validation in two study areas located in Xinjiang Province, China. The results show that the MSTDFA algorithm can generate daily synthetic Landsat imagery with a high correlation coefficient (R) ranged from 0.646 to 0.986 between synthetic images and the actual observations. We further show that MSTDFA can be applied to 250 m 16-day MODIS MOD13Q1 products and the Landsat Normalized Different Vegetation Index (NDVI) data by generating a synthetic NDVI image highly similar to actual Landsat NDVI observation with a high R of 0.97.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. From inside the viewing room of the Launch Control Center, KSC employees watch Space Shuttle Discovery as it creeps along the crawlerway toward the horizon, and Launch Pad 39B at NASAs Kennedy Space Center. First motion of the Shuttle out of the Vehicle Assembly Building (VAB) was at 2:04 p.m. EDT. The Mobile Launcher Platform is moved by the Crawler-Transporter underneath. The Crawler is 20 feet high, 131 feet long and 114 feet wide. It moves on eight tracks, each containing 57 shoes, or cleats, weighing one ton each. Loaded with the Space Shuttle, the Crawler can move at a maximum speed of approximately 1 mile an hour. A leveling system in the Crawler keeps the Shuttle vertical while negotiating the 5 percent grade leading to the top of the launch pad. Launch of Discovery on its Return to Flight mission, STS-114, is targeted for May 15 with a launch window that extends to June 3. During its 12-day mission, Discoverys seven-person crew will test new hardware and techniques to improve Shuttle safety, as well as deliver supplies to the International Space Station. Discovery was moved on March 29 from the Orbiter Processing Facility to the VAB and attached to its propulsion elements, a redesigned ET and twin SRBs.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. As Space Shuttle Discovery creeps along the crawlerway toward the horizon, and Launch Pad 39B at NASAs Kennedy Space Center, media and workers in the foreground appear as ants. First motion of the Shuttle out of the Vehicle Assembly Building (VAB) was at 2:04 p.m. EDT. The Mobile Launcher Platform is moved by the Crawler-Transporter underneath. The Crawler is 20 feet high, 131 feet long and 114 feet wide. It moves on eight tracks, each containing 57 shoes, or cleats, weighing one ton each. Loaded with the Space Shuttle, the Crawler can move at a maximum speed of approximately 1 mile an hour. A leveling system in the Crawler keeps the Shuttle vertical while negotiating the 5 percent grade leading to the top of the launch pad. Launch of Discovery on its Return to Flight mission, STS- 114, is targeted for May 15 with a launch window that extends to June 3. During its 12-day mission, Discoverys seven-person crew will test new hardware and techniques to improve Shuttle safety, as well as deliver supplies to the International Space Station. Discovery was moved on March 29 from the Orbiter Processing Facility to the VAB and attached to its propulsion elements, a redesigned ET and twin SRBs.
Seismic signal time-frequency analysis based on multi-directional window using greedy strategy
NASA Astrophysics Data System (ADS)
Chen, Yingpin; Peng, Zhenming; Cheng, Zhuyuan; Tian, Lin
2017-08-01
Wigner-Ville distribution (WVD) is an important time-frequency analysis technology with a high energy distribution in seismic signal processing. However, it is interfered by many cross terms. To suppress the cross terms of the WVD and keep the concentration of its high energy distribution, an adaptive multi-directional filtering window in the ambiguity domain is proposed. This begins with the relationship of the Cohen distribution and the Gabor transform combining the greedy strategy and the rotational invariance property of the fractional Fourier transform in order to propose the multi-directional window, which extends the one-dimensional, one directional, optimal window function of the optimal fractional Gabor transform (OFrGT) to a two-dimensional, multi-directional window in the ambiguity domain. In this way, the multi-directional window matches the main auto terms of the WVD more precisely. Using the greedy strategy, the proposed window takes into account the optimal and other suboptimal directions, which also solves the problem of the OFrGT, called the local concentration phenomenon, when encountering a multi-component signal. Experiments on different types of both the signal models and the real seismic signals reveal that the proposed window can overcome the drawbacks of the WVD and the OFrGT mentioned above. Finally, the proposed method is applied to a seismic signal's spectral decomposition. The results show that the proposed method can explore the space distribution of a reservoir more precisely.
High-temperature, high-pressure optical port for rocket engine applications
NASA Technical Reports Server (NTRS)
Delcher, Ray; Nemeth, ED; Powers, W. T.
1993-01-01
This paper discusses the design, fabrication, and test of a window assembly for instrumentation of liquid-fueled rocket engine hot gas systems. The window was designed to allow optical measurements of hot gas in the SSME fuel preburner and appears to be the first window designed for application in a rocket engine hot gas system. Such a window could allow the use of a number of remote optical measurement technologies including: Raman temperature and species concentration measurement, Raleigh temperature measurements, flame emission monitoring, flow mapping, laser-induced florescence, and hardware imaging during engine operation. The window assembly has been successfully tested to 8,000 psi at 1000 F and over 11,000 psi at room temperature. A computer stress analysis shows the window will withstand high temperature and cryogenic thermal shock.
Experiencing simultanagnosia through windowed viewing of complex social scenes.
Dalrymple, Kirsten A; Birmingham, Elina; Bischof, Walter F; Barton, Jason J S; Kingstone, Alan
2011-01-07
Simultanagnosia is a disorder of visual attention, defined as an inability to see more than one object at once. It has been conceived as being due to a constriction of the visual "window" of attention, a metaphor that we examine in the present article. A simultanagnosic patient (SL) and two non-simultanagnosic control patients (KC and ES) described social scenes while their eye movements were monitored. These data were compared to a group of healthy subjects who described the same scenes under the same conditions as the patients, or through an aperture that restricted their vision to a small portion of the scene. Experiment 1 demonstrated that SL showed unusually low proportions of fixations to the eyes in social scenes, which contrasted with all other participants who demonstrated the standard preferential bias toward eyes. Experiments 2 and 3 revealed that when healthy participants viewed scenes through a window that was contingent on where they looked (Experiment 2) or where they moved a computer mouse (Experiment 3), their behavior closely mirrored that of patient SL. These findings suggest that a constricted window of visual processing has important consequences for how simultanagnosic patients explore their world. Our paradigm's capacity to mimic simultanagnosic behaviors while viewing complex scenes implies that it may be a valid way of modeling simultanagnosia in healthy individuals, providing a useful tool for future research. More broadly, our results support the thesis that people fixate the eyes in social scenes because they are informative to the meaning of the scene. Copyright © 2010 Elsevier B.V. All rights reserved.
An efficient pseudomedian filter for tiling microrrays.
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-06-07
Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at http://tiling.gersteinlab.org/pseudomedian/.
An efficient pseudomedian filter for tiling microrrays
Royce, Thomas E; Carriero, Nicholas J; Gerstein, Mark B
2007-01-01
Background Tiling microarrays are becoming an essential technology in the functional genomics toolbox. They have been applied to the tasks of novel transcript identification, elucidation of transcription factor binding sites, detection of methylated DNA and several other applications in several model organisms. These experiments are being conducted at increasingly finer resolutions as the microarray technology enjoys increasingly greater feature densities. The increased densities naturally lead to increased data analysis requirements. Specifically, the most widely employed algorithm for tiling array analysis involves smoothing observed signals by computing pseudomedians within sliding windows, a O(n2logn) calculation in each window. This poor time complexity is an issue for tiling array analysis and could prove to be a real bottleneck as tiling microarray experiments become grander in scope and finer in resolution. Results We therefore implemented Monahan's HLQEST algorithm that reduces the runtime complexity for computing the pseudomedian of n numbers to O(nlogn) from O(n2logn). For a representative tiling microarray dataset, this modification reduced the smoothing procedure's runtime by nearly 90%. We then leveraged the fact that elements within sliding windows remain largely unchanged in overlapping windows (as one slides across genomic space) to further reduce computation by an additional 43%. This was achieved by the application of skip lists to maintaining a sorted list of values from window to window. This sorted list could be maintained with simple O(log n) inserts and deletes. We illustrate the favorable scaling properties of our algorithms with both time complexity analysis and benchmarking on synthetic datasets. Conclusion Tiling microarray analyses that rely upon a sliding window pseudomedian calculation can require many hours of computation. We have eased this requirement significantly by implementing efficient algorithms that scale well with genomic feature density. This result not only speeds the current standard analyses, but also makes possible ones where many iterations of the filter may be required, such as might be required in a bootstrap or parameter estimation setting. Source code and executables are available at . PMID:17555595
Thermal/structural/optical integrated design for optical sensor mounted on unmanned aerial vehicle
NASA Astrophysics Data System (ADS)
Zhang, Gaopeng; Yang, Hongtao; Mei, Chao; Wu, Dengshan; Shi, Kui
2016-01-01
With the rapid development of science and technology and the promotion of many local wars in the world, altitude optical sensor mounted on unmanned aerial vehicle is more widely applied in the airborne remote sensing, measurement and detection. In order to obtain high quality image of the aero optical remote sensor, it is important to analysis its thermal-optical performance on the condition of high speed and high altitude. Especially for the key imaging assembly, such as optical window, the temperature variation and temperature gradient can result in defocus and aberrations in optical system, which will lead to the poor quality image. In order to improve the optical performance of a high speed aerial camera optical window, the thermal/structural/optical integrated design method is developed. Firstly, the flight environment of optical window is analyzed. Based on the theory of aerodynamics and heat transfer, the convection heat transfer coefficient is calculated. The temperature distributing of optical window is simulated by the finite element analysis software. The maximum difference in temperature of the inside and outside of optical window is obtained. Then the deformation of optical window under the boundary condition of the maximum difference in temperature is calculated. The optical window surface deformation is fitted in Zernike polynomial as the interface, the calculated Zernike fitting coefficients is brought in and analyzed by CodeV Optical Software. At last, the transfer function diagrams of the optical system on temperature field are comparatively analyzed. By comparing and analyzing the result, it can be obtained that the optical path difference caused by thermal deformation of the optical window is 138.2 nm, which is under PV ≤1 4λ . The above study can be used as an important reference for other optical window designs.
Smith, Lauren H; Hargrove, Levi J; Lock, Blair A; Kuiken, Todd A
2011-04-01
Pattern recognition-based control of myoelectric prostheses has shown great promise in research environments, but has not been optimized for use in a clinical setting. To explore the relationship between classification error, controller delay, and real-time controllability, 13 able-bodied subjects were trained to operate a virtual upper-limb prosthesis using pattern recognition of electromyogram (EMG) signals. Classification error and controller delay were varied by training different classifiers with a variety of analysis window lengths ranging from 50 to 550 ms and either two or four EMG input channels. Offline analysis showed that classification error decreased with longer window lengths (p < 0.01 ). Real-time controllability was evaluated with the target achievement control (TAC) test, which prompted users to maneuver the virtual prosthesis into various target postures. The results indicated that user performance improved with lower classification error (p < 0.01 ) and was reduced with longer controller delay (p < 0.01 ), as determined by the window length. Therefore, both of these effects should be considered when choosing a window length; it may be beneficial to increase the window length if this results in a reduced classification error, despite the corresponding increase in controller delay. For the system employed in this study, the optimal window length was found to be between 150 and 250 ms, which is within acceptable controller delays for conventional multistate amplitude controllers.
NASA Astrophysics Data System (ADS)
Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura
2014-05-01
Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (<300Euros) Windows tablets. However, the lack of flexibility in data structure makes for cumbersome workflows when trying to interface our existing shapefile-centric data structures to Move. Nonetheless, in spring 2014 we will experiment with full-3D immersion in the field using the full Move software package in combination with ground based LiDAR and photogrammetry. One new workflow suggested by our initial experiments is that field geologists should consider using photogrammetry software to capture 3D visualizations of key outcrops. This process is now straightforward in several software packages, and it affords a previously unheard of potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.
Li, Min; Yu, Bing-bing; Wu, Jian-hua; Xu, Lin; Sun, Gang
2013-01-01
Purpose As Doppler ultrasound has been proven to be an effective tool to predict and compress the optimal pulsing windows, we evaluated the effective dose and diagnostic accuracy of coronary CT angiography (CTA) incorporating Doppler-guided prospective electrocardiograph (ECG) gating, which presets pulsing windows according to Doppler analysis, in patients with a heart rate >65 bpm. Materials and Methods 119 patients with a heart rate >65 bpm who were scheduled for invasive coronary angiography were prospectively studied, and patients were randomly divided into traditional prospective (n = 61) and Doppler-guided prospective (n = 58) ECG gating groups. The exposure window of traditional prospective ECG gating was set at 30%–80% of the cardiac cycle. For the Doppler group, the length of diastasis was analyzed by Doppler. For lengths greater than 90 ms, the pulsing window was preset during diastole (during 60%–80%); otherwise, the optimal pulsing intervals were moved from diastole to systole (during 30%–50%). Results The mean heart rates of the traditional ECG and the Doppler-guided group during CT scanning were 75.0±7.7 bpm (range, 66–96 bpm) and 76.5±5.4 bpm (range: 66–105 bpm), respectively. The results indicated that whereas the image quality showed no significant difference between the traditional and Doppler groups (P = 0.42), the radiation dose of the Doppler group was significantly lower than that of the traditional group (5.2±3.4mSv vs. 9.3±4.5mSv, P<0.001). The sensitivities of CTA applying traditional and Doppler-guided prospective ECG gating to diagnose stenosis on a segment level were 95.5% and 94.3%, respectively; specificities 98.0% and 97.1%, respectively; positive predictive values 90.7% and 88.2%, respectively; negative predictive values 99.0% and 98.7%, respectively. There was no statistical difference in concordance between the traditional and Doppler groups (P = 0.22). Conclusion Doppler-guided prospective ECG gating represents an improved method in patients with a high heart rate to reduce effective radiation doses, while maintaining high diagnostic accuracy. PMID:23696793
14 CFR 417.229 - Far-field overpressure blast effects analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... characteristics; (2) The potential for broken windows due to peak incident overpressures below 1.0 psi and related... the potentially affected windows, including their size, location, orientation, glazing material, and...
Potential for Bias When Estimating Critical Windows for Air Pollution in Children's Health.
Wilson, Ander; Chiu, Yueh-Hsiu Mathilda; Hsu, Hsiao-Hsien Leon; Wright, Robert O; Wright, Rosalind J; Coull, Brent A
2017-12-01
Evidence supports an association between maternal exposure to air pollution during pregnancy and children's health outcomes. Recent interest has focused on identifying critical windows of vulnerability. An analysis based on a distributed lag model (DLM) can yield estimates of a critical window that are different from those from an analysis that regresses the outcome on each of the 3 trimester-average exposures (TAEs). Using a simulation study, we assessed bias in estimates of critical windows obtained using 3 regression approaches: 1) 3 separate models to estimate the association with each of the 3 TAEs; 2) a single model to jointly estimate the association between the outcome and all 3 TAEs; and 3) a DLM. We used weekly fine-particulate-matter exposure data for 238 births in a birth cohort in and around Boston, Massachusetts, and a simulated outcome and time-varying exposure effect. Estimates using separate models for each TAE were biased and identified incorrect windows. This bias arose from seasonal trends in particulate matter that induced correlation between TAEs. Including all TAEs in a single model reduced bias. DLM produced unbiased estimates and added flexibility to identify windows. Analysis of body mass index z score and fat mass in the same cohort highlighted inconsistent estimates from the 3 methods. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 makes its way toward Launch Pad 39B (in the background). The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
Unity connecting module moving to new site in SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
In the Space Station Processing Facility (SSPF) Unity is suspended in air as it is moved to a now location in the SSPF. At right, visitors watch through a viewing window, part of the visitors tour at the Center. As the primary payload on mission STS-88, scheduled to launch Dec. 3, 1998, Unity will be mated to the Russian-built Zarya control module which should already be in orbit at that time. In the SSPF, Unity is undergoing testing such as the Pad Demonstration Test to verify the compatibility of the module with the Space Shuttle, as well as the ability of the astronauts to send and receive commands to Unity from the flight deck of the orbiter, and the common berthing mechanism to which other space station elements will dock. Unity is expected to be ready for installation into the payload canister on Oct. 25, and transported to Launch Pad 39-A on Oct. 27.
A time-frequency classifier for human gait recognition
NASA Astrophysics Data System (ADS)
Mobasseri, Bijan G.; Amin, Moeness G.
2009-05-01
Radar has established itself as an effective all-weather, day or night sensor. Radar signals can penetrate walls and provide information on moving targets. Recently, radar has been used as an effective biometric sensor for classification of gait. The return from a coherent radar system contains a frequency offset in the carrier frequency, known as the Doppler Effect. The movements of arms and legs give rise to micro Doppler which can be clearly detailed in the time-frequency domain using traditional or modern time-frequency signal representation. In this paper we propose a gait classifier based on subspace learning using principal components analysis(PCA). The training set consists of feature vectors defined as either time or frequency snapshots taken from the spectrogram of radar backscatter. We show that gait signature is captured effectively in feature vectors. Feature vectors are then used in training a minimum distance classifier based on Mahalanobis distance metric. Results show that gait classification with high accuracy and short observation window is achievable using the proposed classifier.
NASA Astrophysics Data System (ADS)
Sosson, Marc; Stephenson, Randell; Sheremet, Yevgeniya; Rolland, Yann; Adamia, Shota; Melkonian, Rafael; Kangarli, Talat; Yegorova, Tamara; Avagyan, Ara; Galoyan, Ghazar; Danelian, Taniel; Hässig, Marc; Meijers, Maud; Müller, Carla; Sahakyan, Lilit; Sadradze, Nino; Alania, Victor; Enukidze, Onice; Mosar, Jon
2016-01-01
We report new observations in the eastern Black Sea-Caucasus region that allow reconstructing the evolution of the Neotethys in the Cretaceous. At that time, the Neotethys oceanic plate was subducting northward below the continental Eurasia plate. Based on the analysis of the obducted ophiolites that crop out throughout Lesser Caucasus and East Anatolides, we show that a spreading center (AESA basin) existed within the Neotethys, between Middle Jurassic and Early Cretaceous. Later, the spreading center was carried into the subduction with the Neotethys plate. We argue that the subduction of the spreading center opened a slab window that allowed asthenospheric material to move upward, in effect thermally and mechanically weakening the otherwise strong Eurasia upper plate. The local weakness zone favored the opening of the Black Sea back-arc basins. Later, in the Late Cretaceous, the AESA basin obducted onto the Taurides-Anatolides-South Armenia Microplate (TASAM), which then collided with Eurasia along a single suture zone (AESA suture).
Non-monotonic temperature dependence of chaos-assisted diffusion in driven periodic systems
NASA Astrophysics Data System (ADS)
Spiechowicz, J.; Talkner, P.; Hänggi, P.; Łuczka, J.
2016-12-01
The spreading of a cloud of independent Brownian particles typically proceeds more effectively at higher temperatures, as it derives from the commonly known Sutherland-Einstein relation for systems in thermal equilibrium. Here, we report on a non-equilibrium situation in which the diffusion of a periodically driven Brownian particle moving in a periodic potential decreases with increasing temperature within a finite temperature window. We identify as the cause for this non-intuitive behaviour a dominant deterministic mechanism consisting of a few unstable periodic orbits embedded into a chaotic attractor together with thermal noise-induced dynamical changes upon varying temperature. The presented analysis is based on extensive numerical simulations of the corresponding Langevin equation describing the studied setup as well as on a simplified stochastic model formulated in terms of a three-state Markovian process. Because chaos exists in many natural as well as in artificial systems representing abundant areas of contemporary knowledge, the described mechanism may potentially be discovered in plentiful different contexts.
Network Analyses for Space-Time High Frequency Wind Data
NASA Astrophysics Data System (ADS)
Laib, Mohamed; Kanevski, Mikhail
2017-04-01
Recently, network science has shown an important contribution to the analysis, modelling and visualization of complex time series. Numerous existing methods have been proposed for constructing networks. This work studies spatio-temporal wind data by using networks based on the Granger causality test. Furthermore, a visual comparison is carried out with several frequencies of data and different size of moving window. The main attention is paid to the temporal evolution of connectivity intensity. The Hurst exponent is applied on the provided time series in order to explore if there is a long connectivity memory. The results explore the space time structure of wind data and can be applied to other environmental data. The used dataset presents a challenging case study. It consists of high frequency (10 minutes) wind data from 120 measuring stations in Switzerland, for a time period of 2012-2013. The distribution of stations covers different geomorphological zones and elevation levels. The results are compared with the Person correlation network as well.
Broadband laser ranging precision and accuracy experiments with PDV benchmarking
NASA Astrophysics Data System (ADS)
Catenacci, Jared; Daykin, Ed; Howard, Marylesa; Lalone, Brandon; Miller, Kirk
2017-06-01
Broadband laser ranging (BLR) is a developmental diagnostic designed to measure the precise position of surfaces and particle clouds moving at velocities of several kilometers per second. Recent single stage gas gun experiments were conducted to quantify the precision and accuracy possible with a typical BLR system. For these experiments, the position of a mirrored projectile is measured relative to the location of a stationary optical flat (uncoated window) mounted within the gun catch tank. Projectile velocity is constrained to one-dimensional motion within the gun barrel. A collimating probe is aligned to be orthogonal to both the target window and the mirrored impactor surface. The probe is used to simultaneously measure the position and velocity with a BLR and conventional Photonic Doppler Velocimetry (PDV) system. Since there is a negligible lateral component to the target velocity, coupled with strong signal returns from a mirrored surface, integrating the PDV measurement provides a high fidelity distance measurement reference to which the BLR measurement may be compared.
Apparatus for solar coal gasification
Gregg, D.W.
1980-08-04
Apparatus for using focused solar radiation to gasify coal and other carbonaceous materials is described. Incident solar radiation is focused from an array of heliostats through a window onto the surface of a moving bed of coal, contained within a gasification reactor. The reactor is designed to minimize contact between the window and solids in the reactor. Steam introduced into the gasification reactor reacts with the heated coal to produce gas consisting mainly of carbon monoxide and hydrogen, commonly called synthesis gas, which can be converted to methane, methanol, gasoline, and other useful products. One of the novel features of the invention is the generation of process steam in one embodiment at the rear surface of a secondary mirror used to redirect the focused sunlight. Another novel feature of the invention is the location and arrangement of the array of mirrors on an inclined surface (e.g., a hillside) to provide for direct optical communication of said mirrors and the carbonaceous feed without a secondary redirecting mirror.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siegfried, Matthew J.; Radford, Daniel R.; Huffman, Russell K.
An electrostatic particle collector may generally include a housing having sidewalls extending lengthwise between a first end and a second end. The housing may define a plate slot that extends heightwise within the housing between a top end and a bottom end. The housing may further include a plate access window that provides access to the bottom end of the plate slot. The collector may also include a collector plate configured to be installed within the plate slot that extends heightwise between a top edge and a bottom edge. Additionally, when the collector plate is installed within the plate slot,more » the bottom edge of the collector plate may be accessible from an exterior of the housing via the plate access window so as to allow the bottom edge of the collector plate to be moved relative to the housing to facilitate removal of the collector plate from the housing.« less
2013-06-15
ISS036-E-008165 (15 June 2013) --- Expedition 36 Flight Engineer Fyodor Yurchikhin with Russia's Federal Space Agency (Roscosmos) takes pictures of a highly anticipated event from a window in the Pirs module on the International Space Station. His electronic still camera is equipped with a 400mm lens to capture distant images of the European Space Agency's Automated Transfer Vehicle-4 (ATV-4) “Albert Einstein.” The spacecraft eventually moved in much closer and successfully docked to the orbital outpost at 2:07 GMT, June 15, 2013, following a ten-day period of free-flight.
21. ORE DOCK, LOOKING SOUTHWEST. THIS VIEW SHOWS THE WEST ...
21. ORE DOCK, LOOKING SOUTHWEST. THIS VIEW SHOWS THE WEST END OF THE DOCK. EMPTY CARS ARE MOVED IN FROM THE WEST BY 'SHUNT CARS,' PUT INTO PLACE AS NEEDED BENEATH THE HULETTS, FILLED, THEN SHUNTED TO THE EAST END OF THE YARD WHERE THEY ARE MADE UP INTO TRAINS. THE POWER HOUSE (WITH TALL ARCHED WINDOWS) AND THE TWO-STORY DOCK OFFICE CAN BE SEEN HERE. - Pennsylvania Railway Ore Dock, Lake Erie at Whiskey Island, approximately 1.5 miles west of Public Square, Cleveland, Cuyahoga County, OH
2008-04-01
V. X. D. Yang, M. L. Gordon, A. Mok, Y. Zhao, Z. Chen, R. Cobbold , B. Wilson, and I. Vitkin, " Improve.d pha<e. re;olve.d optical Doppler tomography... marked according to visualmea~urement (b), rutd the result of the moving circular window filtering. In this case, the location of the mininttun...botmdaries are marked on the OCT image; (g). The ODT image for the same OCT scan; (h). Magnified \\<iew of the region marked in (_ll,). where. the
The predictive power of singular value decomposition entropy for stock market dynamics
NASA Astrophysics Data System (ADS)
Caraiani, Petre
2014-01-01
We use a correlation-based approach to analyze financial data from the US stock market, both daily and monthly observations from the Dow Jones. We compute the entropy based on the singular value decomposition of the correlation matrix for the components of the Dow Jones Industrial Index. Based on a moving window, we derive time varying measures of entropy for both daily and monthly data. We find that the entropy has a predictive ability with respect to stock market dynamics as indicated by the Granger causality tests.
2012-07-27
ISS032-E-010609 (27 July 2012) --- As seen through windows in the Cupola, the station's Canadarm2 robotic arm moves toward the unpiloted Japan Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV-3) as it approaches the International Space Station. NASA astronaut Joe Acaba and Japan Aerospace Exploration Agency astronaut Aki Hoshide, both Expedition 32 flight engineers, used the station's robotic arm to capture and berth the HTV-3 to the Earth-facing port of the station's Harmony node. The attachment was completed at 10:34 a.m. (EDT) on July 27, 2012.
NASA Astrophysics Data System (ADS)
Magdy, Nancy; Ayad, Miriam F.
2015-02-01
Two simple, accurate, precise, sensitive and economic spectrophotometric methods were developed for the simultaneous determination of Simvastatin and Ezetimibe in fixed dose combination products without prior separation. The first method depends on a new chemometrics-assisted ratio spectra derivative method using moving window polynomial least square fitting method (Savitzky-Golay filters). The second method is based on a simple modification for the ratio subtraction method. The suggested methods were validated according to USP guidelines and can be applied for routine quality control testing.
Grivna, Michal; Al-Marzouqi, Hanan M; Al-Ali, Maryam R; Al-Saadi, Nada N; Abu-Zidan, Fikri M
2017-01-01
Falls of children from heights (balconies and windows) usually result in severe injuries and death. Details on child falls from heights in the United Arab Emirates (UAE) are not easily accessible. Our aim was to assess the incidents, personal, and environmental risk factors for pediatric falls from windows/balconies using newspaper clippings. We used a retrospective study design to electronically assess all major UAE national Arabic and English newspapers for reports of unintentional child falls from windows and balconies during 2005-2016. A structured data collection form was developed to collect information. Data were entered into an Excel sheet and descriptive analysis was performed. Newspaper clippings documented 96 fall incidents. After cleaning the data and excluding duplicate cases and intentional injuries, 81 cases were included into the final analysis. Fifty-three percent ( n = 42) were boys. The mean (range) age was 4.9 years (1-15). Thirty-eight (47%) children fell from windows and 36 (44%) from balconies. Twenty-two (27%) children climbed on the furniture placed on a balcony or close to a window. Twenty-five (31%) children were not alone in the apartment when they fell. Twenty-nine children fell from less than 5 floors (37%), 33 from 5 to 10 floors (42%) and 16 from more than 10 floors (21%) . Fifteen children (19%) were hospitalized and survived the fall incident, while 66 died (81%). Newspapers proved to be useful to study pediatric falls from heights. It is necessary to improve window safety by installing window guards and raising awareness.
On Time Delay Margin Estimation for Adaptive Control and Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T.
2011-01-01
This paper presents methods for estimating time delay margin for adaptive control of input delay systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent an adaptive law by a locally bounded linear approximation within a small time window. The time delay margin of this input delay system represents a local stability measure and is computed analytically by three methods: Pade approximation, Lyapunov-Krasovskii method, and the matrix measure method. These methods are applied to the standard model-reference adaptive control, s-modification adaptive law, and optimal control modification adaptive law. The windowing analysis results in non-unique estimates of the time delay margin since it is dependent on the length of a time window and parameters which vary from one time window to the next. The optimal control modification adaptive law overcomes this limitation in that, as the adaptive gain tends to infinity and if the matched uncertainty is linear, then the closed-loop input delay system tends to a LTI system. A lower bound of the time delay margin of this system can then be estimated uniquely without the need for the windowing analysis. Simulation results demonstrates the feasibility of the bounded linear stability method for time delay margin estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cervino, L; Soultan, D; Pettersson, N
2016-06-15
Purpose: to evaluate the dosimetric and radiobiological consequences from having different gating windows, dose rates, and breathing patterns in gated VMAT lung radiotherapy. Methods: A novel 3D-printed moving phantom with central high and peripheral low tracer uptake regions was 4D FDG-PET/CT-scanned using ideal, patient-specific regular, and irregular breathing patterns. A scan of the stationary phantom was obtained as a reference. Target volumes corresponding to different uptake regions were delineated. Simultaneous integrated boost (SIB) 6 MV VMAT plans were produced for conventional and hypofractionated radiotherapy, using 30–70 and 100% cycle gating scenarios. Prescribed doses were 200 cGy with SIB to 240more » cGy to high uptake volume for conventional, and 800 with SIB to 900 cGy for hypofractionated plans. Dose rates of 600 MU/min (conventional and hypofractionated) and flattening filter free 1400 MU/min (hypofractionated) were used. Ion chamber measurements were performed to verify delivered doses. Vials with A549 cells placed in locations matching ion chamber measurements were irradiated using the same plans to measure clonogenic survival. Differences in survival for the different doses, dose rates, gating windows, and breathing patterns were analyzed. Results: Ion chamber measurements agreed within 3% of the planned dose, for all locations, breathing patterns and gating windows. Cell survival depended on dose alone, and not on gating window, breathing pattern, MU rate, or delivery time. The surviving fraction varied from approximately 40% at 2Gy to 1% for 9 Gy and was within statistical uncertainty relative to that observed for the stationary phantom. Conclusions: Use of gated VMAT in PET-driven SIB radiotherapy was validated using ion chamber measurements and cell survival assays for conventional and hypofractionated radiotherapy.« less
Addressing scale dependence in roughness and morphometric statistics derived from point cloud data.
NASA Astrophysics Data System (ADS)
Buscombe, D.; Wheaton, J. M.; Hensleigh, J.; Grams, P. E.; Welcker, C. W.; Anderson, K.; Kaplinski, M. A.
2015-12-01
The heights of natural surfaces can be measured with such spatial density that almost the entire spectrum of physical roughness scales can be characterized, down to the morphological form and grain scales. With an ability to measure 'microtopography' comes a demand for analytical/computational tools for spatially explicit statistical characterization of surface roughness. Detrended standard deviation of surface heights is a popular means to create continuous maps of roughness from point cloud data, using moving windows and reporting window-centered statistics of variations from a trend surface. If 'roughness' is the statistical variation in the distribution of relief of a surface, then 'texture' is the frequency of change and spatial arrangement of roughness. The variance in surface height as a function of frequency obeys a power law. In consequence, roughness is dependent on the window size through which it is examined, which has a number of potential disadvantages: 1) the choice of window size becomes crucial, and obstructs comparisons between data; 2) if windows are large relative to multiple roughness scales, it is harder to discriminate between those scales; 3) if roughness is not scaled by the texture length scale, information on the spacing and clustering of roughness `elements' can be lost; and 4) such practice is not amenable to models describing the scattering of light and sound from rough natural surfaces. We discuss the relationship between roughness and texture. Some useful parameters which scale vertical roughness to characteristic horizontal length scales are suggested, with examples of bathymetric point clouds obtained using multibeam from two contrasting riverbeds, namely those of the Colorado River in Grand Canyon, and the Snake River in Hells Canyon. Such work, aside from automated texture characterization and texture segmentation, roughness and grain size calculation, might also be useful for feature detection and classification from point clouds.
MOVES regional level sensitivity analysis
DOT National Transportation Integrated Search
2012-01-01
The MOVES Regional Level Sensitivity Analysis was conducted to increase understanding of the operations of the MOVES Model in regional emissions analysis and to highlight the following: : the relative sensitivity of selected MOVES Model input paramet...
Vibration measurement by temporal Fourier analyses of a digital hologram sequence.
Fu, Yu; Pedrini, Giancarlo; Osten, Wolfgang
2007-08-10
A method for whole-field noncontact measurement of displacement, velocity, and acceleration of a vibrating object based on image-plane digital holography is presented. A series of digital holograms of a vibrating object are captured by use of a high-speed CCD camera. The result of the reconstruction is a three-dimensional complex-valued matrix with noise. We apply Fourier analysis and windowed Fourier analysis in both the spatial and the temporal domains to extract the displacement, the velocity, and the acceleration. The instantaneous displacement is obtained by temporal unwrapping of the filtered phase map, whereas the velocity and acceleration are evaluated by Fourier analysis and by windowed Fourier analysis along the time axis. The combination of digital holography and temporal Fourier analyses allows for evaluation of the vibration, without a phase ambiguity problem, and smooth spatial distribution of instantaneous displacement, velocity, and acceleration of each instant are obtained. The comparison of Fourier analysis and windowed Fourier analysis in velocity and acceleration measurements is also presented.
11 Foot Unitary Plan Tunnel Facility Optical Improvement Large Window Analysis
NASA Technical Reports Server (NTRS)
Hawke, Veronica M.
2015-01-01
The test section of the 11 by 11-foot Unitary Plan Transonic Wind Tunnel (11-foot UPWT) may receive an upgrade of larger optical windows on both the North and South sides. These new larger windows will provide better access for optical imaging of test article flow phenomena including surface and off body flow characteristics. The installation of these new larger windows will likely produce a change to the aerodynamic characteristics of the flow in the Test Section. In an effort understand the effect of this change, a computational model was employed to predict the flows through the slotted walls, in the test section and around the model before and after the tunnel modification. This report documents the solid CAD model that was created and the inviscid computational analysis that was completed as a preliminary estimate of the effect of the changes.
Platform for Postprocessing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don
2008-01-01
Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).
Long-wavelength Magnetic and Gravity Anomaly Correlations of Africa and Europe
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Hinze, W. J. (Principal Investigator); Olivier, R.
1984-01-01
Preliminary MAGSAT scalar magnetic anomaly data were compiled for comparison with long-wavelength-pass filtered free-air gravity anomalies and regional heat-flow and tectonic data. To facilitate the correlation analysis at satellite elevations over a spherical-Earth, equivalent point source inversion was used to differentially reduce the magnetic satellite anomalies to the radial pole at 350 km elevation, and to upward continue the first radial derivative of the free-air gravity anomalies. Correlation patterns between these regional geopotential anomaly fields are quantitatively established by moving window linear regression based on Poisson's theorem. Prominent correlations include direct correspondences for the Baltic Shield, where both anomalies are negative, and the central Mediterranean and Zaire Basin where both anomalies are positive. Inverse relationships are generally common over the Precambrian Shield in northwest Africa, the Basins and Shields in southern Africa, and the Alpine Orogenic Belt. Inverse correlations also presist over the North Sea Rifts, the Benue Rift, and more generally over the East African Rifts. The results of this quantitative correlation analysis support the general inverse relationships of gravity and magnetic anomalies observed for North American continental terrain which may be broadly related to magnetic crustal thickness variations.
Long-wavelength magnetic and gravity anomaly correlations on Africa and Europe
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Olivier, R.; Hinze, W. J.
1985-01-01
Preliminary MAGSAT scalar magnetic anomaly data were compiled for comparison with long-wavelength-pass filtered free-air gravity anomalies and regional heat-flow and tectonic data. To facilitate the correlation analysis at satellite elevations over a spherical-Earth, equivalent point source inversion was used to differentially reduce the magnetic satellite anomalies to the radial pole at 350 km elevation, and to upward continue the first radial derivative of the free-air gravity anomalies. Correlation patterns between these regional geopotential anomaly fields are quantitatively established by moving window linear regression based on Poisson's theorem. Prominent correlations include direct correspondences for the Baltic shield, where both anomalies are negative, and the central Mediterranean and Zaire Basin where both anomalies are positive. Inverse relationships are generally common over the Precambrian Shield in northwest Africa, the Basins and Shields in southern Africa, and the Alpine Orogenic Belt. Inverse correlations also presist over the North Sea Rifts, the Benue Rift, and more generally over the East African Rifts. The results of this quantitative correlation analysis support the general inverse relationships of gravity and magnetic anomalies observed for North American continental terrain which may be broadly related to magnetic crustal thickness variations.
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Heckels, R. EG; Savage, M. K.; Townend, J.
2018-05-01
Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.
Effects of window size and shape on accuracy of subpixel centroid estimation of target images
NASA Technical Reports Server (NTRS)
Welch, Sharon S.
1993-01-01
A new algorithm is presented for increasing the accuracy of subpixel centroid estimation of (nearly) point target images in cases where the signal-to-noise ratio is low and the signal amplitude and shape vary from frame to frame. In the algorithm, the centroid is calculated over a data window that is matched in width to the image distribution. Fourier analysis is used to explain the dependency of the centroid estimate on the size of the data window, and simulation and experimental results are presented which demonstrate the effects of window size for two different noise models. The effects of window shape were also investigated for uniform and Gaussian-shaped windows. The new algorithm was developed to improve the dynamic range of a close-range photogrammetric tracking system that provides feedback for control of a large gap magnetic suspension system (LGMSS).
Displacement and frequency analyses of vibratory systems
NASA Astrophysics Data System (ADS)
Low, K. H.
1995-02-01
This paper deals with the frequency and response studies of vibratory systems, which are represented by a set of n coupled second-order differential equations. The following numerical methods are used in the response analysis: central difference, fourth-order Runge-Kutta and modal methods. Data generated in the response analysis are processed to obtain the system frequencies by using the fast Fourier transform (FFT) or harmonic response methods. Two types of the windows are used in the FFT analysis: rectangular and Hanning windows. Examples of two, four and seven degrees of freedom systems are considered, to illustrate the proposed algorithms. Comparisons with those existing results confirm the validity of the proposed methods. The Hanning window attenuates the results that give a narrower bandwidth around the peak if compared with those using the rectangular window. It is also found that in free vibrations of a multi-mass system, the masses will vibrate in a manner that is the superposition of the natural frequencies of the system, while the system will vibrate at the driving frequency in forced vibrations.
Fundamental study of compression for movie files of coronary angiography
NASA Astrophysics Data System (ADS)
Ando, Takekazu; Tsuchiya, Yuichiro; Kodera, Yoshie
2005-04-01
When network distribution of movie files was considered as reference, it could be useful that the lossy compression movie files which has small file size. We chouse three kinds of coronary stricture movies with different moving speed as an examination object; heart rate of slow, normal and fast movies. The movies of MPEG-1, DivX5.11, WMV9 (Windows Media Video 9), and WMV9-VCM (Windows Media Video 9-Video Compression Manager) were made from three kinds of AVI format movies with different moving speeds. Five kinds of movies that are four kinds of compression movies and non-compression AVI instead of the DICOM format were evaluated by Thurstone's method. The Evaluation factors of movies were determined as "sharpness, granularity, contrast, and comprehensive evaluation." In the virtual bradycardia movie, AVI was the best evaluation at all evaluation factors except the granularity. In the virtual normal movie, an excellent compression technique is different in all evaluation factors. In the virtual tachycardia movie, MPEG-1 was the best evaluation at all evaluation factors expects the contrast. There is a good compression form depending on the speed of movies because of the difference of compression algorithm. It is thought that it is an influence by the difference of the compression between frames. The compression algorithm for movie has the compression between the frames and the intra-frame compression. As the compression algorithm give the different influence to image by each compression method, it is necessary to examine the relation of the compression algorithm and our results.
Validation of the Spatial Accuracy of the ExacTracRTM Adaptive Gating System
NASA Astrophysics Data System (ADS)
Twork, Gregory
Stereotactic body radiation therapy (SBRT) is a method of treatment that is used in extracranial locations, including the abdominal and thoracic cavities, as well as spinal and paraspinal locations. At the McGill University Health Centre, liver SBRT treatments include gating, which places the treatment beam on a duty cycle controlled by tracking of fiducial markers moving with the patient's breathing cycle. Respiratory gated treatments aim to spare normal tissue, while delivering a dose properly to a moving target. The ExacTracRTM system (BrainLAB AG Germany) is an image-guided radiotherapy system consisting of a combination of infra-red (IR) cameras and dual kilovoltage (kV) X-ray tubes. The IR system is used to track patient positioning and respiratory motion, while the kV X-rays are used to determine a positional shift based on internal anatomy or fiducial markers. In order to validate the system's ability to treat under gating conditions, each step of the SBRT process was evaluated quantitatively. Initially the system was tested under ideal static conditions, followed by a study including gated parameters. The uncertainties of the isocenters, positioning algorithm, planning computed tomography (CT) and four dimensional CT (4DCT) scans, gating window size and tumor motion were evaluated for their contributions to the total uncertainty in treatment. The mechanical isocenter and 4DCT were found to be the largest sources of uncertainty. However, for tumors with large internal amplitudes (>2.25 cm) that are treated with large gating windows (>30%) the gating parameters can contribute more than 1.1 +/- 1.8 mm.
Whalen, D. H.; Zunshine, Lisa; Holquist, Michael
2015-01-01
Reading fiction is a major component of intellectual life, yet it has proven difficult to study experimentally. One aspect of literature that has recently come to light is perspective embedding (“she thought I left” embedding her perspective on “I left”), which seems to be a defining feature of fiction. Previous work (Whalen et al., 2012) has shown that increasing levels of embedment affects the time that it takes readers to read and understand short vignettes in a moving window paradigm. With increasing levels of embedment from 1 to 5, reading times in a moving window paradigm rose almost linearly. However, level 0 was as slow as 3–4. Accuracy on probe questions was relatively constant until dropping at the fifth level. Here, we assessed this effect in a more ecologically valid (“typical”) reading paradigm, in which the entire vignette was visible at once, either for as long as desired (Experiment 1) or a fixed time (Experiment 2). In Experiment 1, reading times followed a pattern similar to that of the previous experiment, with some differences in absolute speed. Accuracy matched previous results: fairly consistent accuracy until a decline at level 5, indicating that both presentation methods allowed understanding. In Experiment 2, accuracy was somewhat reduced, perhaps because participants were less successful at allocating their attention than they were during the earlier experiment; however, the pattern was the same. It seems that literature does not, on average, use easiest reading level but rather uses a middle ground that challenges the reader, but not too much. PMID:26635684
Computed Tomography Window Blending: Feasibility in Thoracic Trauma.
Mandell, Jacob C; Wortman, Jeremy R; Rocha, Tatiana C; Folio, Les R; Andriole, Katherine P; Khurana, Bharti
2018-02-07
This study aims to demonstrate the feasibility of processing computed tomography (CT) images with a custom window blending algorithm that combines soft-tissue, bone, and lung window settings into a single image; to compare the time for interpretation of chest CT for thoracic trauma with window blending and conventional window settings; and to assess diagnostic performance of both techniques. Adobe Photoshop was scripted to process axial DICOM images from retrospective contrast-enhanced chest CTs performed for trauma with a window-blending algorithm. Two emergency radiologists independently interpreted the axial images from 103 chest CTs with both blended and conventional windows. Interpretation time and diagnostic performance were compared with Wilcoxon signed-rank test and McNemar test, respectively. Agreement with Nexus CT Chest injury severity was assessed with the weighted kappa statistic. A total of 13,295 images were processed without error. Interpretation was faster with window blending, resulting in a 20.3% time saving (P < .001), with no difference in diagnostic performance, within the power of the study to detect a difference in sensitivity of 5% as determined by post hoc power analysis. The sensitivity of the window-blended cases was 82.7%, compared to 81.6% for conventional windows. The specificity of the window-blended cases was 93.1%, compared to 90.5% for conventional windows. All injuries of major clinical significance (per Nexus CT Chest criteria) were correctly identified in all reading sessions, and all negative cases were correctly classified. All readers demonstrated near-perfect agreement with injury severity classification with both window settings. In this pilot study utilizing retrospective data, window blending allows faster preliminary interpretation of axial chest CT performed for trauma, with no significant difference in diagnostic performance compared to conventional window settings. Future studies would be required to assess the utility of window blending in clinical practice. Copyright © 2018 The Association of University Radiologists. All rights reserved.
Wavelet-based multiscale window transform and energy and vorticity analysis
NASA Astrophysics Data System (ADS)
Liang, Xiang San
A new methodology, Multiscale Energy and Vorticity Analysis (MS-EVA), is developed to investigate sub-mesoscale, meso-scale, and large-scale dynamical interactions in geophysical fluid flows which are intermittent in space and time. The development begins with the construction of a wavelet-based functional analysis tool, the multiscale window transform (MWT), which is local, orthonormal, self-similar, and windowed on scale. The MWT is first built over the real line then modified onto a finite domain. Properties are explored, the most important one being the property of marginalization which brings together a quadratic quantity in physical space with its phase space representation. Based on MWT the MS-EVA is developed. Energy and enstrophy equations for the large-, meso-, and sub-meso-scale windows are derived and their terms interpreted. The processes thus represented are classified into four categories: transport; transfer, conversion, and dissipation/diffusion. The separation of transport from transfer is made possible with the introduction of the concept of perfect transfer. By the property of marginalization, the classical energetic analysis proves to be a particular case of the MS-EVA. The MS-EVA developed is validated with classical instability problems. The validation is carried out through two steps. First, it is established that the barotropic and baroclinic instabilities are indicated by the spatial averages of certain transfer term interaction analyses. Then calculations of these indicators are made with an Eady model and a Kuo model. The results agree precisely with what is expected from their analytical solutions, and the energetics reproduced reveal a consistent and important aspect of the unknown dynamic structures of instability processes. As an application, the MS-EVA is used to investigate the Iceland-Faeroe frontal (IFF) variability. A MS-EVA-ready dataset is first generated, through a forecasting study with the Harvard Ocean Prediction System using the data gathered during the 1993 NRV Alliance cruise. The application starts with a determination of the scale window bounds, which characterize a double-peak structure in either the time wavelet spectrum or the space wavelet spectrum. The resulting energetics, when locally averaged, reveal that there is a clear baroclinic instability happening around the cold tongue intrusion observed in the forecast. Moreover, an interaction analysis shows that the energy released by the instability indeed goes to the meso-scale window and fuel the growth of the intrusion. The sensitivity study shows that, in this case, the key to a successful application is a correct decomposition of the large-scale window from the meso-scale window.
Pedersen, Mangor; Omidvarnia, Amir; Zalesky, Andrew; Jackson, Graeme D
2018-06-08
Correlation-based sliding window analysis (CSWA) is the most commonly used method to estimate time-resolved functional MRI (fMRI) connectivity. However, instantaneous phase synchrony analysis (IPSA) is gaining popularity mainly because it offers single time-point resolution of time-resolved fMRI connectivity. We aim to provide a systematic comparison between these two approaches, on both temporal and topological levels. For this purpose, we used resting-state fMRI data from two separate cohorts with different temporal resolutions (45 healthy subjects from Human Connectome Project fMRI data with repetition time of 0.72 s and 25 healthy subjects from a separate validation fMRI dataset with a repetition time of 3 s). For time-resolved functional connectivity analysis, we calculated tapered CSWA over a wide range of different window lengths that were temporally and topologically compared to IPSA. We found a strong association in connectivity dynamics between IPSA and CSWA when considering the absolute values of CSWA. The association between CSWA and IPSA was stronger for a window length of ∼20 s (shorter than filtered fMRI wavelength) than ∼100 s (longer than filtered fMRI wavelength), irrespective of the sampling rate of the underlying fMRI data. Narrow-band filtering of fMRI data (0.03-0.07 Hz) yielded a stronger relationship between IPSA and CSWA than wider-band (0.01-0.1 Hz). On a topological level, time-averaged IPSA and CSWA nodes were non-linearly correlated for both short (∼20 s) and long (∼100 s) windows, mainly because nodes with strong negative correlations (CSWA) displayed high phase synchrony (IPSA). IPSA and CSWA were anatomically similar in the default mode network, sensory cortex, insula and cerebellum. Our results suggest that IPSA and CSWA provide comparable characterizations of time-resolved fMRI connectivity for appropriately chosen window lengths. Although IPSA requires narrow-band fMRI filtering, we recommend the use of IPSA given that it does not mandate a (semi-)arbitrary choice of window length and window overlap. A code for calculating IPSA is provided. Copyright © 2018. Published by Elsevier Inc.
Minimal Window Duration for Accurate HRV Recording in Athletes.
Bourdillon, Nicolas; Schmitt, Laurent; Yazdani, Sasan; Vesin, Jean-Marc; Millet, Grégoire P
2017-01-01
Heart rate variability (HRV) is non-invasive and commonly used for monitoring responses to training loads, fitness, or overreaching in athletes. Yet, the recording duration for a series of RR-intervals varies from 1 to 15 min in the literature. The aim of the present work was to assess the minimum record duration to obtain reliable HRV results. RR-intervals from 159 orthostatic tests (7 min supine, SU, followed by 6 min standing, ST) were analyzed. Reference windows were 4 min in SU (min 3-7) and 4 min in ST (min 9-13). Those windows were subsequently divided and the analyses were repeated on eight different fractioned windows: the first min (0-1), the second min (1-2), the third min (2-3), the fourth min (3-4), the first 2 min (0-2), the last 2 min (2-4), the first 3 min (0-3), and the last 3 min (1-4). Correlation and Bland & Altman statistical analyses were systematically performed. The analysis window could be shortened to 0-2 instead of 0-4 for RMSSD only, whereas the 4-min window was necessary for LF and total power. Since there is a need for 1 min of baseline to obtain a steady signal prior the analysis window, we conclude that studies relying on RMSSD may shorten the windows to 3 min (= 1+2) in SU or seated position only and to 6 min (= 1+2 min SU plus 1+2 min ST) if there is an orthostatic test. Studies relying on time- and frequency-domain parameters need a minimum of 5 min (= 1+4) min SU or seated position only but require 10 min (= 1+4 min SU plus 1+4 min ST) for the orthostatic test.
Capillary Electrophoresis Sensitivity Enhancement Based on Adaptive Moving Average Method.
Drevinskas, Tomas; Telksnys, Laimutis; Maruška, Audrius; Gorbatsova, Jelena; Kaljurand, Mihkel
2018-06-05
In the present work, we demonstrate a novel approach to improve the sensitivity of the "out of lab" portable capillary electrophoretic measurements. Nowadays, many signal enhancement methods are (i) underused (nonoptimal), (ii) overused (distorts the data), or (iii) inapplicable in field-portable instrumentation because of a lack of computational power. The described innovative migration velocity-adaptive moving average method uses an optimal averaging window size and can be easily implemented with a microcontroller. The contactless conductivity detection was used as a model for the development of a signal processing method and the demonstration of its impact on the sensitivity. The frequency characteristics of the recorded electropherograms and peaks were clarified. Higher electrophoretic mobility analytes exhibit higher-frequency peaks, whereas lower electrophoretic mobility analytes exhibit lower-frequency peaks. On the basis of the obtained data, a migration velocity-adaptive moving average algorithm was created, adapted, and programmed into capillary electrophoresis data-processing software. Employing the developed algorithm, each data point is processed depending on a certain migration time of the analyte. Because of the implemented migration velocity-adaptive moving average method, the signal-to-noise ratio improved up to 11 times for sampling frequency of 4.6 Hz and up to 22 times for sampling frequency of 25 Hz. This paper could potentially be used as a methodological guideline for the development of new smoothing algorithms that require adaptive conditions in capillary electrophoresis and other separation methods.
On interrelations of recurrences and connectivity trends between stock indices
NASA Astrophysics Data System (ADS)
Goswami, B.; Ambika, G.; Marwan, N.; Kurths, J.
2012-09-01
Financial data has been extensively studied for correlations using Pearson's cross-correlation coefficient ρ as the point of departure. We employ an estimator based on recurrence plots - the correlation of probability of recurrence (CPR) - to analyze connections between nine stock indices spread worldwide. We suggest a slight modification of the CPR approach in order to get more robust results. We examine trends in CPR for an approximately 19-month window moved along the time series and compare them to trends in ρ. Binning CPR into three levels of connectedness (strong, moderate, and weak), we extract the trends in number of connections in each bin over time. We also look at the behavior of CPR during the dot-com bubble by shifting the time series to align their peaks. CPR mainly uncovers that the markets move in and out of periods of strong connectivity erratically, instead of moving monotonically towards increasing global connectivity. This is in contrast to ρ, which gives a picture of ever-increasing correlation. CPR also exhibits that time-shifted markets have high connectivity around the dot-com bubble of 2000. We use significance tests using twin surrogates to interpret all the measures estimated in the study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Junwei; Chen, Ruizhu; Hartlep, Thomas
2015-08-10
Helioseismic and magnetohydrodynamic waves are abundant in and above sunspots. Through cross-correlating oscillation signals in the photosphere observed by the Solar Dynamics Observatory/Helioseismic and Magnetic Imager, we reconstruct how waves propagate away from virtual wave sources located inside a sunspot. In addition to the usual helioseismic wave, a fast-moving wave is detected traveling along the sunspot’s radial direction from the umbra to about 15 Mm beyond the sunspot boundary. The wave has a frequency range of 2.5–4.0 mHz with a phase velocity of 45.3 km s{sup −1}, substantially faster than the typical speeds of Alfvén and magnetoacoustic waves in themore » photosphere. The observed phenomenon is consistent with a scenario of that a magnetoacoustic wave is excited at approximately 5 Mm beneath the sunspot. Its wavefront travels to and sweeps across the photosphere with a speed higher than the local magnetoacoustic speed. The fast-moving wave, if truly excited beneath the sunspot’s surface, will help open a new window for studying the internal structure and dynamics of sunspots.« less
Aston, Philip J; Christie, Mark I; Huang, Ying H; Nandi, Manasi
2018-03-01
Advances in monitoring technology allow blood pressure waveforms to be collected at sampling frequencies of 250-1000 Hz for long time periods. However, much of the raw data are under-analysed. Heart rate variability (HRV) methods, in which beat-to-beat interval lengths are extracted and analysed, have been extensively studied. However, this approach discards the majority of the raw data. Our aim is to detect changes in the shape of the waveform in long streams of blood pressure data. Our approach involves extracting key features from large complex data sets by generating a reconstructed attractor in a three-dimensional phase space using delay coordinates from a window of the entire raw waveform data. The naturally occurring baseline variation is removed by projecting the attractor onto a plane from which new quantitative measures are obtained. The time window is moved through the data to give a collection of signals which relate to various aspects of the waveform shape. This approach enables visualisation and quantification of changes in the waveform shape and has been applied to blood pressure data collected from conscious unrestrained mice and to human blood pressure data. The interpretation of the attractor measures is aided by the analysis of simple artificial waveforms. We have developed and analysed a new method for analysing blood pressure data that uses all of the waveform data and hence can detect changes in the waveform shape that HRV methods cannot, which is confirmed with an example, and hence our method goes 'beyond HRV'.
The Study of Residential Areas Extraction Based on GF-3 Texture Image Segmentation
NASA Astrophysics Data System (ADS)
Shao, G.; Luo, H.; Tao, X.; Ling, Z.; Huang, Y.
2018-04-01
The study chooses the standard stripe and dual polarization SAR images of GF-3 as the basic data. Residential areas extraction processes and methods based upon GF-3 images texture segmentation are compared and analyzed. GF-3 images processes include radiometric calibration, complex data conversion, multi-look processing, images filtering, and then conducting suitability analysis for different images filtering methods, the filtering result show that the filtering method of Kuan is efficient for extracting residential areas, then, we calculated and analyzed the texture feature vectors using the GLCM (the Gary Level Co-occurrence Matrix), texture feature vectors include the moving window size, step size and angle, the result show that window size is 11*11, step is 1, and angle is 0°, which is effective and optimal for the residential areas extracting. And with the FNEA (Fractal Net Evolution Approach), we segmented the GLCM texture images, and extracted the residential areas by threshold setting. The result of residential areas extraction verified and assessed by confusion matrix. Overall accuracy is 0.897, kappa is 0.881, and then we extracted the residential areas by SVM classification based on GF-3 images, the overall accuracy is less 0.09 than the accuracy of extraction method based on GF-3 Texture Image Segmentation. We reached the conclusion that residential areas extraction based on GF-3 SAR texture image multi-scale segmentation is simple and highly accurate. although, it is difficult to obtain multi-spectrum remote sensing image in southern China, in cloudy and rainy weather throughout the year, this paper has certain reference significance.
Aston, Philip J; Christie, Mark I; Huang, Ying H; Nandi, Manasi
2018-01-01
Abstract Advances in monitoring technology allow blood pressure waveforms to be collected at sampling frequencies of 250–1000 Hz for long time periods. However, much of the raw data are under-analysed. Heart rate variability (HRV) methods, in which beat-to-beat interval lengths are extracted and analysed, have been extensively studied. However, this approach discards the majority of the raw data. Objective: Our aim is to detect changes in the shape of the waveform in long streams of blood pressure data. Approach: Our approach involves extracting key features from large complex data sets by generating a reconstructed attractor in a three-dimensional phase space using delay coordinates from a window of the entire raw waveform data. The naturally occurring baseline variation is removed by projecting the attractor onto a plane from which new quantitative measures are obtained. The time window is moved through the data to give a collection of signals which relate to various aspects of the waveform shape. Main results: This approach enables visualisation and quantification of changes in the waveform shape and has been applied to blood pressure data collected from conscious unrestrained mice and to human blood pressure data. The interpretation of the attractor measures is aided by the analysis of simple artificial waveforms. Significance: We have developed and analysed a new method for analysing blood pressure data that uses all of the waveform data and hence can detect changes in the waveform shape that HRV methods cannot, which is confirmed with an example, and hence our method goes ‘beyond HRV’. PMID:29350622
Marchant, Carol A; Briggs, Katharine A; Long, Anthony
2008-01-01
ABSTRACT Lhasa Limited is a not-for-profit organization that exists to promote the sharing of data and knowledge in chemistry and the life sciences. It has developed the software tools Derek for Windows, Meteor, and Vitic to facilitate such sharing. Derek for Windows and Meteor are knowledge-based expert systems that predict the toxicity and metabolism of a chemical, respectively. Vitic is a chemically intelligent toxicity database. An overview of each software system is provided along with examples of the sharing of data and knowledge in the context of their development. These examples include illustrations of (1) the use of data entry and editing tools for the sharing of data and knowledge within organizations; (2) the use of proprietary data to develop nonconfidential knowledge that can be shared between organizations; (3) the use of shared expert knowledge to refine predictions; (4) the sharing of proprietary data between organizations through the formation of data-sharing groups; and (5) the use of proprietary data to validate predictions. Sharing of chemical toxicity and metabolism data and knowledge in this way offers a number of benefits including the possibilities of faster scientific progress and reductions in the use of animals in testing. Maximizing the accessibility of data also becomes increasingly crucial as in silico systems move toward the prediction of more complex phenomena for which limited data are available.
Ehara, Shoichi; Okuyama, Takuhiro; Shirai, Nobuyuki; Sugioka, Kenichi; Oe, Hiroki; Itoh, Toshihide; Matsuoka, Toshiyuki; Ikura, Yoshihiro; Ueda, Makiko; Naruko, Takahiko; Hozumi, Takeshi; Yoshiyama, Minoru
2009-08-01
Previous studies have shown a correlation between coronary artery cross-sectional diameter and left ventricular (LV) mass. However, no studies have examined the correlation between actual coronary artery volume (CAV) and LV mass. In the present study, measurements of CAV by 64-multislice computed tomography (MSCT) were validated and the relationship between CAV and LV mass was investigated. First, coronary artery phantoms consisting of syringes filled with solutions of contrast medium moving at simulated heart rates were scanned by 64-MSCT. Display window settings permitting accurate calculation of small volumes were optimized by evaluating volume-rendered images of the segmented contrast medium at different window settings. Next, 61 patients without significant coronary artery stenosis were scanned by 64-MSCT with the same protocol as for the phantoms. Coronary arteries were segmented on a workstation and the same window settings were applied to the volume-rendered images to calculate total CAV. Significant correlations between total CAV and LV mass (r=0.660, P<0.0001) were found, whereas an inverse relation was present between total CAV per 100 g of LV mass and LV mass. The novel concept of "CAV" for the characterization of coronary arteries may prove useful for future research, particularly on the causes of LV hypertrophy.
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-01-01
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration’s (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003–2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW’s) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach. PMID:27314363
Evaluation of Bias Correction Method for Satellite-Based Rainfall Data.
Bhatti, Haris Akram; Rientjes, Tom; Haile, Alemseged Tamiru; Habib, Emad; Verhoef, Wouter
2016-06-15
With the advances in remote sensing technology, satellite-based rainfall estimates are gaining attraction in the field of hydrology, particularly in rainfall-runoff modeling. Since estimates are affected by errors correction is required. In this study, we tested the high resolution National Oceanic and Atmospheric Administration's (NOAA) Climate Prediction Centre (CPC) morphing technique (CMORPH) satellite rainfall product (CMORPH) in the Gilgel Abbey catchment, Ethiopia. CMORPH data at 8 km-30 min resolution is aggregated to daily to match in-situ observations for the period 2003-2010. Study objectives are to assess bias of the satellite estimates, to identify optimum window size for application of bias correction and to test effectiveness of bias correction. Bias correction factors are calculated for moving window (MW) sizes and for sequential windows (SW's) of 3, 5, 7, 9, …, 31 days with the aim to assess error distribution between the in-situ observations and CMORPH estimates. We tested forward, central and backward window (FW, CW and BW) schemes to assess the effect of time integration on accumulated rainfall. Accuracy of cumulative rainfall depth is assessed by Root Mean Squared Error (RMSE). To systematically correct all CMORPH estimates, station based bias factors are spatially interpolated to yield a bias factor map. Reliability of interpolation is assessed by cross validation. The uncorrected CMORPH rainfall images are multiplied by the interpolated bias map to result in bias corrected CMORPH estimates. Findings are evaluated by RMSE, correlation coefficient (r) and standard deviation (SD). Results showed existence of bias in the CMORPH rainfall. It is found that the 7 days SW approach performs best for bias correction of CMORPH rainfall. The outcome of this study showed the efficiency of our bias correction approach.
NASA Astrophysics Data System (ADS)
Zhao, Jinping; Cao, Yong; Wang, Xin
2018-06-01
In order to study the temporal variations of correlations between two time series, a running correlation coefficient (RCC) could be used. An RCC is calculated for a given time window, and the window is then moved sequentially through time. The current calculation method for RCCs is based on the general definition of the Pearson product-moment correlation coefficient, calculated with the data within the time window, which we call the local running correlation coefficient (LRCC). The LRCC is calculated via the two anomalies corresponding to the two local means, meanwhile, the local means also vary. It is cleared up that the LRCC reflects only the correlation between the two anomalies within the time window but fails to exhibit the contributions of the two varying means. To address this problem, two unchanged means obtained from all available data are adopted to calculate an RCC, which is called the synthetic running correlation coefficient (SRCC). When the anomaly variations are dominant, the two RCCs are similar. However, when the variations of the means are dominant, the difference between the two RCCs becomes obvious. The SRCC reflects the correlations of both the anomaly variations and the variations of the means. Therefore, the SRCCs from different time points are intercomparable. A criterion for the superiority of the RCC algorithm is that the average value of the RCC should be close to the global correlation coefficient calculated using all data. The SRCC always meets this criterion, while the LRCC sometimes fails. Therefore, the SRCC is better than the LRCC for running correlations. We suggest using the SRCC to calculate the RCCs.
Computed Tomographic Window Setting for Bronchial Measurement to Guide Double-Lumen Tube Size.
Seo, Jeong-Hwa; Bae, Jinyoung; Paik, Hyesun; Koo, Chang-Hoon; Bahk, Jae-Hyon
2018-04-01
The bronchial diameter measured on computed tomography (CT) can be used to guide double-lumen tube (DLT) sizes objectively. The bronchus is known to be measured most accurately in the so-called bronchial CT window. The authors investigated whether using the bronchial window results in the selection of more appropriately sized DLTs than using the other windows. CT image analysis and prospective randomized study. Tertiary hospital. Adults receiving left-sided DLTs. The authors simulated selection of DLT sizes based on the left bronchial diameters measured in the lung (width 1,500 Hounsfield unit [HU] and level -700 HU), bronchial (1,000 HU and -450 HU), and mediastinal (400 HU and 25 HU) CT windows. Furthermore, patients were randomly assigned to undergo imaging with either the bronchial or mediastinal window to guide DLT sizes. Using the underwater seal technique, the authors assessed whether the DLT was appropriately sized, undersized, or oversized for the patient. On 130 CT images, the bronchial diameter (9.9 ± 1.2 mm v 10.5 ± 1.3 mm v 11.7 ± 1.3 mm) and the selected DLT size were different in the lung, bronchial, and mediastinal windows, respectively (p < 0.001). In 13 patients (17%), the bronchial diameter measured in the lung window suggested too small DLTs (28 Fr) for adults. In the prospective study, oversized tubes were chosen less frequently in the bronchial window than in the mediastinal window (6/110 v 23/111; risk ratio 0.38; 95% CI 0.19-0.79; p = 0.003). No tubes were undersized after measurements in these two windows. The bronchial measurement in the bronchial window guided more appropriately sized DLTs compared with the lung or mediastinal windows. Copyright © 2017 Elsevier Inc. All rights reserved.
Study of wavefront error and polarization of a side mounted infrared window
NASA Astrophysics Data System (ADS)
Liu, Jiaguo; Li, Lin; Hu, Xinqi; Yu, Xin
2008-03-01
The wavefront error and polarization of a side mounted infrared window made of ZnS are studied. The Infrared windows suffer from temperature gradient and stress during their launch process. Generally, the gradient in temperature changes the refractive index of the material whereas stress produces deformation and birefringence. In this paper, a thermal finite element analysis (FEA) of an IR window is presented. For this purpose, we employed an FEA program Ansys to obtain the time-varying temperature field. The deformation and stress of the window are derived from a structural FEA with the aerodynamic force and the temperature field previously obtained as being the loads. The deformation, temperature field, stress field, ray tracing and Jones Calculus are used to calculate the wavefront error and the change of polarization state.
Hydrogen Safety Project: Chemical analysis support task. Window ``E`` analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T E; Campbell, J A; Hoppe, E W
1992-09-01
Core samples taken from tank 101-SY at Hanford during ``window E`` were analyzed for organic and radiochemical constituents by staff of the Analytical Chemistry Laboratory at Pacific Northwest Laboratory. Westinghouse Hanford company submitted these samples to the laboratory.
The relationship between movement speed and duration during soccer matches.
Roecker, Kai; Mahler, Hubert; Heyde, Christian; Röll, Mareike; Gollhofer, Albert
2017-01-01
The relationship between the time duration of movement (t(dur)) and related maximum possible power output has been studied and modeled under many conditions. Inspired by the so-called power profiles known for discontinuous endurance sports like cycling, and the critical power concept of Monod and Scherrer, the aim of this study was to evaluate the numerical characteristics of the function between maximum horizontal movement velocity (HSpeed) and t(dur) in soccer. To evaluate this relationship, GPS data from 38 healthy soccer players and 82 game participations (≥30 min active playtime) were used to select maximum HSpeed for 21 distinct t(dur) values (between 0.3 s and 2,700 s) based on moving medians with an incremental t(dur) window-size. As a result, the relationship between HSpeed and Log(t(dur)) appeared reproducibly as a sigmoidal decay function, and could be fitted to a five-parameter equation with upper and lower asymptotes, and an inflection point, power and decrease rate. Thus, the first three parameters described individual characteristics if evaluated using mixed-model analysis. This study shows for the first time the general numerical relationship between t(dur) and HSpeed in soccer games. In contrast to former descriptions that have evaluated speed against power, HSpeed against t(dur) always yields a sigmoidal shape with a new upper asymptote. The evaluated curve fit potentially describes the maximum moving speed of individual players during the game, and allows for concise interpretations of the functional state of team sports athletes.
The relationship between movement speed and duration during soccer matches
Mahler, Hubert; Heyde, Christian; Röll, Mareike; Gollhofer, Albert
2017-01-01
The relationship between the time duration of movement (t(dur)) and related maximum possible power output has been studied and modeled under many conditions. Inspired by the so-called power profiles known for discontinuous endurance sports like cycling, and the critical power concept of Monod and Scherrer, the aim of this study was to evaluate the numerical characteristics of the function between maximum horizontal movement velocity (HSpeed) and t(dur) in soccer. To evaluate this relationship, GPS data from 38 healthy soccer players and 82 game participations (≥30 min active playtime) were used to select maximum HSpeed for 21 distinct t(dur) values (between 0.3 s and 2,700 s) based on moving medians with an incremental t(dur) window-size. As a result, the relationship between HSpeed and Log(t(dur)) appeared reproducibly as a sigmoidal decay function, and could be fitted to a five-parameter equation with upper and lower asymptotes, and an inflection point, power and decrease rate. Thus, the first three parameters described individual characteristics if evaluated using mixed-model analysis. This study shows for the first time the general numerical relationship between t(dur) and HSpeed in soccer games. In contrast to former descriptions that have evaluated speed against power, HSpeed against t(dur) always yields a sigmoidal shape with a new upper asymptote. The evaluated curve fit potentially describes the maximum moving speed of individual players during the game, and allows for concise interpretations of the functional state of team sports athletes. PMID:28742832
Arcjet exploratory tests of ARC optical window design for the AFE vehicle
NASA Technical Reports Server (NTRS)
Whiting, Ellis E.; Terrazas-Salinas, Imelda; Craig, Roger A.; Sobeck, Charles K.; Sarver, George L., III; Salerno, Louis J.; Love, Wendell; Maa, Scott; Covington, AL
1991-01-01
Tests were made in the 20 MW arc jet facility at the NASA ARC to determine the suitability of sapphire and fused silica as window materials for the Aeroassist Flight Experiment (AFE) entry vehicle. Twenty nine tests were made; 25 at a heating rate about 80 percent of that expected during the AFE entry and 4 at approximately the full, 100 percent AFE heating rate profile, that produces a temperature of about 2900 F on the surface of the tiles that protect the vehicle. These tests show that a conductively cooled window design using mechanical thermal contacts and sapphire is probably not practical. Cooling the window using mechanical thermal contacts produces thermal stresses in the sapphire that cause the window to crack. An insulated design using sapphire, that cools the window as little as possible, appears promising although some spectral data in the vacuum-ultra-violet (VUV) will be lost due to the high temperature reached by the sapphire. The surface of the insulated sapphire windows, tested at the 100 percent AFE heating rate, showed some slight ablation, and cracks appeared in two of three test windows. One small group of cracks were obviously caused by mechanical binding of the window in the assembly, which can be eliminated with improved design. Other cracks were long, straight, thin crystallographic cracks that have very little effect on the optical transmission of the window. Also, the windows did not fall apart along these crystallographic cracks when the windows were removed from their assemblies. Theoretical results from the thermal analysis computer program SINDA indicate that increasing the window thickness from 4 to 8 mm may enable surface ablation to be avoided. An insulated design using a fused silica window tested at the nominal AFE heating rate experienced severe ablation, thus fused silica is not considered to be an acceptable window material.
NASA Astrophysics Data System (ADS)
Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh
2008-11-01
Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.
Johnson, Christine M; Sullivan, Jess; Buck, Cara L; Trexel, Julie; Scarpuzzi, Mike
2015-01-01
Anticipating the location of a temporarily obscured target-what Piaget (the construction of reality in the child. Basic Books, New York, 1954) called "object permanence"-is a critical skill, especially in hunters of mobile prey. Previous research with bottlenose dolphins found they could predict the location of a target that had been visibly displaced into an opaque container, but not one that was first placed in an opaque container and then invisibly displaced to another container. We tested whether, by altering the task to involve occlusion rather than containment, these animals could show more advanced object permanence skills. We projected dynamic visual displays at an underwater-viewing window and videotaped the animals' head moves while observing these displays. In Experiment 1, the animals observed a small black disk moving behind occluders that shifted in size, ultimately forming one large occluder. Nine out of ten subjects "tracked" the presumed movement of the disk behind this occluder on their first trial-and in a statistically significant number of subsequent trials-confirming their visible displacement abilities. In Experiment 2, we tested their invisible displacement abilities. The disk first disappeared behind a pair of moving occluders, which then moved behind a stationary occluder. The moving occluders then reappeared and separated, revealing that the disk was no longer behind them. The subjects subsequently looked to the correct stationary occluder on eight of their ten first trials, and in a statistically significant number of subsequent trials. Thus, by altering the stimuli to be more ecologically valid, we were able to show that the dolphins could indeed succeed at an invisible displacement task.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Vendemia, Nicholas; Chao, Jerry; Ivanidze, Jana; Sanelli, Pina; Spinelli, Henry M
2011-01-01
Medpor (Porex Surgical, Inc, Newnan, GA) is composed of porous polyethylene and is commonly used in craniofacial reconstruction. When complications such as seroma or abscess formation arise, diagnostic modalities are limited because Medpor is radiolucent on conventional radiologic studies. This poses a problem in situations where imaging is necessary to distinguish the implant from surrounding tissues. To present a clinically useful method for imaging Medpor with conventional computed tomographic (CT) scanning. Eleven patients (12 total implants) who have undergone reconstructive surgery with Medpor were included in the study. A retrospective review of CT scans done between 1 and 16 months postoperatively was performed using 3 distinct CT window settings. Measurements of implant dimensions and Hounsfield units were recorded and qualitatively assessed. Of the 3 distinct window settings studied, namely, "bone" (W1100/L450), "soft tissue"; (W500/L50), and "implant" (W800/L200), the implant window proved the most ideal, allowing the investigators to visualize and evaluate Medpor in all cases. Qualitative analysis revealed that Medpor implants were able to be distinguished from surrounding tissue in both the implant and soft tissue windows, with a density falling between that of fat and fluid. In 1 case, Medpor could not be visualized in the soft tissue window, although it could be visualized in the implant window. Quantitative analysis demonstrated a mean (SD) density of -38.7 (7.4) Hounsfield units. Medpor may be optimally visualized on conventional CT scans using the implant window settings W800/L200, which can aid in imaging Medpor and diagnosing implant-related complications.
Launch window analysis of satellites in high eccentricity or large circular orbits
NASA Technical Reports Server (NTRS)
Renard, M. L.; Bhate, S. K.; Sridharan, R.
1973-01-01
Numerical methods and computer programs for studying the stability and evolution of orbits of large eccentricity are presented. Methods for determining launch windows and target dates are developed. Mathematical models are prepared to analyze the characteristics of specific missions.
An embedded laser marking controller based on ARM and FPGA processors.
Dongyun, Wang; Xinpiao, Ye
2014-01-01
Laser marking is an important branch of the laser information processing technology. The existing laser marking machine based on PC and WINDOWS operating system, are large and inconvenient to move. Still, it cannot work outdoors or in other harsh environments. In order to compensate for the above mentioned disadvantages, this paper proposed an embedded laser marking controller based on ARM and FPGA processors. Based on the principle of laser galvanometer scanning marking, the hardware and software were designed for the application. Experiments showed that this new embedded laser marking controller controls the galvanometers synchronously and could achieve precise marking.
NASA TileWorld manual (system version 2.2)
NASA Technical Reports Server (NTRS)
Philips, Andrew B.; Bresina, John L.
1991-01-01
The commands are documented of the NASA TileWorld simulator, as well as providing information about how to run it and extend it. The simulator, implemented in Common Lisp with Common Windows, encodes a particular range in a spectrum of domains, for controllable research experiments. TileWorld consists of a two dimensional grid of cells, a set of polygonal tiles, and a single agent which can grasp and move tiles. In addition to agent executable actions, there is an external event over which the agent has not control; this event correspond to a 'gust of wind'.
2001-08-08
KODIAK ISLAND, Alaska -- The Sapphire payload is moved into position next to the Starshine 3 payload at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5 p.m. to 7 p.m. p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
Yoon, Jai-Woong; Sawant, Amit; Suh, Yelin; Cho, Byung-Chul; Suh, Tae-Suk; Keall, Paul
2011-07-01
In dynamic multileaf collimator (MLC) motion tracking with complex intensity-modulated radiation therapy (IMRT) fields, target motion perpendicular to the MLC leaf travel direction can cause beam holds, which increase beam delivery time by up to a factor of 4. As a means to balance delivery efficiency and accuracy, a moving average algorithm was incorporated into a dynamic MLC motion tracking system (i.e., moving average tracking) to account for target motion perpendicular to the MLC leaf travel direction. The experimental investigation of the moving average algorithm compared with real-time tracking and no compensation beam delivery is described. The properties of the moving average algorithm were measured and compared with those of real-time tracking (dynamic MLC motion tracking accounting for both target motion parallel and perpendicular to the leaf travel direction) and no compensation beam delivery. The algorithm was investigated using a synthetic motion trace with a baseline drift and four patient-measured 3D tumor motion traces representing regular and irregular motions with varying baseline drifts. Each motion trace was reproduced by a moving platform. The delivery efficiency, geometric accuracy, and dosimetric accuracy were evaluated for conformal, step-and-shoot IMRT, and dynamic sliding window IMRT treatment plans using the synthetic and patient motion traces. The dosimetric accuracy was quantified via a tgamma-test with a 3%/3 mm criterion. The delivery efficiency ranged from 89 to 100% for moving average tracking, 26%-100% for real-time tracking, and 100% (by definition) for no compensation. The root-mean-square geometric error ranged from 3.2 to 4.0 mm for moving average tracking, 0.7-1.1 mm for real-time tracking, and 3.7-7.2 mm for no compensation. The percentage of dosimetric points failing the gamma-test ranged from 4 to 30% for moving average tracking, 0%-23% for real-time tracking, and 10%-47% for no compensation. The delivery efficiency of moving average tracking was up to four times higher than that of real-time tracking and approached the efficiency of no compensation for all cases. The geometric accuracy and dosimetric accuracy of the moving average algorithm was between real-time tracking and no compensation, approximately half the percentage of dosimetric points failing the gamma-test compared with no compensation.
Rusterholz, Thomas; Achermann, Peter; Dürr, Roland; Koenig, Thomas; Tarokh, Leila
2017-06-01
Investigating functional connectivity between brain networks has become an area of interest in neuroscience. Several methods for investigating connectivity have recently been developed, however, these techniques need to be applied with care. We demonstrate that global field synchronization (GFS), a global measure of phase alignment in the EEG as a function of frequency, must be applied considering signal processing principles in order to yield valid results. Multichannel EEG (27 derivations) was analyzed for GFS based on the complex spectrum derived by the fast Fourier transform (FFT). We examined the effect of window functions on GFS, in particular of non-rectangular windows. Applying a rectangular window when calculating the FFT revealed high GFS values for high frequencies (>15Hz) that were highly correlated (r=0.9) with spectral power in the lower frequency range (0.75-4.5Hz) and tracked the depth of sleep. This turned out to be spurious synchronization. With a non-rectangular window (Tukey or Hanning window) these high frequency synchronization vanished. Both, GFS and power density spectra significantly differed for rectangular and non-rectangular windows. Previous papers using GFS typically did not specify the applied window and may have used a rectangular window function. However, the demonstrated impact of the window function raises the question of the validity of some previous findings at higher frequencies. We demonstrated that it is crucial to apply an appropriate window function for determining synchronization measures based on a spectral approach to avoid spurious synchronization in the beta/gamma range. Copyright © 2017 Elsevier B.V. All rights reserved.
Windows Into the Real World From a Virtual Globe
NASA Astrophysics Data System (ADS)
Rich, J.; Urban-Rich, J.
2007-12-01
Virtual globes such as Google Earth can be great tools for learning about the geographical variation of the earth. The key to virtual globes is the use of satellite imagery to provide a highly accurate view of the earth's surface. However, because the images are not updated regularly, variations in climate and vegetation over time can not be easily seen. In order to enhance the view of the earth and observe these changes by region and over time we are working to add near real time "windows" into the real world from a virtual globe. For the past 4 years we have been installing web cameras in areas of the world that will provide long term monitoring of global changes. By archiving hourly images from arctic, temperate and tropical regions we are creating a visual data set that is already beginning to tell the story of climate variability. The cameras are currently installed in 10 elementary schools in 3 countries and show the student's view out each window. The Windows Around the World program (http://www.WindowsAroundTheWorld.org) uses the images from these cameras to help students gain a better understanding of earth process and variability in climate and vegetation between different regions and over time. Previously we have used standard web based technologies such as DHTML and AJAX to provide near real-time access to these images and also provide enhanced functionality such as dynamic time lapse movies that allow users to see changes over months, days or hours up to the current hour (http://www.windowsaroundtheworld.org/north_america.aspx). We have integrated the camera images from Windows Around the World into Google Earth. Through network links and models we are creating a way for students to "fly" to another school in the program and see what the current view is out the window. By using a model as a screen, the image can be viewed from the same direction as the students who are sitting in a classroom at the participating school. Once at the school, visiting students can move around the area in three dimensions and gain a better understanding of what they are seeing out the window. Currently time-lapse images can be viewed at a lower resolution for all schools on the globe or when flying into an individual school, higher resolution time-lapse images can be seen. The observation of shadows, precipitation, movement of the sun and changes in vegetation allows the viewer to gain a better understanding of how the earth works and how the environment changes between regions and over time. World.org
77 FR 12588 - Long Fence & Home, LLLP; Analysis of Proposed Consent Order To Aid Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
... homeowners can realize by replacing their windows, including the home's geographic location, size, insulation... window of a specific composition in a building having a specific level of insulation in a specific region..., energy savings, energy [[Page 12590
Thermal and Lorentz Force Analysis of Beryllium Windows for the Rectilinear Muon Cooling Channel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Tianhuan; Li, D.; Virostek, S.
Reduction of the 6-dimensional phase-space of a muon beam by several orders of magnitude is a key requirement for a Muon Collider. Recently, a 12-stage rectilinear ionization cooling channel has been proposed to achieve that goal. The channel consists of a series of low frequency (325 MHz-650 MHz) normal conducting pillbox cavities, which are enclosed with thin beryllium windows (foils) to increase shunt impedance and give a higher field on-axis for a given amount of power. These windows are subject to ohmic heating from RF currents and Lorentz force from the EM field in the cavity, both of which willmore » produce out of the plane displacements that can detune the cavity frequency. In this study, using the TEM3P code, we report on a detailed thermal and mechanical analysis for the actual Be windows used on a 325 MHz cavity in a vacuum ionization cooling rectilinear channel for a Muon Collider.« less
Thermal and Lorentz force analysis of beryllium windows for a rectilinear muon cooling channel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, T.; Stratakis, D.; Li, D.
Reduction of the 6-dimensional phase-space of a muon beam by several orders of magnitude is a key requirement for a Muon Collider. Recently, a 12-stage rectilinear ionization cooling channel has been proposed to achieve that goal. The channel consists of a series of low frequency (325 MHz-650 MHz) normal conducting pillbox cavities, which are enclosed with thin beryllium windows (foils) to increase shunt impedance and give a higher field on-axis for a given amount of power. These windows are subject to ohmic heating from RF currents and Lorentz force from the EM field in the cavity, both of which willmore » produce out of the plane displacements that can detune the cavity frequency. In this study, using the TEM3P code, we report on a detailed thermal and mechanical analysis for the actual Be windows used on a 325 MHz cavity in a vacuum ionization cooling rectilinear channel for a Muon Collider.« less
Assessment of candidates for target window material in accelerator-driven molybdenum-99 production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strons, Philip; Bailey, James; Makarashvili, Vakhtang
2016-10-01
NorthStar Medical Technologies is pursuing production of an important medical isotope, Mo-99, through a photo-nuclear reaction of a Mo-100 target using a high-power electron accelerator. The current target utilizes an Inconel 718 window. The purpose of this study was to evaluate other candidate materials for the target window, which separates the high-pressure helium gas inside the target from the vacuum inside the accelerator beamline and is subjected to significant stress. Our initial analysis assessed the properties (density, thermal conductivity, maximum stress, minimum window thickness, maximum temperature, and figure of merit) for a range of materials, from which the three mostmore » promising were chosen: Inconel 718, 250 maraging steel, and standard-grade beryllium. These materials were subjected to further analysis to determine the effects of thermal and mechanical strain versus beam power at varying thicknesses. Both beryllium and the maraging steel were calculated to withstand more than twice as high beam power than Inconel 718.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J; Shi, W; Andrews, D
2015-06-15
Purpose To compare online image registrations of TrueBeam cone-beam CT (CBCT) and BrainLab ExacTrac imaging systems. Methods Tests were performed on a Varian TrueBeam STx linear accelerator (Version 2.0), which is integrated with a BrainLab ExacTrac imaging system (Version 6.0.5). The study was focused on comparing the online image registrations for translational shifts. A Rando head phantom was placed on treatment couch and immobilized with a BrainLab mask. The phantom was shifted by moving the couch translationally for 8 mm with a step size of 1 mm, in vertical, longitudinal, and lateral directions, respectively. At each location, the phantom wasmore » imaged with CBCT and ExacTrac x-ray. CBCT images were registered with TrueBeam and ExacTrac online registration algorithms, respectively. And ExacTrac x-ray image registrations were performed. Shifts calculated from different registrations were compared with nominal couch shifts. Results The averages and ranges of absolute differences between couch shifts and calculated phantom shifts obtained from ExacTrac x-ray registration, ExacTrac CBCT registration with default window, ExaxTrac CBCT registration with adjusted window (bone), Truebeam CBCT registration with bone window, and Truebeam CBCT registration with soft tissue window, were: 0.07 (0.02–0.14), 0.14 (0.01–0.35), 0.12 (0.02–0.28), 0.09 (0–0.20), and 0.06 (0–0.10) mm, in vertical direction; 0.06 (0.01–0.12), 0.27 (0.07–0.57), 0.23 (0.02–0.48), 0.04 (0–0.10), and 0.08 (0– 0.20) mm, in longitudinal direction; 0.05 (0.01–0.21), 0.35 (0.14–0.80), 0.25 (0.01–0.56), 0.19 (0–0.40), and 0.20 (0–0.40) mm, in lateral direction. Conclusion The shifts calculated from ExacTrac x-ray and TrueBeam CBCT registrations were close to each other (the differences between were less than 0.40 mm in any direction), and had better agreements with couch shifts than those from ExacTrac CBCT registrations. There were no significant differences between TrueBeam CBCT registrations using different windows. In ExacTrac CBCT registrations, using bone window led to better agreements than using default window.« less
Removal of Noise from a Voice Signal by Synthesis
1973-05-01
for 102.4 millisecond windows is about five times as great as the cost of computing for 25.6 millisecond windows. Hammett in his work on an adaptive...spectrum analysis vocoder, has examined the selection of data window widths in detail [18]. The solution Hammett used to optimize the trade off between...result is: n s(t) E Ri(t - i . T) i-1 In this equation n is the number of impulse responses under consideration, s(t) is the resulting synthetic signal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, J.; Moon, T.J.; Howell, J.R.
This paper presents an analysis of the heat transfer occurring during an in-situ curing process for which infrared energy is provided on the surface of polymer composite during winding. The material system is Hercules prepreg AS4/3501-6. Thermoset composites have an exothermic chemical reaction during the curing process. An Eulerian thermochemical model is developed for the heat transfer analysis of helical winding. The model incorporates heat generation due to the chemical reaction. Several assumptions are made leading to a two-dimensional, thermochemical model. For simplicity, 360{degree} heating around the mandrel is considered. In order to generate the appropriate process windows, the developedmore » heat transfer model is combined with a simple winding time model. The process windows allow for a proper selection of process variables such as infrared energy input and winding velocity to give a desired end-product state. Steady-state temperatures are found for each combination of the process variables. A regression analysis is carried out to relate the process variables to the resulting steady-state temperatures. Using regression equations, process windows for a wide range of cylinder diameters are found. A general procedure to find process windows for Hercules AS4/3501-6 prepreg tape is coded in a FORTRAN program.« less
Short segment search method for phylogenetic analysis using nested sliding windows
NASA Astrophysics Data System (ADS)
Iskandar, A. A.; Bustamam, A.; Trimarsanto, H.
2017-10-01
To analyze phylogenetics in Bioinformatics, coding DNA sequences (CDS) segment is needed for maximal accuracy. However, analysis by CDS cost a lot of time and money, so a short representative segment by CDS, which is envelope protein segment or non-structural 3 (NS3) segment is necessary. After sliding window is implemented, a better short segment than envelope protein segment and NS3 is found. This paper will discuss a mathematical method to analyze sequences using nested sliding window to find a short segment which is representative for the whole genome. The result shows that our method can find a short segment which more representative about 6.57% in topological view to CDS segment than an Envelope segment or NS3 segment.
Cerquera, Alexander; Vollebregt, Madelon A; Arns, Martijn
2018-03-01
Nonlinear analysis of EEG recordings allows detection of characteristics that would probably be neglected by linear methods. This study aimed to determine a suitable epoch length for nonlinear analysis of EEG data based on its recurrence rate in EEG alpha activity (electrodes Fz, Oz, and Pz) from 28 healthy and 64 major depressive disorder subjects. Two nonlinear metrics, Lempel-Ziv complexity and scaling index, were applied in sliding windows of 20 seconds shifted every 1 second and in nonoverlapping windows of 1 minute. In addition, linear spectral analysis was carried out for comparison with the nonlinear results. The analysis with sliding windows showed that the cortical dynamics underlying alpha activity had a recurrence period of around 40 seconds in both groups. In the analysis with nonoverlapping windows, long-term nonstationarities entailed changes over time in the nonlinear dynamics that became significantly different between epochs across time, which was not detected with the linear spectral analysis. Findings suggest that epoch lengths shorter than 40 seconds neglect information in EEG nonlinear studies. In turn, linear analysis did not detect characteristics from long-term nonstationarities in EEG alpha waves of control subjects and patients with major depressive disorder patients. We recommend that application of nonlinear metrics in EEG time series, particularly of alpha activity, should be carried out with epochs around 60 seconds. In addition, this study aimed to demonstrate that long-term nonlinearities are inherent to the cortical brain dynamics regardless of the presence or absence of a mental disorder.
Ariza, Pedro; Solesio-Jofre, Elena; Martínez, Johann H.; Pineda-Pardo, José A.; Niso, Guiomar; Maestú, Fernando; Buldú, Javier M.
2015-01-01
In this study we used graph theory analysis to investigate age-related reorganization of functional networks during the active maintenance of information that is interrupted by external interference. Additionally, we sought to investigate network differences before and after averaging network parameters between both maintenance and interference windows. We compared young and older adults by measuring their magnetoencephalographic recordings during an interference-based working memory task restricted to successful recognitions. Data analysis focused on the topology/temporal evolution of functional networks during both the maintenance and interference windows. We observed that: (a) Older adults require higher synchronization between cortical brain sites in order to achieve a successful recognition, (b) The main differences between age groups arise during the interference window, (c) Older adults show reduced ability to reorganize network topology when interference is introduced, and (d) Averaging network parameters leads to a loss of sensitivity to detect age differences. PMID:26029079
Launch Window Trade Analysis for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Yu, Wayne H.; Richon, Karen
2014-01-01
The James Webb Space Telescope (JWST) is a large-scale space telescope mission designed to study fundamental astrophysical questions ranging from the formation of the universe to the origin of planetary systems and the origins of life. JWSTs orbit design is a Libration Point Orbit (LPO) around the Sun-Earth/Moon (SEM) L2 point for a planned mission lifetime of 10.5 years. The launch readiness period for JWST is from Oct 1st, 2018 November 30th, 2018. This paper presents the first launch window analysis for the JWST observatory using finite-burn modeling; previous analysis assumed a single impulsive midcourse correction to achieve the mission orbit. The physical limitations of the JWST hardware stemming primarily from propulsion, communication and thermal requirements alongside updated mission design requirements result in significant launch window within the launch readiness period. Future plans are also discussed.
James Webb Space Telescope Launch Window Trade Analysis
NASA Technical Reports Server (NTRS)
Yu, Wayne; Richon, Karen
2014-01-01
The James Webb Space Telescope (JWST) is a large-scale space telescope mission designed to study fundamental astrophysical questions ranging from the formation of the universe to the origin of planetary systems and the origins of life. JWSTs orbit design is a Libration Point Orbit (LPO) around the Sun-EarthMoon (SEM) L2 point for a planned mission lifetime of 10.5 years. The launch readiness period for JWST is from Oct 1st, 2018 November 30th, 2018. This paper presents the first launch window analysis for the JWST observatory using finite-burn modeling; previous analysis assumed a single impulsive midcourse correction to achieve the mission orbit. The physical limitations of the JWST hardware stemming primarily from propulsion, communication and thermal requirements alongside updated mission design requirements result in significant launch window within the launch readiness period. Future plans are also discussed.
Size and Location of Defects at the Coupling Interface Affect Lithotripter Performance
Li, Guangyan; Williams, James C.; Pishchalnikov, Yuri A.; Liu, Ziyue; McAteer, James A.
2012-01-01
OBJECTIVE To determine how the size and location of coupling defects caught between the therapy head of a lithotripter and the skin of a surrogate patient (acoustic window of a test chamber) affect the features of shock waves responsible for stone breakage. METHODS Model defects were placed in the coupling gel between the therapy head of a Dornier Compact-S electromagnetic lithotripter and the Mylar window of a water-filled coupling test system. A fiber-optic hydrophone was used to measure acoustic pressures and map the lateral dimensions of the focal zone of the lithotripter. The effect of coupling conditions on stone breakage was assessed using Gypsum model stones. RESULTS Stone breakage decreased in proportion to the area of the coupling defect; a centrally located defect blocking only 18% of the transmission area reduced stone breakage by an average of almost 30%. The effect on stone breakage was greater for defects located on-axis and decreased as the defect was moved laterally; an 18% defect located near the periphery of the coupling window (2.0 cm off-axis) reduced stone breakage by only ~15% compared to when coupling was completely unobstructed. Defects centered within the coupling window acted to narrow the focal width of the lithotripter; an 8.2% defect reduced the focal width ~30% compared to no obstruction (4.4 mm versus 6.5 mm). Coupling defects located slightly off center disrupted the symmetry of the acoustic field; an 18% defect positioned 1.0 cm off-axis shifted the focus of maximum positive pressure ~1.0 mm laterally. Defects on and off-axis imposed a significant reduction in the energy density of shock waves across the focal zone. CONCLUSIONS In addition to blocking the transmission of shock wave energy, coupling defects also disrupt the properties of shock waves that play a role in stone breakage, including the focal width of the lithotripter and the symmetry of the acoustic field; the effect is dependent on the size and location of defects, with defects near the center of the coupling window having the greatest effect. These data emphasize the importance of eliminating air pockets from the coupling interface, particularly defects located near the center of the coupling window. PMID:22938566
NASA Astrophysics Data System (ADS)
Li, M.; Yu, T.; Chunliang, X.; Zuo, X.; Liu, Z.
2017-12-01
A new method for estimating the equatorial plasma bubbles (EPBs) motions from airglow emission all-sky images is presented in this paper. This method, which is called 'cloud-derived wind technology' and widely used in satellite observation of wind, could reasonable derive zonal and meridional velocity vectors of EPBs drifts by tracking a series of successive airglow 630.0 nm emission images. Airglow emission images data are available from an all sky airglow camera in Hainan Fuke (19.5°N, 109.2°E) supported by China Meridional Project, which can receive the 630.0nm emission from the ionosphere F region at low-latitudes to observe plasma bubbles. A series of pretreatment technology, e.g. image enhancement, orientation correction, image projection are utilized to preprocess the raw observation. Then the regions of plasma bubble extracted from the images are divided into several small tracing windows and each tracing window can find a target window in the searching area in following image, which is considered as the position tracing window moved to. According to this, velocities in each window are calculated by using the technology of cloud-derived wind. When applying the cloud-derived wind technology, the maximum correlation coefficient (MCC) and the histogram of gradient (HOG) methods to find the target window, which mean to find the maximum correlation and the minimum euclidean distance between two gradient histograms in respectively, are investigated and compared in detail. The maximum correlation method is fianlly adopted in this study to analyze the velocity of plasma bubbles because of its better performance than HOG. All-sky images from Hainan Fuke, between August 2014 and October 2014, are analyzed to investigate the plasma bubble drift velocities using MCC method. The data at different local time at 9 nights are studied and find that zonal drift velocity in different latitude at different local time ranges from 50 m/s to 180 m/s and there is a peak value at about 20°N. For comparison and validation, EPBs motions obtained from three traditional methods are also investigated and compared with MC method. The advantages and disadvantages of using cloud-derived wind technology to calculate EPB drift velocity are discussed.
NASA Astrophysics Data System (ADS)
Dwi Prastyo, Dedy; Handayani, Dwi; Fam, Soo-Fen; Puteri Rahayu, Santi; Suhartono; Luh Putu Satyaning Pradnya Paramita, Ni
2018-03-01
Risk assessment and evaluation becomes essential for financial institution to measure the potential risk of their counterparties. In middle of 2016 until first quarter of 2017, there is national program from Indonesian government so-called Tax Amnesty. One subsector that has potential to receive positive impact from the Tax Amnesty program is property and real estate. This work evaluates the risk of top five companies in term of capital share listed in Indonesia stock exchange (IDX). To do this, the Value-at-Risk (VaR) with ARMAX-GARCHX approach is employed. The ARMAX-GARCHX simultaneously models the adaptive mean and variance of stock return of each company considering exogenous variables, i.e. IDR/USD exchange rate and Jakarta Composite Index (JCI). The risk is evaluated in scheme of time moving window. The risk evaluation using 5% quantile with window size 500 transaction days perform better result compare to other scenarios. In addition, duration test is used to test the dependency between shortfalls. It informs that series of shortfall are independent.
Retinoic acid temporally orchestrates colonization of the gut by vagal neural crest cells.
Uribe, Rosa A; Hong, Stephanie S; Bronner, Marianne E
2018-01-01
The enteric nervous system arises from neural crest cells that migrate as chains into and along the primitive gut, subsequently differentiating into enteric neurons and glia. Little is known about the mechanisms governing neural crest migration en route to and along the gut in vivo. Here, we report that Retinoic Acid (RA) temporally controls zebrafish enteric neural crest cell chain migration. In vivo imaging reveals that RA loss severely compromises the integrity and migration of the chain of neural crest cells during the window of time window when they are moving along the foregut. After loss of RA, enteric progenitors accumulate in the foregut and differentiate into enteric neurons, but subsequently undergo apoptosis resulting in a striking neuronal deficit. Moreover, ectopic expression of the transcription factor meis3 and/or the receptor ret, partially rescues enteric neuron colonization after RA attenuation. Collectively, our findings suggest that retinoic acid plays a critical temporal role in promoting enteric neural crest chain migration and neuronal survival upstream of Meis3 and RET in vivo. Copyright © 2017 Elsevier Inc. All rights reserved.
Software for Viewing Landsat Mosaic Images
NASA Technical Reports Server (NTRS)
Watts, Zack; Farve, Catharine L.; Harvey, Craig
2003-01-01
A Windows-based computer program has been written to enable novice users (especially educators and students) to view images of large areas of the Earth (e.g., the continental United States) generated from image data acquired in the Landsat observations performed circa the year 1990. The large-area images are constructed as mosaics from the original Landsat images, which were acquired in several wavelength bands and each of which spans an area (in effect, one tile of a mosaic) of .5 in latitude by .6 in longitude. Whereas the original Landsat data are registered on a universal transverse Mercator (UTM) grid, the program converts the UTM coordinates of a mouse pointer in the image to latitude and longitude, which are continuously updated and displayed as the pointer is moved. The mosaic image currently on display can be exported as a Windows bitmap file. Other images (e.g., of state boundaries or interstate highways) can be overlaid on Landsat mosaics. The program interacts with the user via standard toolbar, keyboard, and mouse user interfaces. The program is supplied on a compact disk along with tutorial and educational information.
Verch, Andreas; Pfaff, Marina; de Jonge, Niels
2015-06-30
Gold nanoparticles were observed to move at a liquid/solid interface 3 orders of magnitude slower than expected for the movement in a bulk liquid by Brownian motion. The nanoscale movement was studied with scanning transmission electron microscopy (STEM) using a liquid enclosure consisting of microchips with silicon nitride windows. The experiments involved a variation of the electron dose, the coating of the nanoparticles, the surface charge of the enclosing membrane, the viscosity, and the liquid thickness. The observed slow movement was not a result of hydrodynamic hindrance near a wall but instead explained by the presence of a layer of ordered liquid exhibiting a viscosity 5 orders of magnitude larger than a bulk liquid. The increased viscosity presumably led to a dramatic slowdown of the movement. The layer was formed as a result of the surface charge of the silicon nitride windows. The exceptionally slow motion is a crucial aspect of electron microscopy of specimens in liquid, enabling a direct observation of the movement and agglomeration of nanoscale objects in liquid.
Car glass microphones using piezoelectric transducers for external alarm detection and localization
NASA Astrophysics Data System (ADS)
Bolzmacher, Christian; Le Guelvouit, Valentin
2015-05-01
This work describes the potential use of car windows as a long range acoustic sensing device for external alarm signals. The goal is to detect and localize siren signals (e.g. ambulances and police cars) and to alert presbycusic drivers of its presence by visual and acoustic feedback in order to improve individual mobility and increase the sense of security. The glass panes of a Renault Zoé operating as an acoustic antenna have been equipped with large 50 mm outer diameter piezoceramic rings, hidden in the lower part of the door structure and the lower part of the windshield and the rear window. The response of the glass to quasi-static signals and sweep excitation has been recorded. In general, the glass pane is acting as a high pass filter due to its inherent stiffness and provides only little damping. This effect is compensated by using a charge amplifier electronic circuit. The detection capability up to 120 m as well as a dynamic test where the car is moving towards the sound source is reported.
Hubbard, Timothy L; Motes, Michael A
2005-08-01
Memory for the initial and final positions of moving targets was examined. When targets appeared adjacent to the boundary of a larger enclosing window, memory for initial position exhibited a Fröhlich effect (i.e., a displacement forward), and when distance of initial position from the boundary increased, memory for initial position exhibited a smaller Fröhlich effect or an onset repulsion effect (i.e., a displacement backward). When targets vanished adjacent to the boundary of a larger enclosing window, memory for final position was displaced backward, and when distance of final position from the boundary increased, memory for final position did not exhibit significant displacement. These patterns differed from previously reported displacements of initial and final positions of targets presented on a blank background. Possible influences of attention and extrapolation of trajectory on whether memory for initial position exhibits a Fröhlich effect or an onset repulsion effect and on backward displacement in memory for final position are discussed.
MOVING TO INEQUALITY: NEIGHBORHOOD EFFECTS AND EXPERIMENTS MEET STRUCTURE1
Sampson, Robert J.
2014-01-01
The Moving to Opportunity (MTO) housing experiment has proven to be an important intervention not just in the lives of the poor, but in social science theories of neighborhood effects. Competing causal claims have been the subject of considerable disagreement, culminating in the debate between Clampet-Lundquist and Massey (2008) and Ludwig et al. (2008). This paper assesses the debate by clarifying analytically distinct questions posed by neighborhood-level theories, reconceptualizing selection bias as a fundamental social process worthy of study in its own right rather than as a statistical nuisance, and reconsidering the scientific method of experimentation, and hence causality, in the social world of the city. I also analyze MTO and independent survey data from Chicago to examine trajectories of residential attainment. Although MTO provides crucial leverage for estimating neighborhood effects on individuals, as proponents rightly claim, I demonstrate the implications imposed by a stratified urban structure and how MTO simultaneously provides a new window on the social reproduction of concentrated inequality. PMID:25360053
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 nears Launch Pad 39B (in the background, right). The tip of the orange external tank can be seen above the rotating service structure surrounding the shuttle. The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
2006-08-28
KENNEDY SPACE CENTER, FLA. - Crawler-transporter No. 2 makes its way toward Launch Pad 39B (in the background). The tip of the orange external tank can be seen above the rotating service structure surrounding the shuttle. The crawler is being moved nearby in the event the mission management team decides to roll back Space Shuttle Atlantis due to Hurricane Ernesto. The hurricane has been forecast on a heading north and east from Cuba, taking it along the eastern coast of Florida. NASA's lighted launch window extends to Sept. 13, but mission managers are hoping to launch on mission STS-115 by Sept. 7 to avoid a conflict with a Russian Soyuz rocket also bound for the International Space Station. The crawler is 131 feet long, 113 feet wide and 20 feet high. It weights 5.5 million pounds unloaded. The combined weight of crawler, mobile launcher platform and a space shuttle is 12 million pounds. Unloaded, the crawler moves at 2 mph. Loaded, the snail's pace slows to 1 mph. Photo credit: NASA/Kim Shiflett
Aperture Synthesis Shows Perceptual Integration of Geometrical Form Across Saccades.
Schreiber, Kai; Morgan, Michael
2018-03-01
We investigated the perceptual bias in perceived relative lengths in the Brentano version of the Müller-Lyer arrowheads figure. The magnitude of the bias was measured both under normal whole-figure viewing condition and under an aperture viewing condition, where participants moved their gaze around the figure but could see only one arrowhead at a time through a Gaussian-weighted contrast window. The extent of the perceptual bias was similar under the two conditions. The stimuli were presented on a CRT in a light-proof room with room-lights off, but visual context was provided by a rectangular frame surrounding the figure. The frame was either stationary with respect to the figure or moved in such a manner that the bias would be counteracted if the observer were locating features with respect to the frame. Biases were reduced in the latter condition. We conclude that integration occurs over saccades, but largely in an external visual framework, rather than in a body-centered frame using an extraretinal signal.
Bzorgi, Fariborz M.
2015-05-19
In various embodiments an apparatus is presented for securing a structure such as a door, window, hatch, or gate that moves between an open and a closed position relative to a fixed structure to provide or deny access to a compartment, a room, an outdoor area, or a facility. Various embodiments provide a delay in opening the closure of sufficient duration to frustrate a rapid activation that might be desired by a person who is attempting to pass through the closure for some illicit purpose. Typically, hydraulics are used to activate the apparatus and no electrical energy or electronic signals are employed. In one embodiment, a plurality of actuations of a hand lever operates a hydraulic pump that moves a locking bolt from a first position in which a locking bolt is engaged with a recess in the fixed structure (preventing opening of a gate) to a second position in which the locking bolt is disengaged from the recess to permit opening of the gate.
Static analysis of a sonar dome rubber window
NASA Technical Reports Server (NTRS)
Lai, J. L.
1978-01-01
The application of NASTRAN (level 16.0.1) to the static analysis of a sonar dome rubber window (SDRW) was demonstrated. The assessment of the conventional model (neglecting the enclosed fluid) for the stress analysis of the SDRW was made by comparing its results to those based on a sophisticated model (including the enclosed fluid). The fluid was modeled with isoparametric linear hexahedron elements with approximate material properties whose shear modulus was much smaller than its bulk modulus. The effect of the chosen material property for the fluid is discussed.
NASA Astrophysics Data System (ADS)
Yang, Shuang-Long; Liang, Li-Ping; Liu, Hou-De; Xu, Ke-Jun
2018-03-01
Aiming at reducing the estimation error of the sensor frequency response function (FRF) estimated by the commonly used window-based spectral estimation method, the error models of interpolation and transient errors are derived in the form of non-parameter models. Accordingly, window effects on the errors are analyzed and reveal that the commonly used hanning window leads to smaller interpolation error which can also be significantly eliminated by the cubic spline interpolation method when estimating the FRF from the step response data, and window with smaller front-end value can restrain more transient error. Thus, a new dual-cosine window with its non-zero discrete Fourier transform bins at -3, -1, 0, 1, and 3 is constructed for FRF estimation. Compared with the hanning window, the new dual-cosine window has the equivalent interpolation error suppression capability and better transient error suppression capability when estimating the FRF from the step response; specifically, it reduces the asymptotic property of the transient error from O(N-2) of the hanning window method to O(N-4) while only increases the uncertainty slightly (about 0.4 dB). Then, one direction of a wind tunnel strain gauge balance which is a high order, small damping, and non-minimum phase system is employed as the example for verifying the new dual-cosine window-based spectral estimation method. The model simulation result shows that the new dual-cosine window method is better than the hanning window method for FRF estimation, and compared with the Gans method and LPM method, it has the advantages of simple computation, less time consumption, and short data requirement; the actual data calculation result of the balance FRF is consistent to the simulation result. Thus, the new dual-cosine window is effective and practical for FRF estimation.
Low-E Storm Windows Gain Acceptance as a Home Weatherization Measure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbride, Theresa L.; Cort, Katherine A.
This article for Home Energy Magazine describes work by the U.S. Department of Energy to develop low-emissivity storm windows as an energy efficiency-retrofit option for existing homes. The article describes the low-emissivity invisible silver metal coatings on the glass, which reflect heat back into the home in winter or back outside in summer and the benefits of low-e storm windows including insulation, air sealing, noise blocking, protection of antique windows, etc. The article also describes Pacific Northwest National Laboratory's efforts on behalf of DOE to overcome market barriers to adoption of the technology, including performance validation studies in the PNNLmore » Lab Homes, cost effectiveness analysis, production of reports, brochures, how-to guides on low-e storm window installation for the Building America Solution Center, and a video posted on YouTube. PNNL's efforts were reviewed by the Pacific Northwest Regional Technical Forum (RTF), which serves as the advisory board to the Pacific Northwest Electric Power Planning Council and Bonneville Power Administration. In late July 2015, the RTF approved the low-e storm window measure’s savings and specifications, a critical step in integrating low-e storm windows into energy-efficiency planning and utility weatherization and incentive programs. PNNL estimates that more than 90 million homes in the United States with single-pane or low-performing double-pane windows would benefit from the technology. Low-e storm windows are suitable not only for private residences but also for small commercial buildings, historic properties, and facilities that house residents, such as nursing homes, dormitories, and in-patient facilities. To further assist in the market transformation of low-e storm windows and other high-efficiency window attachments, DOE helped found the window Attachment Energy Rating Council (AERC) in 2015. AERC is an independent, public interest, non-profit organization whose mission is to rate, label, and certify the performance of window attachments.« less
Monitoring Wetland Hydro-dynamics in the Prairie Pothole Region Using Landsat Time Series
NASA Astrophysics Data System (ADS)
Zhou, Q.; Rover, J.; Gallant, A.
2017-12-01
Wetlands provide a variety of ecosystem functions, while it is spatially and temporally dynamic. We mapped the dynamics of wetlands in the North Dakota Prairie Pothole Region using all available clear observations of Landsat sensor data from 1985 to 2014. We used a cluster analysis to group pixels exhibiting similar long-term spectral trends over seven Landsat bands, then applied the tasseled-cap transformation to evaluate the temporal characteristics of brightness, greenness, and wetness for each cluster. We tested relations between these three indices and hydrologic conditions, as represented by the Palmer Hydrological Drought Index (PHDI), using the cross-correlation analysis for each cluster performed over an eight-year moving window for the 30 years covered by the study. This temporal window size coincided with the timing of a major shift from a prolonged drought that occurred within the first eight years of the study period to wetter conditions that prevailed throughout the remaining years. The 20 cluster we produced represented a gradient from locations that continuously held water throughout the study period to locations that, at most, held water only for short periods in some years. The spatial distribution of the cluster groups reflected patterns of regional geologic and geomorphologic features. Comparisons of the PHDI to tasseled-cap wetness were the most straightforward to interpret among the results from the three indices. Wetness for most cluster groups had high positive correlations with PHDI during drought years, with the correlations reduced as the landscape entered a lengthy, wetter period; however, wetness generally remained highly and positively correlated with PHDI across all years for four cluster groups where the area exhibited two or more multi-year dry-wet cycles. These same four groups also had strong, generally negative correlations with tasseled-cap brightness. For other cluster groups, brightness often was strongly negatively correlated with the PHDI during the drought years, with the relation weakening for subsequent years of adequate or high moisture. Relations between tasseled-cap greenness and PHDI were highly variable among and within cluster groups. Results from this analysis support ongoing efforts to develop new products that characterize wetland dynamics.
Windowed multipole for cross section Doppler broadening
NASA Astrophysics Data System (ADS)
Josey, C.; Ducru, P.; Forget, B.; Smith, K.
2016-02-01
This paper presents an in-depth analysis on the accuracy and performance of the windowed multipole Doppler broadening method. The basic theory behind cross section data is described, along with the basic multipole formalism followed by the approximations leading to windowed multipole method and the algorithm used to efficiently evaluate Doppler broadened cross sections. The method is tested by simulating the BEAVRS benchmark with a windowed multipole library composed of 70 nuclides. Accuracy of the method is demonstrated on a single assembly case where total neutron production rates and 238U capture rates compare within 0.1% to ACE format files at the same temperature. With regards to performance, clock cycle counts and cache misses were measured for single temperature ACE table lookup and for windowed multipole. The windowed multipole method was found to require 39.6% more clock cycles to evaluate, translating to a 7.9% performance loss overall. However, the algorithm has significantly better last-level cache performance, with 3 fewer misses per evaluation, or a 65% reduction in last-level misses. This is due to the small memory footprint of the windowed multipole method and better memory access pattern of the algorithm.
Non-susceptible landslide areas in Italy and in the Mediterranean region
NASA Astrophysics Data System (ADS)
Marchesini, I.; Ardizzone, F.; Alvioli, M.; Rossi, M.; Guzzetti, F.
2014-04-01
We used landslide information for 13 study areas in Italy and morphometric information obtained from the 3 arc-second SRTM DEM to determine areas where landslide susceptibility is expected to be null or negligible in Italy, and in the landmasses surrounding the Mediterranean Sea. The morphometric information consisted in the local terrain slope computed in a square 3 × 3 cell moving window, and in the regional relative relief computed in a circular 15 × 15 cell moving window. We tested three different models to determine the non-susceptible landslide areas, including a linear model (LR), a quantile linear model (QLR), and a quantile non-linear model (QNL). We tested the performance of the three models using independent landslide information represented by the Italian Landslide Inventory (Inventario Fenomeni Franosi in Italia - IFFI). Best results were obtained using the QNL model. The corresponding zonation of non-susceptible landslide areas was intersected in a GIS with geographical census data for Italy. The result allowed determining that 57.5% of the population of Italy (in 2001) was located in areas where landslide susceptibility is expected to be null or negligible, and that the remaining 42.5% was located in areas where some landslide susceptibility is expected. We applied the QNL model to the landmasses surrounding the Mediterranean Sea, and we tested the synoptic non-susceptibility zonation using independent landslide information for three study areas in Spain. Results proved that the QNL model was capable of determining where landslide susceptibility is expected to be negligible in the Mediterranean area. We expect our results to be applicable in similar study areas, facilitating the identification of non-susceptible and susceptible landslide areas, at the synoptic scale.
Non-susceptible landslide areas in Italy and in the Mediterranean region
NASA Astrophysics Data System (ADS)
Marchesini, I.; Ardizzone, F.; Alvioli, M.; Rossi, M.; Guzzetti, F.
2014-08-01
We used landslide information for 13 study areas in Italy and morphometric information obtained from the 3-arcseconds shuttle radar topography mission digital elevation model (SRTM DEM) to determine areas where landslide susceptibility is expected to be negligible in Italy and in the landmasses surrounding the Mediterranean Sea. The morphometric information consisted of the local terrain slope which was computed in a square 3 × 3-cell moving window, and in the regional relative relief computed in a circular 15 × 15-cell moving window. We tested three different models to classify the "non-susceptible" landslide areas, including a linear model (LNR), a quantile linear model (QLR), and a quantile, non-linear model (QNL). We tested the performance of the three models using independent landslide information presented by the Italian Landslide Inventory (Inventario Fenomeni Franosi in Italia - IFFI). Best results were obtained using the QNL model. The corresponding zonation of non-susceptible landslide areas was intersected in a geographic information system (GIS) with geographical census data for Italy. The result determined that 57.5% of the population of Italy (in 2001) was located in areas where landslide susceptibility is expected to be negligible. We applied the QNL model to the landmasses surrounding the Mediterranean Sea, and we tested the synoptic non-susceptibility zonation using independent landslide information for three study areas in Spain. Results showed that the QNL model was capable of determining where landslide susceptibility is expected to be negligible in the validation areas in Spain. We expect our results to be applicable in similar study areas, facilitating the identification of non-susceptible landslide areas, at the synoptic scale.
Four-dimensional layer-stacking carbon-ion beam dose distribution by use of a lung numeric phantom.
Mori, Shinichiro; Kumagai, Motoki; Miki, Kentaro
2015-07-01
To extend layer-stacking irradiation to accommodate intrafractional organ motion, we evaluated the carbon-ion layer-stacking dose distribution using a numeric lung phantom. We designed several types of range compensators. The planning target volume was calculated from the respective respiratory phases for consideration of intrafractional beam range variation. The accumulated dose distribution was calculated by registering of the dose distributions at respective phases to that at the reference phase. We evaluated the dose distribution based on the following six parameters: motion displacement, direction, gating window, respiratory cycle, range-shifter change time, and prescribed dose. All parameters affected the dose conformation to the moving target. By shortening of the gating window, dose metrics for superior-inferior (SI) and anterior-posterior (AP) motions were decreased from a D95 of 94 %, Dmax of 108 %, and homogeneity index (HI) of 23 % at T00-T90, to a D95 of 93 %, Dmax of 102 %, and HI of 20 % at T40-T60. In contrast, all dose metrics except the HI were independent of respiratory cycle. All dose metrics in SI motion were almost the same in respective motion displacement, with a D95 of 94 %, Dmax of 108 %, Dmin of 89 %, and HI of 23 % for the ungated phase, and D95 of 93 %, Dmax of 102 %, Dmin of 85 %, and HI of 20 % for the gated phase. The dose conformation to a moving target was improved by the gating strategy and by an increase in the prescribed dose. A combination of these approaches is a practical means of adding them to existing treatment protocols without modifications.
Seismic facies analysis based on self-organizing map and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian
2015-01-01
Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.
Minimal Window Duration for Accurate HRV Recording in Athletes
Bourdillon, Nicolas; Schmitt, Laurent; Yazdani, Sasan; Vesin, Jean-Marc; Millet, Grégoire P.
2017-01-01
Heart rate variability (HRV) is non-invasive and commonly used for monitoring responses to training loads, fitness, or overreaching in athletes. Yet, the recording duration for a series of RR-intervals varies from 1 to 15 min in the literature. The aim of the present work was to assess the minimum record duration to obtain reliable HRV results. RR-intervals from 159 orthostatic tests (7 min supine, SU, followed by 6 min standing, ST) were analyzed. Reference windows were 4 min in SU (min 3–7) and 4 min in ST (min 9–13). Those windows were subsequently divided and the analyses were repeated on eight different fractioned windows: the first min (0–1), the second min (1–2), the third min (2–3), the fourth min (3–4), the first 2 min (0–2), the last 2 min (2–4), the first 3 min (0–3), and the last 3 min (1–4). Correlation and Bland & Altman statistical analyses were systematically performed. The analysis window could be shortened to 0–2 instead of 0–4 for RMSSD only, whereas the 4-min window was necessary for LF and total power. Since there is a need for 1 min of baseline to obtain a steady signal prior the analysis window, we conclude that studies relying on RMSSD may shorten the windows to 3 min (= 1+2) in SU or seated position only and to 6 min (= 1+2 min SU plus 1+2 min ST) if there is an orthostatic test. Studies relying on time- and frequency-domain parameters need a minimum of 5 min (= 1+4) min SU or seated position only but require 10 min (= 1+4 min SU plus 1+4 min ST) for the orthostatic test. PMID:28848382
Analysts guide: TreeVal for Windows, Version 2.0.
R.D. Fight; J.T. Chmelik; E.A. Coulter
2001-01-01
TreeVal for Windows provides financial information and analysis to support silvicultural decisions in coast Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco). It integrates the effect of growth and yield, management costs, harvesting costs, product and mill type, manufacturing costs, product prices, and product grade premiums. Output files from...
The influence of opening windows and doors on the natural ventilation rate of a residential building
Increased building energy efficiency is important in reducing national energy use and greenhouse gas emissions. An analysis of air change rates due to door and window openings in a research test house located in a residential environment are presented. These data inform developme...
MARD—A moving average rose diagram application for the geosciences
NASA Astrophysics Data System (ADS)
Munro, Mark A.; Blenkinsop, Thomas G.
2012-12-01
MARD 1.0 is a computer program for generating smoothed rose diagrams by using a moving average, which is designed for use across the wide range of disciplines encompassed within the Earth Sciences. Available in MATLAB®, Microsoft® Excel and GNU Octave formats, the program is fully compatible with both Microsoft® Windows and Macintosh operating systems. Each version has been implemented in a user-friendly way that requires no prior experience in programming with the software. MARD conducts a moving average smoothing, a form of signal processing low-pass filter, upon the raw circular data according to a set of pre-defined conditions selected by the user. This form of signal processing filter smoothes the angular dataset, emphasising significant circular trends whilst reducing background noise. Customisable parameters include whether the data is uni- or bi-directional, the angular range (or aperture) over which the data is averaged, and whether an unweighted or weighted moving average is to be applied. In addition to the uni- and bi-directional options, the MATLAB® and Octave versions also possess a function for plotting 2-dimensional dips/pitches in a single, lower, hemisphere. The rose diagrams from each version are exportable as one of a selection of common graphical formats. Frequently employed statistical measures that determine the vector mean, mean resultant (or length), circular standard deviation and circular variance are also included. MARD's scope is demonstrated via its application to a variety of datasets within the Earth Sciences.
Steveson, Chloe; Schuijf, Joanne D; Vavere, Andrea L; Mather, Richard T; Caton, Teresa; Mehra, Vishal; Betoko, Aisha; Cox, Christopher; Lima, Joao Ac; George, Richard T
The aim of this study is to evaluate the effect of heart rate on exposure window, best phase, and image quality for stress computed tomography perfusion (CTP) in the CORE320 study. The CTP data sets were analyzed to determine the best phase for perfusion analysis. A predefined exposure window covering 75% to 95% of the R-R cycle was used. Of the 368 patients included in the analysis, 93% received oral β blockade before the rest scan. The median heart rate during the stress acquisition was 69 bpm (interquartile range [IQR], 60-77). The median best phase was 81% (IQR, 76-90), and length of exposure window was 22% (IQR, 19-24). The best phase was significantly later in the cardiac cycle with higher heart rates (P < 0.001), and higher heart rates resulted in a small, but higher number of poor quality scans (6%, P < 0.001). The median effective dose of the stress scan was 5.3 mSv (IQR, 3.8-6.1). Stress myocardial CTP imaging can be performed using prospective electrocardiography triggering, an exposure window of 75% to 95%, and β-blockade resulting in good or excellent image quality in the majority (80%) of patients while maintaining a low effective radiation dose.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohn, S. A.; Aguirre, J. E.; Moore, D. F.
2016-06-01
Current generation low-frequency interferometers constructed with the objective of detecting the high-redshift 21 cm background aim to generate power spectra of the brightness temperature contrast of neutral hydrogen in primordial intergalactic medium. Two-dimensional (2D) power spectra (power in Fourier modes parallel and perpendicular to the line of sight) that formed from interferometric visibilities have been shown to delineate a boundary between spectrally smooth foregrounds (known as the wedge ) and spectrally structured 21 cm background emission (the EoR window ). However, polarized foregrounds are known to possess spectral structure due to Faraday rotation, which can leak into the EoR window.more » In this work we create and analyze 2D power spectra from the PAPER-32 imaging array in Stokes I, Q, U, and V. These allow us to observe and diagnose systematic effects in our calibration at high signal-to-noise within the Fourier space most relevant to EoR experiments. We observe well-defined windows in the Stokes visibilities, with Stokes Q, U, and V power spectra sharing a similar wedge shape to that seen in Stokes I. With modest polarization calibration, we see no evidence that polarization calibration errors move power outside the wedge in any Stokes visibility to the noise levels attained. Deeper integrations will be required to confirm that this behavior persists to the depth required for EoR detection.« less
Visual control of foot placement when walking over complex terrain.
Matthis, Jonathan S; Fajen, Brett R
2014-02-01
The aim of this study was to investigate the role of visual information in the control of walking over complex terrain with irregularly spaced obstacles. We developed an experimental paradigm to measure how far along the future path people need to see in order to maintain forward progress and avoid stepping on obstacles. Participants walked over an array of randomly distributed virtual obstacles that were projected onto the floor by an LCD projector while their movements were tracked by a full-body motion capture system. Walking behavior in a full-vision control condition was compared with behavior in a number of other visibility conditions in which obstacles did not appear until they fell within a window of visibility centered on the moving observer. Collisions with obstacles were more frequent and, for some participants, walking speed was slower when the visibility window constrained vision to less than two step lengths ahead. When window sizes were greater than two step lengths, the frequency of collisions and walking speed were weakly affected or unaffected. We conclude that visual information from at least two step lengths ahead is needed to guide foot placement when walking over complex terrain. When placed in the context of recent research on the biomechanics of walking, the findings suggest that two step lengths of visual information may be needed because it allows walkers to exploit the passive mechanical forces inherent to bipedal locomotion, thereby avoiding obstacles while maximizing energetic efficiency. PsycINFO Database Record (c) 2014 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Salinas Solé, Celia; Peña Angulo, Dhais; Gonzalez HIgaldo, Jose Carlos; Brunetti, MIchele
2017-04-01
In this poster we applied the moving window approach (see Poster I of this collection) to analyze trends of autumn and its corresponding months (September, October, November) temperature mean values of maximum (Tmax) and minimum (Tmin) in Spanish mainland to detect the effects of length period and starting year. Monthly series belong to Monthly Temperature dataset of Spanish mainland (MOTEDAS). Database contains in its grid format of 5236 pixels of monthly series (10x10 km). The threshold used in spatial analyses considers 20% of land under significant trend (p<0.05). The most striking results are as follow: • Seasonal trend analyses of Autumn Tmax show no significance at any temporal Windows. Trends of Tmin are significant in more than 20% of land until 1974-2010. The area affected in Tmin progressively increase from SE to NW. • Monthly trend analyses not detect any significance in Tmax, while in Tmin, particularly in October, an extended area is detected in temporal windows in between 1951-2010 to 1978-2010, but clearly concentrated in the starting years of initial 70´s. Spatial pattern of areas affected significantly seems to be from SE to NW for October, and South-North in September. To conclude autumn trend analyses of Tmax and Tmin in Spanish mainland only detect significant trend in Tmin, mostly located in the 70´s and extending from SE to central areas of study area.
Stine-Morrow, Elizabeth A. L.; Noh, Soo Rim; Shake, Matthew C.
2009-01-01
This research examined age differences in the accommodation of reading strategies as a consequence of explicit instruction in conceptual integration. In Experiment 1, young, middle-aged, and older adults read sentences for delayed recall using a moving window method. Readers in an experimental group received instruction in making conceptual links during reading while readers in a control group were simply encouraged to allocate effort. Regression analysis to decompose word-by-word reading times in each condition isolated the time allocated to conceptual processing at the point in the text at which new concepts were introduced, as well as at clause and sentence boundaries. While younger adults responded to instructions by differentially allocating effort to sentence wrap-up, older adults allocated effort to intrasentence wrap-up and on new concepts as they were introduced, suggesting that older readers optimized their allocation of effort to linguistic computations for textbase construction within their processing capacity. Experiment 2 verified that conceptual integration training improved immediate recall among older readers as a consequence of engendering allocation to conceptual processing. PMID:19941199
Osmanski, Bruno-Félix; Pezet, Sophie; Ricobaraza, Ana; Lenkei, Zsolt; Tanter, Mickael
2014-01-01
Long-range coherences in spontaneous brain activity reflect functional connectivity. Here we propose a novel, highly resolved connectivity mapping approach, using ultrafast functional ultrasound (fUS), which enables imaging of cerebral microvascular haemodynamics deep in the anaesthetized rodent brain, through a large thinned-skull cranial window, with pixel dimensions of 100 μm × 100 μm in-plane. The millisecond-range temporal resolution allows unambiguous cancellation of low-frequency cardio-respiratory noise. Both seed-based and singular value decomposition analysis of spatial coherences in the low-frequency (<0.1 Hz) spontaneous fUS signal fluctuations reproducibly report, at different coronal planes, overlapping high-contrast, intrinsic functional connectivity patterns. These patterns are similar to major functional networks described in humans by resting-state fMRI, such as the lateral task-dependent network putatively anticorrelated with the midline default-mode network. These results introduce fUS as a powerful novel neuroimaging method, which could be extended to portable systems for three-dimensional functional connectivity imaging in awake and freely moving rodents. PMID:25277668
Random domain name and address mutation (RDAM) for thwarting reconnaissance attacks
Chen, Xi; Zhu, Yuefei
2017-01-01
Network address shuffling is a novel moving target defense (MTD) that invalidates the address information collected by the attacker by dynamically changing or remapping the host’s network addresses. However, most network address shuffling methods are limited by the limited address space and rely on the host’s static domain name to map to its dynamic address; therefore these methods cannot effectively defend against random scanning attacks, and cannot defend against an attacker who knows the target’s domain name. In this paper, we propose a network defense method based on random domain name and address mutation (RDAM), which increases the scanning space of the attacker through a dynamic domain name method and reduces the probability that a host will be hit by an attacker scanning IP addresses using the domain name system (DNS) query list and the time window methods. Theoretical analysis and experimental results show that RDAM can defend against scanning attacks and worm propagation more effectively than general network address shuffling methods, while introducing an acceptable operational overhead. PMID:28489910
Vehicle tracking using fuzzy-based vehicle detection window with adaptive parameters
NASA Astrophysics Data System (ADS)
Chitsobhuk, Orachat; Kasemsiri, Watjanapong; Glomglome, Sorayut; Lapamonpinyo, Pipatphon
2018-04-01
In this paper, fuzzy-based vehicle tracking system is proposed. The proposed system consists of two main processes: vehicle detection and vehicle tracking. In the first process, the Gradient-based Adaptive Threshold Estimation (GATE) algorithm is adopted to provide the suitable threshold value for the sobel edge detection. The estimated threshold can be adapted to the changes of diverse illumination conditions throughout the day. This leads to greater vehicle detection performance compared to a fixed user's defined threshold. In the second process, this paper proposes the novel vehicle tracking algorithms namely Fuzzy-based Vehicle Analysis (FBA) in order to reduce the false estimation of the vehicle tracking caused by uneven edges of the large vehicles and vehicle changing lanes. The proposed FBA algorithm employs the average edge density and the Horizontal Moving Edge Detection (HMED) algorithm to alleviate those problems by adopting fuzzy rule-based algorithms to rectify the vehicle tracking. The experimental results demonstrate that the proposed system provides the high accuracy of vehicle detection about 98.22%. In addition, it also offers the low false detection rates about 3.92%.
Sojoudi, Alireza; Goodyear, Bradley G
2016-12-01
Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Sitohang, Yosep Oktavianus; Darmawan, Gumgum
2017-08-01
This research attempts to compare between two forecasting models in time series analysis for predicting the sales volume of motorcycle in Indonesia. The first forecasting model used in this paper is Autoregressive Fractionally Integrated Moving Average (ARFIMA). ARFIMA can handle non-stationary data and has a better performance than ARIMA in forecasting accuracy on long memory data. This is because the fractional difference parameter can explain correlation structure in data that has short memory, long memory, and even both structures simultaneously. The second forecasting model is Singular spectrum analysis (SSA). The advantage of the technique is that it is able to decompose time series data into the classic components i.e. trend, cyclical, seasonal and noise components. This makes the forecasting accuracy of this technique significantly better. Furthermore, SSA is a model-free technique, so it is likely to have a very wide range in its application. Selection of the best model is based on the value of the lowest MAPE. Based on the calculation, it is obtained the best model for ARFIMA is ARFIMA (3, d = 0, 63, 0) with MAPE value of 22.95 percent. For SSA with a window length of 53 and 4 group of reconstructed data, resulting MAPE value of 13.57 percent. Based on these results it is concluded that SSA produces better forecasting accuracy.
Hb Koln [β98(FG5) [GTG → ATG, Val → Met]: the first report from India.
Warang, Prashant; Nair, Sona; Nadkarni, Anita; Kedar, Prabhakar; Bhave, Abhay; Ghosh, Kanjaksha; Colah, Roshan
2014-06-01
The group of unstable hemoglobins are associated with congenital non-spherocytic hemolytic anemia due to instability of the hemoglobin molecule. They often lead to formation of the characteristic inclusion bodies or Heinz bodies. To identity the cause of mild anemia, reticulocytosis, and hepatosplenomegly in a case of non-spherocytic hemolytic anemia. A 34-year-old female patient originating from Maharashtra, western India presented with mild anemia and jaundice which had persisted since childhood. Investigations included a complete blood count, screening for red cell membrane protein defects, Hb analysis by high-performance liquid chromatography (HPLC) and cellulose acetate electrophoresis (pH 8.9), heat instability test and DNA sequencing. Hemoglobin analysis by HPLC showed an abnormal peak in the Hb C window (9.8%) with a retention time of 4.90 minutes. Cellulose acetate electrophoresis (pH 8.9) showed a slow moving band (6.15%) between Hb A2 and Hb S. The heat instability test was positive. DNA analysis of α globin genes showed absence of both deletional and non- deletional α thalassemia. DNA sequencing of the β globin gene revealed heterozygosity for a mutation at codon 98 [GTG → ATG, Val → Met], which gives rise to Hb-Koln. Hb Koln is the commonest unstable Hb variant reported from many populations in the world. However, this is the first report of this unstable Hb variant from India.
Long, Chloe V; Flint, James A; Lepper, Paul A
2010-10-01
Bat mortality resulting from actual or near-collision with operational wind turbine rotors is a phenomenon that is widespread but not well understood. Because bats rely on information contained in high-frequency echoes to determine the nature and movement of a target, it is important to consider how ultrasonic pulses similar to those used by bats for echolocation may be interacting with operational turbine rotor blades. By assessing the characteristics of reflected ultrasonic echoes, moving turbine blades operating under low wind speed conditions (<6 m s(-1)) were found to produce distinct Doppler shift profiles at different angles to the rotor. Frequency shifts of up to ±700-800 Hz were produced, which may not be perceptible by some bat species. Monte Carlo simulation of bat-like sampling by echolocation revealed that over 50 rotor echoes could be required by species such as Pipistrellus pipistrellus for accurate interpretation of blade movement, which may not be achieved in the bat's approach time-window. In summary, it was found that echoes returned from moving blades had features which could render them attractive to bats or which might make it difficult for the bat to accurately detect and locate blades in sufficient time to avoid a collision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Puerari, Ivânio; Elmegreen, Bruce G.; Block, David L., E-mail: puerari@inaoep.mx
2014-12-01
We examine 8 μm IRAC images of the grand design two-arm spiral galaxies M81 and M51 using a new method whereby pitch angles are locally determined as a function of scale and position, in contrast to traditional Fourier transform spectral analyses which fit to average pitch angles for whole galaxies. The new analysis is based on a correlation between pieces of a galaxy in circular windows of (lnR,θ) space and logarithmic spirals with various pitch angles. The diameter of the windows is varied to study different scales. The result is a best-fit pitch angle to the spiral structure as amore » function of position and scale, or a distribution function of pitch angles as a function of scale for a given galactic region or area. We apply the method to determine the distribution of pitch angles in the arm and interarm regions of these two galaxies. In the arms, the method reproduces the known pitch angles for the main spirals on a large scale, but also shows higher pitch angles on smaller scales resulting from dust feathers. For the interarms, there is a broad distribution of pitch angles representing the continuation and evolution of the spiral arm feathers as the flow moves into the interarm regions. Our method shows a multiplicity of spiral structures on different scales, as expected from gas flow processes in a gravitating, turbulent and shearing interstellar medium. We also present results for M81 using classical 1D and 2D Fourier transforms, together with a new correlation method, which shows good agreement with conventional 2D Fourier transforms.« less
El-Jardali, Fadi; Ataya, Nour; Jamal, Diana; Jaafar, Maha
2012-05-06
Limited work has been done to promote knowledge translation (KT) in the Eastern Mediterranean Region (EMR). The objectives of this study are to: 1.assess the climate for evidence use in policy; 2.explore views and practices about current processes and weaknesses of health policymaking; 3.identify priorities including short-term requirements for policy briefs; and 4.identify country-specific requirements for establishing KT platforms. Senior policymakers, stakeholders and researchers from Algeria, Bahrain, Egypt, Iran, Jordan, Lebanon, Oman, Sudan, Syria, Tunisia, and Yemen participated in this study. Questionnaires were used to assess the climate for use of evidence and identify windows of opportunity and requirements for policy briefs and for establishing KT platforms. Current processes and weaknesses of policymaking were appraised using case study scenarios. Closed-ended questions were analyzed descriptively. Qualitative data was analyzed using thematic analysis. KT activities were not frequently undertaken by policymakers and researchers in EMR countries, research evidence about high priority policy issues was rarely made available, and interaction between policymakers and researchers was limited, and policymakers rarely identified or created places for utilizing research evidence in decision-making processes. Findings emphasized the complexity of policymaking. Donors, political regimes, economic goals and outdated laws were identified as key drivers. Lack of policymakers' abilities to think strategically, constant need to make quick decisions, limited financial resources, and lack of competent and trained human resources were suggested as main weaknesses. Despite the complexity of policymaking processes in countries from this region, the absence of a structured process for decision making, and the limited engagement of policymakers and researchers in KT activities, there are windows of opportunity for moving towards more evidence informed policymaking.
Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H
2016-10-01
The single leg drop jump landing test may assess dynamic and static balance abilities in different phases of the landing. However objective definitions of different phases following landing and associated reliability are lacking. Therefore, we determined the existence of possible distinct phases of single leg drop jump landing on a force plate in 82 elite youth soccer players. Three outcome measures were calculated over moving windows of five sizes: center of pressure (COP) speed, COP sway and horizontal ground reaction force (GRF). Per outcome measure, a Factor Analysis was employed with all windows as input variables. It showed that four factors (patterns of variance) largely (>75%) explained the variance across subjects/trials along the 12s time series. Each factor was highly associated with a distinct phase of the time series signal: dynamic (0.4-2.7s), late dynamic (2.5-5.0s), static 1 (5.0-8.3s) and static 2 (8.1-11.7s). Intra-class correlations (ICC) between trials were lower for the dynamic phases (0.45-0.68) than for the static phases (0.60-0.86). The COP speed showed higher ICC's (0.63-0.86) than COP sway (0.45-0.61) and GRF (0.57-0.71) for all four phases. In conclusion, following a drop jump landing unique information is available in four distinct phases. The COP speed is most reliable, with higher reliability in the static phases compared to the dynamic phases. Future studies should assess the sensitivity of information from dynamic, late dynamic and static phases. Copyright © 2016 Elsevier B.V. All rights reserved.
2012-01-01
Objectives Limited work has been done to promote knowledge translation (KT) in the Eastern Mediterranean Region (EMR). The objectives of this study are to: 1.assess the climate for evidence use in policy; 2.explore views and practices about current processes and weaknesses of health policymaking; 3.identify priorities including short-term requirements for policy briefs; and 4.identify country-specific requirements for establishing KT platforms. Methods Senior policymakers, stakeholders and researchers from Algeria, Bahrain, Egypt, Iran, Jordan, Lebanon, Oman, Sudan, Syria, Tunisia, and Yemen participated in this study. Questionnaires were used to assess the climate for use of evidence and identify windows of opportunity and requirements for policy briefs and for establishing KT platforms. Current processes and weaknesses of policymaking were appraised using case study scenarios. Closed-ended questions were analyzed descriptively. Qualitative data was analyzed using thematic analysis. Results KT activities were not frequently undertaken by policymakers and researchers in EMR countries, research evidence about high priority policy issues was rarely made available, and interaction between policymakers and researchers was limited, and policymakers rarely identified or created places for utilizing research evidence in decision-making processes. Findings emphasized the complexity of policymaking. Donors, political regimes, economic goals and outdated laws were identified as key drivers. Lack of policymakers’ abilities to think strategically, constant need to make quick decisions, limited financial resources, and lack of competent and trained human resources were suggested as main weaknesses. Conclusion Despite the complexity of policymaking processes in countries from this region, the absence of a structured process for decision making, and the limited engagement of policymakers and researchers in KT activities, there are windows of opportunity for moving towards more evidence informed policymaking. PMID:22559007
Window encapsulation in car industry by using the 50 {Omega} RF technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, J.P.; Barboteau, M.; Collet, L.
Throughout the world car industry has been using window encapsulation for a few years now. This technology is mainly used in production lines and is called RIM for polyurethane reaction injection moulding. This technology, however brings about some problems such as: glass breaking during mould closure, high production cost, systematic rough edges. The PSA Group (Peugeot-Citroen), a pioneer in this field, in collaboration with SAIREM has launched a new innovating process for window encapsulation by using the 50 {Omega} RF technology for gelling PVC Plastisol. The study was followed by an industrial prototype. Industrial equipment was then installed at WEBASTOmore » HEULIEZ for window encapsulation of the sunshine roof for the Citroen Xantia. The authors describe the principle of window encapsulation and the different existing processes. They describe the 50 {Omega} RF technology, an industrial installation and the constraints of this technology in order to get maximum efficiency. In the conclusion they present a technical and economical analysis of the different solutions for window encapsulation. They also present the advantages of the 50 {Omega} RF technology and the new opportunities it offers.« less
Bispectral analysis: comparison of two windowing functions
NASA Astrophysics Data System (ADS)
Silvagni, D.; Djerroud, C.; Réveillé, T.; Gravier, E.
2018-02-01
Amongst all the normalized forms of bispectrum, the bicoherence is shown to be a very useful diagnostic tool in experimental studies of nonlinear wave interactions in plasma, as it measures the fraction of wave power due to the quadratic wave coupling in a self-excited fluctuation spectrum [1, 2]. In order to avoid spectral leakage, the application of a windowing function is needed during the bicoherence computation. Spectral leakage from statistically dependent components are of crucial importance in the discrimination between coupled and uncoupled modes, as they will introduce in the bicoherence spectrum phase-coupled modes which in reality do not exist. Therefore, the windowing function plays a key role in the bicoherence estimation. In this paper, two windowing methods are compared: the multiplication of the initial signal by the Hanning function and the subtraction of the straight line which links the two extremities of the signal. The influence of these two windowing methods on both the power spectrum and the bicoherence spectrum is showed. Although both methods give precise results, the Hanning function appears to be the more suitable window.
Sound isolation performance of interior acoustical sash
NASA Astrophysics Data System (ADS)
Tocci, Gregory
2002-05-01
In existing, as well as new buildings, an interior light of glass mounted on the inside of a prime window is used to improve the sound transmission loss otherwise obtained by the prime window alone. Interior acoustical sash is most often 1/4 in. (6 mm) monolithic or laminated glass, and is typically spaced 3 in. to 6 in. from the glass of the prime window. This paper presents TL data measured at Riverbank Acoustical Laboratories by Solutia (formerly Monsanto) for lightweight prime windows of various types, with and without interior acoustical sash glazed with 1/4 in. laminated glass. The TL data are used to estimate the A-weighted insertion loss of interior acoustical sash when applied to prime windows glazed with lightweight glass for four transportation noise source types-highway traffic, aircraft, electric rail, and diesel rail. The analysis also has been extended to determine the insertion loss expressed as a change in OITC. The data also exhibit the reductions in insertion loss that can result from short-circuiting the interior acoustical sash with the prime window. [Work supported by Solutia, Inc.
Looking through the postdisaster policy window
NASA Astrophysics Data System (ADS)
Solecki, William D.; Michaels, Sarah
1994-07-01
Policy windows are transitory opportunities during which the likelihood of adopting new policy or legislative proposals is greater than usual. Accepted wisdom has held that natural disasters serve as focusing events that generate policy windows in their wake. This paper highlights the need for a more circumscribed understanding of when and where policy windows occur based on the experiences of three US regional planning organizations: a hand-picked commission of community leaders, a council of governments, and a special-purpose substate organization. The first operated in the San Francisco Bay Area of California following the Loma Prieta earthquake (October 1989), and the other two in South Carolina's Atlantic coastal plain after Hurricane Hugo (September 1989). The analysis concludes that natural disasters did not transform the agenda or mission of these entities. Policy windows were neither automatic outcomes of focusing events nor did they ensure the adoption of pertinent policy within the organizations investigated. Several conditions are minimally necessary for using policy windows to bring about hazard mitigation: comprehensive institutional conceptualization of hazards management, institutional strength and flexibility, and well-placed, effective policy entrepreneurs.
Fragomeni, Breno de Oliveira; Misztal, Ignacy; Lourenco, Daniela Lino; Aguilar, Ignacio; Okimoto, Ronald; Muir, William M
2014-01-01
The purpose of this study was to determine if the set of genomic regions inferred as accounting for the majority of genetic variation in quantitative traits remain stable over multiple generations of selection. The data set contained phenotypes for five generations of broiler chicken for body weight, breast meat, and leg score. The population consisted of 294,632 animals over five generations and also included genotypes of 41,036 single nucleotide polymorphism (SNP) for 4,866 animals, after quality control. The SNP effects were calculated by a GWAS type analysis using single step genomic BLUP approach for generations 1-3, 2-4, 3-5, and 1-5. Variances were calculated for windows of 20 SNP. The top ten windows for each trait that explained the largest fraction of the genetic variance across generations were examined. Across generations, the top 10 windows explained more than 0.5% but less than 1% of the total variance. Also, the pattern of the windows was not consistent across generations. The windows that explained the greatest variance changed greatly among the combinations of generations, with a few exceptions. In many cases, a window identified as top for one combination, explained less than 0.1% for the other combinations. We conclude that identification of top SNP windows for a population may have little predictive power for genetic selection in the following generations for the traits here evaluated.
Oczeretko, Edward; Swiatecka, Jolanta; Kitlas, Agnieszka; Laudanski, Tadeusz; Pierzynski, Piotr
2006-01-01
In physiological research, we often study multivariate data sets, containing two or more simultaneously recorded time series. The aim of this paper is to present the cross-correlation and the wavelet cross-correlation methods to assess synchronization between contractions in different topographic regions of the uterus. From a medical point of view, it is important to identify time delays between contractions, which may be of potential diagnostic significance in various pathologies. The cross-correlation was computed in a moving window with a width corresponding to approximately two or three contractions. As a result, the running cross-correlation function was obtained. The propagation% parameter assessed from this function allows quantitative description of synchronization in bivariate time series. In general, the uterine contraction signals are very complicated. Wavelet transforms provide insight into the structure of the time series at various frequencies (scales). To show the changes of the propagation% parameter along scales, a wavelet running cross-correlation was used. At first, the continuous wavelet transforms as the uterine contraction signals were received and afterwards, a running cross-correlation analysis was conducted for each pair of transformed time series. The findings show that running functions are very useful in the analysis of uterine contractions.
A Unified Air-Sea Visualization System: Survey on Gridding Structures
NASA Technical Reports Server (NTRS)
Anand, Harsh; Moorhead, Robert
1995-01-01
The goal is to develop a Unified Air-Sea Visualization System (UASVS) to enable the rapid fusion of observational, archival, and model data for verification and analysis. To design and develop UASVS, modelers were polled to determine the gridding structures and visualization systems used, and their needs with respect to visual analysis. A basic UASVS requirement is to allow a modeler to explore multiple data sets within a single environment, or to interpolate multiple datasets onto one unified grid. From this survey, the UASVS should be able to visualize 3D scalar/vector fields; render isosurfaces; visualize arbitrary slices of the 3D data; visualize data defined on spectral element grids with the minimum number of interpolation stages; render contours; produce 3D vector plots and streamlines; provide unified visualization of satellite images, observations and model output overlays; display the visualization on a projection of the users choice; implement functions so the user can derive diagnostic values; animate the data to see the time-evolution; animate ocean and atmosphere at different rates; store the record of cursor movement, smooth the path, and animate a window around the moving path; repeatedly start and stop the visual time-stepping; generate VHS tape animations; work on a variety of workstations; and allow visualization across clusters of workstations and scalable high performance computer systems.
Hou, Lei; Wu, Peiyi
2016-06-21
Turbidity, DLS and FTIR measurements in combination with the perturbation correlation moving window (PCMW) technique and 2D correlation spectroscopy (2Dcos) analysis have been utilized to investigate the LCST-type transition of a oligo ethylene glycol acrylate-based copolymer (POEGA) in aqueous solutions in this work. As demonstrated in turbidity and DLS curves, the macroscopic phase separation was sharp and slightly concentration dependent. Moreover, individual chemical groups along polymer chains also display abrupt changes in temperature-variable IR spectra. However, according to conventional IR analysis, the C-H groups present obvious dehydration, whereas C[double bond, length as m-dash]O and C-O-C groups exhibit anomalous "forced hydration" during the steep phase transition. From these analyses together with the PCMW and 2Dcos results, it has been confirmed that the hydrophobic interaction among polymer chains drove the chain collapse and dominated the phase transition. In addition, the unexpected enhanced hydration behavior of C[double bond, length as m-dash]O and C-O-C groups was induced by forced hydrogen bonding between polar groups along polymer chains and entrapped water molecules in the aggregates, which originated from the special chemical structure of POEGA.
Hutka, Stefanie; Bidelman, Gavin M.; Moreno, Sylvain
2013-01-01
There is convincing empirical evidence for bidirectional transfer between music and language, such that experience in either domain can improve mental processes required by the other. This music-language relationship has been studied using linear models (e.g., comparing mean neural activity) that conceptualize brain activity as a static entity. The linear approach limits how we can understand the brain’s processing of music and language because the brain is a nonlinear system. Furthermore, there is evidence that the networks supporting music and language processing interact in a nonlinear manner. We therefore posit that the neural processing and transfer between the domains of language and music are best viewed through the lens of a nonlinear framework. Nonlinear analysis of neurophysiological activity may yield new insight into the commonalities, differences, and bidirectionality between these two cognitive domains not measurable in the local output of a cortical patch. We thus propose a novel application of brain signal variability (BSV) analysis, based on mutual information and signal entropy, to better understand the bidirectionality of music-to-language transfer in the context of a nonlinear framework. This approach will extend current methods by offering a nuanced, network-level understanding of the brain complexity involved in music-language transfer. PMID:24454295
NASA Astrophysics Data System (ADS)
Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey
2018-04-01
Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.
Hutka, Stefanie; Bidelman, Gavin M; Moreno, Sylvain
2013-12-30
There is convincing empirical evidence for bidirectional transfer between music and language, such that experience in either domain can improve mental processes required by the other. This music-language relationship has been studied using linear models (e.g., comparing mean neural activity) that conceptualize brain activity as a static entity. The linear approach limits how we can understand the brain's processing of music and language because the brain is a nonlinear system. Furthermore, there is evidence that the networks supporting music and language processing interact in a nonlinear manner. We therefore posit that the neural processing and transfer between the domains of language and music are best viewed through the lens of a nonlinear framework. Nonlinear analysis of neurophysiological activity may yield new insight into the commonalities, differences, and bidirectionality between these two cognitive domains not measurable in the local output of a cortical patch. We thus propose a novel application of brain signal variability (BSV) analysis, based on mutual information and signal entropy, to better understand the bidirectionality of music-to-language transfer in the context of a nonlinear framework. This approach will extend current methods by offering a nuanced, network-level understanding of the brain complexity involved in music-language transfer.
Field Evaluation of Highly Insulating Windows in the Lab Homes: Winter Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, Graham B.; Widder, Sarah H.; Bauman, Nathan N.
2012-06-01
This field evaluation of highly insulating windows was undertaken in a matched pair of 'Lab Homes' located on the Pacific Northwest National Laboratory (PNNL) campus during the 2012 winter heating season. Improving the insulation and solar heat gain characteristics of a home's windows has the potential to significantly improve the home's building envelope and overall thermal performance by reducing heat loss (in the winter), and cooling loss and solar heat gain (in the summer) through the windows. A high quality installation and/or window retrofit will also minimize or reduce air leakage through the window cavity and thus also contribute tomore » reduced heat loss in the winter and cooling loss in the summer. These improvements all contribute to decreasing overall annual home energy use. Occupant comfort (non-quantifiable) can also be increased by minimizing or eliminating the cold 'draft' (temperature) many residents experience at or near window surfaces that are at a noticeably lower temperature than the room air temperature. Lastly, although not measured in this experiment, highly insulating windows (triple-pane in this experiment) also have the potential to significantly reduce the noise transmittance through windows compared to standard double-pane windows. The metered data taken in the Lab Homes and data analysis presented here represent 70 days of data taken during the 2012 heating season. As such, the savings from highly insulating windows in the experimental home (Lab Home B) compared to the standard double-pane clear glass windows in the baseline home (Lab Home A) are only a portion of the energy savings expected from a year-long experiment that would include a cooling season. The cooling season experiment will take place in the homes in the summer of 2012, and results of that experiment will be reported in a subsequent report available to all stakeholders.« less
Rouze, Ned C; Deng, Yufeng; Palmeri, Mark L; Nightingale, Kathryn R
2017-10-01
Recent measurements of shear wave propagation in viscoelastic materials have been analyzed by constructing the 2-D Fourier transform (2DFT) of the shear wave signal and measuring the phase velocity c(ω) and attenuation α(ω) from the peak location and full width at half-maximum (FWHM) of the 2DFT signal at discrete frequencies. However, when the shear wave is observed over a finite spatial range, the 2DFT signal is a convolution of the true signal and the observation window, and measurements using the FWHM can yield biased results. In this study, we describe a method to account for the size of the spatial observation window using a model of the 2DFT signal and a non-linear, least-squares fitting procedure to determine c(ω) and α(ω). Results from the analysis of finite-element simulation data agree with c(ω) and α(ω) calculated from the material parameters used in the simulation. Results obtained in a viscoelastic phantom indicate that the measured attenuation is independent of the observation window and agree with measurements of c(ω) and α(ω) obtained using the previously described progressive phase and exponential decay analysis. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
The influence of opening windows and doors on the natural ventilation rate of a residential building
An analysis of air exchange rates due to intentional window and door openings in a research test house located in a residential environment is presented. These data inform the development of ventilation rate control strategies as building envelopes are tightened to improve the e...
Windowed multitaper correlation analysis of multimodal brain monitoring parameters.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.
1993-03-01
values themselves. The Wools perform risk-adjusted present-value comparisons and compute the ROI using discount factors. The assessment of risk in a...developed X Window system, the de facto industry standard window system in the UNIX environment. An X- terminal’s use is limited to display. It has no...2.1 IT HARDWARE The DOS-based PC used in this analysis costs $2,060. It includes an ASL 486DX-33 Industry Standard Architecture (ISA) computer with 8
Feature-Oriented Domain Analysis (FODA) Feasibility Study
1990-11-01
controlling the synchronous behavior of the task. A task may wait for one or more synchronizing or message queue events. "* Each task is designed using the...Comparative Study 13 2.2.1. The Genesis System 13 2.2.2. MCC Work 15 2.2.2.1. The DESIRE Design Recovery Tool 15 0 2.2.2.2. Domain Analysis Method 1f...Illustration 43 Figure 6-1: Architectural Layers 48 Figure 6-2: Window Management Subsystem Design Structure 49 Figure 7-1: Function of a Window Manager
Processing Cones: A Computational Structure for Image Analysis.
1981-12-01
image analysis applications, referred to as a processing cone, is described and sample algorithms are presented. A fundamental characteristic of the structure is its hierarchical organization into two-dimensional arrays of decreasing resolution. In this architecture, a protypical function is defined on a local window of data and applied uniformly to all windows in a parallel manner. Three basic modes of processing are supported in the cone: reduction operations (upward processing), horizontal operations (processing at a single level) and projection operations (downward
Non-destructive scanning for applied stress by the continuous magnetic Barkhausen noise method
NASA Astrophysics Data System (ADS)
Franco Grijalba, Freddy A.; Padovese, L. R.
2018-01-01
This paper reports the use of a non-destructive continuous magnetic Barkhausen noise technique to detect applied stress on steel surfaces. The stress profile generated in a sample of 1070 steel subjected to a three-point bending test is analyzed. The influence of different parameters such as pickup coil type, scanner speed, applied magnetic field and frequency band analyzed on the effectiveness of the technique is investigated. A moving smoothing window based on a second-order statistical moment is used to analyze the time signal. The findings show that the technique can be used to detect applied stress profiles.
An Embedded Laser Marking Controller Based on ARM and FPGA Processors
Dongyun, Wang; Xinpiao, Ye
2014-01-01
Laser marking is an important branch of the laser information processing technology. The existing laser marking machine based on PC and WINDOWS operating system, are large and inconvenient to move. Still, it cannot work outdoors or in other harsh environments. In order to compensate for the above mentioned disadvantages, this paper proposed an embedded laser marking controller based on ARM and FPGA processors. Based on the principle of laser galvanometer scanning marking, the hardware and software were designed for the application. Experiments showed that this new embedded laser marking controller controls the galvanometers synchronously and could achieve precise marking. PMID:24772028
A conserved quantity in thin body dynamics
NASA Astrophysics Data System (ADS)
Hanna, James; Pendar, Hodjat
We use an example from textile processing to illustrate the utility of a conserved quantity associated with metric symmetry in a thin body. This quantity, when combined with the usual linear and angular momentum currents, allows us to construct a four-parameter family of curves representing the equilibria of a rotating, flowing string. To achieve this, we introduce a non-material action of mixed Lagrangian-Eulerian type, applicable to fixed windows of axially-moving systems. We will point out intriguing similarities with Bernoulli's equation, discuss the effects of axial flow on rotating conservative systems, and make connections with 19th- and 20th-century results on the dynamics of cables.
2001-05-29
KODIAK ISLAND, Alaska -- A special platform connects the barge with a ramp to allow Castor 120, the first stage of the Athena 1 launch vehicle, to safely move onto the dock at Kodiak Island, Alaska, as preparations to launch Kodiak Star proceed. The first orbital launch to take place from Alaska's Kodiak Launch Complex, Kodiak Star is scheduled to lift off on a Lockheed Martin Athena I launch vehicle on Sept. 17 during a two-hour window that extends from 5:00 to 7:00 p.m. ADT. The payloads aboard include the Starshine 3, sponsored by NASA, and the PICOSat, PCSat and Sapphire, sponsored by the Department of Defense (DoD) Space Test Program.
The JPL Resource Allocation Planning and Scheduling Office (RAPSO) process
NASA Technical Reports Server (NTRS)
Morris, D. G.; Burke, E. S.
2002-01-01
The Jet Propulsion Laboratory's Resource Allocation Planning and Scheduling Office is chartered to divide the limited amount of tracking hours of the Deep Space Network amongst the various missions in as equitable allotment as can be achieved. To best deal with this division of assets and time, an interactive process has evolved that promotes discussion with agreement by consensus between all of the customers that use the Deep Space Network (DSN). Aided by a suite of tools, the task of division of asset time is then performed in three stages of granularity. Using this approach, DSN loads are either forecasted or scheduled throughout a moving 10-year window.
2005-12-01
KENNEDY SPACE CENTER, FLA. - The third stage, or upper stage for the New Horizons spacecraft, is moved toward the open door of NASA Kennedy Space Center’s Payload Hazardous Servicing Facility. The third stage is a Boeing STAR 48 solid-propellant kick motor. The Atlas V is the launch vehicle for NASA’s New Horizons spacecraft, scheduled to launch from Cape Canaveral Air Force Station, Fla., during a 35-day window that opens Jan. 11 and fly through the Pluto system as early as summer 2015. New Horizons will be powered by a single radioisotope thermoelectric generator (RTG), provided by the Department of Energy, which will be installed shortly before launch.
2005-12-01
KENNEDY SPACE CENTER, FLA. - The third stage, or upper stage for the New Horizons spacecraft, is moved toward the open door of NASA Kennedy Space Center’s Payload Hazardous Servicing Facility. The third stage is a Boeing STAR 48 solid-propellant kick motor. The Atlas V is the launch vehicle for NASA’s New Horizons spacecraft, scheduled to launch from Cape Canaveral Air Force Station, Fla., during a 35-day window that opens Jan. 11 and fly through the Pluto system as early as summer 2015. New Horizons will be powered by a single radioisotope thermoelectric generator (RTG), provided by the Department of Energy, which will be installed shortly before launch.
NASA Astrophysics Data System (ADS)
Brewick, P. T.; Smyth, A. W.
2014-12-01
The accurate and reliable estimation of modal damping from output-only vibration measurements of structural systems is a continuing challenge in the fields of operational modal analysis (OMA) and system identification. In this paper a modified version of the blind source separation (BSS)-based Second-Order Blind Identification (SOBI) method was used to perform modal damping identification on a model bridge structure under varying loading conditions. The bridge model was created with finite elements and consisted of a series of stringer beams supported by a larger girder. The excitation was separated into two categories: ambient noise and traffic loads with noise modeled with random forcing vectors and traffic simulated with moving loads for cars and partially distributed moving masses for trains. The acceleration responses were treated as the mixed output signals for the BSS algorithm. The modified SOBI method used a windowing technique to maximize the amount of information used for blind identification from the responses. The modified SOBI method successfully found the mode shapes for both types of excitation with strong accuracy, but power spectral densities (PSDs) of the recovered modal responses showed signs of distortion for the traffic simulations. The distortion had an adverse affect on the damping ratio estimates for some of the modes but no correlation could be found between the accuracy of the damping estimates and the accuracy of the recovered mode shapes. The responses and their PSDs were compared to real-world collected data and patterns similar to distortion were observed implying that this issue likely affects real-world estimates.
NASA Astrophysics Data System (ADS)
Cheng, Z.; Chen, Y.; Liu, Y.; Liu, W.; Zhang, G.
2015-12-01
Among those hydrocarbon reservoir detection techniques, the time-frequency analysis based approach is one of the most widely used approaches because of its straightforward indication of low-frequency anomalies from the time-frequency maps, that is to say, the low-frequency bright spots usually indicate the potential hydrocarbon reservoirs. The time-frequency analysis based approach is easy to implement, and more importantly, is usually of high fidelity in reservoir prediction, compared with the state-of-the-art approaches, and thus is of great interest to petroleum geologists, geophysicists, and reservoir engineers. The S transform has been frequently used in obtaining the time-frequency maps because of its better performance in controlling the compromise between the time and frequency resolutions than the alternatives, such as the short-time Fourier transform, Gabor transform, and continuous wavelet transform. The window function used in the majority of previous S transform applications is the symmetric Gaussian window. However, one problem with the symmetric Gaussian window is the degradation of time resolution in the time-frequency map due to the long front taper. In our study, a bi-Gaussian S transform that substitutes the symmetric Gaussian window with an asymmetry bi-Gaussian window is proposed to analyze the multi-channel seismic data in order to predict hydrocarbon reservoirs. The bi-Gaussian window introduces asymmetry in the resultant time-frequency spectrum, with time resolution better in the front direction, as compared with the back direction. It is the first time that the bi-Gaussian S transform is used for analyzing multi-channel post-stack seismic data in order to predict hydrocarbon reservoirs since its invention in 2003. The superiority of the bi-Gaussian S transform over traditional S transform is tested on a real land seismic data example. The performance shows that the enhanced temporal resolution can help us depict more clearly the edge of the hydrocarbon reservoir, especially when the thickness of the reservoir is small (such as the thin beds).
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118
Small-window parametric imaging based on information entropy for ultrasound tissue characterization
NASA Astrophysics Data System (ADS)
Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean
2017-01-01
Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.
NASA Astrophysics Data System (ADS)
Fraizier, E.; Antoine, P.; Godefroit, J.-L.; Lanier, G.; Roy, G.; Voltz, C.
Lithium fluoride (LiF) windows are extensively used in traditional shock wave experiments because of their transparency beyond 100 GPa along [100] axis. A correct knowledge of the optical and mechanical properties of these windows is essential in order to analyze the experimental data and to determine the equation of state on a large variety of metals. This in mind, the windows supply is systematically characterized in order to determine the density, the thermal expansion and the crystalline orientation. Furthermore, an experimental campaign is conducted in order to characterize the windows properties under shock loading at 300 K and preheated conditions (450 K). This article describes the experiments, details the analysis and presents the results. Particle velocity measurements are carried out at the interface of a multiple windows stack using interferometer diagnostic (VISAR and IDL) at 532 nm wavelength. Shock velocity is calculated as a function of the time of flight through each window. The optical correction is calculated as the ratio of the apparent velocity gap and the particle velocity at the free surface. To go further, the Rankine-Hugoniot relations are applied to calculate the pressure and the density. Then, the results and uncertainties are presented and compared with literature data.
Chemical analyses of provided samples
NASA Technical Reports Server (NTRS)
Becker, Christopher H.
1993-01-01
A batch of four samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch diameter optics labeled windows no. PR14 and PR17 and MgF2 mirrors 9-93 PPPC exp. and control DMES 26-92. The analyses emphasized surface contamination or modification. In these studies, pulsed desorption by 355 nm laser light and single-photon ionization (SPI) above the sample by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2)) were used, emphasizing organic analysis. For the two windows with an apparent yellowish contaminant film, higher desorption laser power was needed to provide substantial signals, indicating a less volatile contamination than for the two mirrors. Window PR14 and the 9-93 mirror showed more hydrocarbon components than the other two samples. The mass spectra, which show considerable complexity, are discussed in terms of various potential chemical assignments.
Empirical assessment of a prismatic daylight-redirecting window film in a full-scale office testbed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanachareonkit, Anothai; Lee, Eleanor S.; McNeil, Andrew
2013-08-31
Daylight redirecting systems with vertical windows have the potential to offset lighting energy use in deep perimeter zones. Microstructured prismatic window films can be manufactured using low-cost, roll-to-roll fabrication methods and adhered to the inside surface of existing windows as a retrofit measure or installed as a replacement insulating glass unit in the clerestory portion of the window wall. A clear film patterned with linear, 50-250 micrometer high, four-sided asymmetrical prisms was fabricated and installed in the south-facing, clerestory low-e, clear glazed windows of a full-scale testbed facility. Views through the film were distorted. The film was evaluated in amore » sunny climate over a two-year period to gauge daylighting and visual comfort performance. The daylighting aperture was small (window-towall ratio of 0.18) and the lower windows were blocked off to isolate the evaluation to the window film. Workplane illuminance measurements were made in the 4.6 m (15 ft) deep room furnished as a private office. Analysis of discomfort glare was conducted using high dynamic range imaging coupled with the evalglare software tool, which computes the daylight glare probability and other metrics used to evaluate visual discomfort. The window film was found to result in perceptible levels of discomfort glare on clear sunny days from the most conservative view point in the rear of the room looking toward the window. Daylight illuminance levels at the rear of the room were significantly increased above the reference window condition, which was defined as the same glazed clerestory window but with an interior Venetian blind (slat angle set to the cut-off angle), for the equinox to winter solstice period on clear sunny days. For partly cloudy and overcast sky conditions, daylight levels were improved slightly. To reduce glare, the daylighting film was coupled with a diffusing film in an insulating glazing unit. The diffusing film retained the directionality of the redirected light spreading it within a small range of outgoing angles. This solution was found to reduce glare to imperceptible levels while retaining for the most part the illuminance levels achieved solely by the daylighting film.« less
Hook Region Represented in a Cochlear Model
NASA Astrophysics Data System (ADS)
Steele, Charles R.; Kim, Namkeun; Puria, Sunil
2009-02-01
The present interest is in discontinuities. Particularly the geometry of the hook region, with the flexible round window nearly parallel with the basilar membrane, is not represented by a standard box model, in which both stapes and round window are placed at the end. A better model represents the round window by a soft membrane in the wall of scala tympani, with the end closed. This complicates the analysis considerably. Features are that the significant compression wave, i.e., the fast wave, is of negligible magnitude in this region, and that significant evanescent waves occur because of the discontinuities at the beginning and end of the simulated round window. The effect of this on both high frequency, with maximum basilar membrane response in the hook region, and lower frequencies are determined.
DOT National Transportation Integrated Search
2012-01-01
OVERVIEW OF PRESENTATION : Evaluation Parameters : EPAs Sensitivity Analysis : Comparison to Baseline Case : MOVES Sensitivity Run Specification : MOVES Sensitivity Input Parameters : Results : Uses of Study
Kashif, Muhammad; Andersson, Claes; Åberg, Magnus; Nygren, Peter; Sjöblom, Tobias; Hammerling, Ulf; Larsson, Rolf; Gustafsson, Mats G
2014-07-01
For decades, the standard procedure when screening for candidate anticancer drug combinations has been to search for synergy, defined as any positive deviation from trivial cases like when the drugs are regarded as diluted versions of each other (Loewe additivity), independent actions (Bliss independence), or no interaction terms in a response surface model (no interaction). Here, we show that this kind of conventional synergy analysis may be completely misleading when the goal is to detect if there is a promising in vitro therapeutic window. Motivated by this result, and the fact that a drug combination offering a promising therapeutic window seldom is interesting if one of its constituent drugs can provide the same window alone, the largely overlooked concept of therapeutic synergy (TS) is reintroduced. In vitro TS is said to occur when the largest therapeutic window obtained by the best drug combination cannot be achieved by any single drug within the concentration range studied. Using this definition of TS, we introduce a procedure that enables its use in modern massively parallel experiments supported by a statistical omnibus test for TS designed to avoid the multiple testing problem. Finally, we suggest how one may perform TS analysis, via computational predictions of the reference cell responses, when only the target cell responses are available. In conclusion, the conventional error-prone search for promising drug combinations may be improved by replacing conventional (toxicology-rooted) synergy analysis with an analysis focused on (clinically motivated) TS. ©2014 American Association for Cancer Research.
Improved modified energy ratio method using a multi-window approach for accurate arrival picking
NASA Astrophysics Data System (ADS)
Lee, Minho; Byun, Joongmoo; Kim, Dowan; Choi, Jihun; Kim, Myungsun
2017-04-01
To identify accurately the location of microseismic events generated during hydraulic fracture stimulation, it is necessary to detect the first break of the P- and S-wave arrival times recorded at multiple receivers. These microseismic data often contain high-amplitude noise, which makes it difficult to identify the P- and S-wave arrival times. The short-term-average to long-term-average (STA/LTA) and modified energy ratio (MER) methods are based on the differences in the energy densities of the noise and signal, and are widely used to identify the P-wave arrival times. The MER method yields more consistent results than the STA/LTA method for data with a low signal-to-noise (S/N) ratio. However, although the MER method shows good results regardless of the delay of the signal wavelet for signals with a high S/N ratio, it may yield poor results if the signal is contaminated by high-amplitude noise and does not have the minimum delay. Here we describe an improved MER (IMER) method, whereby we apply a multiple-windowing approach to overcome the limitations of the MER method. The IMER method contains calculations of an additional MER value using a third window (in addition to the original MER window), as well as the application of a moving average filter to each MER data point to eliminate high-frequency fluctuations in the original MER distributions. The resulting distribution makes it easier to apply thresholding. The proposed IMER method was applied to synthetic and real datasets with various S/N ratios and mixed-delay wavelets. The results show that the IMER method yields a high accuracy rate of around 80% within five sample errors for the synthetic datasets. Likewise, in the case of real datasets, 94.56% of the P-wave picking results obtained by the IMER method had a deviation of less than 0.5 ms (corresponding to 2 samples) from the manual picks.
Mock Target Window OTR and IR Design and Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wass, Alexander Joseph
In order to fully verify temperature measurements made on the target window using infrared (IR) optical non-contact methods, actual comparative measurements are made with a real beam distribution as the heat source using Argonne National Laboratory’s (ANL) 35 MeV electron accelerator. Using Monte Carlo N-Particle (MCNP) simulations and thermal Finite Element Analysis (FEA), a cooled mock target window with thermocouple implants is designed to be used in such a test to achieve window temperatures up to 700°C. An uncoated and blackcoated mock window is designed to enhance the IR temperature measurements and verify optical transmitted radiation (OTR) imagery. This allowsmore » us to fully verify and characterize our temperature accuracy with our current IR camera method and any future method we may wish to explore using actual production conditions. This test also provides us with valuable conclusions/concerns regarding the calibration method we developed using our IR test stand at TA-53 in MPF-14.« less
SOT: A rapid prototype using TAE windows
NASA Technical Reports Server (NTRS)
Stephens, Mark; Eike, David; Harris, Elfrieda; Miller, Dana
1986-01-01
The development of the window interface extension feature of the Transportable Applications Executive (TAE) is discussed. This feature is being used to prototype a space station payload interface in order to demonstrate and assess the benefits of using windows on a bit mapped display and also to convey the concept of telescience, the control and operation of space station payloads from remote sites. The prototype version of the TAE with windows operates on a DEC VAXstation 100. This workstation has a high resolution 19 inch bit mapped display, a keyboard and a three-button mouse. The VAXstation 100 is not a stand-alone workstation, but is controlled by software executing on a VAX/8600. A short scenario was developed utilizing the Solar Optical Telescope (SOT) as an example payload. In the scenario the end-user station includes the VAXstation 100 plus an image analysis terminal used to display the CCD images. The layout and use of the prototype elements, i.e., the root menu, payload status window, and target acquisition menu is described.
Thermal Stress in HFEF Hot Cell Windows Due to an In-Cell Metal Fire
Solbrig, Charles W.; Warmann, Stephen A.
2016-01-01
This work investigates an accident during the pyrochemical extraction of Uranium and Plutonium from PWR spent fuel in an argon atmosphere hot cell. In the accident, the heavy metals (U and Pu) being extracted are accidentally exposed to air from a leaky instrument penetration which goes through the cell walls. The extracted pin size pieces of U and Pu metal readily burn when exposed to air. Technicians perform the electrochemical extraction using manipulators through a 4 foot thick hot cell concrete wall which protects them from the radioactivity of the spent fuel. Four foot thick windows placed in the wallmore » allow the technicians to visually control the manipulators. These windows would be exposed to the heat of the metal fire. As a result, this analysis determines if the thermal stress caused by the fire would crack the windows and if the heat would degrade the window seals allowing radioactivity to escape from the cell.« less
Thermal Stress in HFEF Hot Cell Windows Due to an In-Cell Metal Fire
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solbrig, Charles W.; Warmann, Stephen A.
This work investigates an accident during the pyrochemical extraction of Uranium and Plutonium from PWR spent fuel in an argon atmosphere hot cell. In the accident, the heavy metals (U and Pu) being extracted are accidentally exposed to air from a leaky instrument penetration which goes through the cell walls. The extracted pin size pieces of U and Pu metal readily burn when exposed to air. Technicians perform the electrochemical extraction using manipulators through a 4 foot thick hot cell concrete wall which protects them from the radioactivity of the spent fuel. Four foot thick windows placed in the wallmore » allow the technicians to visually control the manipulators. These windows would be exposed to the heat of the metal fire. As a result, this analysis determines if the thermal stress caused by the fire would crack the windows and if the heat would degrade the window seals allowing radioactivity to escape from the cell.« less
Ozgul, Betul Memis; Orhan, Kaan; Oz, Firdevs Tulga
2015-09-01
We investigated inhibition of lesion progression in artificial enamel lesions. Lesions were created on primary and permanent anterior teeth (n = 10 each) and were divided randomly into two groups with two windows: Group 1 (window A: resin infiltration; window B: negative control) and Group 2 (window A: resin infiltration + fluoride varnish; window B: fluoride varnish). After pH cycling, micro-computed tomography was used to analyze progression of lesion depth and changes in mineral density. Resin infiltration and resin infiltration + fluoride varnish significantly inhibited progression of lesion depth in primary teeth (P < 0.05). Inhibition of lesion depth progression in permanent teeth was significantly greater after treatment with resin infiltration + fluoride varnish than in the negative control (P < 0.05). Change in mineral density was smaller in the resin infiltration and resin infiltration + fluoride varnish groups; however, the difference was not significant for either group (P > 0.05). Resin infiltration is a promising method of inhibiting progression of caries lesions.
McBride, J.H.; Hatcher, R.D.; Stephenson, W.J.; Hooper, R.J.
2005-01-01
The southern Appalachian Pine Mountain window exposes 1.1 Ga Grenvillian basement and its metasedimentary Paleozoic(?) cover through the allochthonous Inner Piedmont. The issue of whether the crustal block inside the window was either transported above the master Appalachian (late Alleghanian) de??collement or is an autochthonous block that was overridden by the de??collement has been debated for some time. New detrital zircon geochronologic data from the cover rocks inside the window suggest this crustal block was derived from Gondwana but docked with Laurentia before the Alleghanian event. Reprocessed deep seismic reflection data from west-central Georgia (pre- and poststack noise reduction, amplitude variation analysis, and prestack depth migration) indicate that a significant band of subhorizontal reflections occurs almost continuously beneath the window collinear with the originally recognized de??collement reflections north of the window. A marked variation in the de??collement image, from strong and coherent north of the window to more diffuse directly beneath the window, is likely a partial consequence of the different geology between the Inner Piedmont and the window. The more diffuse image beneath the window may also result from imaging problems related to changes in topography and fold of cover (i.e., signal-to-noise ratio). Two alternative tectonic models for the Pine Mountain window can partially account for the observed variation in the de??collement reflectivity. (1) The Pine Mountain block could be truncated below by a relatively smooth continuation of the de??collement. The window would thus expose an allochthonous basement duplex or horse-block thrust upward from the south along the Late Proterozoic rifted continental margin. (2) The window represents localized exhumation of autochthonous basement and cover along a zone of distributed intrabasement shearing directly beneath the window. Either model is viable if only reflector geometry is considered; model (1) is favored if both geometry and kinematics of Blue Ridge-Piedmont thrust sheet emplacement are incorporated. In either model, the southern margin of the window merges to the west with the Iapetan early Alleghanian Central Piedmont suture, which juxtaposes North American-affinity Piedmont rocks to the north and exotic Panafrican rocks of the Carolina (Avalon) terrane to the south. Immediately south of the window, this suture dips southward and merges in the lower crust with the late Alleghanian suture joining the Appalachians with Gondwana. ?? 2005 Geological Society of America.
Local variance for multi-scale analysis in geomorphometry.
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-07-15
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements.
Local variance for multi-scale analysis in geomorphometry
Drăguţ, Lucian; Eisank, Clemens; Strasser, Thomas
2011-01-01
Increasing availability of high resolution Digital Elevation Models (DEMs) is leading to a paradigm shift regarding scale issues in geomorphometry, prompting new solutions to cope with multi-scale analysis and detection of characteristic scales. We tested the suitability of the local variance (LV) method, originally developed for image analysis, for multi-scale analysis in geomorphometry. The method consists of: 1) up-scaling land-surface parameters derived from a DEM; 2) calculating LV as the average standard deviation (SD) within a 3 × 3 moving window for each scale level; 3) calculating the rate of change of LV (ROC-LV) from one level to another, and 4) plotting values so obtained against scale levels. We interpreted peaks in the ROC-LV graphs as markers of scale levels where cells or segments match types of pattern elements characterized by (relatively) equal degrees of homogeneity. The proposed method has been applied to LiDAR DEMs in two test areas different in terms of roughness: low relief and mountainous, respectively. For each test area, scale levels for slope gradient, plan, and profile curvatures were produced at constant increments with either resampling (cell-based) or image segmentation (object-based). Visual assessment revealed homogeneous areas that convincingly associate into patterns of land-surface parameters well differentiated across scales. We found that the LV method performed better on scale levels generated through segmentation as compared to up-scaling through resampling. The results indicate that coupling multi-scale pattern analysis with delineation of morphometric primitives is possible. This approach could be further used for developing hierarchical classifications of landform elements. PMID:21779138
Effect of contrast on human speed perception
NASA Technical Reports Server (NTRS)
Stone, Leland S.; Thompson, Peter
1992-01-01
This study is part of an ongoing collaborative research effort between the Life Science and Human Factors Divisions at NASA ARC to measure the accuracy of human motion perception in order to predict potential errors in human perception/performance and to facilitate the design of display systems that minimize the effects of such deficits. The study describes how contrast manipulations can produce significant errors in human speed perception. Specifically, when two simultaneously presented parallel gratings are moving at the same speed within stationary windows, the lower-contrast grating appears to move more slowly. This contrast-induced misperception of relative speed is evident across a wide range of contrasts (2.5-50 percent) and does not appear to saturate (e.g., a 50 percent contrast grating appears slower than a 70 percent contrast grating moving at the same speed). The misperception is large: a 70 percent contrast grating must, on average, be slowed by 35 percent to match a 10 percent contrast grating moving at 2 deg/sec (N = 6). Furthermore, it is largely independent of the absolute contrast level and is a quasilinear function of log contrast ratio. A preliminary parametric study shows that, although spatial frequency has little effect, the relative orientation of the two gratings is important. Finally, the effect depends on the temporal presentation of the stimuli: the effects of contrast on perceived speed appears lessened when the stimuli to be matched are presented sequentially. These data constrain both physiological models of visual cortex and models of human performance. We conclude that viewing conditions that effect contrast, such as fog, may cause significant errors in speed judgments.
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Hamuro, Yoshitomo
2017-03-01
A new strategy to analyze amide hydrogen/deuterium exchange mass spectrometry (HDX-MS) data is proposed, utilizing a wider time window and isotope envelope analysis of each peptide. While most current scientific reports present HDX-MS data as a set of time-dependent deuteration levels of peptides, the ideal HDX-MS data presentation is a complete set of backbone amide hydrogen exchange rates. The ideal data set can provide single amide resolution, coverage of all exchange events, and the open/close ratio of each amide hydrogen in EX2 mechanism. Toward this goal, a typical HDX-MS protocol was modified in two aspects: measurement of a wider time window in HDX-MS experiments and deconvolution of isotope envelope of each peptide. Measurement of a wider time window enabled the observation of deuterium incorporation of most backbone amide hydrogens. Analysis of the isotope envelope instead of centroid value provides the deuterium distribution instead of the sum of deuteration levels in each peptide. A one-step, global-fitting algorithm optimized exchange rate and deuterium retention during the analysis of each amide hydrogen by fitting the deuterated isotope envelopes at all time points of all peptides in a region. Application of this strategy to cytochrome c yielded 97 out of 100 amide hydrogen exchange rates. A set of exchange rates determined by this approach is more appropriate for a patent or regulatory filing of a biopharmaceutical than a set of peptide deuteration levels obtained by a typical protocol. A wider time window of this method also eliminates false negatives in protein-ligand binding site identification.
NASA Astrophysics Data System (ADS)
El-Madany, T.; Griessbaum, F.; Maneke, F.; Chu, H.-S.; Wu, C.-C.; Chang, S. C.; Hsia, Y.-J.; Juang, J.-Y.; Klemm, O.
2010-07-01
To estimate carbon dioxide or water vapor fluxes with the Eddy Covariance method high quality data sets are necessary. Under foggy conditions this is challenging, because open path measurements are influenced by the water droplets that cross the measurement path as well as deposit on the windows of the optical path. For the LI-7500 the deposition of droplets on the window results in an intensity reduction of the infrared beam. To keep the strength of the infrared beam under these conditions, the energy is increased. A measure for the increased energy is given by the AGC value (Automatic Gain Control). Up to a AGC threshold value of 70 % the data from the LI-7500 is assumed to be of good quality (personal communication with LICOR). Due to fog deposition on the windows, the AGC value rises above 70 % and stays there until the fog disappears and the water on the windows evaporates. To gain better data quality during foggy conditions, a blower system was developed that blows the deposited water droplets off the window. The system is triggered if the AGC value rises above 70 %. Then a pneumatic jack will lift the blower system towards the LI-7500 and the water-droplets get blown off with compressed air. After the AGC value drops below 70 %, the pneumatic jack will move back to the idle position. Using this technique showed that not only the fog droplets on the window causing significant problems to the measurement, but also the fog droplets inside the measurement path. Under conditions of very dense fog the measured values of carbon dioxide can get unrealistically high, and for water vapor, negative values can be observed even if the AGC value is below 70 %. The negative values can be explained by the scatter of the infrared beam on the fog droplets. It is assumed, that different types of fog droplet spectra are causing the various error patterns observed. For high quality flux measurements, not only the AGC threshold value of 70 % is important, but also the fluctuation of the AGC value in a flux averaging interval. Such AGC value fluctuations can cause severe jumps in the concentration measurements that can hardly be corrected for. Results of fog effects on the LI-7500 performance and its consequences for flux measurements and budget calculations will be presented.
Rabies Vaccination: Higher Failure Rates in Imported Dogs than in those Vaccinated in Italy.
Rota Nodari, E; Alonso, S; Mancin, M; De Nardi, M; Hudson-Cooke, S; Veggiato, C; Cattoli, G; De Benedictis, P
2017-03-01
The current European Union (EU) legislation decrees that pets entering the EU from a rabies-infected third country have to obtain a satisfactory virus-neutralizing antibody level, while those moving within the EU require only rabies vaccination as the risk of moving a rabid pet within the EU is considered negligible. A number of factors driving individual variations in dog vaccine response have been previously reported, including a high rate of vaccine failure in puppies, especially those subject to commercial transport. A total of 21 001 observations collected from dogs (2006-2012) vaccinated in compliance with the current EU regulations were statistically analysed to assess the effect of different risk factors related to rabies vaccine efficacy. Within this framework, we were able to compare the vaccination failure rate in a group of dogs entering the Italian border from EU and non-EU countries to those vaccinated in Italy prior to international travel. Our analysis identified that cross-breeds and two breed categories showed high vaccine success rates, while Beagles and Boxers were the least likely to show a successful response to vaccination (88.82% and 90.32%, respectively). Our analysis revealed diverse performances among the commercially available vaccines, in terms of serological peak windows, and marked differences according to geographical area. Of note, we found a higher vaccine failure rate in imported dogs (13.15%) than in those vaccinated in Italy (5.89%). Our findings suggest that the choice of vaccine may influence the likelihood of an animal achieving a protective serological level and that time from vaccination to sampling should be considered when interpreting serological results. A higher vaccine failure in imported compared to Italian dogs highlights the key role that border controls still have in assessing the full compliance of pet movements with EU legislation to minimize the risk of rabies being reintroduced into a disease-free area. © 2016 The Authors. Zoonoses and Public Health Published by Blackwell Verlag GmbH.
LinkWinds: An Approach to Visual Data Analysis
NASA Technical Reports Server (NTRS)
Jacobson, Allan S.
1992-01-01
The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.
Simulation of a Real-Time Brain Computer Interface for Detecting a Self-Paced Hitting Task.
Hammad, Sofyan H; Kamavuako, Ernest N; Farina, Dario; Jensen, Winnie
2016-12-01
An invasive brain-computer interface (BCI) is a promising neurorehabilitation device for severely disabled patients. Although some systems have been shown to work well in restricted laboratory settings, their utility must be tested in less controlled, real-time environments. Our objective was to investigate whether a specific motor task could be reliably detected from multiunit intracortical signals from freely moving animals in a simulated, real-time setting. Intracortical signals were first obtained from electrodes placed in the primary motor cortex of four rats that were trained to hit a retractable paddle (defined as a "Hit"). In the simulated real-time setting, the signal-to-noise-ratio was first increased by wavelet denoising. Action potentials were detected, and features were extracted (spike count, mean absolute values, entropy, and combination of these features) within pre-defined time windows (200 ms, 300 ms, and 400 ms) to classify the occurrence of a "Hit." We found higher detection accuracy of a "Hit" (73.1%, 73.4%, and 67.9% for the three window sizes, respectively) when the decision was made based on a combination of features rather than on a single feature. However, the duration of the window length was not statistically significant (p = 0.5). Our results showed the feasibility of detecting a motor task in real time in a less restricted environment compared to environments commonly applied within invasive BCI research, and they showed the feasibility of using information extracted from multiunit recordings, thereby avoiding the time-consuming and complex task of extracting and sorting single units. © 2016 International Neuromodulation Society.
Compensation for Blur Requires Increase in Field of View and Viewing Time
Kwon, MiYoung; Liu, Rong; Chien, Lillian
2016-01-01
Spatial resolution is an important factor for human pattern recognition. In particular, low resolution (blur) is a defining characteristic of low vision. Here, we examined spatial (field of view) and temporal (stimulus duration) requirements for blurry object recognition. The spatial resolution of an image such as letter or face, was manipulated with a low-pass filter. In experiment 1, studying spatial requirement, observers viewed a fixed-size object through a window of varying sizes, which was repositioned until object identification (moving window paradigm). Field of view requirement, quantified as the number of “views” (window repositions) for correct recognition, was obtained for three blur levels, including no blur. In experiment 2, studying temporal requirement, we determined threshold viewing time, the stimulus duration yielding criterion recognition accuracy, at six blur levels, including no blur. For letter and face recognition, we found blur significantly increased the number of views, suggesting a larger field of view is required to recognize blurry objects. We also found blur significantly increased threshold viewing time, suggesting longer temporal integration is necessary to recognize blurry objects. The temporal integration reflects the tradeoff between stimulus intensity and time. While humans excel at recognizing blurry objects, our findings suggest compensating for blur requires increased field of view and viewing time. The need for larger spatial and longer temporal integration for recognizing blurry objects may further challenge object recognition in low vision. Thus, interactions between blur and field of view should be considered for developing low vision rehabilitation or assistive aids. PMID:27622710
Reciprocal capacitance transients?
NASA Astrophysics Data System (ADS)
Gfroerer, Tim; Simov, Peter; Wanlass, Mark
2007-03-01
When the reverse bias across a semiconductor diode is changed, charge carriers move to accommodate the appropriate depletion thickness, producing a simultaneous change in the device capacitance. Transient capacitance measurements can reveal inhibited carrier motion due to trapping, where the depth of the trap can be evaluated using the temperature-dependent escape rate. However, when we employ this technique on a GaAs0.72P0.28 n+/p diode (which is a candidate for incorporation in multi-junction solar cells), we observe a highly non-exponential response under a broad range of experimental conditions. Double exponential functions give good fits, but lead to non-physical results. The deduced rates depend on the observation time window and fast and slow rates, which presumably correspond to deep and shallow levels, have identical activation energies. Meanwhile, we have discovered a universal linear relationship between the inverse of the capacitance and time. An Arrhenius plot of the slope of the reciprocal of the transient yields an activation energy of approximately 0.4 eV, independent of the observation window and other experimental conditions. The reciprocal behavior leads us to hypothesize that hopping, rather than escape into high-mobility bands, may govern the transport of trapped holes in this system.
An artificial reality environment for remote factory control and monitoring
NASA Technical Reports Server (NTRS)
Kosta, Charles Paul; Krolak, Patrick D.
1993-01-01
Work has begun on the merger of two well known systems, VEOS (HITLab) and CLIPS (NASA). In the recent past, the University of Massachusetts Lowell developed a parallel version of NASA CLIPS, called P-CLIPS. This modification allows users to create smaller expert systems which are able to communicate with each other to jointly solve problems. With the merger of a VEOS message system, PCLIPS-V can now act as a group of entities working within VEOS. To display the 3D virtual world we have been using a graphics package called HOOPS, from Ithaca Software. The artificial reality environment we have set up contains actors and objects as found in our Lincoln Logs Factory of the Future project. The environment allows us to view and control the objects within the virtual world. All communication between the separate CLIPS expert systems is done through VEOS. A graphical renderer generates camera views on X-Windows devices; Head Mounted Devices are not required. This allows more people to make use of this technology. We are experimenting with different types of virtual vehicles to give the user a sense that he or she is actually moving around inside the factory looking ahead through windows and virtual monitors.
NASA Astrophysics Data System (ADS)
Qiao, Ruimin; Li, Qinghao; Zhuo, Zengqing; Sallis, Shawn; Fuchs, Oliver; Blum, Monika; Weinhardt, Lothar; Heske, Clemens; Pepper, John; Jones, Michael; Brown, Adam; Spucces, Adrian; Chow, Ken; Smith, Brian; Glans, Per-Anders; Chen, Yanxue; Yan, Shishen; Pan, Feng; Piper, Louis F. J.; Denlinger, Jonathan; Guo, Jinghua; Hussain, Zahid; Chuang, Yi-De; Yang, Wanli
2017-03-01
An endstation with two high-efficiency soft x-ray spectrographs was developed at Beamline 8.0.1 of the Advanced Light Source, Lawrence Berkeley National Laboratory. The endstation is capable of performing soft x-ray absorption spectroscopy, emission spectroscopy, and, in particular, resonant inelastic soft x-ray scattering (RIXS). Two slit-less variable line-spacing grating spectrographs are installed at different detection geometries. The endstation covers the photon energy range from 80 to 1500 eV. For studying transition-metal oxides, the large detection energy window allows a simultaneous collection of x-ray emission spectra with energies ranging from the O K-edge to the Ni L-edge without moving any mechanical components. The record-high efficiency enables the recording of comprehensive two-dimensional RIXS maps with good statistics within a short acquisition time. By virtue of the large energy window and high throughput of the spectrographs, partial fluorescence yield and inverse partial fluorescence yield signals could be obtained for all transition metal L-edges including Mn. Moreover, the different geometries of these two spectrographs (parallel and perpendicular to the horizontal polarization of the beamline) provide contrasts in RIXS features with two different momentum transfers.
Mahmood, Hafiz Sultan; Hoogmoed, Willem B.; van Henten, Eldert J.
2013-01-01
Fine-scale spatial information on soil properties is needed to successfully implement precision agriculture. Proximal gamma-ray spectroscopy has recently emerged as a promising tool to collect fine-scale soil information. The objective of this study was to evaluate a proximal gamma-ray spectrometer to predict several soil properties using energy-windows and full-spectrum analysis methods in two differently managed sandy loam fields: conventional and organic. In the conventional field, both methods predicted clay, pH and total nitrogen with a good accuracy (R2 ≥ 0.56) in the top 0–15 cm soil depth, whereas in the organic field, only clay content was predicted with such accuracy. The highest prediction accuracy was found for total nitrogen (R2 = 0.75) in the conventional field in the energy-windows method. Predictions were better in the top 0–15 cm soil depths than in the 15–30 cm soil depths for individual and combined fields. This implies that gamma-ray spectroscopy can generally benefit soil characterisation for annual crops where the condition of the seedbed is important. Small differences in soil structure (conventional vs. organic) cannot be determined. As for the methodology, we conclude that the energy-windows method can establish relations between radionuclide data and soil properties as accurate as the full-spectrum analysis method. PMID:24287541
Iconic Meaning in Music: An Event-Related Potential Study.
Cai, Liman; Huang, Ping; Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners' experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360 ms and 410-460 ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music.
Iconic Meaning in Music: An Event-Related Potential Study
Luo, Qiuling; Huang, Hong; Mo, Lei
2015-01-01
Although there has been extensive research on the processing of the emotional meaning of music, little is known about other aspects of listeners’ experience of music. The present study investigated the neural correlates of the iconic meaning of music. Event-related potentials (ERP) were recorded while a group of 20 music majors and a group of 20 non-music majors performed a lexical decision task in the context of implicit musical iconic meaning priming. ERP analysis revealed a significant N400 effect of congruency in time window 260-510 ms following the onset of the target word only in the group of music majors. Time-course analysis using 50 ms windows indicated significant N400 effects both within the time window 410-460 ms and 460-510 ms for music majors, whereas only a partial N400 effect during time window 410-460 ms was observed for non-music majors. There was also a trend for the N400 effects in the music major group to be stronger than those in the non-major group in the sub-windows of 310-360ms and 410-460ms. Especially in the sub-window of 410-460 ms, the topographical map of the difference waveforms between congruent and incongruent conditions revealed different N400 distribution between groups; the effect was concentrated in bilateral frontal areas for music majors, but in central-parietal areas for non-music majors. These results imply probable neural mechanism differences underlying automatic iconic meaning priming of music. Our findings suggest that processing of the iconic meaning of music can be accomplished automatically and that musical training may facilitate the understanding of the iconic meaning of music. PMID:26161561
Window and Overlap Processing Effects on Power Estimates from Spectra
NASA Astrophysics Data System (ADS)
Trethewey, M. W.
2000-03-01
Fast Fourier transform (FFT) spectral processing is based on the assumption of stationary ergodic data. In engineering practice, the assumption is often violated and non-stationary data processed. Data windows are commonly used to reduce leakage by decreasing the signal amplitudes near the boundaries of the discrete samples. With certain combinations of non-stationary signals and windows, the temporal weighting may attenuate important signal characteristics to adversely affect any subsequent processing. In other words, the window artificially reduces a significant section of the time signal. Consequently, spectra and overall power estimated from the affected samples are unreliable. FFT processing can be particularly problematic when the signal consists of randomly occurring transients superimposed on a more continuous signal. Overlap processing is commonly used in this situation to improve the estimates. However, the results again depend on the temporal character of the signal in relation to the window weighting. A worst-case scenario, a short-duration half sine pulse, is used to illustrate the relationship between overlap percentage and resulting power estimates. The power estimates are shown to depend on the temporal behaviour of the square of overlapped window segments. An analysis shows that power estimates may be obtained to within 0.27 dB for the following windows and overlap combinations: rectangular (0% overlap), Hanning (62.5% overlap), Hamming (60.35% overlap) and flat-top (82.25% overlap).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maurya, Gulab Singh; Kumar, Rohit; Rai, Awadhesh Kumar, E-mail: awadheshkrai@rediffmail.com
2015-12-15
In the present manuscript, we demonstrate the design of an experimental setup for on-line laser induced breakdown spectroscopy (LIBS) analysis of impurity layers deposited on specimens of interest for fusion technology, namely, plasma-facing components (PFCs) of a tokamak. For investigation of impurities deposited on PFCs, LIBS spectra of a tokamak wall material like a stainless steel sample (SS304) have been recorded through contaminated and cleaned optical windows. To address the problem of identification of dust and gases present inside the tokamak, we have shown the capability of the apparatus to record LIBS spectra of gases. A new approach known asmore » “back collection method” to record LIBS spectra of impurities deposited on the inner surface of optical window is presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenent, Robert C.
2017-12-06
NREL will conduct durability testing of Sage Electrochromics dynamic windows products using American Society for Testing and Materials (ASTM) standard methods and drive parameters as defined by Sage. Window units will be tested and standard analysis performed. Data will be summarized and reported back to Sage at the end of the testing period.
Windowed Multitaper Correlation Analysis of Multimodal Brain Monitoring Parameters
Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome. PMID:25821507
The U.S. Lab is moved to payload canister
NASA Technical Reports Server (NTRS)
2000-01-01
In the Space Station Processing Facility, the U.S. Laboratory Destiny, a component of the International Space Station, glides overhead other hardware while visitors watch from a window (right). On the floor, left to right, are two Multi-Purpose Logistics Modules (MPLMs), Raffaello (far left) and Leonardo, and a Pressurized Mating Adapter-3 (right). Destiny is being moved to a payload canister for transfer to the Operations and Checkout Building where it will be tested in the altitude chamber. Destiny is scheduled to fly on mission STS-98 in early 2001. During the mission, the crew will install the Lab in the Space Station during a series of three space walks. The STS-98 mission will provide the Station with science research facilities and expand its power, life support and control capabilities. The U.S. Lab module continues a long tradition of microgravity materials research, first conducted by Skylab and later Shuttle and Spacelab missions. Destiny is expected to be a major feature in future research, providing facilities for biotechnology, fluid physics, combustion, and life sciences research.
Tan, Chaowei; Wang, Bo; Liu, Paul; Liu, Dong
2008-01-01
Wide field of view (WFOV) imaging mode obtains an ultrasound image over an area much larger than the real time window normally available. As the probe is moved over the region of interest, new image frames are combined with prior frames to form a panorama image. Image registration techniques are used to recover the probe motion, eliminating the need for a position sensor. Speckle patterns, which are inherent in ultrasound imaging, change, or become decorrelated, as the scan plane moves, so we pre-smooth the image to reduce the effects of speckle in registration, as well as reducing effects from thermal noise. Because we wish to track the movement of features such as structural boundaries, we use an adaptive mesh over the entire smoothed image to home in on areas with feature. Motion estimation using blocks centered at the individual mesh nodes generates a field of motion vectors. After angular correction of motion vectors, we model the overall movement between frames as a nonrigid deformation. The polygon filling algorithm for precise, persistence-based spatial compounding constructs the final speckle reduced WFOV image.
Tour of Jupiter Galilean moons: Winning solution of GTOC6
NASA Astrophysics Data System (ADS)
Colasurdo, Guido; Zavoli, Alessandro; Longo, Alessandro; Casalino, Lorenzo; Simeoni, Francesco
2014-09-01
The paper presents the trajectory designed by the Italian joint team Politecnico di Torino & Sapienza Università di Roma (Team5), winner of the 6th edition of the Global Trajectory Optimization Competition (GTOC6). In the short time available in these competitions, Team5 resorted to basic knowledge, simple tools and a powerful indirect optimization procedure. The mission concerns a 4-year tour of the Jupiter Galilean moons. The paper explains the strategy that was preliminarily devised and eventually implemented by looking for a viable trajectory. The first phase is a capture that moves the spacecraft from the arrival hyperbola to a low-energy orbit around Jupiter. Six series of flybys follow; in each one the spacecraft orbits Jupiter in resonance with a single moon; criteria to construct efficient chains of resonant flybys are presented. Transfer legs move the spacecraft from resonance with a moon to another one; precise phasing of the relevant moons is required; mission opportunities in a 11-year launch window are found by assuming ballistic trajectories and coplanar circular orbits for the Jovian satellites. The actual trajectory is found by using an indirect technique.
Unity connecting module moving to new site in SSPF
NASA Technical Reports Server (NTRS)
1998-01-01
In the Space Station Processing Facility (SSPF), Unity (top) is suspended in air as it is moved to a new location (bottom left)in the SSPF. To its left is Leonardo, the Italian-built Multi- Purpose Logistics Module to be launched on STS-100. Above Leonardo, visitors watch through a viewing window, part of the visitors tour at the Center. As the primary payload on mission STS-88, scheduled to launch Dec. 3, 1998, Unity will be mated to the Russian-built Zarya control module which should already be in orbit at that time. In the SSPF, Unity is undergoing testing such as the Pad Demonstration Test to verify the compatibility of the module with the Space Shuttle, as well as the ability of the astronauts to send and receive commands to Unity from the flight deck of the orbiter, and the common berthing mechanism to which other space station elements will dock. Unity is expected to be ready for installation into the payload canister on Oct. 25, and transported to Launch Pad 39-A on Oct. 27.
Altenritter, Matthew E.; Zydlewski, Gayle B.; Kinnison, Michael T.; Zydlewski, Joseph D.; Wippelhauser, Gail S.
2018-01-01
Movement of shortnose sturgeon (Acipenser brevirostrum) among major river systems in the Gulf of Maine is common and has implications for the management of this endangered species. Directed movements of 61 telemetered individuals monitored between 2010 and 2013 were associated with the river of tagging and individual characteristics. While a small proportion of fish tagged in the Kennebec River moved to the Penobscot River (5%), a much higher proportion of fish tagged in the Penobscot River moved to the Kennebec River (66%), during probable spawning windows. This suggests that Penobscot River fish derive from a migratory contingent within a larger Kennebec River population. Despite this connectivity, fish captured in the Penobscot River were larger (∼100 mm fork length) and had higher condition factors (median Fulton’s K: 0.76) than those captured in the Kennebec River (median Fulton’s K: 0.61). Increased abundance and resource limitation in the Kennebec River may be constraining growth and promoting migration to the Penobscot River by individuals with sufficient initial size and condition. Migrants could experience an adaptive reproductive advantage relative to nonmigratory individuals.
A study on scattering correction for γ-photon 3D imaging test method
NASA Astrophysics Data System (ADS)
Xiao, Hui; Zhao, Min; Liu, Jiantang; Chen, Hao
2018-03-01
A pair of 511KeV γ-photons is generated during a positron annihilation. Their directions differ by 180°. The moving path and energy information can be utilized to form the 3D imaging test method in industrial domain. However, the scattered γ-photons are the major factors influencing the imaging precision of the test method. This study proposes a γ-photon single scattering correction method from the perspective of spatial geometry. The method first determines possible scattering points when the scattered γ-photon pair hits the detector pair. The range of scattering angle can then be calculated according to the energy window. Finally, the number of scattered γ-photons denotes the attenuation of the total scattered γ-photons along its moving path. The corrected γ-photons are obtained by deducting the scattered γ-photons from the original ones. Two experiments are conducted to verify the effectiveness of the proposed scattering correction method. The results concluded that the proposed scattering correction method can efficiently correct scattered γ-photons and improve the test accuracy.
NASA Astrophysics Data System (ADS)
Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey
2017-02-01
Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.
NASA Astrophysics Data System (ADS)
Diodato, A.; Cafarelli, A.; Schiappacasse, A.; Tognarelli, S.; Ciuti, G.; Menciassi, A.
2018-02-01
High intensity focused ultrasound (HIFU) is an emerging therapeutic solution that enables non-invasive treatment of several pathologies, mainly in oncology. On the other hand, accurate targeting of moving abdominal organs (e.g. liver, kidney, pancreas) is still an open challenge. This paper proposes a novel method to compensate the physiological respiratory motion of organs during HIFU procedures, by exploiting a robotic platform for ultrasound-guided HIFU surgery provided with a therapeutic annular phased array transducer. The proposed method enables us to keep the same contact point between the transducer and the patient’s skin during the whole procedure, thus minimizing the modification of the acoustic window during the breathing phases. The motion of the target point is compensated through the rotation of the transducer around a virtual pivot point, while the focal depth is continuously adjusted thanks to the axial electronically steering capabilities of the HIFU transducer. The feasibility of the angular motion compensation strategy has been demonstrated in a simulated respiratory-induced organ motion environment. Based on the experimental results, the proposed method appears to be significantly accurate (i.e. the maximum compensation error is always under 1 mm), thus paving the way for the potential use of this technique for in vivo treatment of moving organs, and therefore enabling a wide use of HIFU in clinics.